Abate talks to Maxime Gariel, CTO of Xwing about the autonomous flight technology they are developing.
At Xwing, they are converting traditional aircraft into remotely operated aircraft. They do this by retrofitting planes with multiple sensors including cameras, radar, and lidar, and by developing sensor fusion algorithms to allow their planes to understand the world around them, using highly accurate perception algorithms.
Xwing’s autonomous flight technology allows a plane to taxi in the airport, takeoff, fly to a destination, avoid airborne and ground threats, and land, all without any human input. This technology not only enables autonomous flight but may also enhance the safety of manned aircraft by improving a plane’s ability to understand its surroundings.
Maxime Gariel is the CTO of Xwing, a San Francisco based startup whose mission is to dramatically increase human mobility using fully autonomous aerial vehicles. Xwing is developing a Detect-And-Avoid system for unmanned and remotely piloted vehicles. Maxime is a pilot but he is passionate about making airplanes fly themselves.
Maxime joined Xwing from Rockwell Collins where he was a Principal GNC Engineer. He worked on autonomous aircraft projects including DARPA Gremlins and the AgustaWestland SW4 Solo autonomous helicopter. Before becoming Chief Engineer of the SW4 Solo’s flight control system, he was in charge of the system architecture, redundancy, and safety for the project.
Before Rockwell Collins, he worked on ADS-B based conflict detection as a postdoc at MIT and on autoland systems for airliners at Thales. Maxime earned his MS and PhD in Aerospace Engineering from Georgia Tech and his BS from ISAE-Supaéro (France).
edited for clarity
Welcome to the Robohub Podcast. How are you doing? Can you tell us a little bit about your background?
I’m Maxime Gariel, I’m the CTO at x-wing and my background has always been in aerospace engineering. I started my career developing autopilots for airliners and especially auto-landing and that’s when I discovered what I loved. I knew I loved the airplane but making them fly themselves like that was a revelation.
After that, I got my master and Ph.D. in aerospace engineering looking at air traffic management and what happens when you add more technology into this already very complex system you know where you have other sectors, so you have a lot of humans directing airplanes, you have pilots, and you have airports. So what happens when you put more and more technology into that system like ADS-B or self separation when you allow pilots to separate and what happens when you have a problem what happens when you have failures so always looking at what is your plan b your plan c and ensuring that the system can degrade gracefully.
then I did a short postdoc at MIT to look at a conflict of violence between aircraft and then spent several years at Rockwell collins to convert existing helicopters into fully automated helicopters, taking 1980s helicopters adding redundant flight control systems. So going from fully manned to fully unmanned and having a high level of reliability and redundancy. I joined Xwing when we opened the office about five years ago
So for quite a large portion of your career, you’ve been working on building autonomous aircraft. How advanced is this industry? How long has this been around?
What is very interesting is you know the first little pilot was built in 1912. So automation in the aviation industry is not something new, we are standing on the shoulders of giants. Lots of work has been done over decades and now we’re reaching a point where we’re having all of the pieces of technology mature enough so that we can reliably control those aircraft remotely and monitor them remotely.
It’s really about the confluence of all the technology and safety that has been done over the past 100 years plus some new technology that comes in particular from the self-driving car industry like on the perception side we’re leveraging a lot of that technology
Can you tell us a little bit about what your team at x-wing is building and what’s the product that they’re creating?
The mission at Xwing is to dramatically change how humans move for distances that are typically too long to drive but too short to fly commercial. So think about regional mobility, 100 to 500 miles. What we are doing is putting together all of the technology so that human goods can move autonomously throughout the airspace using what will eventually be clean vehicles but first of all we’re focusing on the autonomy part.
The first product that we are looking at is to retrofit existing aircraft so the target is the Cessna caravan. It’s an aircraft that can carry 10 to 14 passengers depending on where you are in the world for distances up to a thousand miles so it’s a pretty sizable aircraft.
We are adding our technology so that we can operate it from the ground without a pilot on board. It is very exciting to work with lots of challenges both on the technology side as well as on the regulatory side. It has never been done before so it’s pushing the boundaries and working with the regulators, in this case, DFA to get approval to do so
So this company is building autonomous aircraft to be able to ship goods and maybe not people but just the goods from one place to another without anybody on board of the airplane?
That’s correct. we’re taking a very pragmatic approach our goal is to change human mobility but we’re doing that one step at a time so we’re starting with cargo because you don’t have to deal with people you don’t have to you don’t put free people at risk initially the public acceptance is going to be easier and you know the u.s has over 5000 small airports that we can leverage and very few of those are used for commercial flights.
Our goal is to leverage this infrastructure to bring initially cargo to all the people around us and to bring communities together you have a lot of small communities that don’t have access to fresh food, legal documents overnight, so it’s really about shortening distances and bringing a city experience every day to all the people around the US and around the world
What are some of the benefits of taking the human pilot out of the loop for these autonomous flights? What are some of the extra features that can be added and some of the advantages?
so one of the main benefits is to make the asset more available so right now we have enough aircraft but it’s really difficult for the operators to get pilots to be able to fly the routes that they need. Because when there might be demands somewhere, we may be able to get an aircraft, but sometimes it’s really difficult to get a pilot to live in those location and to fly maybe a couple of hours a day.
It’s about going from having to find the demand and the pilot and the aircraft to only having to deal with the aircraft. For the demand, you can rebalance your network much more easily you can shift off all of your assets overnight where it’s needed so it’s it makes it opens up a lot of new markets.
At the same time, our first market is to target logistics companies like FedEx and UPS. They fly every day what they have what we call like “feeder airlines” so “feeder flights” where the large aircraft comes in in the morning very early in a major hub then they offload all the packages either to small airplanes or to trucks and the smaller planes are going to fly another 100 miles for instance and then they’ll offload to the truck so that the packages can be at people’s house by 10 a.m on the business day in some cases.
So it’s helping the larger logistics company deliver on the promise of next-day delivery and help some of the problems of hiring pilots. There is a significant shortage of pilots, COVID made that even worse because lots of pilots went into early retirements at the beginning of COVID and now that the demands are picking up, it’s really hard to hire pilots again
If you were to break down the problem space of autonomous flight into three categories:
> perception of the environment
> planning for the route
> control of the vehicle
How would you describe the challenges that each of these has at a high level?
The problem we are tackling is the typical robotics problem as you you broke it down like perceptions control, planning, and control. The perception is one of the new pieces and the most challenging pieces that we have to deal with.
In our case the perception starts on the airport surface where we have to be able to taxi and that has never been done before actually so we need to be able to see around us other aircraft you know some of the markings of other vehicles that may be driving around, people, animals so this is very common with the self-driving car problem.
So we have to solve this self-driving car problem in a more controlled environment. In the airport we have fewer things that can happen and for us, if we have an obstacle, we stop. So it’s the self-driving car problem but easier.
Then when we start flying we have what is called a “detect and avoid” problem to solve which is the ability of the aircraft to detect other airplanes that fly or airplanes or helicopters or whatever it may be and then safely reroute around them. This is a very challenging problem because we have to look really far.
We have to be able to see other airplanes up to five miles away right so the sensors for self driving cars are not well suited. The reason we have to see that’s far is because we fly quite fast. We fly ourselves at 170 knots and then maybe another airplane flying 170 knots so that’s 340 knots which is around (390) miles per hour of closing speed. So that’s why you have to see really far to have enough time to safely avoid and maintain the safe distance.
The separation distance between aircraft we want is at least 2000 feet. So you need to be able to have enough time to see the other person make your decision change your trajectory and maintain 2000 feet of separation
How does that compare to say a human pilot who’s driving a plane and then they’re seeing the other planes out there are they also just visually manually detecting these other planes out there and then maybe looking at sensors like radar or something to detect what the path is going to be?
Yes when pilots are in contact with air traffic control air traffic control is going to let them know of other traffic and in some cases ask the pilot to change their trajectory to avoid other aircraft but not every aircraft is talking to traffic control. That is when it becomes challenging.
We have to be able to detect airplanes that do not take to air traffic control and may not be equipped even with what is called a transponder. Most aircraft have a transponder that broadcasts their identity and in some cases what is called ADS-B where they also broadcast their position and their speed and their velocity. But not all aircraft have that so we need to get to the lowest common denominator which is able to detect “cell planes”, you know, the very old World War II airplanes that don’t have a radio or transponder. We need to be able to see all of these aircraft so when you have a pilot, the pilot typically has to look out and find those airplanes so it’s very challenging. It’s very difficult, and in many cases pilots miss this other airplane so our system can do better than actually a pilot.
that’s a little bit terrifying to hear. Is this a large number of planes or is it all the small little planes?
it’s all little planes. All of the large aligners that we commonly fly to go from point a to point b fly into different classes of airspace they fly into airspaces where everybody needs to have a transponder and everybody has to talk to a traffic control when we are talking about the market we are we were playing in, the small feeder, we don’t fly as high. We fly between five and ten thousand feet. We fly where all the general aviation people fly so that’s where you have a lot more traffic the airligners start from major airports, they climb to eighteen thousand feet and above, where everybody is talking to air traffic control. We are flying a different class of airspace
and it sounds like the communication between these planes it’s plane air traffic control and then other planes and they’re both communicating through this middleman there’s no direct plane to plane type of communication that happens
the pilot will talk to a traffic control that will talk to other airplanes but you share the frequency so you can hear what’s you know who is talking to who but you don’t always have the full view of where the other people are then the airplane themselves may be equipped with a transponder that will broadcast their position so that other airplanes can grab this information and display to the pilots.
So in some cases we know very far where the other airplanes are and where they’re going and in some cases, we have no idea and we have to detect them close and do like an avoidance maneuver and that’s the challenging portion of the detecting void
what are some of the benefits to having x-wings autonomous perception and flight planning onboard versus a traditional aircraft
the aircraft is able to go from one airport to the other airport without the need of a human. So from a benefit standpoint it’s removing the need for a pilot collocated with the aircraft it’s also going to improve safety as our system monitors the health of the aircraft throughout the flight and across flight so that we can do preventive maintenance. But mostly the aircraft is equipped with almost 360 view so when we taxi we can see everything that’s happening, when we’re flying with the detect and avoid system we can maintain a safe separation distance and we can also optimize the trajectory of the aircraft to minimize fuel burns in our case.
i want to go back to the previous question we talked about the perception side where it’s a problem. Then on the planning side the aircraft needs to have the full vision of who is around what they’re doing so that we can reroute in real time and that’s new. Currently aircraft require the pilot to input modification to the flight plan. In our case, we are able to dynamically reroute either in case of contingency, so if we lose the engine the aircraft can determine what is the closest airport, how do i get there, do i have enough energy to to reach that airport. As well as if i see other airplanes there the aircraft can navigate around them.
And finally the controls problem. So the controls problem is the part that has been worked on for for decades, so there is no need to reinvent the wheel. Autopilot, flight control computers, servos those are things that aviation companies have been doing for many years that we can leverage.
Some of the new parts, though, no aircraft can take off on its own so that’s a new part of the plan to be solved. Aircraft cannot taxi so we have to solve the taxing problem which is actually quite challenging. Aircraft are meant to fly and they can also drive but they’re not the best at driving.
And finally, landing is also one of the challenges. In that case it’s a challenge of controls but also navigation. We want to make sure that we can land on the runway every time regardless of the sensor that we’re using. Large aircraft can land automatically at about 55 airports in the USA, and we want to target 5000 without adding a lot of new technology. So how do we solve this navigation and localization problem very accurately is one of the challenges to solve
and so just to capture an idea of the scale of this challenge so right now taxiing in an airport that is a situation that has a lot of room for accidents for large catastrophes you know we’ve seen some high profile planes just taxiing and another plane lands into it and then both planes have very massive casualties. How difficult is this problem to solve and what are you guys doing to solve the taxing problem?
so there has been a lot of actions where like airplanes would clip the wings of other airplanes. They don’t realize how wide they are or they clip you know a light post or something like that. With the sensor we have on board, we know exactly where obstacles are with respect to us so we can and we will stop before there is any any incident and we have a really good situational awareness of the aircraft and surroundings so that the taxing will be much much safer.
is there any reason why these sensors and this software algorithm wouldn’t also be installed on manned flights with a pilot just to decrease that error rate on their end?
It will be. Already over the past ~10 years, cameras have been installed on a large airplane to help the pilot see the wingtips and try to see the runway centerline. There is no reason not to add those new sensors eventually. I think, you know, it takes a bit of time for this new technology to get into the aviation realm because it needs to be certified but once we certify it then it will. It can be used for much larger aircraft.
A lot of the technology that we are developing can help human pilots actually. if you think about the detect and avoid, even though we want to close the loop entirely, so we want the aircraft to get the detect and avoid data and to re-plan automatically just providing that situational awareness, that information about other traffic where they are with respect to us, what they’re doing, tremendously helps current pilot and could increase the safety and reduce the number of accidents.
In small aviation there are a lot of accidents where two aircraft are landing at the same runway and one lands on top of the other one just because they cannot see through the nose of the aircraft, they can’t see the other person. With the detect and avoid system that situation would never happen. For helicopter pilots also, it’s challenging, because they can change the heading and direction very quickly so having all of this information provided to them would be extremely valuable.
So you’ve mentioned having cameras installed at various parts of the airplane and how that can create a situation where you don’t have these blind spots anymore. What are some of the other sensors and some of the sensor fusion that’s happening on this plane to make this a magical product?
on the ground for taxiing we’re using a combination of cameras and lidars as well as GPS of course. Using cameras and lidar help us to train the cameras to learn what’s around so eventually we could get rid of the Lidar. In the short term, the lidar is really good to get this depth, to know how far other airplanes are and how far obstacles are.
Once we start flying, we use a combination of radar and ADS-B. ADS-B is an aviation-specific system where we use the existing transponder as well as GPS and each aircraft broadcasts its position and its velocity and the other aircraft can receive that. So you can see from really far away, like 50 miles, where other airplanes are. That’s very helpful but not everybody has it and that’s what the problem. So for those aircraft the transponder we use ADS-B for the aircraft that are not equipped we use the radar and the cameras. We fuse all of this to get the best track possible for all the other airplanes and then we make decisions based on the strikes
So the long-term vision is to take your lidar and then use this as ground truth to train your video from your cameras to detect the distance to objects. Is this going to be a 3d point cloud of the environment around it?
the goal is to remove the lidar right if we have more sensors that are expensive we will be happy to remove them we don’t need two in the short term. But you’re right the goal is to detect all of the objects around us and figure out are they in the way, are they a danger to us, can we keep on taxiing safely, or do we need to reroute. when we are airborn the problem is much more challenging.
As I was describing, with the distance we’re trying to detect at, a small airplane five miles away with a camera is extremely challenging and also the cameras don’t work well at night. We don’t work well in the rain so we need to be able to detect those other vehicles in all weather conditions so that’s where radar, for instance, works well. RF is not impacted too much by weather and by you know the time of day and
so another thing about this environment that you’re working in especially when you’re flying in the air you don’t have a lot of features around you and that’s what makes it very different from say a self-driving car or from taxiing you can judge the distance to objects a little bit from a structured featured environment once you’re in the air and a plane is five miles away and you know it’s coming towards you I can imagine that would be very difficult to gauge what that distance is and there would be a large error bar over there
yes you’re completely right if you take a Cessna 172 and a Cessna Caravan, from far away they look very similar. But one is twice as big as the other one. So if you just look at this image against the blue sky, is it the Cessna 172 that is like two miles away? Or is it the Cessna caravan that is four miles away? because those two things are going to have the same number of pixels and have almost the same shape.
So you’re completely right. Using only cameras to detect at long distances we have this range uncertainty problem so the cameras are not very well suited. What they can be used for is to validate some of the other sensors.
One of the problems of ADS-B so ADS-B as i was describing earlier is this communication protocol where everybody broadcasts a position but it has a problem of security. it’s not a secure protocol. Anybody can create a software-defined radio and start broadcasting it. It’s illegal but people can start doing these type of things. If suddenly a bad actor wants to generate a lot of fake airplanes around me, i need a way to validate that those targets that i receive – are they fake or are they real?
So one way is to use the camera and to say “do i see an object” where i got this ADS-B ping. The cameras can be used to validate radar or to validate ADS-B and to improve the overall solution.
The radar is really good at range, for instance, but it’s not very good in azimuth elevation. The camera, on the other hand, is going to be very good at azimuth elevation because that’s the pixels, but really poor range, so if you put the two together then you have a really good position and velocity for the other aircraft.
That’s a bit of a scary point you bring up that you can just broadcast these fake plane signals over ADS-B. Is this something that’s been done before or is this just a theoretical vulnerability?
Some researchers have done it. The consequences of doing it are significant, because you’re messing up with the federal government. I don’t think anybody is going to go and start doing this from their bedroom. It’s going to it’s somebody who has ill intention could do it.
so it can happen it has research i’ll show that it can be done but i haven’t seen i haven’t seen it done with a bad intention yet.
but regardless of whether or not this is a bad intention or maybe you’re getting faulty sensor readings from some of the sensors, or some sort of weird reflection, the sensor fusion algorithm can compensate for that and use each sensor to double-check the outputs from the other and then that’s how you create this redundancy when you’re flying.
when you go flying you don’t think twice about the safety. You assume that you’re gonna get there so aviation has this really high requirement for safety.
The target level of safety is one catastrophic failure (so one failure that can lead to a loss of life per billion flight hours).
When Airbus or Boeing designs the aircraft, they have to demonstrate, theoretically, that the aircraft will have a level of safety and reliability of one catastrophic failure per billion flight hours.
in our case we have smaller airplanes so the level safety is smaller but we’re talking about one per hundred million flight hours which is still very significant. The way to achieve it is to have redundancy and to have various ways of doing the same thing so that’s why you have different sensors that when they all work you get the best solution possible and when one starts failing you still have a backup. In aviation is all about having a plan A, B, C, and D.
So you touched earlier on training your data from your cameras based on your lidar output so you do some machine learning, and you can get some nice distance and trajectory estimates of other planes and objects flying by.
Are there any other interesting machine learning algorithms that are being done under the hood in this plane?
We have to certify the aircraft and the FAA and the community.. so we need to use deterministic methods as much as possible, so we are not relying on AI and machine learning for safety critical items and we don’t need to. That’s a nice thing that we don’t need to, we could, but it’s not required and the reason is we need to demonstrate with a really high level of safety i was describing a few minutes ago that the system will perform as intended with the probability of failure like extremely extremely small and right now we have no way of showing that AI can do that.
Though the other place where, for instance, that AI and Machine Learning can be used is for localization. if you think about landing.. detecting a runway is something where machine learning and AI are perfect. Perfect application – instead of looking for a cat or a dog you look for a runway and if that’s the runway, what is my position with respect to it?
So in that case, those techniques can be used to complement the existing sensors, to validate the GPS position that we have, or to augment, in some cases the GPS position that we have. So it’s not the primary method but it’s an augmentation or a backup.
So is this kind of a similar attitude that the certification has for self-driving cars as it is for the aviation space? Or is it a little bit more extreme because aviation is just like “security is number one!”?
It’s much more extreme. On the self-driving cars, tesla is pushing updates on, I don’t know if it’s a daily basis, but very regularly and nobody looks at it. People try it and like guinea pigs.
In the aviation space, before we put our technology in the hands of customers, before we can use it to make money, generating revenue, it has to be certified.
The FAA has to go through every single line of code, every requirement, we have to demonstrate the level of safety before we can put it on the market. so it’s much much more challenging than in the self-driving car space
To backtrack a bit, about the market, who are the customers for this? How would you categorize the market into different groups?
While the long-term vision is to move people and to move the world, the short-term goal and where we’re very focused on is to be able to introduce this technology for cargo and for the logistics company.
So FedEx, UPS, DHL, Amazon, and then you have a lot of other people that have urgent cargo needs. It has to be payloads that have a certain value because it’s the value of time that you’re saving. But initially, we’re looking at the feeder market for the larger legitimate company.
So for instance, FedEx owns its own fleet of Cessna caravans. They have about 260 of them and they lease them to small operators. A number of small operators fly them for for FedEx. UPS, on the other hand, has delegates to operators that own their aircraft and they are geographically in different parts of the us. Then all of those operators, very fragmented markets, provide the service to ups and fedex in that case.
so that’s the market we’re entering. As we solve this problem of safe integration of aircraft in the in the airspace we’ll be able to put passengers onboard and expand around the world but the market is very significant already in the US.
What are some of the next steps at x-wing?
so the next step is to sell them.
One of them is to continue the certification of our technology. So this is a process that takes time, that is very rigorous so we keep on marching there pretty actively.
Then we want to demonstrate some use cases to our partners, or carry cargo for them. We have a “path 135” certificate meaning that x-wing is an airline and we are allowed to carry cargo for revenue.
We want to work with our partners to demonstrate the use case initially in what is called “OPS” or “optionally piloted aircraft” so we’re gonna have a safety pilot on board but we’re gonna be controlling from the ground. So that we can demonstrate how do you integrate into a larger part how do you interact with that traffic control what are all the corner cases that we didn’t think of you know where we operate but if you go into other locations.. what are all the specific cases that we need to account for in our final product?
so our goal is really to be with our partners, with our customers, at the airport, in the air so that we can gather all those requirements and design the best product.
Our approach is not to design a product and then sell it. We really want to understand the needs. We are designing the product, we are operating the aircraft and certifying it at the same time so that we can update the technology, certify again, and put it into service until we are really happy with the technology and at that point that’s when we can think about licensing it but in the short term being able to close the loop on ourselves to iterate on the certification aspect is very important to move forward fast
and at this stage of the company are you already selling to paying customers or are you waiting for full certification before you’re able to do this
so we are an airline we carry cargo for passengers for our customers so we have to pay customers and progressively we’re adding the technology so we’re not waiting to have the full solution before entering the market
so and then you’re an airline and you’re delivering stuff yourself and temporarily for now piloting these aircraft while there’s also this sensor and data capture that’s happening to train the autonomy
exactly so we have the pilot, we put our technology on board so that we gather data for the certification. For instance, the detect and avoid is a new technology. nobody has certified such a system yet so we need to gather enough flight data to show how it performs in all conditions so be having our sensors and our technology on this aircraft that perform daily operations. Then we’re building up all the required information and all the data set for certification so it’s a win-win situation the fa gets a lot of data we get a lot of flight hours and then we can certify it faster
Thank you very much for speaking with us today this was amazing
you’re very welcome to thank you very much for having me
Robohub Podcast is a non-profit robotics podcast where we interview experts in robotics, including researchers, entrepreneurs, policy makers, and venture capitalists. Our interviewers are researchers, entrepreneurs, and engineers involved in robotics. Our interviews are technical and, often, get into the details of what we are discussing, but we make an effort to have our interviews understandable to a general audience.