Robohub.org
ep.

356

podcast
 

Controlling a Drone After Sudden Rotor Failure #ICRA2022 with Sihao Sun

by
06 June 2022



share this:


Dr. Sihao Sun discusses his award-winning research in the area of controlling the flight of a drone when faced with a sudden rotor failure.

Typical research in this area addressed the case where one of the four rotors in a quadrotor suddenly, spontaneously stops working. This previous research does not take into full account real-life scenarios where rotor failure is common. This includes collisions with other drones, walls, birds, and operating in degraded GPS environments.

Dr. Sihao Sun

Dr. Sihao Sun is a postdoctoral research assistant at the Robotics and Perception Group (RPG) in University of Zurich directed by Prof. Davide Scaramuzza. Currently, he is working on control and perception for aerial robots (drones).

In December 2020, he received his PhD degree in Aerospace Engineering from the Control and Simulation Group of Delft University of Technology. His works on quadrotor fault-tolerant flight control have been featured by reputable media, such as IEEE Spectrum.

Links



transcript

Transcript (edited for clarity):

Abate: [00:00:00] Can you tell me a little bit about your research?

Sihao Sun: I have a very broad research interest … the object of the research is areas of robotics like quad-copters, but also hybrid or even flapping drones.

Abate: What’s a hybrid drone?

Sihao Sun: A hybrid drone is like those drones with hybrid configuration, like with a wing and rotors… So in that sense, you can leverage the advantages of both configurations. … you can do this vertical takeoff landing and also forest flight with more aerodynamic efficiently.

Abate: So you can go much further distance as well as have the ability to go up and down easily.

And so you had a paper that was here at ICRA “Upset recovery of control for quad rotors, subjected to complete rotor failure from large initial disturbance”. Could you talk about that paper?

Sihao Sun: Yes. Yes. I think that’s an old paper. It was published back in 2020. So that work is also about this “motor failure” problem.

But so the background is that before that paper, the existing works, trying to tackle the motor failure problem of the quad-rotor, always assume that the failure happens when a drone is nearing a hovering condition.

So when a drone is hovering and all of a sudden one motor fails, or that assumption that it’s, it’s somehow reasonable. But it’s still a little bit far from reality. Because imagine in reality when the motor [fails], it could be because of some collision with other drones or with birds or something.

So that collision would change the attitude of the drones dramatically.

Abate: So what you’re saying is that if the drone is just hovering in the air, it’s unlikely that the motor would just randomly fail in that position. It’s more likely that it’s actually because of a collision, which is the cause of…

Sihao Sun: Yes.

So even if the motor fails, when it is hovering, the software still needs some time to detect the failure of that motor. So that time will roughly take about 100 milliseconds. So because you know that in that time, during that period, the drone itself, it doesn’t know that one motor fails. So it is still using its original flight controller and its original [00:03:00] flight algorithms.

And that the whole system has changed. Drones are very likely to flip over in that period. Which also means that even if the video happens around the hovering condition, the fault tolerance algorithm will take over when the drone is already tipping over.

So that’s why we want to look into this scenario when a drone is almost upside down… Existing approaches, before that paper was published, they assume that the drone is near hovering and they will use that assumption in their algorithms, which means that these algorithms will fail in real-life scenarios. So we have validated in our experiments that [it is] very hard to recover the drones when a drone is almost flipping over.

So that is the initiative about that paper. And then we design an algorithm that is able to recover the drone, not only near hovering but also when the drone is upside down and [we are] able to recover in that situation. It’s very challenging, but that would prove that it works.

Abate: This is for a quadrotor where only one of the motors fails?

Also just to understand how stable a quadrotor can be with just three motors without a fault tolerance algorithm… Will they just immediately fall? Fly erratically? And eventually, hit a floor or a wall?

Sihao Sun: They just behave randomly. And also, depending on which, algorithms are being used to control the drone, because there are many different algorithms. A PID or LQR, also with these control methods, they are cascaded.

So for example, even if you have an identical controller to control the attitude. The controller to control the position might also be different. And those variations of alternative controller as well also leads to different behaviors of the motor failures, it’s very hard to say what will happen, but it’s very likely that the drone will crash.

Abate: Yeah. And so what exactly did your research do? And some of your research was presented here at the 2022 ICRA, correct?

Sihao Sun: I have two papers published in ICRA 2022. One paper is also trying to address the upside down problem, but it’s using another algorithm called non-linear MPC. MPC is the nonlinear model predictive control, which is also widely used in the robotics community.

But nobody has ever tried to use that in this motor failure condition. So [both of the papers I mentioned] try to solve this upside-down recovery problem.

But in the older version of the paper, they use a cascaded structure. So basically the outer loop controllers… We always separate the controllers in the inner loop and outer loop controller because the inner loop [controller] can react much quicker. While, the outer loop, [gives] high-level commands to the inner loop, but they react much, much slower.

So in those cascaded structures, in most cases, the outer loop does not know any constraints from the inner loop. So sometimes you just give a very unreasonable command to the inner loop, which the inner loop cannot follow very well. So that’s the problem of this cascaded structure.

So for that old paper we published two years ago in ICRA, this problem happens sometimes, which makes the upset recovery controller not 100% successful. So that’s why we resolved to the nonlinear NPC methods, which we publish in this ICRA because that method does not use this cascaded structure.

The motor speed command, the lowest command is being calculated using the whole state space of information. And so it does not suffer from this cascading issue. And we show that this paper can behave much, much better than the previous one.

Abate: And does this require any type of special sensors, with high-frequency inputs to make this work?

Sihao Sun: No. It’s the same as other research, like using, using VICON or GPS as external sensors for state estimation and an IMU onboard for the body rate estimates. So it’s all the same with others. So we don’t need to use any other special sensors, different from other research groups.

Abate: And is the position and orientation — that’s partially IMU — Is it also using a vision system?

Sihao Sun: No. It’s not using a vision system for that work. We also get a paper that was published in RAL last year. And it was also presented in, ICRA 2021.

It’s about using these onboard vision sensors to solve this motor failure problem. I’m also very honored that that paper was also nominated as the best paper of RAL past year. That was announced yesterday. In that paper, we’re trying to solve this problem from another perspective.

So it’s not about this how to recover the drone from this upside-down orientation, but trying to use onboard sensors instead of external sensors, because there’s also a realistic scenario. So imagine the motor failures in an area which was very close to a tall building and the GPS signal is degraded.

So you cannot rely on these external positioning sensors for state estimation. So in that same case, we have to use the onboard camera. But the motor failure case for quad-copter is very challenging for these onboard vision sensors because the drone must (have) fast speed. So it must have, yaw spin that’s over four revolutions per second I would say.

It’s quite fast for standard vision-based estimation algorithms because of the motion board, because of the ill-posed condition. So in that paper, we tried to solve that problem.

To compare the two types of cameras: one is the standard frame camera. So we fine-tune that. And the other is the event-based camera, which is a new type of neuromorphic sensor.

It’s also a very hot research topic in the computer vision community and the robotic community. That kind of sensor does not suffer from this motion blur. So we compare these two kinds of sensors in this motor failure scenario and we conclude that with the standard camera it’s still possible to control [the quadrotor] overall, even if the motor fails.

But the event-based camera can do better, especially in the lower light conditions. That was the conclusion of that paper.

Abate: Well lower light conditions… and also I imagine if this is outdoors and it’s spinning very quickly and in and out of the sun that might cause some exposure issues with the standard camera.

So one of the big benefits of the event camera is that it can work across a larger range of brightness and then also the “frames”… it’s not really operating the same way with a “fixed frame rate”. So motion blur is much less of an issue.

Sihao Sun: Exactly.

Abate: So, there are a couple of different methods that you proposed. Is the last one that you’re talking about the one that you think is the most viable for use in actual industry?

Sihao Sun: Well, you know, for each paper, they are the only addressing a single problem, but for industry, you have to combine all of them together. So, what I would recommend is for industries to apply the “model predictive control” methods for the drones because we have shown that that method is very useful in these motor failure conditions compared with other cascaded structures.

But you can also use the onboard camera that is fine-tuned or use an event-based camera for that, even though event cameras are quite expensive at the moment… but perhaps in the future, you can use that.

Abate: Perhaps in the future when the price goes down with mass production, but not yet for most use cases.



transcript



tags: , , , , , , , ,


Abate De Mey Podcast Leader and Robotics Founder
Abate De Mey Podcast Leader and Robotics Founder





Related posts :



Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by

Robot Talk Episode 94 – Esyin Chew

In the latest episode of the Robot Talk podcast, Claire chatted to Esyin Chew from Cardiff Metropolitan University about service and social humanoid robots in healthcare and education.
18 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association