Robohub.org
 

Flying high-speed drones into the unknown with AI

by
08 October 2021



share this:

When it comes to exploring complex and unknown environments such as forests, buildings or caves, drones are hard to beat. They are fast, agile and small, and they can carry sensors and payloads virtually everywhere. However, autonomous drones can hardly find their way through an unknown environment without a map. For the moment, expert human pilots are needed to release the full potential of drones.

“To master autonomous agile flight, you need to understand the environment in a split second to fly the drone along collision-free paths,” says Davide Scaramuzza, who leads the Robotics and Perception Group at the University of Zurich and the NCCR Robotics Rescue Robotics Grand Challenge. “This is very difficult both for humans and for machines. Expert human pilots can reach this level after years of perseverance and training. But machines still struggle.”

The AI algorithm learns to fly in the real world from a simulated expert

In a new study, Scaramuzza and his team have trained an autonomous quadrotor to fly through previously unseen environments such as forests, buildings, ruins and trains, keeping speeds of up to 40 km/h and without crashing into trees, walls or other obstacles. All this was achieved relying only on the quadrotor’s on-board cameras and computation.

The drone’s neural network learned to fly by watching a sort of “simulated expert” – an algorithm that flew a computer-generated drone through a simulated environment full of complex obstacles. At all times, the algorithm had complete information on the state of the quadrotor and readings from its sensors, and could rely on enough time and computational power to always find the best trajectory.

Such a “simulated expert” could not be used outside of simulation, but its data were used to teach the neural network how to predict the best trajectory based only on the data from the sensors. This is a considerable advantage over existing systems, which first use sensor data to create a map of the environment and then plan trajectories within the map – two steps that require time and make it impossible to fly at high-speeds.

No exact replica of the real world needed

After being trained in simulation, the system was tested in the real world, where it was able to fly in a variety of environments without collisions at speeds of up to 40 km/h. “While humans require years to train, the AI, leveraging high-performance simulators, can reach comparable navigation abilities much faster, basically overnight,” says Antonio Loquercio, a PhD student and co-author of the paper. “Interestingly these simulators do not need to be an exact replica of the real world. If using the right approach, even simplistic simulators are sufficient,” adds Elia Kaufmann, another PhD student and co-author.

The applications are not limited to quadrotors. The researchers explain that the same approach could be useful for improving the performance of autonomous cars, or could even open the door to a new way of training AI systems for operations in domains where collecting data is difficult or impossible, for example on other planets.

According to the researchers, the next steps will be to make the drone improve from experience, as well as to develop faster sensors that can provide more information about the environment in a smaller amount of time – thus allowing drones to fly safely even at speeds above 40 km/h.

Literature

An open-source version of the paper can be found here.

Media contacts

Prof. Dr. Davide Scaramuzza – Robotics and Perception Group
Department of Informatics
University of Zurich
Phone +41 44 635 24 09
E-mail: sdavide@ifi.uzh.ch

Antonio Loquercio – Robotics and Perception Group
Department of Informatics
University of Zurich
Phone +41 44 635 43 73
E-mail: loquercio@ifi.uzh.ch

Elia Kaufmann – Robotics and Perception Group
Institut für Informatik
Universität Zürich
Tel. +41 44 635 43 73
E-Mail: ekaufmann@ifi.uzh.ch

Media Relations University of Zurich

Phone +41 44 634 44 67
E-mail: mediarelations@kommunikation.uzh.ch



tags: ,


NCCR Robotics





Related posts :



Meet the Oystamaran

Working directly with oyster farmers, MIT students are developing a robot that can flip heavy, floating bags of oysters, helping the shellfish to grow and stay healthy.
08 December 2021, by

Exploring ROS2 with a wheeled robot – #4 – Obstacle avoidance

In this post you’ll learn how to program a robot to avoid obstacles using ROS2 and C++. Up to the end of the post, the Dolly robot moves autonomously in a scene with many obstacles, simulated using Gazebo 11.
06 December 2021, by

Team builds first living robots that can reproduce

AI-designed Xenobots reveal entirely new form of biological self-replication—promising for regenerative medicine.
02 December 2021, by

Exploring ROS2 using wheeled Robot – #3 – Moving the robot

In this post you’ll learn how to publish to a ROS2 topic using ROS2 C++. We are moving the robot Dolly robot, simulated using Gazebo 11.
30 November 2021, by

An inventory of robotics roadmaps to better inform policy and investment

Silicon Valley Robotics in partnership with the Industrial Activities Board of the IEEE Robotics and Automation Society, is compiling an up to date resource list of various robotics, AIS and AI roadmaps, national or otherwise.
29 November 2021, by

Robots can be companions, caregivers, collaborators — and social influencers

People are hardwired to respond socially to technology that presents itself as even vaguely social. While this may sound like the beginnings of a Black Mirror episode, this tendency is precisely what allows us to enjoy social interactions with robots and place them in caregiver, collaborator or companion roles.
26 November 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association