Robohub.org
 

Using 3D snapshots to control a small helicopter

by
30 September 2012



share this:

In the latest article in the Autonomous Robots journal, researchers from the Australian Defense Force Academy present a new control strategy for small flying robots that uses only vision and inertial sensors.

To control a flying robot, you usually need to know the attitude of the robot (roll, pitch, yaw), where it is in the horizontal plane (x,y), and how high it is from the ground (z). While attitude measurements are provided by inertial sensors on board the robot, most flying robots rely on GPS and additional range sensors such as ultra-sound sensors, lasers or radars to determine their position and altitude. GPS signal however is not always available in cluttered environments and can be jammed. Additional sensors increase the weight that needs to be carried by the robot. Instead Garratt et al. propose to replace position sensors with a single small, low cost camera.

By comparing a snapshot taken from a downward pointing camera and a reference snapshot taken at an earlier time, the robot is able to calculate its displacement in the horizontal plane. The loom of the image is used to calculate the change in altitude. Image loom corresponds to image expansion or contraction as can be seen in the images below. By reacting to these image displacements, the robot is able to control its position.

Grass as seen from altitudes of 0.25 m, 0.5 m, 1.0 m and 2.0 m (from left to right).

Using this strategy, the researchers were able to show in simulation that a helicopter could perform take-off, hover and the transition from low speed forward flight to hover. The ability to track horizontal and vertical displacements using 3D snapshots from a single camera was then confirmed in reality using a Vario XLC gas-turbine helicopter.

In the future, the authors intend to further test the 3D snapshot control strategy in flight using their Vario XLC helicopter before moving to smaller platforms such as an Asctec Pelican quadrotor. Additional challenges include taking into account the shadow of the robot, which might change position from snapshot to snapshot.

Source: Matthew A. Garratt, Andrew J. Lambert and Hamid Teimoori (2012) Design of a 3D snapshot based visual flight control system using a single camera in hover, Autonomous Robots.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Sensing with purpose

Fadel Adib uses wireless technologies to sense the world in new ways, taking aim at sweeping problems such as food insecurity, climate change, and access to health care.
29 January 2023, by

Robot Talk Episode 34 – Interview with Sabine Hauert

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to Dr Sabine Hauert from the University of Bristol all about swarm robotics, nanorobots, and environmental monitoring.
28 January 2023, by

Special drone collects environmental DNA from trees

Researchers at ETH Zurich and the Swiss Federal research institute WSL have developed a flying device that can land on tree branches to take samples. This opens up a new dimension for scientists previously reserved for biodiversity researchers.
27 January 2023, by

The robots of CES 2023

Robots were on the main expo floor at CES this year, and these weren’t just cool robots for marketing purposes. I’ve been tracking robots at CES for more than 10 years, watching the transition from robot toys to real robots.
25 January 2023, by

Robot Talk Episode 33 – Interview with Dan Stoyanov

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to Professor Dan Stoyanov from University College London all about robotic vision, surgical robotics, and artificial intelligence.
20 January 2023, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association