Robohub.org
 

Drone learns to see in zero-gravity


by
03 October 2016



share this:
Overview of the experiment and the internal sensor readings of the drone.

Overview of the experiment and the internal sensor readings of the drone.

During an experiment performed on board of the International Space Station (ISS) a small drone successfully learned by itself to see distances using only one eye, reported scientists at the 67th International Astronautical Congress (IAC) in Guadalajara, Mexico.

Although humans effortlessly estimate distances with one eye, it is not clear how we learn this capability, nor how robots should learn the same. The experiment was designed in collaboration between the Advanced Concepts Team (ACT) of the European Space Agency (ESA), the Massachusetts Institute of Technology (MIT) and the Micro Air Vehicles lab (MAV-lab) of Delft University of Technology (TU Delft), and was the final step of a five-year research effort aimed at in-orbit testing of advanced artificial intelligence (AI) concepts.

https://www.youtube.com/watch?list=PL_KSX9GOn2P-U9mQKHf9Ul_cSteTw13CW&v=GZHE0E6AsSE

The paper, “Self-supervised learning as an enabling technology for future space exploration robots: ISS experiments”, describes how during the experiment a drone started navigating in the ISS while recording stereo vision information on its surroundings from its two ‘eyes’ (cameras). It then started to learn about the distances to walls and obstacles encountered so that when the stereo vision camera would be switched off, it could start an autonomous exploratory behaviour using only one ‘eye’ (a single camera).

While humans can close one eye and still be able to tell whether a particular object is far, in robotics many would consider it as being extremely hard. “It is a mathematical impossibility to extract distances to objects from one single image, as long as one has not experienced the objects before,” says Guido de Croon from Delft University of Technology and one of the principal investigators of the experiment. “But once we recognise something to be a car, we know its physical characteristics and we may use that information to estimate its distance from us. A similar logic is what we wanted the drones to learn during the experiments.” Only, in an environment with no gravity, where no particular direction is favourite and thus had also to overcome this difficulty.

The self-supervised learning algorithm developed and used during the in-orbit experiment was thoroughly tested at the TU Delft CyberZoo on quadrotors, proving its value and robustness.

Drone learns to see distances with one camera on earth.

Drone learns to see distances with one camera on earth.

“It was very exciting to see, for the first time, a drone in space learning using cutting edge AI methods,” added Dario Izzo who coordinated the scientific contribution from ESA’s Advanced Concepts Team.

“At ESA, and in particular here at the ACT, we worked towards this goal for the past 5 years. In space applications, machine learning is not considered as a reliable approach to autonomy: a ‘bad’ learning result may result in a catastrophic failure of the entire mission. Our approach, based on the self-supervised learning paradigm, has a high degree of reliability and helps the drone autonomy: a similar learning algorithm was successfully applied to self-driving cars, a task where reliability is also of paramount importance.”

https://youtu.be/tDS4SUR_Egs?list=PL_KSX9GOn2P-U9mQKHf9Ul_cSteTw13CW

The little drone that successfully learned “to see” was one of the SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellite) drones on board the ISS. The SPHERES are capable of rotation and translation in all directions. Twelve carbon dioxide thrusters are used for control and propulsion, and allow the satellites to manoeuvre with great precision in the zero gravity environment of the station. The MIT Space Systems Laboratory, in conjunction with NASA, DARPA, and Aurora Flight Sciences, developed and operates the SPHERES system to provide a safe and reusable zero gravity platform to test sensor, control and autonomy technologies for use in satellites. Developing these technologies is an enabler for new types of satellite systems.

The drone experiments on earth were performed by Kevin van Hecke for his MSc thesis. He also went to the MIT Space Systems Lab to translate the drone programs to the software required by the SPHERES: “It was my life-long dream to work on space technology, but that I would contribute to a learning robot in space even exceeds my wildest dreams!”

Indeed, the experiment seems to hold promise for the future: “This is a further step in our quest for truly autonomous space systems, increasingly needed for deep space exploration, complex operations, for reducing costs, and increasing capabilities and science opportunities,” comments Leopold Summerer, head of ESA’s Advanced Concepts Team.

Team involved

TU Delft MAV-Lab: Guido de Croon, Laurens van der Maaten, Kevin van Hecke.
Massachusetts Institute of Technology: Timothy P. Setterfield, Alvar Saenz-Otero.
Advanced Concepts Team: Dario Izzo, Daniel Hennes.



tags: ,


Guido de Croon is Full Professor at the Micro Air Vehicle lab of Delft University of Technology in the Netherlands.
Guido de Croon is Full Professor at the Micro Air Vehicle lab of Delft University of Technology in the Netherlands.


Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.

Restoring surgeons’ sense of touch with robotic fingertips

  10 Mar 2026
Researchers are developing robotic “fingertips” that could give surgeons back their sense of touch during minimally invasive and robotic operations.

Robot Talk Episode 147 – Miniature living robots, with Maria Guix

  06 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties.

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence