Robohub.org
 

Local visual homing


by
24 August 2010



share this:

How can a robot, using vision, go back to a previously visited location?

Möller et al. look at this research question, tagged “Local Visual Homing” in an intuitive manner inspired from social insects returning to their nest. The idea is that a robot, when somewhere important, takes a snapshot of the surrounding visual information. To return to that location later on (homing), it compares its current view of the world with the stored snapshot.

A technique called “image warping” is used to guide the robot to the snapshot location. Simply put, the robot imagines all possible movements it can do and simulates their effect on its current view of the world. It then selects the action that would bring its view closest to the stored snapshot. The outcome of this method is a homing vector that the robot should follow and a measure of how much its orientation has changed.

Using three different implementations of image warping, Möller et al. show how a robot equipped with a panoramic camera could effectively home with reasonable computational effort. Experiments were conducted on a database of real-world images taken by a robot (see example images below).

In the future, robots could use visual homing to go from snapshot to snapshot, thereby navigating through large environments.

Finally, don’t miss the author’s website for an extensive overview of visual navigation techniques.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Robot Talk Episode 137 – Getting two-legged robots moving, with Oluwami Dosunmu-Ogunbi

  12 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Oluwami Dosunmu-Ogunbi from Ohio Northern University about bipedal robots that can walk and even climb stairs.

Radboud chemists are working with companies and robots on the transition from oil-based to bio-based materials

  10 Dec 2025
The search for new materials can be accelerated by using robots and AI models.

Robot Talk Episode 136 – Making driverless vehicles smarter, with Shimon Whiteson

  05 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Shimon Whiteson from Waymo about machine learning for autonomous vehicles.

Why companies don’t share AV crash data – and how they could

  01 Dec 2025
Researchers have created a roadmap outlining the barriers and opportunities to encourage AV companies to share the data to make AVs safer.

Robot Talk Episode 135 – Robot anatomy and design, with Chapa Sirithunge

  28 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chapa Sirithunge from University of Cambridge about what robots can teach us about human anatomy, and vice versa.

Learning robust controllers that work across many partially observable environments

  27 Nov 2025
Exploring designing controllers that perform reliably even when the environment may not be precisely known.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence