Robohub.org
 

Local visual homing


by
24 August 2010



share this:

How can a robot, using vision, go back to a previously visited location?

Möller et al. look at this research question, tagged “Local Visual Homing” in an intuitive manner inspired from social insects returning to their nest. The idea is that a robot, when somewhere important, takes a snapshot of the surrounding visual information. To return to that location later on (homing), it compares its current view of the world with the stored snapshot.

A technique called “image warping” is used to guide the robot to the snapshot location. Simply put, the robot imagines all possible movements it can do and simulates their effect on its current view of the world. It then selects the action that would bring its view closest to the stored snapshot. The outcome of this method is a homing vector that the robot should follow and a measure of how much its orientation has changed.

Using three different implementations of image warping, Möller et al. show how a robot equipped with a panoramic camera could effectively home with reasonable computational effort. Experiments were conducted on a database of real-world images taken by a robot (see example images below).

In the future, robots could use visual homing to go from snapshot to snapshot, thereby navigating through large environments.

Finally, don’t miss the author’s website for an extensive overview of visual navigation techniques.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence