Robohub.org
 

Local visual homing


by
24 August 2010



share this:

How can a robot, using vision, go back to a previously visited location?

Möller et al. look at this research question, tagged “Local Visual Homing” in an intuitive manner inspired from social insects returning to their nest. The idea is that a robot, when somewhere important, takes a snapshot of the surrounding visual information. To return to that location later on (homing), it compares its current view of the world with the stored snapshot.

A technique called “image warping” is used to guide the robot to the snapshot location. Simply put, the robot imagines all possible movements it can do and simulates their effect on its current view of the world. It then selects the action that would bring its view closest to the stored snapshot. The outcome of this method is a homing vector that the robot should follow and a measure of how much its orientation has changed.

Using three different implementations of image warping, Möller et al. show how a robot equipped with a panoramic camera could effectively home with reasonable computational effort. Experiments were conducted on a database of real-world images taken by a robot (see example images below).

In the future, robots could use visual homing to go from snapshot to snapshot, thereby navigating through large environments.

Finally, don’t miss the author’s website for an extensive overview of visual navigation techniques.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Robot Talk Episode 120 – Evolving robots to explore other planets, with Emma Hart

  09 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Emma Hart from Edinburgh Napier University about algorithms that 'evolve' better robot designs and control systems.

Robot Talk Episode 119 – Robotics for small manufacturers, with Will Kinghorn

  02 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Will Kinghorn from Made Smarter about how to increase adoption of new tech by small manufacturers.

Multi-agent path finding in continuous environments

  01 May 2025
How can a group of agents minimise their journey length whilst avoiding collisions?

Interview with Yuki Mitsufuji: Improving AI image generation

  29 Apr 2025
Find out about two pieces of research tackling different aspects of image generation.

Robot Talk Episode 118 – Soft robotics and electronic skin, with Miranda Lowther

  25 Apr 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Miranda Lowther from the University of Bristol about soft, sensitive electronic skin for prosthetic limbs.

Interview with Amina Mević: Machine learning applied to semiconductor manufacturing

  17 Apr 2025
Find out how Amina is using machine learning to develop an explainable multi-output virtual metrology system.

Robot Talk Episode 117 – Robots in orbit, with Jeremy Hadall

  11 Apr 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Jeremy Hadall from the Satellite Applications Catapult about robotic systems for in-orbit servicing, assembly, and manufacturing.

Robot Talk Episode 116 – Evolved behaviour for robot teams, with Tanja Kaiser

  04 Apr 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Tanja Katharina Kaiser from the University of Technology Nuremberg about how applying evolutionary principles can help robot teams make better decisions.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence