Robohub.org
 

Vision-based navigation with motion blur


by
20 July 2010



share this:

Robots often need to know where they are in the world to navigate efficiently. One of the cheapest ways to localize is to strap a camera on-board and extract visual features from the environment. However, challenges arise when robots move fast enough to create motion blur. The problem is that blurry images lead to decreased accuracy in localization. Because of this, robots that move too fast might no longer be able to localize and as a result might get lost or need to stop and re-localize.

Instead, Hornung et al. propose to use reinforcement learning to determine the optimal policy which allows the robots to go at speeds appropriate for navigation while ensuring that they get to destination as fast as possible. The actual implementation uses an augmented Markov decision process (MDP) to model the navigation task.

The learned policy is then compressed using a clustering technique to avoid being memory-sassy, which would be a major limitation for robots with low storage capacity.

Experiments were successfully conducted on two different robots in indoor and outdoor scenarios (see video) and the robots were faster than if they had navigated at constant speed. In the future, Hornung et al. hope to implement their system on fast moving robots, such as unmanned aerial vehicles!



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence