Robohub.org
 

Stable visual navigation

by
20 April 2011



share this:

If you’re trying to get from the couch to the fridge, you’ll probably be using vision to navigate and home-in on your fresh drink.

To make your camera-equipped robot do something similar, give it an image taken from the location it is trying to reach (target image). By comparing features in the image taken from its camera and the target image, the robot is able to determine in what direction it should move to make the two images match.

However, challenges often arise when the robot is nearing its goal. If there is little change between the current and target images, the robot motion might start oscillating. To avoid this oscillation, López-Nicolás et al. propose to replace the target image by a smartly chosen virtual image computed at the beginning of the task. Possible features in the current, target and virtual images are shown below.

Left: Initial image of an experiment with the point features detected. Right: Target image with the points matched (circles) and the computed virtual target points (squares).

Experiments were done using a Pioneer P3-DX from ActivMedia. The robot is equipped with a forward looking Point Grey Research Flea2 camera. Results show the robot is able to smoothly navigate towards a target.

In the future, authors hope to equip their robots with omnidirectional cameras to allow them to reach targets all around.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Sense Think Act Pocast: Erik Schluntz

In this episode, Audrow Nash interviews Erik Schluntz, co-founder and CTO of Cobalt Robotics, which makes a security guard robot. Erik speaks about how their robot handles elevators, how they have hum...
19 October 2021, by and

A robot that finds lost items

Researchers at MIT have created RFusion, a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper, that fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
18 October 2021, by

Robohub gets a fresh look

If you visited Robohub this week, you may have spotted a big change: how this blog looks now! On Tuesday (coinciding with Ada Lovelace Day and our ‘50 women in robotics that you need to know about‘ by chance), Robohub got a massive modernisation on its look by our technical director Ioannis K. Erripis and his team.
17 October 2021, by
ep.

339

podcast

High Capacity Ride Sharing, with Alex Wallar

In this episode, our interviewer Lilly speaks to Alex Wallar, co-founder and CTO of The Routing Company. Wallar shares his background in multi-robot path-planning and optimization, and his research on scheduling and routing algorithms for high-capacity ride-sharing. They discuss how The Routing Company helps cities meet the needs of their people, the technical ins and outs of their dispatcher and assignment system, and the importance of public transit to cities and their economics.
12 October 2021, by

50 women in robotics you need to know about 2021

It’s Ada Lovelace Day and once again we’re delighted to introduce you to “50 women in robotics you need to know about”! From the Afghanistan Girls Robotics Team to K.G.Engelhardt who in 1989 ...
12 October 2021, by and

Join the Women in Robotics Photo Challenge

How can women feel as if they belong in robotics if we can't see any pictures of women building or programming robots? The Civil Rights Activist Marian Wright Edelson aptly said, "You can't be what yo...
12 October 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association