Robohub.org
 

Tracking 3D objects in real-time using active stereo vision


by
10 November 2012



share this:

In a recent publications in the Journal Autonomous Robots, a team of researchers from the UK, Italy and France propose a new technique that allows an active binocular robot to fixate and track objects while performing 3D reconstruction in real-time.

Humans have the ability to track objects by turning their head to and gazing at areas of interest. Integrating images from both eyes provides depth information that allows us to represent 3D objects. Such feats could prove useful in robotic systems with similar vision functionalities. POPEYE, shown in the image to the left, is able to independently move its head and two cameras used for stereo vision.

To perform 3D reconstruction of object features, the robot needs to know the spatial relationship between its two cameras. For this purpose, Sapienza et al. calibrate the robot vision system before the experiment by placing cards with known patterns in the environment and systematically moving the camera motors to learn how these motor changes impact the images captured. After calibration, and thanks to some math (homography-based method), the robot is able to measure how much its motors have moved and relate that to changes in the image features. Measuring motor changes is very fast, allowing for real-time 3D tracking.

Results show that the robot is able to keep track of a human face while performing 3D reconstruction. In the future, the authors hope to add zooming functionalities to their method.

Source: Michael Sapienza, Miles Hansard and Radu Horaud (2012) Real-time visuomotor update of an active binocular head, Autonomous Robots.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Robot Talk Episode 140 – Robot balance and agility, with Amir Patel

  16 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Amir Patel from University College London about designing robots with the agility and manoeuvrability of a cheetah.

Taking humanoid soccer to the next level: An interview with RoboCup trustee Alessandra Rossi

and   14 Jan 2026
Find out more about the forthcoming changes to the RoboCup soccer leagues.

Robots to navigate hiking trails

  12 Jan 2026
Find out more about work presented at IROS 2025 on autonomous hiking trail navigation via semantic segmentation and geometric analysis.

Robot Talk Episode 139 – Advanced robot hearing, with Christine Evers

  09 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Christine Evers from University of Southampton about helping robots understand the world around them through sound.

Meet the AI-powered robotic dog ready to help with emergency response

  07 Jan 2026
Built by Texas A&M engineering students, this four-legged robot could be a powerful ally in search-and-rescue missions.

MIT engineers design an aerial microrobot that can fly as fast as a bumblebee

  31 Dec 2025
With insect-like speed and agility, the tiny robot could someday aid in search-and-rescue missions.

Robohub highlights 2025

  29 Dec 2025
We take a look back at some of the interesting blog posts, interviews and podcasts that we've published over the course of the year.

The science of human touch – and why it’s so hard to replicate in robots

  24 Dec 2025
Trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence