Robohub.org
 

Tracking 3D objects in real-time using active stereo vision

by
10 November 2012



share this:

In a recent publications in the Journal Autonomous Robots, a team of researchers from the UK, Italy and France propose a new technique that allows an active binocular robot to fixate and track objects while performing 3D reconstruction in real-time.

Humans have the ability to track objects by turning their head to and gazing at areas of interest. Integrating images from both eyes provides depth information that allows us to represent 3D objects. Such feats could prove useful in robotic systems with similar vision functionalities. POPEYE, shown in the image to the left, is able to independently move its head and two cameras used for stereo vision.

To perform 3D reconstruction of object features, the robot needs to know the spatial relationship between its two cameras. For this purpose, Sapienza et al. calibrate the robot vision system before the experiment by placing cards with known patterns in the environment and systematically moving the camera motors to learn how these motor changes impact the images captured. After calibration, and thanks to some math (homography-based method), the robot is able to measure how much its motors have moved and relate that to changes in the image features. Measuring motor changes is very fast, allowing for real-time 3D tracking.

Results show that the robot is able to keep track of a human face while performing 3D reconstruction. In the future, the authors hope to add zooming functionalities to their method.

Source: Michael Sapienza, Miles Hansard and Radu Horaud (2012) Real-time visuomotor update of an active binocular head, Autonomous Robots.



tags: , ,


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Estimating manipulation intentions to ease teleoperation

Introducing an intention estimation model that relies on both gaze and motion features.
06 December 2022, by and

Countering Luddite politicians with life (and cost) saving machines

Beyond aerial tricks, drones are now being deployed in novel ways to fill the labor gap of menial jobs that have not returned since the pandemic.
04 December 2022, by

Call for robot holiday videos 2022

That’s right! You better not run, you better not hide, you better watch out for brand new robot holiday videos on Robohub!
02 December 2022, by

The Utah Bionic Leg: A motorized prosthetic for lower-limb amputees

Lenzi’s Utah Bionic Leg uses motors, processors, and advanced artificial intelligence that all work together to give amputees more power to walk, stand-up, sit-down, and ascend and descend stairs and ramps.

Touch sensing: An important tool for mobile robot navigation

Proximal sensing often is a blind spot for most long range sensors such as cameras and lidars for which touch sensors could serve as a complementary modality.
29 November 2022, by

Study: Automation drives income inequality

New data suggest most of the growth in the wage gap since 1980 comes from automation displacing less-educated workers.
27 November 2022, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association