Robohub.org
 

Vision, force and touch for manipulation


by
22 July 2010



share this:

Manipulating objects is still a major challenge for robots in human-centered environments. To overcome this hurdle, Prats et al. propose to combine vision, force and tactile sensing to achieve robust and reliable manipulation with a robot arm fitted with a 3-finger hand (see video below).

Using three sensing modalities increases the robustness of the system, especially since each sensor taken alone has its shortcomings. For example, vision can be used to track a manipulated object and can therefor be used to control manipulation. However, vision is sometimes badly calibrated or occluded. Furthermore, forces applied to the robot arm can be measured to make sure the efforts are focussed in the right direction. However, if the robot does not have a good grip on the object it is manipulating, this might cause it to slip. Adding tactile sensing instead is useful to feel the object manipulated and readjust the position of the manipulator when errors occur.

To prove their point, Prats et al. test different combinations of all three sensor modalities on a tricky task for robots, opening a sliding door. In the end, it seems that a combination of vision, force and tactile sensing saves the day.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



MIT engineers design an aerial microrobot that can fly as fast as a bumblebee

  31 Dec 2025
With insect-like speed and agility, the tiny robot could someday aid in search-and-rescue missions.

Robohub highlights 2025

  29 Dec 2025
We take a look back at some of the interesting blog posts, interviews and podcasts that we've published over the course of the year.

The science of human touch – and why it’s so hard to replicate in robots

  24 Dec 2025
Trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is.

Bio-hybrid robots turn food waste into functional machines

  22 Dec 2025
EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.

Robot Talk Episode 138 – Robots in the environment, with Stefano Mintchev

  19 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Stefano Mintchev from ETH Zürich about robots to explore and monitor the natural environment.

Artificial tendons give muscle-powered robots a boost

  18 Dec 2025
The new design from MIT engineers could pump up many biohybrid builds.

Robot Talk Episode 137 – Getting two-legged robots moving, with Oluwami Dosunmu-Ogunbi

  12 Dec 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Oluwami Dosunmu-Ogunbi from Ohio Northern University about bipedal robots that can walk and even climb stairs.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence