Robohub.org
 

Vision, force and touch for manipulation


by
22 July 2010



share this:

Manipulating objects is still a major challenge for robots in human-centered environments. To overcome this hurdle, Prats et al. propose to combine vision, force and tactile sensing to achieve robust and reliable manipulation with a robot arm fitted with a 3-finger hand (see video below).

Using three sensing modalities increases the robustness of the system, especially since each sensor taken alone has its shortcomings. For example, vision can be used to track a manipulated object and can therefor be used to control manipulation. However, vision is sometimes badly calibrated or occluded. Furthermore, forces applied to the robot arm can be measured to make sure the efforts are focussed in the right direction. However, if the robot does not have a good grip on the object it is manipulating, this might cause it to slip. Adding tactile sensing instead is useful to feel the object manipulated and readjust the position of the manipulator when errors occur.

To prove their point, Prats et al. test different combinations of all three sensor modalities on a tricky task for robots, opening a sliding door. In the end, it seems that a combination of vision, force and tactile sensing saves the day.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.

Sony AI table tennis robot outplays elite human players

  22 Apr 2026
New robot and AI system has beaten professional and elite table tennis players.

AI system learns to keep warehouse robot traffic running smoothly

  20 Apr 2026
This new approach adapts to decide which robots should get the right of way at every moment, avoiding congestion and increasing throughput.

Robot Talk Episode 152 – Dexterous robot hands, with Rich Walker

  17 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Rich Walker from Shadow Robot Company about their advanced robotic hands for research and industry.

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

and   14 Apr 2026
Ross King created the first robot scientist back in 2009. He spoke to us about the nature of scientific discovery, the role AI has to play, and his recent work in DNA computing.

Robot Talk Episode 151 – Robots to study the ocean, with Simona Aracri

  10 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Simona Aracri from National Research Council of Italy about innovative robot designs for oceanography and environmental monitoring.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence