Robohub.org
 

Data-driven grasping


by
12 September 2011



share this:

As robots enter our industries and homes, they will be required to manipulate a large diversity of objects with unknown shapes, sizes and orientations. One approach would be to have the robot spend time building a precise model of the object of interest and then performing an optimal grasp using inverse kinematics.

Instead, Goldfeder et al. propose data-driven grasping, a fast approach that does not require precise sensing. The idea is that the robot builds a database of possible grasps suitable for a large variety of shapes. When a new object is presented to the robot, it selects a shape from the database that is similar and performs the corresponding grasp. This matching phase can even be performed with partial sensor data.

Experiments were conducted both in simulation and using HERB, a home exploring robotic butler platform developed by Intel Research and CMU. HERB has a Barrett hand mounted on a Barrett WAM arm and is equipped with a 2 megapixel webcam, which is the only sensor used during trials. Results can be seen in the excellent video below showing the robot grasping toy planes, gloves and even a ukulele!

Just in case you want to build your own data-driven grasper, here are the main steps taken from the publication:

Step 1: Creating a grasp database of 3D models annotated with precomputed grasps and quality scores.
Step 2: Indexing the database for retrieval using partial 3D geometry.
Step 3: Finding matches in the database using only the sensor data, which is typically incomplete.
Step 4: Aligning the object to each of the matched models from the database.
Step 5: Selecting a grasp from the candidate grasps provided by the aligned matches.
Step 6: Executing the grasp and evaluating the results.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory


Subscribe to Robohub newsletter on substack



Related posts :

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

and   14 Apr 2026
Ross King created the first robot scientist back in 2009. He spoke to us about the nature of scientific discovery, the role AI has to play, and his recent work in DNA computing.

Robot Talk Episode 151 – Robots to study the ocean, with Simona Aracri

  10 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Simona Aracri from National Research Council of Italy about innovative robot designs for oceanography and environmental monitoring.

Generative AI improves a wireless vision system that sees through obstructions

  08 Apr 2026
With this new technique, a robot could more accurately detect hidden objects or understand an indoor scene using reflected Wi-Fi signals.

Resource-constrained image generation and visual understanding: an interview with Aniket Roy

  07 Apr 2026
Aniket tells us about his research exploring how modern generative models can be adapted to operate efficiently while maintaining strong performance.

Back to school: robots learn from factory workers

  02 Apr 2026
A Czech startup is making factory automation easier by letting workers teach robots new tasks through simple demonstrations instead of complex coding.

Resource-sharing boosts robotic resilience

  31 Mar 2026
When a modular robot shares power, sensing, and communication resources among its individual units, it is significantly more resistant to failure than traditional robotic systems.

Robot Talk Episode 150 – House building robots, with Vikas Enti

  27 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Vikas Enti from Reframe Systems about using robotics and automation to build climate-resilient, high-performance homes.

A history of RoboCup with Manuela Veloso

and   24 Mar 2026
Find out how RoboCup got started and how the competition has evolved, from one of the co-founders.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence