Robohub.org
 

Data-driven grasping


by
12 September 2011



share this:

As robots enter our industries and homes, they will be required to manipulate a large diversity of objects with unknown shapes, sizes and orientations. One approach would be to have the robot spend time building a precise model of the object of interest and then performing an optimal grasp using inverse kinematics.

Instead, Goldfeder et al. propose data-driven grasping, a fast approach that does not require precise sensing. The idea is that the robot builds a database of possible grasps suitable for a large variety of shapes. When a new object is presented to the robot, it selects a shape from the database that is similar and performs the corresponding grasp. This matching phase can even be performed with partial sensor data.

Experiments were conducted both in simulation and using HERB, a home exploring robotic butler platform developed by Intel Research and CMU. HERB has a Barrett hand mounted on a Barrett WAM arm and is equipped with a 2 megapixel webcam, which is the only sensor used during trials. Results can be seen in the excellent video below showing the robot grasping toy planes, gloves and even a ukulele!

Just in case you want to build your own data-driven grasper, here are the main steps taken from the publication:

Step 1: Creating a grasp database of 3D models annotated with precomputed grasps and quality scores.
Step 2: Indexing the database for retrieval using partial 3D geometry.
Step 3: Finding matches in the database using only the sensor data, which is typically incomplete.
Step 4: Aligning the object to each of the matched models from the database.
Step 5: Selecting a grasp from the candidate grasps provided by the aligned matches.
Step 6: Executing the grasp and evaluating the results.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory


Subscribe to Robohub newsletter on substack



Related posts :

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.

Restoring surgeons’ sense of touch with robotic fingertips

  10 Mar 2026
Researchers are developing robotic “fingertips” that could give surgeons back their sense of touch during minimally invasive and robotic operations.

Robot Talk Episode 147 – Miniature living robots, with Maria Guix

  06 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties.

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence