Robohub.org
 

Data-driven grasping


by
12 September 2011



share this:

As robots enter our industries and homes, they will be required to manipulate a large diversity of objects with unknown shapes, sizes and orientations. One approach would be to have the robot spend time building a precise model of the object of interest and then performing an optimal grasp using inverse kinematics.

Instead, Goldfeder et al. propose data-driven grasping, a fast approach that does not require precise sensing. The idea is that the robot builds a database of possible grasps suitable for a large variety of shapes. When a new object is presented to the robot, it selects a shape from the database that is similar and performs the corresponding grasp. This matching phase can even be performed with partial sensor data.

Experiments were conducted both in simulation and using HERB, a home exploring robotic butler platform developed by Intel Research and CMU. HERB has a Barrett hand mounted on a Barrett WAM arm and is equipped with a 2 megapixel webcam, which is the only sensor used during trials. Results can be seen in the excellent video below showing the robot grasping toy planes, gloves and even a ukulele!

Just in case you want to build your own data-driven grasper, here are the main steps taken from the publication:

Step 1: Creating a grasp database of 3D models annotated with precomputed grasps and quality scores.
Step 2: Indexing the database for retrieval using partial 3D geometry.
Step 3: Finding matches in the database using only the sensor data, which is typically incomplete.
Step 4: Aligning the object to each of the matched models from the database.
Step 5: Selecting a grasp from the candidate grasps provided by the aligned matches.
Step 6: Executing the grasp and evaluating the results.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Livestream of RoboCup2025

  18 Jul 2025
Watch the competition live from Salvador!

Tackling the 3D Simulation League: an interview with Klaus Dorer and Stefan Glaser

and   15 Jul 2025
With RoboCup2025 starting today, we found out more about the 3D simulation league, and the new simulator they have in the works.

An interview with Nicolai Ommer: the RoboCupSoccer Small Size League

and   01 Jul 2025
We caught up with Nicolai to find out more about the Small Size League, how the auto referees work, and how teams use AI.

RoboCupRescue: an interview with Adam Jacoff

and   25 Jun 2025
Find out what's new in the RoboCupRescue League this year.

Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence