Robohub.org
 

Data-driven grasping

by
12 September 2011



share this:

As robots enter our industries and homes, they will be required to manipulate a large diversity of objects with unknown shapes, sizes and orientations. One approach would be to have the robot spend time building a precise model of the object of interest and then performing an optimal grasp using inverse kinematics.

Instead, Goldfeder et al. propose data-driven grasping, a fast approach that does not require precise sensing. The idea is that the robot builds a database of possible grasps suitable for a large variety of shapes. When a new object is presented to the robot, it selects a shape from the database that is similar and performs the corresponding grasp. This matching phase can even be performed with partial sensor data.

Experiments were conducted both in simulation and using HERB, a home exploring robotic butler platform developed by Intel Research and CMU. HERB has a Barrett hand mounted on a Barrett WAM arm and is equipped with a 2 megapixel webcam, which is the only sensor used during trials. Results can be seen in the excellent video below showing the robot grasping toy planes, gloves and even a ukulele!

Just in case you want to build your own data-driven grasper, here are the main steps taken from the publication:

Step 1: Creating a grasp database of 3D models annotated with precomputed grasps and quality scores.
Step 2: Indexing the database for retrieval using partial 3D geometry.
Step 3: Finding matches in the database using only the sensor data, which is typically incomplete.
Step 4: Aligning the object to each of the matched models from the database.
Step 5: Selecting a grasp from the candidate grasps provided by the aligned matches.
Step 6: Executing the grasp and evaluating the results.




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



#RoboCup2024 – daily digest: 21 July

In the last of our digests, we report on the closing day of competitions in Eindhoven.
21 July 2024, by and

#RoboCup2024 – daily digest: 20 July

In the second of our daily round-ups, we bring you a taste of the action from Eindhoven.
20 July 2024, by and

#RoboCup2024 – daily digest: 19 July

Welcome to the first of our daily round-ups from RoboCup2024 in Eindhoven.
19 July 2024, by and

Robot Talk Episode 90 – Robotically Augmented People

In this special live recording at the Victoria and Albert Museum, Claire chatted to Milia Helena Hasbani, Benjamin Metcalfe, and Dani Clode about robotic prosthetics and human augmentation.
21 June 2024, by

Robot Talk Episode 89 – Simone Schuerle

In the latest episode of the Robot Talk podcast, Claire chatted to Simone Schuerle from ETH Zürich all about microrobots, medicine and science.
14 June 2024, by

Robot Talk Episode 88 – Lord Ara Darzi

In the latest episode of the Robot Talk podcast, Claire chatted to Lord Ara Darzi from Imperial College London all about robotic surgery - past, present and future.
07 June 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association