As robots enter our industries and homes, they will be required to manipulate a large diversity of objects with unknown shapes, sizes and orientations. One approach would be to have the robot spend time building a precise model of the object of interest and then performing an optimal grasp using inverse kinematics.
Instead, Goldfeder et al. propose data-driven grasping, a fast approach that does not require precise sensing. The idea is that the robot builds a database of possible grasps suitable for a large variety of shapes. When a new object is presented to the robot, it selects a shape from the database that is similar and performs the corresponding grasp. This matching phase can even be performed with partial sensor data.
Experiments were conducted both in simulation and using HERB, a home exploring robotic butler platform developed by Intel Research and CMU. HERB has a Barrett hand mounted on a Barrett WAM arm and is equipped with a 2 megapixel webcam, which is the only sensor used during trials. Results can be seen in the excellent video below showing the robot grasping toy planes, gloves and even a ukulele!
Just in case you want to build your own data-driven grasper, here are the main steps taken from the publication:
Step 1: Creating a grasp database of 3D models annotated with precomputed grasps and quality scores.
Step 2: Indexing the database for retrieval using partial 3D geometry.
Step 3: Finding matches in the database using only the sensor data, which is typically incomplete.
Step 4: Aligning the object to each of the matched models from the database.
Step 5: Selecting a grasp from the candidate grasps provided by the aligned matches.
Step 6: Executing the grasp and evaluating the results.