Robohub.org
 

Soft robotic gripper can pick up and identify wide array of objects

by
02 October 2015



share this:
Team's silicone rubber gripper can pick up egg, CD & paper, and identify objects by touch alone

Team’s silicone rubber gripper can pick up egg, CD & paper, and identify objects by touch alone

By Adam Conner-Simons, MIT CSAIL

Robots have many strong suits, but delicacy traditionally hasn’t been one of them. Rigid limbs and digits make it difficult for them to grasp, hold, and manipulate a range of everyday objects without dropping or crushing them.

Recently, CSAIL researchers have discovered that the solution may be to turn to a substance more commonly associated with new buildings and Silly Putty: silicone.

At a conference this month, researchers from CSAIL Director Daniela Rus’ Distributed Robotics Lab demonstrated a 3-D-printed robotic hand made out of silicone rubber that can lift and handle objects as delicate as an egg and as thin as a compact disc.

Just as impressively, its three fingers have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items.

“Robots are often limited in what they can do because of how hard it is to interact with objects of different sizes and materials,” Rus says. “Grasping is an important step in being able to do useful tasks; with this work we set out to develop both the soft hands and the supporting control and planning systems that make dynamic grasping possible.”

The paper, which was co-written by Rus and graduate student Bianca Homberg, PhD candidate Robert Katzschmann, and postdoc Mehmet Dogar, will be presented at this month’s International Conference on Intelligent Robots and Systems.

 

The hard science of soft robots

The gripper, which can also pick up such items as a tennis ball, a Rubik’s cube and a Beanie Baby, is part of a larger body of work out of Rus’ lab at CSAIL aimed at showing the value of so-called “soft robots” made of unconventional materials such as silicone, paper, and fiber.

Researchers say that soft robots have a number of advantages over “hard” robots, including the ability to handle irregularly-shaped objects, squeeze into tight spaces, and readily recover from collisions.

“A robot with rigid hands will have much more trouble with tasks like picking up an object,” Homberg says. “This is because it has to have a good model of the object and spend a lot of time thinking about precisely how it will perform the grasp.”

Soft robots represent an intriguing new alternative. However, one downside to their extra flexibility (or “compliance”) is that they often have difficulty accurately measuring where an object is, or even if they have successfully picked it up at all.

That’s where the CSAIL team’s “bend sensors” come in. When the gripper hones in an object, the fingers send back location data based on their curvature. Using this data, the robot can pick up an unknown object and compare it to the existing clusters of data points that represent past objects. With just three data points from a single grasp, the robot’s algorithms can distinguish between objects as similar in size as a cup and a lemonade bottle.

“As a human, if you’re blindfolded and you pick something up, you can feel it and still understand what it is,” says Katzschmann. “We want to develop a similar skill in robots — essentially, giving them ‘sight’ without them actually being able to see.”

The team is hopeful that, with further sensor advances, the system could eventually identify dozens of distinct objects, and be programmed to interact with them differently depending on their size, shape, and function.

 

How it works

Researchers control the gripper via a series of pistons that push pressurized air through the silicone fingers. The pistons cause little bubbles to expand in the fingers, spurring them to stretch and bend.

The hand can grip using two types of grasps: “enveloping grasps,” where the object is entirely contained within the gripper, and “pinch grasps,” where the object is held by the tips of the fingers.

Outfitted for the popular Baxter manufacturing robot, the gripper significantly outperformed Baxter’s default gripper, which was unable to pick up a CD or piece of paper and was prone to completely crushing items like a soda can.

Like Rus’ previous robotic arm, the fingers are made of silicone rubber, which was chosen because of its qualities of being both relatively stiff, but also flexible enough to expand with the pressure from the pistons. Meanwhile, the gripper’s interface and exterior finger-molds are 3-D-printed, which means the system will work on virtually any robotic platform.

In the future, Rus says the team plans to put more time into improving and adding more sensors that will allow the gripper to identify a wider variety of objects.

“If we want robots in human-centered environments, they need to be more adaptive and able to interact with objects whose shape and placement are not precisely known,” Rus says. “Our dream is to develop a robot that, like a human, can approach an unknown object, big or small, determine its approximate shape and size, and figure out how to interface with it in one seamless motion.”

This work was done in the Distributed Robotics Laboratory at MIT with support from The Boeing Company and the National Science Foundation.

RELATED

Paper: “Haptic Identification of Objects using a Modular Soft Robotic Gripper”

Bianca Homberg

Daniela Rus

Distributed Robotics Lab

[article on csail.mit.edu]



tags: , , ,


CSAIL MIT The Computer Science and Artificial Intelligence Laboratory – known as CSAIL ­– is the largest research laboratory at MIT and one of the world’s most important centers of information technology research.
CSAIL MIT The Computer Science and Artificial Intelligence Laboratory – known as CSAIL ­– is the largest research laboratory at MIT and one of the world’s most important centers of information technology research.





Related posts :



Robot Talk Episode 92 – Gisela Reyes-Cruz

In the latest episode of the Robot Talk podcast, Claire chatted to Gisela Reyes-Cruz from the University of Nottingham about how humans interact with, trust and accept robots.
04 October 2024, by

Robot Talk Episode 91 – John Leonard

In the latest episode of the Robot Talk podcast, Claire chatted to John Leonard from Massachusetts Institute of Technology about autonomous navigation for underwater vehicles and self-driving cars. 
27 September 2024, by

Interview with Jerry Tan: Service robot development for education

We find out about the Jupiter2 platform and how it can be used in educational settings.
18 September 2024, by

#RoboCup2024 – daily digest: 21 July

In the last of our digests, we report on the closing day of competitions in Eindhoven.
21 July 2024, by and

#RoboCup2024 – daily digest: 20 July

In the second of our daily round-ups, we bring you a taste of the action from Eindhoven.
20 July 2024, by and

#RoboCup2024 – daily digest: 19 July

Welcome to the first of our daily round-ups from RoboCup2024 in Eindhoven.
19 July 2024, by and





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association