Robohub.org
 

Soft robotic gripper can pick up and identify wide array of objects


by
02 October 2015



share this:
Team's silicone rubber gripper can pick up egg, CD & paper, and identify objects by touch alone

Team’s silicone rubber gripper can pick up egg, CD & paper, and identify objects by touch alone

By Adam Conner-Simons, MIT CSAIL

Robots have many strong suits, but delicacy traditionally hasn’t been one of them. Rigid limbs and digits make it difficult for them to grasp, hold, and manipulate a range of everyday objects without dropping or crushing them.

Recently, CSAIL researchers have discovered that the solution may be to turn to a substance more commonly associated with new buildings and Silly Putty: silicone.

At a conference this month, researchers from CSAIL Director Daniela Rus’ Distributed Robotics Lab demonstrated a 3-D-printed robotic hand made out of silicone rubber that can lift and handle objects as delicate as an egg and as thin as a compact disc.

Just as impressively, its three fingers have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items.

“Robots are often limited in what they can do because of how hard it is to interact with objects of different sizes and materials,” Rus says. “Grasping is an important step in being able to do useful tasks; with this work we set out to develop both the soft hands and the supporting control and planning systems that make dynamic grasping possible.”

The paper, which was co-written by Rus and graduate student Bianca Homberg, PhD candidate Robert Katzschmann, and postdoc Mehmet Dogar, will be presented at this month’s International Conference on Intelligent Robots and Systems.

 

The hard science of soft robots

The gripper, which can also pick up such items as a tennis ball, a Rubik’s cube and a Beanie Baby, is part of a larger body of work out of Rus’ lab at CSAIL aimed at showing the value of so-called “soft robots” made of unconventional materials such as silicone, paper, and fiber.

Researchers say that soft robots have a number of advantages over “hard” robots, including the ability to handle irregularly-shaped objects, squeeze into tight spaces, and readily recover from collisions.

“A robot with rigid hands will have much more trouble with tasks like picking up an object,” Homberg says. “This is because it has to have a good model of the object and spend a lot of time thinking about precisely how it will perform the grasp.”

Soft robots represent an intriguing new alternative. However, one downside to their extra flexibility (or “compliance”) is that they often have difficulty accurately measuring where an object is, or even if they have successfully picked it up at all.

That’s where the CSAIL team’s “bend sensors” come in. When the gripper hones in an object, the fingers send back location data based on their curvature. Using this data, the robot can pick up an unknown object and compare it to the existing clusters of data points that represent past objects. With just three data points from a single grasp, the robot’s algorithms can distinguish between objects as similar in size as a cup and a lemonade bottle.

“As a human, if you’re blindfolded and you pick something up, you can feel it and still understand what it is,” says Katzschmann. “We want to develop a similar skill in robots — essentially, giving them ‘sight’ without them actually being able to see.”

The team is hopeful that, with further sensor advances, the system could eventually identify dozens of distinct objects, and be programmed to interact with them differently depending on their size, shape, and function.

 

How it works

Researchers control the gripper via a series of pistons that push pressurized air through the silicone fingers. The pistons cause little bubbles to expand in the fingers, spurring them to stretch and bend.

The hand can grip using two types of grasps: “enveloping grasps,” where the object is entirely contained within the gripper, and “pinch grasps,” where the object is held by the tips of the fingers.

Outfitted for the popular Baxter manufacturing robot, the gripper significantly outperformed Baxter’s default gripper, which was unable to pick up a CD or piece of paper and was prone to completely crushing items like a soda can.

Like Rus’ previous robotic arm, the fingers are made of silicone rubber, which was chosen because of its qualities of being both relatively stiff, but also flexible enough to expand with the pressure from the pistons. Meanwhile, the gripper’s interface and exterior finger-molds are 3-D-printed, which means the system will work on virtually any robotic platform.

In the future, Rus says the team plans to put more time into improving and adding more sensors that will allow the gripper to identify a wider variety of objects.

“If we want robots in human-centered environments, they need to be more adaptive and able to interact with objects whose shape and placement are not precisely known,” Rus says. “Our dream is to develop a robot that, like a human, can approach an unknown object, big or small, determine its approximate shape and size, and figure out how to interface with it in one seamless motion.”

This work was done in the Distributed Robotics Laboratory at MIT with support from The Boeing Company and the National Science Foundation.

RELATED

Paper: “Haptic Identification of Objects using a Modular Soft Robotic Gripper”

Bianca Homberg

Daniela Rus

Distributed Robotics Lab

[article on csail.mit.edu]



tags: , , ,


CSAIL MIT The Computer Science and Artificial Intelligence Laboratory – known as CSAIL ­– is the largest research laboratory at MIT and one of the world’s most important centers of information technology research.
CSAIL MIT The Computer Science and Artificial Intelligence Laboratory – known as CSAIL ­– is the largest research laboratory at MIT and one of the world’s most important centers of information technology research.





Related posts :



Robot Talk Episode 119 – Robotics for small manufacturers, with Will Kinghorn

  02 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Will Kinghorn from Made Smarter about how to increase adoption of new tech by small manufacturers.

Multi-agent path finding in continuous environments

  01 May 2025
How can a group of agents minimise their journey length whilst avoiding collisions?

Interview with Yuki Mitsufuji: Improving AI image generation

  29 Apr 2025
Find out about two pieces of research tackling different aspects of image generation.

Robot Talk Episode 118 – Soft robotics and electronic skin, with Miranda Lowther

  25 Apr 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Miranda Lowther from the University of Bristol about soft, sensitive electronic skin for prosthetic limbs.

Interview with Amina Mević: Machine learning applied to semiconductor manufacturing

  17 Apr 2025
Find out how Amina is using machine learning to develop an explainable multi-output virtual metrology system.

Robot Talk Episode 117 – Robots in orbit, with Jeremy Hadall

  11 Apr 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Jeremy Hadall from the Satellite Applications Catapult about robotic systems for in-orbit servicing, assembly, and manufacturing.

Robot Talk Episode 116 – Evolved behaviour for robot teams, with Tanja Kaiser

  04 Apr 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Tanja Katharina Kaiser from the University of Technology Nuremberg about how applying evolutionary principles can help robot teams make better decisions.

AI can be a powerful tool for scientists. But it can also fuel research misconduct

  31 Mar 2025
While AI is allowing scientists to make technological breakthroughs, there’s also a darker side to the use of AI in science: scientific misconduct is on the rise.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence