Robohub.org
 

Q&A: Warehouse robots that feel by sight


by
27 July 2022



share this:

Ted Adelson. Photo courtesy of the Department of Brain and Cognitive Sciences.

By Kim Martineau | MIT Schwarzman College of Computing

More than a decade ago, Ted Adelson set out to create tactile sensors for robots that would give them a sense of touch. The result? A handheld imaging system powerful enough to visualize the raised print on a dollar bill. The technology was spun into GelSight, to answer an industry need for low-cost, high-resolution imaging.

An expert in both human and machine vision, Adelson was pleased to have created something useful. But he never lost sight of his original dream: to endow robots with a sense of touch. In a new Science Hub project with Amazon, he’s back on the case. He plans to build out the GelSight system with added capabilities to sense temperature and vibrations. A professor in MIT’s Department of Brain and Cognitive Sciences, Adelson recently sat down to talk about his work.

Q: What makes the human hand so hard to recreate in a robot?

A: A human finger has soft, sensitive skin, which deforms as it touches things. The question is how to get precise sensing when the sensing surface itself is constantly moving and changing during manipulation.

Q: You’re an expert on human and computer vision. How did touch grab your interest?

A: When my daughters were babies, I was amazed by how skillfully they used their fingers and hands to explore the world. I wanted to understand the way they were gathering information through their sense of touch. Being a vision researcher, I naturally looked for a way to do it with cameras.

Q: How does the GelSight robot finger work? What are its limitations?

A: A camera captures an image of the skin from inside, and a computer vision system calculates the skin’s 3D deformation. GelSight fingers offer excellent tactile acuity, far exceeding that of human fingers. However, the need for an inner optical system limits the sizes and shapes we can achieve today.

Q: How did you come up with the idea of giving a robot finger a sense of touch by, in effect, giving it sight?

A: A camera can tell you about the geometry of the surface it is viewing. By putting a tiny camera inside the finger, we can measure how the skin geometry is changing from point to point. This tells us about tactile properties like force, shape, and texture.

Q: How did your prior work on cameras figure in?

A: My prior research on the appearance of reflective materials helped me engineer the optical properties of the skin. We create a very thin matte membrane and light it with grazing illumination so all the details can be seen.

Q: Did you know there was a market for measuring 3D surfaces?

A: No. My postdoc Kimo Johnson posted a YouTube video showing GelSight’s capabilities about a decade ago. The video went viral, and we got a flood of email with interesting suggested applications. People have since used the technology for measuring the microtexture of shark skin, packed snow, and sanded surfaces. The FBI uses it in forensics to compare spent cartridge casings.

Q: What’s GelSight’s main application?  

A: Industrial inspection. For example, an inspector can press a GelSight sensor against a scratch or bump on an airplane fuselage to measure its exact size and shape in 3D. This application may seem quite different from the original inspiration of baby fingers, but it shows that tactile sensing can have many uses. As for robotics, tactile sensing is mainly a research topic right now, but we expect it to increasingly be useful in industrial robots.

Q: You’re now building in a way to measure temperature and vibrations. How do you do that with a camera? How else will you try to emulate human touch?

A: You can convert temperature to a visual signal that a camera can read by using liquid crystals, the molecules that make mood rings and forehead thermometers change color. For vibrations we will use microphones. We also want to extend the range of shapes a finger can have. Finally, we need to understand how to use the information coming from the finger to improve robotics.

Q: Why are we sensitive to temperature and vibrations, and why is that useful for robotics?

A: Identifying material properties is an important aspect of touch. Sensing temperature helps you tell whether something is metal or wood, and whether it is wet or dry. Vibrations can help you distinguish a slightly textured surface, like unvarnished wood, from a perfectly smooth surface, like wood with a glossy finish.

Q: What’s next?

A: Making a tactile sensor is the first step. Integrating it into a useful finger and hand comes next. Then you have to get the robot to use the hand to perform real-world tasks.

Q: Evolution gave us five fingers and two hands. Will robots have the same?

A: Different robots will have different kinds of hands, optimized for different situations. Big hands, small hands, hands with three fingers or six fingers, and hands we can’t even imagine today. Our goal is to provide the sensing capability, so that the robot can skillfully interact with the world.




MIT News





Related posts :



Robot Talk Episode 103 – Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.

Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association