Robohub.org
 

Robots that can sort recycling

by
21 April 2019



share this:

RoCycle can detect if an object is paper, metal, or plastic. CSAIL researchers say that such a system could potentially help enable the convenience of single-stream recycling with lower contamination rates that confirm to China’s new recycling standards.
Photo: Jason Dorfman


By Adam Conner-Simons

Every year trash companies sift through an estimated 68 million tons of recycling, which is the weight equivalent of more than 30 million cars.

A key step in the process happens on fast-moving conveyor belts, where workers have to sort items into categories like paper, plastic and glass. Such jobs are dull, dirty, and often unsafe, especially in facilities where workers also have to remove normal trash from the mix.

With that in mind, a team led by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a robotic system that can detect if an object is paper, metal, or plastic.

The team’s “RoCycle” system includes a soft Teflon hand that uses tactile sensors on its fingertips to detect an object’s size and stiffness. Compatible with any robotic arm, RoCycle was found to be 85 percent accurate at detecting materials when stationary, and 63 percent accurate on an actual simulated conveyer belt. (Its most common error was identifying paper-covered metal tins as paper, which the team says would be improved by adding more sensors along the contact surface.)

“Our robot’s sensorized skin provides haptic feedback that allows it to differentiate between a wide range of objects, from the rigid to the squishy,” says MIT Professor Daniela Rus, senior author on a related paper that will be presented in April at the IEEE International Conference on Soft Robotics (RoboSoft) in Seoul, South Korea. “Computer vision alone will not be able to solve the problem of giving machines human-like perception, so being able to use tactile input is of vital importance.”

A collaboration with Yale University, RoCycle directly demonstrates the limits of sight-based sorting: It can reliably distinguish between two visually similar Starbucks cups, one made of paper and one made of plastic, that would give vision systems trouble.

Incentivizing recycling

Rus says that the project is part of her larger goal to reduce the back-end cost of recycling, in order to incentivize more cities and countries to create their own programs. Today recycling centers aren’t particularly automated; their main kinds of machinery include optical sorters that use different wavelength light to distinguish between plastics, magnetic sorters that separate out iron and steel products, and aluminum sorters that use eddy currents to remove non-magnetic metals.

This is a problem for one very big reason: just last month China raised its standards for the cleanliness of recycled goods it accepts from the United States, meaning that some of the country’s single-stream recycling is now sent to landfills.

“If a system like RoCycle could be deployed on a wide scale, we’d potentially be able to have the convenience of single-stream recycling with the lower contamination rates of multi-stream recycling,” says PhD student Lillian Chin, lead author on the new paper.

It’s surprisingly hard to develop machines that can distinguish between paper, plastic, and metal, which shows how impressive a feat it is for humans. When we pick up an object, we can immediately recognize many of its qualities even with our eyes closed, like whether it’s large and stiff or small and soft. By feeling the object and understanding how that relates to the softness of our own fingertips, we are able to learn how to handle a wide range of objects without dropping or breaking them.

This kind of intuition is tough to program into robots. Traditional hard (“rigid”) robot hands have to know an object’s exact location and size to be able to calculate a precise motion path. Soft hands made of materials like rubber are much more flexible, but have a different problem: Because they’re powered by fluidic forces, they have a balloon-like structure that can puncture quite easily.

How RoCycle works

Rus’ team used a motor-driven hand made of a relatively new material called “auxetics.” Most materials get narrower when pulled on, like a rubber band when you stretch it; auxetics, meanwhile, actually get wider. The MIT team took this concept and put a twist on it, quite literally: They created auxetics that, when cut, twist to either the left or right. Combining a “left-handed” and “right-handed” auxetic for each of the hand’s two large fingers makes them interlock and oppose each other’s rotation, enabling more dynamic movement. (The team calls this “handed-shearing auxetics”, or HSA.)

“In contrast to soft robots, whose fluid-driven approach requires air pumps and compressors, HSA combines twisting with extension, meaning that you’re able to use regular motors,” says Chin.

The team’s gripper first uses its “strain sensor” to estimate an object’s size, and then uses its two pressure sensors to measure the force needed to grasp an object. These metrics — along with calibration data on the size and stiffnesses of objects of different material types — are what gives the gripper a sense of what material the object is made. (Since the tactile sensors are also conductive, they can detect metal by how much it changes the electrical signal.)

“In other words, we estimate the size and measure the pressure difference between the current closed hand and what a normal open hand should look like,” says Chin. “We use this pressure difference and size to classify the specific object based on information about different objects that we’ve already measured.”

RoCycle builds on an set of sensors that detect the radius of an object to within 30 percent accuracy, and tell the difference between “hard” and “soft” objects with 78 percent accuracy. The team’s hand is also almost completely puncture resistant: It was able to be scraped by a sharp lid and punctured by a needle more than 20 times, with minimal structural damage.

As a next step, the researchers plan to build out the system so that it can combine tactile data with actual video data from a robot’s cameras. This would allow the team to further improve its accuracy and potentially allow for even more nuanced differentiation between different kinds of materials.

Chin and Rus co-wrote the RoCycle paper alongside MIT postdoc Jeffrey Lipton, as well as PhD student Michelle Yuen and Professor Rebecca Kramer-Bottiglio of Yale University.

This project was supported in part by Amazon, JD.com, the Toyota Research Institute, and the National Science Foundation.




MIT News





Related posts :



Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association