Robohub.org
 

Shelf-stocking robots with independent movement


by
23 September 2022



share this:

Robots that move about by themselves must be able to adapt to the dynamic and challenging conditions in a supermarket. Hernández Corbato says: “My research focuses on using artificial intelligence to make machines smarter and more reliable by teaching them symbolic knowledge. The goal is to develop robotic ‘brains’ for intelligent robots that can be trusted to work alongside people, because they can explain their decisions.”

A supermarket is typically a place where unexpected things happen all the time. Not only are there thousands of products with different shapes and looks, there are also people walking in and out. How can an independently operating machine handle this safely, efficiently and intelligently? By activating symbolic knowledge that we humans also use, says Hernández. “We recognize a tray with four legs underneath as a symbol: ‘table’. We don’t need a photo for it. When we encode such ‘symbol language’ and make it suitable for robots, they can perform more complex tasks.”

Researcher Carlos Hernández Corbato of the Department of Cognitive Robotics in the retail lab at RoboHouse.

One look with its camera eyes and the robot knows it is facing an object on which plates and cups can be placed. Based on this, it can decide what to do. The acceleration that this produces, should enable robots to perform multiple actions at the same time: navigate, pick up and move objects, and ultimately communicate with people.

Symbolic knowledge

For Hernández, the AI for Retail Lab research program of supermarket chain Ahold Delhaize brings together everything that fascinates him about artificial intelligence. Retail requires robots to use a broad diversity of skills: to perceive the environment, navigate around it, manipulate objects or collaborate with humans. For him, it’s all about the question of which algorithms are needed to make a machine respond just as intelligently as a human brain. As a specialist in software for autonomously operating robots, he already won the Amazon Picking Challenge in 2016 with a team from TU Delft. At this occasion a robotic arm placed products from a container in their place on a shelf.

“Retail requires robots to use a broad diversity of skills: to perceive the environment, navigate around it, manipulate objects or collaborate with humans.”

– Carlos Hernández Corbato, researcher

The ‘supermarket robot’ is even more challenging. It requires the leap from a static factory environment to the dynamics of a store. The traditional way, in which robots learn from the data they collect, is too cumbersome for that. The robot would already get stuck in stock management. Programming a customized robot treatment for every orange, bottle, soup can, milk carton or cucumber would be too much work. “We want to inject symbolic knowledge into the robot’s operating system all at once,” says Hernández. “If that knowledge is available, the robot can continuously adapt to his changing environment. For example, by downloading a different hand movement.”

The robot must be able to independently choose a different algorithm if it encounters a problem along the way. So that he can pick up a can that falls from his hands, or change the grip of his hand slightly when picking up an unknown object. The technicians have already set up a test shop where robot ‘Tiago’ can practice with it. In about five years’ time, it should deliver a machine with a mobile base, two arms and two camera eyes, which independently refills supermarket shelves 24 hours a day. And it must be able to do that under all circumstances, day and night.

Sensor broken

The latter does not only apply to the supermarket robot. In fact, every robot should have a next generation operating system to better cope with changing circumstances. Hernández Corbato:  “Beyond integrating different robot skills, cognitive skills for robots need to enable them to reason about those skills, to understand how they can use them, and what are the consequences of their own actions. In sum, we need to endow robots (or any intelligent autonomous system build) with self-awareness so that we can trust them.”

“We need to endow robots (or any intelligent autonomous system build) with self-awareness so that we can trust them.”

– Carlos Hernández Corbato, researcher

It is the core idea behind the European project Metacontrol for ROS2 systems (MROS) that the Cognitive Robotics department recently completed. The AI technique that Hernández used for this is called the metacontrol method. It describes the properties and skills of the robot in a structured way, so that the robot can use the knowledge to adapt and overcome problems.

As part of this research, he developed multiple prototypes of these next generation robots together with Bosch Corporate Research, Universidad Rey Juan Carlos, Universidad Politecnica de Madrid and IT University in Copenhagen.

Does it perform better than traditional robots? “Yes, he navigated more safely and, thanks to its symbolic knowledge, was able to adapt to the circumstances. When one sensor broke, it switched to another independently,” says Hernández enthusiastically. “That is where we want to go to: a robot with sufficient intelligence to deal with failures.”

(in the middle) Corrado Pezzato, PhD candidate at AIRLab , (right) Stefan Bonhof, research engineer and project manager at AIRLab in RoboHouse.

The post Shelf-stocking robots with independent movement appeared first on RoboHouse.



tags:


Joost van de Loo - Strategist at RoboHouse
Joost van de Loo - Strategist at RoboHouse





Related posts :



Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence