Robohub.org
 

Shelf-stocking robots with independent movement


by
23 September 2022



share this:

Robots that move about by themselves must be able to adapt to the dynamic and challenging conditions in a supermarket. Hernández Corbato says: “My research focuses on using artificial intelligence to make machines smarter and more reliable by teaching them symbolic knowledge. The goal is to develop robotic ‘brains’ for intelligent robots that can be trusted to work alongside people, because they can explain their decisions.”

A supermarket is typically a place where unexpected things happen all the time. Not only are there thousands of products with different shapes and looks, there are also people walking in and out. How can an independently operating machine handle this safely, efficiently and intelligently? By activating symbolic knowledge that we humans also use, says Hernández. “We recognize a tray with four legs underneath as a symbol: ‘table’. We don’t need a photo for it. When we encode such ‘symbol language’ and make it suitable for robots, they can perform more complex tasks.”

Researcher Carlos Hernández Corbato of the Department of Cognitive Robotics in the retail lab at RoboHouse.

One look with its camera eyes and the robot knows it is facing an object on which plates and cups can be placed. Based on this, it can decide what to do. The acceleration that this produces, should enable robots to perform multiple actions at the same time: navigate, pick up and move objects, and ultimately communicate with people.

Symbolic knowledge

For Hernández, the AI for Retail Lab research program of supermarket chain Ahold Delhaize brings together everything that fascinates him about artificial intelligence. Retail requires robots to use a broad diversity of skills: to perceive the environment, navigate around it, manipulate objects or collaborate with humans. For him, it’s all about the question of which algorithms are needed to make a machine respond just as intelligently as a human brain. As a specialist in software for autonomously operating robots, he already won the Amazon Picking Challenge in 2016 with a team from TU Delft. At this occasion a robotic arm placed products from a container in their place on a shelf.

“Retail requires robots to use a broad diversity of skills: to perceive the environment, navigate around it, manipulate objects or collaborate with humans.”

– Carlos Hernández Corbato, researcher

The ‘supermarket robot’ is even more challenging. It requires the leap from a static factory environment to the dynamics of a store. The traditional way, in which robots learn from the data they collect, is too cumbersome for that. The robot would already get stuck in stock management. Programming a customized robot treatment for every orange, bottle, soup can, milk carton or cucumber would be too much work. “We want to inject symbolic knowledge into the robot’s operating system all at once,” says Hernández. “If that knowledge is available, the robot can continuously adapt to his changing environment. For example, by downloading a different hand movement.”

The robot must be able to independently choose a different algorithm if it encounters a problem along the way. So that he can pick up a can that falls from his hands, or change the grip of his hand slightly when picking up an unknown object. The technicians have already set up a test shop where robot ‘Tiago’ can practice with it. In about five years’ time, it should deliver a machine with a mobile base, two arms and two camera eyes, which independently refills supermarket shelves 24 hours a day. And it must be able to do that under all circumstances, day and night.

Sensor broken

The latter does not only apply to the supermarket robot. In fact, every robot should have a next generation operating system to better cope with changing circumstances. Hernández Corbato:  “Beyond integrating different robot skills, cognitive skills for robots need to enable them to reason about those skills, to understand how they can use them, and what are the consequences of their own actions. In sum, we need to endow robots (or any intelligent autonomous system build) with self-awareness so that we can trust them.”

“We need to endow robots (or any intelligent autonomous system build) with self-awareness so that we can trust them.”

– Carlos Hernández Corbato, researcher

It is the core idea behind the European project Metacontrol for ROS2 systems (MROS) that the Cognitive Robotics department recently completed. The AI technique that Hernández used for this is called the metacontrol method. It describes the properties and skills of the robot in a structured way, so that the robot can use the knowledge to adapt and overcome problems.

As part of this research, he developed multiple prototypes of these next generation robots together with Bosch Corporate Research, Universidad Rey Juan Carlos, Universidad Politecnica de Madrid and IT University in Copenhagen.

Does it perform better than traditional robots? “Yes, he navigated more safely and, thanks to its symbolic knowledge, was able to adapt to the circumstances. When one sensor broke, it switched to another independently,” says Hernández enthusiastically. “That is where we want to go to: a robot with sufficient intelligence to deal with failures.”

(in the middle) Corrado Pezzato, PhD candidate at AIRLab , (right) Stefan Bonhof, research engineer and project manager at AIRLab in RoboHouse.

The post Shelf-stocking robots with independent movement appeared first on RoboHouse.



tags:


Joost van de Loo - Strategist at RoboHouse
Joost van de Loo - Strategist at RoboHouse





Related posts :

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.

Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh

  23 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.

Vine-inspired robotic gripper gently lifts heavy and fragile objects

  23 Jan 2026
The new design could be adapted to assist the elderly, sort warehouse products, or unload heavy cargo.

Robot Talk Episode 140 – Robot balance and agility, with Amir Patel

  16 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Amir Patel from University College London about designing robots with the agility and manoeuvrability of a cheetah.

Taking humanoid soccer to the next level: An interview with RoboCup trustee Alessandra Rossi

and   14 Jan 2026
Find out more about the forthcoming changes to the RoboCup soccer leagues.

Robots to navigate hiking trails

  12 Jan 2026
Find out more about work presented at IROS 2025 on autonomous hiking trail navigation via semantic segmentation and geometric analysis.

Robot Talk Episode 139 – Advanced robot hearing, with Christine Evers

  09 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Christine Evers from University of Southampton about helping robots understand the world around them through sound.

Meet the AI-powered robotic dog ready to help with emergency response

  07 Jan 2026
Built by Texas A&M engineering students, this four-legged robot could be a powerful ally in search-and-rescue missions.


Robohub is supported by:





 













©2026.01 - Association for the Understanding of Artificial Intelligence