Robohub.org
 

Robots are being programmed to adapt in real time


by
17 January 2019



share this:

In trials, the ResiBot robot learned to walk again in less than two minutes after one of its legs was removed. Image credit – Antoine Cully / Sorbonne University


By Gareth Willmer

It’s part of a field of work that is building machines that can provide real-time help using only limited data as input. Standard machine-learning algorithms often need to process thousands of possibilities before deciding on a solution, which may be impractical in pressurised scenarios where fast adaptation is critical.

After Japan’s Fukushima nuclear disaster in 2011, for example, robots were sent into the power plant to clear up radioactive debris in conditions far too dangerous for humans. The problem, says robotics researcher Professor Jean-Baptiste Mouret is that the robots kept breaking down or came across hazards that stopped them in their tracks.

As part of the ResiBots initiative, he is designing a lower-cost robot that can last long periods without needing constant human maintenance for breakages and are better at overcoming unexpected obstacles.

The ResiBots team is using what it refers to as micro-data learning algorithms, which can help robots adapt in front of one’s eyes in a similar way to how animals react to problems. An animal will, for example, often find a way to continue moving if they get injured, even if they don’t know exactly what the problem is.

In contrast, most current robots self-diagnose a problem before working out a way to overcome it, says Prof. Mouret, principal investigator at ResiBots and a senior researcher at the Inria research centre in France.

‘We’re trying to shortcut this by finding a way for them to react without necessarily having developed an understanding of what’s wrong,’ he said.

Rather than self-diagnosing, the aim for these robots is to learn in a proactive way by trial and error what alternative actions they can take. This could help them overcome difficulties and stop them from shutting down in situations such as disaster scenarios like Fukushima, said Prof. Mouret.

This may not be full artificial intelligence, but Prof. Mouret points out that having knowledge of everything is not essential for getting a robot to work.

‘We’re not trying to solve everything,’ he said. ‘I’m more interested in how they can adapt – and, in fact, adapting to what’s happening is some of what makes animals intelligent.’

Simulated childhood

In one of the most promising approaches developed in the ResiBots project, the robots have a simulated childhood, in which they learn different ways to move their body using an algorithm that searches ahead of time to collect examples of useful behaviours. 

This means that when seeking a way to move, the robots need to choose from one of about 13,000 behaviours rather than an estimated 1047 options that standard algorithms could select from. And the aim is for them to try only a handful of these before finding one that works.

Most of ResiBot’s tests are currently being carried out on a six-legged robot that seeks to find new ways to move after having one or more legs removed. In the latest trials, Prof. Mouret said the robots learned to walk in one to two minutes after one of their legs was taken off, meaning they generally need to test fewer than 10 behaviours before finding one that works.

Test robots can learn to overcome a broken leg in under two minutes. Video credit – Horizon

In total, the researchers are working on half a dozen robots at varying levels of complexity including a child-like humanoid robot known as iCub. Though the much more complex iCub is not yet being used in many trials, the team hopes to do so more over time.

‘Humanoids have the potential of being highly versatile and adapting well to environments designed for humans,’ said Prof. Mouret. ‘For instance, nuclear power plants have doors, levers and ladders that were designed for people.’

There are, however, some big challenges still to overcome, including the fact that a robot needs to be moved back to its starting position once a limb is removed, rather than being able to carry on from the injury site towards the target.

Safety

There are also wider safety issues involving such robots – for example, ensuring that they do not harm earthquake survivors while rescuing them, particularly if the robot is learning by trial and error, said Prof. Mouret.

He believes it will be at least four or five years before such a robot could be used in the field, but is hopeful that the techniques can eventually be employed in all types of robot – not just those for disaster situations, but in the home and other scenarios.

But it’s not just mechanics that can help robots navigate the real world. Robots may also adapt better if they can more strongly connect language to reality. 

Professor Gemma Boleda at the Universitat Pompeu Fabra in Spain, has a background in linguistics and her team is trying to link research in this field to artificial intelligence to help machines better understand the world around them, as part of a project called AMORE.

It’s something that could be useful for making technology such as GPS more intelligent. For example, when driving in a car, the GPS system could specify that you turn right where ‘the big tree’ is, distinguishing it from several other trees.

Prof. Boleda says this has been hard to do in the past because of the difficulty of modelling the way humans link language with reality.

‘In the past, language had largely been represented out of context,’ said Prof. Boleda.

AMORE’s aim is to get computers to understand words and concepts in a real-world context rather than as individual words in isolation, she says. For instance, a robot would learn to connect the phrase ‘this dog’ with an actual dog in the room, representing both the words and the real-world entities.

‘The crux of these models is that they are able to learn their own representations from data,’ she added. ‘Before, researchers had to tell the machine what the world looked like.’

Giving machines a better understanding of the world around them will help them do ‘more with less’ in terms of the amount of data they need and get better at predicting outcomes, Prof. Boleda said.

It could also help with the issue of having enough physical space on devices like mobile phones for the next wave of intelligent applications.

‘I am working with language, but this problem of needing a lot of data is a problem that plagues many other domains of artificial intelligence,’ said Prof. Boleda. ‘So if I develop methods that can do more with less, then these can also be applied elsewhere.’

The research in this article was funded by the EU’s European Research Council.



tags:


Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.
Horizon Magazine brings you the latest news and features about thought-provoking science and innovative research projects funded by the EU.





Related posts :



Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.

Congratulations to the #ICRA2025 best paper award winners

  27 May 2025
The winners and finalists in the different categories have been announced.

#ICRA2025 social media round-up

  23 May 2025
Find out what the participants got up to at the International Conference on Robotics & Automation.

Robot Talk Episode 122 – Bio-inspired flying robots, with Jane Pauline Ramos Ramirez

  23 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Jane Pauline Ramos Ramirez from Delft University of Technology about drones that can move on land and in the air.

Robot Talk Episode 121 – Adaptable robots for the home, with Lerrel Pinto

  16 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Lerrel Pinto from New York University about using machine learning to train robots to adapt to new environments.

What’s coming up at #ICRA2025?

  16 May 2025
Find out what's in store at the IEEE International Conference on Robotics & Automation, which will take place from 19-23 May.

Robot see, robot do: System learns after watching how-tos

  14 May 2025
Researchers have developed a new robotic framework that allows robots to learn tasks by watching a how-to video



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence