Robohub.org
 

Social learning

by
29 August 2010



share this:

Robots are portrayed as tomorrows helpers, be it in schools, hospitals, workplaces or homes. Unfortunately, such robots won’t be truly useful out-of-the-box because of the complexity of real-world environments and tasks. Instead, they will need to learn how to interact with objects in their environment to produce a desired outcome (affordance learning).

For this purpose, robots can explore the world while using machine learning techniques to update their knowledge. However, the learning process is sometimes saturated with examples of objects, actions and effects that won’t help the robot in its purpose.

In these cases, humans or other social partners can help direct robot learning (social learning). Most studies have focussed on scenarios where a teacher demonstrates how to correctly do a task. The robot then imitates the teacher by reproducing the same actions to achieve the same goals.

This approach, while being very efficient, typically means that the teacher needs to take time to train the robot, which can be burdensome. Furthermore, the robot might be so specialized for the demonstrated scenario that it will have trouble performing tasks that slightly differ. In addition, imitation only works when the teacher and robot have similar motion constraints and morphologies.

Luckily, humans and animals use a large variety of mechanisms to learn from social partners. Tapping into this reservoir, Cakmak et al. propose mechanisms where:
– robots interact with the same objects as the social partner (stimulus enhancement)
– robots try to achieve the same effect on the same object as the social partner (emulation)
– robots reproduce the same action as the social partner (mimicking)

Experiments performed in simulation compare stimulus enhancement, emulation, mimicking, imitation and non-social learning in a large variety of situations. The results summarize which mechanisms are better suited for which scenarios in a series of very useful guidelines. Demonstrations with two robots, Jimmy and Jane, were done to validate the study. Don’t miss the excellent video below for a summary of the article.

In the future, Cakmak et al. will focus on combining learning approaches to harness the full potential of this rich set of mechanisms.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory





Related posts :



Sensing with purpose

Fadel Adib uses wireless technologies to sense the world in new ways, taking aim at sweeping problems such as food insecurity, climate change, and access to health care.
29 January 2023, by

Robot Talk Episode 34 – Interview with Sabine Hauert

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to Dr Sabine Hauert from the University of Bristol all about swarm robotics, nanorobots, and environmental monitoring.
28 January 2023, by

Special drone collects environmental DNA from trees

Researchers at ETH Zurich and the Swiss Federal research institute WSL have developed a flying device that can land on tree branches to take samples. This opens up a new dimension for scientists previously reserved for biodiversity researchers.
27 January 2023, by

The robots of CES 2023

Robots were on the main expo floor at CES this year, and these weren’t just cool robots for marketing purposes. I’ve been tracking robots at CES for more than 10 years, watching the transition from robot toys to real robots.
25 January 2023, by

Robot Talk Episode 33 – Interview with Dan Stoyanov

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to Professor Dan Stoyanov from University College London all about robotic vision, surgical robotics, and artificial intelligence.
20 January 2023, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association