Robohub.org
 

Social learning


by
29 August 2010



share this:

Robots are portrayed as tomorrows helpers, be it in schools, hospitals, workplaces or homes. Unfortunately, such robots won’t be truly useful out-of-the-box because of the complexity of real-world environments and tasks. Instead, they will need to learn how to interact with objects in their environment to produce a desired outcome (affordance learning).

For this purpose, robots can explore the world while using machine learning techniques to update their knowledge. However, the learning process is sometimes saturated with examples of objects, actions and effects that won’t help the robot in its purpose.

In these cases, humans or other social partners can help direct robot learning (social learning). Most studies have focussed on scenarios where a teacher demonstrates how to correctly do a task. The robot then imitates the teacher by reproducing the same actions to achieve the same goals.

This approach, while being very efficient, typically means that the teacher needs to take time to train the robot, which can be burdensome. Furthermore, the robot might be so specialized for the demonstrated scenario that it will have trouble performing tasks that slightly differ. In addition, imitation only works when the teacher and robot have similar motion constraints and morphologies.

Luckily, humans and animals use a large variety of mechanisms to learn from social partners. Tapping into this reservoir, Cakmak et al. propose mechanisms where:
– robots interact with the same objects as the social partner (stimulus enhancement)
– robots try to achieve the same effect on the same object as the social partner (emulation)
– robots reproduce the same action as the social partner (mimicking)

Experiments performed in simulation compare stimulus enhancement, emulation, mimicking, imitation and non-social learning in a large variety of situations. The results summarize which mechanisms are better suited for which scenarios in a series of very useful guidelines. Demonstrations with two robots, Jimmy and Jane, were done to validate the study. Don’t miss the excellent video below for a summary of the article.

In the future, Cakmak et al. will focus on combining learning approaches to harness the full potential of this rich set of mechanisms.



tags:


Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory


Subscribe to Robohub newsletter on substack



Related posts :

Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

  05 Mar 2026
Bhoomika Gandhi discusses her work on an optical sensor for medical robotics applications.

Humanoid home robots are on the market – but do we really want them?

  03 Mar 2026
Last year, Norwegian-US tech company 1X announced “the world’s first consumer-ready humanoid robot designed to transform life at home”.

Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer

  27 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.

I developed an app that uses drone footage to track plastic litter on beaches

  26 Feb 2026
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively.

Translating music into light and motion with robots

  25 Feb 2026
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music

Robot Talk Episode 145 – Robotics and automation in manufacturing, with Agata Suwala

  20 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Agata Suwala from the Manufacturing Technology Centre about leveraging robotics to make manufacturing systems more sustainable.

Reversible, detachable robotic hand redefines dexterity

  19 Feb 2026
A robotic hand developed at EPFL has dual-thumbed, reversible-palm design that can detach from its robotic ‘arm’ to reach and grasp multiple objects.

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence