Robohub.org
 

Understanding social robotics

by
24 January 2017



share this:
designingjibo

Pepper, Jibo and Milo make up the first generation of social robots, leading what promises to be a cohort with diverse capabilities and future applications. But what are social robots and what should they be able to do? This article gives an overview of theories that can help us understand social robotics better.



What is a social robot?

I most like the definition which describes social robots as robots for which social interaction plays a key role. So social skills should be needed by the robot to enable them to perform a function. A survey of socially interactive robots defines some key characteristics that of this type of robot. A social robot should show emotions, have capabilities to converse on an advanced level, understand the mental models of their social partners, form social relationships, make use of natural communication cues, show personality and learn social capabilities.

Understanding Social Robots offers another interesting perspective of what a social robot is: Social robot = robot + social interface

In this definition, the robot has its own purpose outside of the social aspect. Examples of this can be care robots, cleaning robots in our homes, service desk robots at an airport or mall information desk or chef robots in a cafeteria. The social interface is simply a kind of familiar protocol which makes it easy for us to communicate effectively with the robot. Social cues can give us insight into the intention of a robot, for example shifting gaze towards a mop gives a clue that the robot is about to change activity, although they might not have eyes in the classical sense.

These indicators of social capability can be as useful as actual social ability and drivers in the robot. As studies show, children are able to project social capabilities onto simple inanimate objects like a calculator. A puppet becomes an animated social partner during play. In the same way, robots only have to have the appearance of sociability to be effective communicators. An Ethical Evaluation of Human–Robot Relationships confirms this idea. We have a need to belong, which causes us to form emotional connections to artificial beings and to search for meaning in these relationships.


How should social robots look?

Masahiro Mori defined the Uncanny Valley theory in 1970, in a paper on the subject. He describes the effects of robot appearance and robot movement on our affinity to the robot. In general, we seem to prefer robots to look more like humans and less like robots. There is a certain point at which robots look both human-like and robot-like, and it becomes confusing for us to categorise them. This is the Uncanny Valley – where robot appearance looks very human but also looks a bit ‘wrong’, which makes us uncomfortable. If robot appearance gets past that point, and looks more human, likeability goes up dramatically.

In Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley (2) we learn that there is a similar effect between robot appearance and trustworthiness of a robot. Robots that showed more positive emotions were also more likeable. So it seems like more human looking robots would lead to more trust and likeability.

Up to this point, we assume that social robots should be humanoid or robotic. But what other forms can robots take? The robot should at least have a face to give it an identity and make it an individual. Further, with a face, you can indicate attention and imitate the social partner to improve communication. Most non-verbal cues are relayed through the face and creates an expectation of how to engage with the robot.

The appearance of a robot can help set people’s expectations of what they should be capable of, and limit those expectations to some focused functions which can be more easily achieved. For example, a bartender robot can be expected to be able to have a good conversation and serve drinks, take payment, but probably it’s ok if it can only speak one language, as it only has to fit the context it’s in.

robot-ces2017

In Why Every Robot at CES Looks Alike, we learn that Jibo’s oversized, round head is designed to mimic the proportions of a young animal or human to make it more endearing. It has one eye to prevent it from triggering the Uncanny Valley effect by looking too robotic and human at the same time. Also, appearing too human-like creates the impression that the robot will respond like a human, while they are not yet capable.

Another interesting example is of Robin, a Nao robot being used to teach children with diabetes how to manage their illness (6). The explanation given to the children is that Robin is a toddler. The children use this role to explain any imperfections in Robin’s speech capabilities.


Different levels of social interaction for robots

A survey of socially interactive robots contains some useful concepts in defining levels of social behaviour in robots:

  • Socially evocative: Do not show any social capabilities but rely on human tendency to project social capabilities.
  • Social interface: Mimic social norms, without actually being driven by them.
  • Socially receptive: Understand social input enough to learn by imitation but do not see social contact.
  • Sociable: Have social drivers and seek social contact.
  • Socially situated: Can function in a social environment and can distinguish between social and non-social entities.
  • Socially embedded: Are aware of social norms and patterns.
  • Socially intelligent: Show human levels of social understanding and awareness based on models of human cognition.

Clearly, social behaviour is nuanced and complex. But to come back to the previous points, social robots can still make themselves effective without reaching the highest levels of social accomplishment.


Effect of social robots on us

To close, de Graaf poses a thought-provoking question (4):

How will we share our world with these new social technologies and how will a future robot society change who we are, how we act and interact—not only with robots but also with each other?

It seems that we will first and foremost shape robots by our own human social patterns and needs. But we cannot help but be changed as individuals and a society when we finally add a more sophisticated layer of robotic social partners in the future.


If you enjoyed this article, you may also enjoy:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: , , , , , ,


Thosha Moodley Thosha is a robotics and artificial intelligence enthusiast...
Thosha Moodley Thosha is a robotics and artificial intelligence enthusiast...





Related posts :



Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association