Robohub.org
 

Would you feel sorry for a simulated robot? Study shows people empathize more with the real thing

poor-Robot_simulation_realIn the future, robots will be in our daily lives: homes, schools, offices, factories, even restaurants. The field of social robotics looks at how these robots can interact with people socially, just like other people do. This makes robots easy to understand and more comfortable to be around. Cute cuddly robots like Paro even help sick people feel better and reduce loneliness by creating empathy. But as anyone who has worked them knows, robots are very hard to program: they fall over, their vision is terrible, and they are very easy to break. They are also very expensive: $10,000USD is a reasonably cheap robot. Because of these challenges, researchers often use robotic simulations in software instead of real robots.

Using computer graphics and motion simulations, on-screen robots can be quite convincing and similar to the real thing, and researchers sometimes use these to conduct studies about how people interact with these robots. Clearly, simulations have limitations when compared to the real thing; for one, people cannot touch simulated robots. But what about for simpler cases where people just converse with the robot or work together on a mental task? Do people interact with these simulated robots the same as they would a real robot? Would people feel empathy for a simulated robot in the same way as they do for real robots?

We conducted a study to investigate this question, according to the following scenario:

  1. A person gets to know an intelligent robot’s personality while collaboratively playing Sudoku
  2. The robot gets a virus and acts increasingly strange
  3. The robot expresses fear of being reset and losing its memory
  4. The robot is erased by the researcher and loses its memory (Poor thing!), and re-introduces itself with a new voice.

This is scenario similar to a common movie cliché, where a protagonist has a friend who is harmed by an unpopular third party (in our case, the researcher). Based on background work in this area, we expected that people would feel sorry for the robot, feel empathy, similar to how they would for an animal or a pet. We validated our study design by comparing it to a case where the robot does not get a virus or get reset, and found that indeed people empathized more with the unfortunate robot that loses its memory.

We did this study twice: once with a real robot, and once with an identical simulated robot: same look, same voice, same actions, and same conversation. We found that people rated their empathy about 10% higher for the real robot than for the simulated one. So, if you are investigating social interactions with your robot, be wary of using simulations – the interaction may not quite be the same as the real thing.

Like any interesting result, our work raises more questions than it answers. Why does this difference exist? Is it as simple as people knowing the difference between real and fake things? Maybe we are used to seeing bad things happen in virtual worlds? Would this difference happen for positive empathy as well? Will this hold for other kinds of robots? What other social interactions are different with simulated robots? Moving forward, our team is continuing to look into these questions!

One final point – looking back, the really hard part of this work was to make a study design that is convincing, believable by the participants, and actually makes people feel sorry for the robot. This was much harder than we expected, and we ended up collaborating with a local artist company, ZenFri Inc.

So – which would you feel more sorry for: a simulated robot, or a real one? If you want to learn more about our process, or any of our results, please check out our paper or visit our project homepage.


Stela H. Seo, Denise Geiskkovitch, Masayuki Nakane, Corey King, James E. Young. “Poor Thing! Would You Feel Sorry for a Simulated Robot? A comparison of empathy toward a physical and a simulated robot,” In proceedings of the 10th ACM/IEEE international conference on Human-Robot Interaction, HRI’2015, Portland, Oregon, USA.


If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 



tags: , , ,


Stela Seo specializes in human-robot interaction at the University of Manitoba.
Stela Seo specializes in human-robot interaction at the University of Manitoba.

Denise Geiskkovitch is studying human-robot interaction at Georgia Tech.
Denise Geiskkovitch is studying human-robot interaction at Georgia Tech.

Masayuki Nakane studies human-robot interaction at the University of Manitoba.
Masayuki Nakane studies human-robot interaction at the University of Manitoba.

Corey King is co-founder of ZenFri Inc.
Corey King is co-founder of ZenFri Inc.

Jim Young is an Assistant Professor at the University of Manitoba
Jim Young is an Assistant Professor at the University of Manitoba





Related posts :



Robot Talk Episode 112 – Getting creative with robotics, with Vali Lalioti

  07 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Vali Lalioti from the University of the Arts London about how art, culture and robotics interact.

Robot Talk Episode 111 – Robots for climate action, with Patrick Meier

  28 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Patrick Meier from the Climate Robotics Network about how robots can help scale action on climate change.

Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

  21 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Catherine Menon from the University of Hertfordshire about designing home assistance robots with ethics in mind.

Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.

Robot Talk Episode 107 – Animal-inspired robot movement, with Robert Siddall

  31 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Robert Siddall from the University of Surrey about novel robot designs inspired by the way real animals move.

Robot Talk Episode 106 – The future of intelligent systems, with Didem Gurdur Broo

  24 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Didem Gurdur Broo from Uppsala University about how to shape the future of robotics, autonomous vehicles, and industrial automation.

Robot Talk Episode 105 – Working with robots in industry, with Gianmarco Pisanelli 

  17 Jan 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gianmarco Pisanelli from the Advanced Manufacturing Research Centre about how to promote the safe and intuitive use of robots in manufacturing.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association