Robohub.org
 

Evaluating the effectiveness of robot behaviors in human-robot interactions


by
26 November 2014



share this:

This post is part of our ongoing efforts to make the latest papers in robotics accessible to a general audience.

Robots that interact with everyday users may need a combination of speech, gaze, and gesture behaviors to convey their message effectively. This is similar to human-human interactions except that every behavior the robot displays must be designed and programmed ahead of time. In other words, designers of robot applications must understand how each of these behaviors contributes to the robot’s effectiveness so that they can determine which behaviors must be included in the application’s design.

To this end, the latest paper by Huang and Mutlu in Autonomous Robots presents a method that designers can use to determine which behaviors should be used to produce a desired effect. They illustrate the method’s use by designing and evaluating a set of narrative behaviors for a storytelling robot that might be used in educational, informational, and entertainment settings.

robot_behaviour

As an example, the figure above shows the Wakamaru human-like robot coordinating speech, gaze, and gesture to tell a story about the process of making paper. The full narration lasted approximately six minutes. One result showed how the robot’s use of pointing gestures improved its audience’s recall of story information and by how much. The impact of different gestures on the robot’s performance is further captured in the diagram shown below. Such a diagram can be used by robot designers to choose appropriate behaviors from a large set of behaviors, or to understand the impact each behavior has on the goals of their design.

diagram
For more information, you can read the paper Multivariate evaluation of interactive robot systems (Chien-Ming Huang and Bilge Mutlu, Autonomous Robots – Springer US, August 2014) or ask questions below!



tags: ,


Autonomous Robots Blog Latest publications in the journal Autonomous Robots (Springer).
Autonomous Robots Blog Latest publications in the journal Autonomous Robots (Springer).





Related posts :



Robot Talk Episode 135 – Robot anatomy and design, with Chapa Sirithunge

  28 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chapa Sirithunge from University of Cambridge about what robots can teach us about human anatomy, and vice versa.

Learning robust controllers that work across many partially observable environments

  27 Nov 2025
Exploring designing controllers that perform reliably even when the environment may not be precisely known.

Human-robot interaction design retreat

  25 Nov 2025
Find out more about an event exploring design for human-robot interaction.

Robot Talk Episode 134 – Robotics as a hobby, with Kevin McAleer

  21 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kevin McAleer from kevsrobots about how to get started building robots at home.

ACM SIGAI Autonomous Agents Award 2026 open for nominations

  19 Nov 2025
Nominations are solicited for the 2026 ACM SIGAI Autonomous Agents Research Award.

Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence