Robohub.org
 

David Robert on “Do robots need heads?”


by
15 August 2013



share this:

robohub.org-robots_on_tourAs a robot animator I can attest to the fact that robots don’t “need” heads to be treated as social entities. Research has shown that people will befriend a stick as long as it moves properly [1].

We have a long-standing habit of anthropomorphizing things that aren’t human by attributing to them human-level personality traits or internal motivations based on cognitive-affective architectures that just aren’t there. Animators have relied on the audience’s willingness to suspend disbelief and, in essence, co-animate things into existence: from a sack of flour to a magic broom. It’s possible to incorporate the user’s willingness to bring a robot to life by appropriately setting expectations and being acutely aware of how the context of interaction affects possible outcomes.

In human lifeforms, a head usually circumscribes a face, whereas in a robot a face can be placed anywhere. Although wonderfully complex, in high degree of freedom (DOF) robot heads, the facial muscles can be challenging to orchestrate with sufficient timing precision. If your robot design facilitates expression through the careful control of the quality of motion rendered, a head isn’t necessary in order to communicate essential non-verbal cues. As long as you consider a means for revealing the robot’s internal state, a head simply isn’t needed. A robot’s intentions can be conveyed through expressive motion and sound regardless of its form or body configuration.

[1] Harris, J., & Sharlin, E. (2011, July). Exploring the affect of abstract motion in social human-robot interaction. In RO-MAN, 2011 IEEE (pp. 441-448). IEEE.

 



tags: , , , , ,


David Robert





Related posts :

“Robot, make me a chair”

  17 Feb 2026
An AI-driven system lets users design and build simple, multicomponent objects by describing them with words.

Robot Talk Episode 144 – Robot trust in humans, with Samuele Vinanzi

  13 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Samuele Vinanzi from Sheffield Hallam University about how robots can tell whether to trust or distrust people.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

and   12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  10 Feb 2026
Sven honoured for his work on AI planning and search.

Robot Talk Episode 143 – Robots for children, with Elmira Yadollahi

  06 Feb 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Elmira Yadollahi from Lancaster University about how children interact with and relate to robots.

New frontiers in robotics at CES 2026

  03 Feb 2026
Henry Hickson reports on the exciting developments in robotics at Consumer Electronics Show 2026.

Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray

  30 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Mark Gray from Universal Robots about their lightweight robotic arms that work alongside humans.

Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh

  23 Jan 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.


Robohub is supported by:





 













©2026.01 - Association for the Understanding of Artificial Intelligence