People’s interactions with machines, from robots that throw tantrums when they lose a colour-matching game against a human opponent to the bionic limbs that could give us extra abilities, are not just revealing more about how our brains are wired – they are also altering them.
Emily Cross is a professor of social robotics at the University of Glasgow in Scotland who is examining the nature of human-robot relationships and what they can tell us about human cognition.
She defines social robots as machines designed to engage with humans on a social level – from online chatbots to machines with a physical presence, for example, those that check people into hotel rooms.
According to Prof. Cross, as robots can be programmed to perform and replicate specific behaviours, they make excellent tools for shedding light on how our brains work, unlike humans, whose behaviour varies.
‘The central tenets to my questions are, can we use human-robot interaction to better understand the flexibility and fundamental mechanisms of social cognition and the human brain,’ she said.
Brain imaging shows that a sad, happy or neutral robotic expression will engage the same parts of the brain as a human face with similar expressions.
Through their project called Social Robots, Prof. Cross and her team are using neural decoding techniques to probe the extent to which human feelings towards a robot change depending on how it behaves.
Tantrums
When the robots used in the project lose a game, they alternate between throwing tantrums or appearing dejected. ‘So far, people actually find it really funny when the robot gets angry,’ she said. ‘But people do respond to them quite strongly and that’s really interesting to see.’
Having robots as colleagues has been shown to affect humans in complex ways. Researchers at the University of Washington found that when soldiers used robots in bomb disposal, they developed emotional attachments towards them and felt frustration, anger or sadness if their robot was destroyed.
Prof. Cross says that from an evolutionary perspective, this doesn’t make sense. ‘We care about people and perhaps animals that might help us or hurt us,’ she said. ‘But with machines it’s a bit more of a mystery and understanding how far we can push that (to develop social relationships with machines) is a really, really fascinating question.’
It’s important to understand these dynamics since, as she points out, robots are already working as companions in nursing homes or even as tutors in early childhood education. Home care and education are prime areas of social robotics research, with R&D efforts focusing on adults suffering from dementia and young children.
Ten-hour rule
Typically, studies on such groups observe interactions over a relatively short time-span. They rarely exceed what Prof. Cross describes as a ten-hour rule, beyond which study participants tend to get bored of their robotic toys. But her team is looking at how feelings towards robots evolve over time.
As part of the project, the researchers send a palm-sized Cozmo robot home with study participants and instruct them to interact with it every day for a week by playing games or introducing it to their friends and pets. The participants’ brains are imaged at the start and end of that period to track changes.
‘If we’re going to have robots in our home environment, if they’re going to be in our schools teaching our kids across weeks, if not years, if they’re going to be peoples’ social companions, we want to know a lot more than just what happens after ten hours’ (of exposure),’ she said.
‘We want to know how people’s social bonds and relationships to robots change across many, many more hours.’
With such technologies set to become a bigger part of our future, other studies are investigating how the brain reacts to a different kind of robot – wearable robotic limbs that augment the body, providing extra abilities.
Wearables could have social and healthcare benefits. For instance, a third arm could assist surgeons to carry out procedures more safely rather than relying on human assistants, enable people to complete their household chores much faster or help construction workers.
But even as the technology capabilities develop apace, Dr Tamar Makin, a neuroscientist at University College London, UK, is exploring what it would take for the brain to accept and operate a robotic appendage as part of the body, through a five-year project called Embodied Tech.
Additional thumb
In order to understand how the brain deals with an extra body part, Dr Makin’s team asks participants to wear an additional opposable thumb for a week. Created by a designer named Dani Clode, the thumb is controlled by pressure sensors worn on the big toes.
Product designer Dani Clode created a prosthetic opposable thumb for people to wear as an extra digit. Video credit: Dani Clode
With the additional thumb, the augmented hand almost has the capabilities of two hands, giving people extra capacity to carry out actions. The question is what effect that has on the brain.
The study is still underway but preliminary results indicate that the presence of an extra thumb alters the brain’s internal map of what the biological hand looks like. Scans show that the brain represents the fingers as collapsing onto each other, away from the thumb and index finger.
This mirrors what happens in diseases like dystonia, when the representation of fingers begins to merge – for instance, when musicians use their fingers excessively – and causes cramp-like pain. The same effect could theoretically cause pain in the wearer of an extra thumb.
‘One important interim message we have is that there are potential costs, not just benefits, to using augmentation technology,’ said Dr Makin.
She believes that the newness of human augmentation means there are lots of unanswered questions but it’s vital to explore the challenges of wearable robotics in order to fully realise the promises, such as multitasking or safer working conditions.
‘I feel like we have a responsibility to gain a much better understanding of how having good control of an additional body part is going to change the representation of the body parts you already have.’
The research in this article was funded by the European Research Council.