Robohub.org
 

The science of human touch – and why it’s so hard to replicate in robots


by
24 December 2025



share this:

By Perla Maiolino, University of Oxford

Robots now see the world with an ease that once belonged only to science fiction. They can recognise objects, navigate cluttered spaces and sort thousands of parcels an hour. But ask a robot to touch something gently, safely or meaningfully, and the limits appear instantly.

As a researcher in soft robotics working on artificial skin and sensorised bodies, I’ve found that trying to give robots a sense of touch forces us to confront just how astonishingly sophisticated human touch really is.

My work began with the seemingly simple question of how robots might sense the world through their bodies. Develop tactile sensors, fully cover a machine with them, process the signals and, at first glance, you should get something like touch.

Except that human touch is nothing like a simple pressure map. Our skin contains several distinct types of mechanoreceptor, each tuned to different stimuli such as vibration, stretch or texture. Our spatial resolution is remarkably fine and, crucially, touch is active: we press, slide and adjust constantly, turning raw sensation into perception through dynamic interaction.

Engineers can sometimes mimic a fingertip-scale version of this, but reproducing it across an entire soft body, and giving a robot the ability to interpret this rich sensory flow, is a challenge of a completely different order.

Working on artificial skin also quickly reveals another insight: much of what we call “intelligence” doesn’t live solely in the brain. Biology offers striking examples – most famously, the octopus.

Octopuses distribute most of their neurons throughout their limbs. Studies of their motor behaviour show an octopus arm can generate and adapt movement patterns locally based on sensory input, with limited input from the brain.

Their soft, compliant bodies contribute directly to how they act in the world. And this kind of distributed, embodied intelligence, where behaviour emerges from the interplay of body, material and environment, is increasingly influential in robotics.

Touch also happens to be the first sense that humans develop in the womb. Developmental neuroscience shows tactile sensitivity emerging from around eight weeks of gestation, then spreading across the body during the second trimester. Long before sight or hearing function reliably, the foetus explores its surroundings through touch. This is thought to help shape how infants begin forming an understanding of weight, resistance and support – the basic physics of the world.

This distinction matters for robotics too. For decades, robots have relied heavily on cameras and lidars (a sensing method that uses pulses of light to measure distance) while avoiding physical contact. But we cannot expect machines to achieve human-level competence in the physical world if they rarely experience it through touch.

Simulation can teach a robot useful behaviour, but without real physical exploration, it risks merely deploying intelligence rather than developing it. To learn in the way humans do, robots need bodies that feel.

A ‘soft’ robot hand with tactile sensors, developed by the University of Oxford’s Soft Robotics Lab, gets to grips with an apple. Video: Oxford Robotics Institute.

One approach my group is exploring is giving robots a degree of “local intelligence” in their sensorised bodies. Humans benefit from the compliance of soft tissues: skin deforms in ways that increase grip, enhance friction and filter sensory signals before they even reach the brain. This is a form of intelligence embedded directly in the anatomy.

Research in soft robotics and morphological computation argues that the body can offload some of the brain’s workload. By building robots with soft structures and low-level processing, so they can adjust grip or posture based on tactile feedback without waiting for central commands, we hope to create machines that interact more safely and naturally with the physical world.

Occupational therapist Ruth Alecock uses the training robot 'Mona'
Occupational therapist Ruth Alecock uses the training robot ‘Mona’. Perla Maiolino/Oxford Robotics Institute, CC BY-NC-SA

Healthcare is one area where this capability could make a profound difference. My group recently developed a robotic patient simulator for training occupational therapists (OTs). Students often practise on one another, which makes it difficult to learn the nuanced tactile skills involved in supporting someone safely. With real patients, trainees must balance functional and affective touch, respect personal boundaries and recognise subtle cues of pain or discomfort. Research on social and affective touch shows how important these cues are to human wellbeing.

To help trainees understand these interactions, our simulator, known as Mona, produces practical behavioural responses. For example, when an OT presses on a simulated pain point in the artificial skin, the robot reacts verbally and with a small physical “hitch” of the body to mimic discomfort.

Similarly, if the trainee tries to move a limb beyond what the simulated patient can tolerate, the robot tightens or resists, offering a realistic cue that the motion should stop. By capturing tactile interaction through artificial skin, our simulator provides feedback that has never previously been available in OT training.

Robots that care

In the future, robots with safe, sensitive bodies could help address growing pressures in social care. As populations age, many families suddenly find themselves lifting, repositioning or supporting relatives without formal training. “Care robots” would help with this, potentially meaning the family member could be cared for at home longer.

Surprisingly, progress in developing this type of robot has been much slower than early expectations suggested – even in Japan, which introduced some of the first care robot prototypes. One of the most advanced examples is Airec, a humanoid robot developed as part of the Japanese government’s Moonshot programme to assist in nursing and elderly-care tasks. This multifaceted programme, launched in 2019, seeks “ambitious R&D based on daring ideas” in order to build a “society in which human beings can be free from limitations of body, brain, space and time by 2050”.

Japan’s Airec care robot is one of the most advanced in development. Video by Global Update.

Throughout the world, though, translating research prototypes into regulated robots remains difficult. High development costs, strict safety requirements, and the absence of a clear commercial market have all slowed progress. But while the technical and regulatory barriers are substantial, they are steadily being addressed.

Robots that can safely share close physical space with people need to feel and modulate how they touch anything that comes into contact with their bodies. This whole-body sensitivity is what will distinguish the next generation of soft robots from today’s rigid machines.

We are still far from robots that can handle these intimate tasks independently. But building touch-enabled machines is already reshaping our understanding of touch. Every step toward robotic tactile intelligence highlights the extraordinary sophistication of our own bodies – and the deep connection between sensation, movement and what we call intelligence.

This article was commissioned in conjunction with the Professors’ Programme, part of Prototypes for Humanity, a global initiative that showcases and accelerates academic innovation to solve social and environmental challenges. The Conversation is the media partner of Prototypes for Humanity 2025.The Conversation

Perla Maiolino, Associate Professor of Engineering Science, member of the Oxford Robotics Institute, University of Oxford

This article is republished from The Conversation under a Creative Commons license. Read the original article.




The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.
The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.


Subscribe to Robohub newsletter on substack



Related posts :

A history of RoboCup with Manuela Veloso

and   24 Mar 2026
Find out how RoboCup got started and how the competition has evolved, from one of the co-founders.

Robot Talk Episode 149 – Robot safety and security, with Krystal Mattich

  20 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Krystal Mattich from Brain Corp about trustworthy autonomous robots in public spaces.

A multi-armed robot for assisting with agricultural tasks

  18 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.

Restoring surgeons’ sense of touch with robotic fingertips

  10 Mar 2026
Researchers are developing robotic “fingertips” that could give surgeons back their sense of touch during minimally invasive and robotic operations.

Robot Talk Episode 147 – Miniature living robots, with Maria Guix

  06 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence