Robohub.org
 

Touchy subject: 3D printed fingertip ‘feels’ like human skin


by
07 April 2022



share this:

Robotic hand with a 3D-printed tactile fingertip on the little (pinky) finger. The white rigid back to the fingertip is covered with the black flexible 3D-printed skin.

Machines can beat the world’s best chess player, but they cannot handle a chess piece as well as an infant. This lack of robot dexterity is partly because artificial grippers lack the fine tactile sense of the human fingertip, which is used to guide our hands as we pick up and handle objects.

Two papers published in the Journal of the Royal Society Interface give the first in-depth comparison of an artificial fingertip with neural recordings of the human sense of touch. The research was led by Professor of Robotics & AI (Artificial Intelligence), Nathan Lepora, from the University of Bristol’s Department of Engineering Maths and based at the Bristol Robotics Laboratory.

“Our work helps uncover how the complex internal structure of human skin creates our human sense of touch. This is an exciting development in the field of soft robotics – being able to 3D-print tactile skin could create robots that are more dexterous or significantly improve the performance of prosthetic hands by giving them an in-built sense of touch,” said Professor Lepora.

Cut-through section on the 3D-printed tactile skin. The white plastic is a rigid mount for the flexible black rubber skin. Both parts are made together on an advanced 3D-printer. The ‘pins’ on the inside of the skin replicate dermal papillae that are formed inside human skin.

Professor Lepora and colleagues created the sense of touch in the artificial fingertip using a 3D-printed mesh of pin-like papillae on the underside of the compliant skin, which mimic the dermal papillae found between the outer epidermal and inner dermal layers of human tactile skin. The papillae are made on advanced 3D-printers that can mix together soft and hard materials to create complicated structures like those found in biology.

“We found our 3D-printed tactile fingertip can produce artificial nerve signals that look like recordings from real, tactile neurons. Human tactile nerves transmit signals from various nerve endings called mechanoreceptors, which can signal the pressure and shape of a contact. Classic work by Phillips and Johnson in 1981 first plotted electrical recordings from these nerves to study ‘tactile spatial resolution’ using a set of standard ridged shapes used by psychologists. In our work, we tested our 3D-printed artificial fingertip as it ‘felt’ those same ridged shapes and discovered a startlingly close match to the neural data,” said Professor Lepora.

“For me, the most exciting moment was when we looked at our artificial nerve recordings from the 3D-printed fingertip and they looked like the real recordings from over 40 years ago! Those recordings are very complex with hills and dips over edges and ridges, and we saw the same pattern in our artificial tactile data,” said Professor Lepora.

While the research found a remarkably close match between the artificial fingertip and human nerve signals, it was not as sensitive to fine detail. Professor Lepora suspects this is because the 3D-printed skin is thicker than real skin and his team is now exploring how to 3D-print structures on the microscopic scale of human skin.

“Our aim is to make artificial skin as good – or even better – than real skin,” said Professor Lepora.

PAPERS



tags:


University of Bristol is one of the most popular and successful universities in the UK.
University of Bristol is one of the most popular and successful universities in the UK.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.

Sony AI table tennis robot outplays elite human players

  22 Apr 2026
New robot and AI system has beaten professional and elite table tennis players.

AI system learns to keep warehouse robot traffic running smoothly

  20 Apr 2026
This new approach adapts to decide which robots should get the right of way at every moment, avoiding congestion and increasing throughput.

Robot Talk Episode 152 – Dexterous robot hands, with Rich Walker

  17 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Rich Walker from Shadow Robot Company about their advanced robotic hands for research and industry.

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

and   14 Apr 2026
Ross King created the first robot scientist back in 2009. He spoke to us about the nature of scientific discovery, the role AI has to play, and his recent work in DNA computing.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence