Robohub.org
 

Lifehand 2 prosthetic grips and senses like a real hand


by
12 March 2014



share this:
04_Copyright_LifeHand2
Amputee Dennis Aabo Sørensen wearing sensory feedback enabled prosthetic in Rome, February 2013 (Lifehand 2, Patrizia Tocci).

Roboticists and doctors working in Switzerland and Italy have come together to develop a bionic hand that provides sensory feedback in real time, meaning that an amputee can be given back the ability to feel and modify grip like someone with a “real” hand. Using a combination of surgically implanted electrodes (connected at one end to the nervous system, and at the other end to sensors) and an algorithm to convert signals, the team has produced a hand that sends information back to the brain that is so detailed that the wearer could even tell the hardness of objects he was given to hold.

In a paper published in Science Translational Medicine in Feb. 2014, the team from EPFL (Switzerland) and SSSA (Italy) headed by Prof. Micera of EPFL and NCCR-Robotics presented an entirely new type of prosthetic hand, Lifehand 2, that is capable of interfacing with the nervous system of the wearer in order to give the ability to grip and sense like a real hand – including being able to feel shape and hardness of an object. This life-like sensation of feeling is something that has never before been achieved in prosthetics.

Using a combination of the basic concepts from TMR (targeted muscle reinnervation) – a new but increasingly sophisticated technique where nerves are rerouted across the chest and can allow a sense of touch when stimulated with tactile traces – and new robotic techniques, the team has allowed an amputee to feel as though his old hand was back during a series of 700 tests.

09_Copyright_LifeHand2
Electrodes implanted in the arm are connected to sensors via an external processing unit. (Lifehand 2, Patrizia Tocci).

In order for the hand to function, a looped sensory feedback mechanism is employed. The first step of this mechanism involves transverse intrafascicular multichannel electrodes (TIMEs) being surgically implanted in the median and ulna nerves in the arm – those that control the sensory fields of the palm and fingers. The electrodes are then connected to a number of sensors distributed across the prosthetic hand in locations that mimic the locations of tendons on a real hand. The signals from the sensors are then relayed to an external unit where they are processed before being passed back to the nerves in a format that allows the brain to understand how much pressure is being exerted on the sensors, much like how information is passed from a real hand to the brain.

Residual muscular signals are then taken from 5 positions on the forearm and are used to decode the intentions of the user, and a power source is used to activate the hand in one of four different grasping motions.

Results of the 700 tests performed during the study were positive, showing that even when the subject was tested under audio and visual deprivation he was able to modify his grip according to what he was holding – so a soft object such as a sponge or an orange was naturally held with a softer grip than a hard object such as a cricket ball. By the end of four weeks of testing, 93% accuracy was recorded for pressure control in the index finger and it was found that the brain automatically assimilated data from multiple sensors in the palm of the hand. The subject was able to distinguish the stiffness of the object within 3 seconds, a response comparable to that of a natural hand, and said “When I held an object, I could feel if it was soft or hard, round or square”.

For many years medical scientists have used prosthetics to replace the lost hands of amputees. Over time, these hands have become more and more sophisticated, using materials that are lighter, more flexible and that look more and more realistic. Such prosthetics are considered invaluable by some amputees but do not have a high uptake within the community due to issues with use and comfort – if a prosthetic is heavy, uncomfortable and cannot move like a real hand it might not be practical for everyday life.

While prosthetics is not new, just the idea of a technique that allows sensors, motors and human nerves to communicate so easily with each other in real time would have been unthinkable a few years ago. For this to have now been developed into a method that supports the ability of the brain to assimilate impulses from 2 different areas to allow the subject to feel the neurologically complex action of palm closure is something exceptional.

07_Copyright_LifeHand2
Lifehand2 allows the amputee to sense the difference between hard objects, like a glass, and soft objects, like a human hand, and adjust his grip appropriately.(Lifehand 2, Patrizia Tocci).

Another particularly encouraging aspect of the study was the level of success despite it being 8 years since the chosen subject’s hand was amputated. The team were concerned that because of the long timescales involved, the nerves would be too degraded for use. However, the tests indicated that this was not the case, thus opening up the technology to a wide range of potential users.

As with any other science or technology project, the work is still ongoing. Although the functionality of the hand has been demonstrated to be lifelike, the appearance of a hand made of plastics, iron, tendons and electrical circuits is not, so the team is working on creating a sensitive polymer skin to make the appearance of the hand more realistic and more usable in everyday life situations. In a parallel line of work, in order for the hand to be made portable the electronics required for the sensory feedback system must be miniaturized so that they can be implanted into the prosthetic.

 



tags: , , ,


NCCR Robotics





Related posts :



Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.

Robot Talk Episode 129 – Automating museum experiments, with Yuen Ting Chan

  17 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Yuen Ting Chan from Natural History Museum about using robots to automate molecular biology experiments.

What’s coming up at #IROS2025?

  15 Oct 2025
Find out what the International Conference on Intelligent Robots and Systems has in store.

From sea to space, this robot is on a roll

  13 Oct 2025
Graduate students in the aptly named "RAD Lab" are working to improve RoboBall, the robot in an airbag.

Robot Talk Episode 128 – Making microrobots move, with Ali K. Hoshiar

  10 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Ali K. Hoshiar from University of Essex about how microrobots move and work together.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence