Robohub.org
 

Lifehand 2 prosthetic grips and senses like a real hand


by
12 March 2014



share this:
04_Copyright_LifeHand2
Amputee Dennis Aabo Sørensen wearing sensory feedback enabled prosthetic in Rome, February 2013 (Lifehand 2, Patrizia Tocci).

Roboticists and doctors working in Switzerland and Italy have come together to develop a bionic hand that provides sensory feedback in real time, meaning that an amputee can be given back the ability to feel and modify grip like someone with a “real” hand. Using a combination of surgically implanted electrodes (connected at one end to the nervous system, and at the other end to sensors) and an algorithm to convert signals, the team has produced a hand that sends information back to the brain that is so detailed that the wearer could even tell the hardness of objects he was given to hold.

In a paper published in Science Translational Medicine in Feb. 2014, the team from EPFL (Switzerland) and SSSA (Italy) headed by Prof. Micera of EPFL and NCCR-Robotics presented an entirely new type of prosthetic hand, Lifehand 2, that is capable of interfacing with the nervous system of the wearer in order to give the ability to grip and sense like a real hand – including being able to feel shape and hardness of an object. This life-like sensation of feeling is something that has never before been achieved in prosthetics.

Using a combination of the basic concepts from TMR (targeted muscle reinnervation) – a new but increasingly sophisticated technique where nerves are rerouted across the chest and can allow a sense of touch when stimulated with tactile traces – and new robotic techniques, the team has allowed an amputee to feel as though his old hand was back during a series of 700 tests.

09_Copyright_LifeHand2
Electrodes implanted in the arm are connected to sensors via an external processing unit. (Lifehand 2, Patrizia Tocci).

In order for the hand to function, a looped sensory feedback mechanism is employed. The first step of this mechanism involves transverse intrafascicular multichannel electrodes (TIMEs) being surgically implanted in the median and ulna nerves in the arm – those that control the sensory fields of the palm and fingers. The electrodes are then connected to a number of sensors distributed across the prosthetic hand in locations that mimic the locations of tendons on a real hand. The signals from the sensors are then relayed to an external unit where they are processed before being passed back to the nerves in a format that allows the brain to understand how much pressure is being exerted on the sensors, much like how information is passed from a real hand to the brain.

Residual muscular signals are then taken from 5 positions on the forearm and are used to decode the intentions of the user, and a power source is used to activate the hand in one of four different grasping motions.

Results of the 700 tests performed during the study were positive, showing that even when the subject was tested under audio and visual deprivation he was able to modify his grip according to what he was holding – so a soft object such as a sponge or an orange was naturally held with a softer grip than a hard object such as a cricket ball. By the end of four weeks of testing, 93% accuracy was recorded for pressure control in the index finger and it was found that the brain automatically assimilated data from multiple sensors in the palm of the hand. The subject was able to distinguish the stiffness of the object within 3 seconds, a response comparable to that of a natural hand, and said “When I held an object, I could feel if it was soft or hard, round or square”.

For many years medical scientists have used prosthetics to replace the lost hands of amputees. Over time, these hands have become more and more sophisticated, using materials that are lighter, more flexible and that look more and more realistic. Such prosthetics are considered invaluable by some amputees but do not have a high uptake within the community due to issues with use and comfort – if a prosthetic is heavy, uncomfortable and cannot move like a real hand it might not be practical for everyday life.

While prosthetics is not new, just the idea of a technique that allows sensors, motors and human nerves to communicate so easily with each other in real time would have been unthinkable a few years ago. For this to have now been developed into a method that supports the ability of the brain to assimilate impulses from 2 different areas to allow the subject to feel the neurologically complex action of palm closure is something exceptional.

07_Copyright_LifeHand2
Lifehand2 allows the amputee to sense the difference between hard objects, like a glass, and soft objects, like a human hand, and adjust his grip appropriately.(Lifehand 2, Patrizia Tocci).

Another particularly encouraging aspect of the study was the level of success despite it being 8 years since the chosen subject’s hand was amputated. The team were concerned that because of the long timescales involved, the nerves would be too degraded for use. However, the tests indicated that this was not the case, thus opening up the technology to a wide range of potential users.

As with any other science or technology project, the work is still ongoing. Although the functionality of the hand has been demonstrated to be lifelike, the appearance of a hand made of plastics, iron, tendons and electrical circuits is not, so the team is working on creating a sensitive polymer skin to make the appearance of the hand more realistic and more usable in everyday life situations. In a parallel line of work, in order for the hand to be made portable the electronics required for the sensory feedback system must be miniaturized so that they can be implanted into the prosthetic.

 



tags: , , ,


NCCR Robotics





Related posts :



Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.

Interview with Amar Halilovic: Explainable AI for robotics

  10 Jun 2025
Find out about Amar's research investigating the generation of explanations for robot actions.

Robot Talk Episode 124 – Robots in the performing arts, with Amy LaViers

  06 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Amy LaViers from the Robotics, Automation, and Dance Lab about the creative relationship between humans and machines.

Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence