Robohub.org
 

This robotic prosthetic hand can be made for just $1000


by
11 September 2013



share this:

The Dextrus hand is a robotic hand that can be put together for well under £650 ($1000) and offers much of the functionality of a human hand. Existing prosthetic hands are magnificent devices, capable of providing a large amount of dexterity using a simple control system. The problem is that they cost somewhere between £7,000-£70,000 ($11,000-$110,000) — far too much for most people to afford, especially in developing countries. Through the Open Hand Project, an open source project with the goal of making robotic prosthetic hands more accessible to amputees, a fully-functional prototype has already been developed. An indiegogo campaign is currently underway to provide funds for refining and testing the design.

In order to broaden the reach of prosthetic devices, I decided to create a low-cost prosthetic hand while in my final year at the University of Plymouth. I ended up creating a fully functioning prototype that went on to win three awards for innovation and excellence.

Dextrus_prosthetic_hand_Gibbard2

The Open Hand Project tackles a number of problems to bring the cost down. One of these is the customized nature of prosthetic devices. Usually they need to be custom-fitted to the user’s remaining arm, which can rack up medical bills with consultations and fittings. The Dextrus hand connects directly to an NHS fitted, passive prosthesis. This means no additional custom fitting and no extra cost.

The Dextrus hand works much like a human hand. It uses electric motors instead of muscles and steel cables instead of tendons. 3D printed plastic parts work like bones and a rubber coating acts as the skin. All of these parts are controlled by electronics to give it a natural movement that can handle all sorts of different objects. It uses stick-on electrodes to read signals from the users remaining muscles, which can control the hand, telling it to open or close.

Dextrus_prosthetic_hand_Gibbard

Using 3D printing offers vast benefits to the project, because the user can select any colour, it’s easy to switch from a right hand to a left hand, and parts can be re-printed with ease should anything break.

To take this project to the next level, I need to design and prototype the rest of the electronics and build everything onto printed circuit boards. The design of the hand needs to be refined and tested to make sure that it’s robust and functional as well as aesthetically pleasing.

This is why I’m using crowd-funding to raise the money to support the project and have launched a campaign on indiegogo. If the crowd-funding campaign is successful, the money will go towards funding the project for an entire year. This will include prototyping PCB designs, materials for additional prototypes of the whole hand and equipment for assembling the electronics. Since I’ll be working full time on this, some of it will also go towards a modest salary to keep me going for the year.



tags: , , , , , ,


Joel Gibbard is a roboticist living in Bristol, UK.
Joel Gibbard is a roboticist living in Bristol, UK.





Related posts :



Robot Talk Episode 133 – Creating sociable robot collaborators, with Heather Knight

  14 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Heather Knight from Oregon State University about applying methods from the performing arts to robotics.

CoRL2025 – RobustDexGrasp: dexterous robot hand grasping of nearly any object

  11 Nov 2025
A new reinforcement learning framework enables dexterous robot hands to grasp diverse objects with human-like robustness and adaptability—using only a single camera.

Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence