Robohub.org
 

Learning hand motions from humans


by
08 November 2011



share this:

Although human hands have lots of degrees of freedom, we typically don’t use most configurations. For example, we usually don’t move the last two joints of our fingers independently. Now let’s look at the anthropomorphic robot hand below. Like the human hand, it has lots of degrees of freedom and planning a motion would typically take a lot of time if we consider all possibilities. To solve this problem, Rosell et al. propose to look at what motions humans do, and use the information to limit the motions the robot hand should be doing.

Industrial robot Stäubli TX 90 with the mechanical hand Schunk Anthropomorphic Hand.

To learn about human hand motion they fitted a human with a sensorized glove and recorded its movements. The human movements were then translated into robot coordinates. Using a technique called Principal Component Analysis, the robot is able to extract the most important motions that humans do. By combining these principal motions with a planner to make sure the arm and hand don’t collide with the environment or their own parts, the robot is able to perform human-like motion using little computation.

The approach was validated in simulation and using a four finger anthropomorphic mechanical hand (17 joints with 13 in- dependent degrees of freedom) assembled on an industrial robot (6 independent degrees of freedom).




Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory
Sabine Hauert is President of Robohub and Associate Professor at the Bristol Robotics Laboratory

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence