Robohub.org
 

Robots with tact

Close-up picture of beautiful charming female in pale pink silk shirt sitting on floor on colorful carpet holding laptop on knees with prosthetic bionic hand made of black metal mechanical device

Picture: Adobe Stock/shurkin_son

Artificial hands, even the most sophisticated prostheses, are still by far inferior to human hands. What they lack are the tactile abilities crucial for dexterity. Other challenges include linking sensing to action within the robotic system – and effectively linking it to the human user. Prof. Dr. Philipp Beckerle from FAU has joined with international colleagues to summarize the latest findings in this field of Robotics – and establish an agenda for future research. Their piece in the research journal Science Robotics suggests a sensorimotor control framework for haptically enabled robotic hands, inspired by principles of the human’s central nervous system. Their aim is to link tactile sensing to movement in human-centred, haptically enabled artificial hands. According to the European and American team of researchers, this approach promises improved dexterity for humans controlling robotic hands.

Tactile sensing needs to play a bigger role

“Human manual dexterity relies critically on touch”, explains Prof. Dr. Philipp Beckerle, head of FAU’s Chair of Autonomous Systems and Mechatronics (ASM). “Humans with intact motor function but insensate fingertips can find it very difficult to grasp or manipulate things.” This, he says, indicates that tactile sensing is necessary for human dexterity. “Bioinspired design suggests that lessons from human haptics could enhance the currently limited dexterity of artificial hands. But robotic and prosthetic hands make little use of the many tactile sensors nowadays available and are hence much less dexterous.”

Beckerle, a Mechatronics engineer, has just had the paper “A hierarchical sensorimotor control framework for human-in-the-loop robotic hands” published in the research journal Science Robotics. In this, he unfolds with international colleagues how advanced technologies now provide not only mechatronic and computational components for anthropomorphic limbs, but also sensing ones. The scientists therefore suggest that such recently developed tactile sensing technologies could be incorporated into a general concept of “electronic skins”. “These include dense arrays of normal-force-sensing tactile elements in contrast to fingertips with a more comprehensive force perception”, the paper reads. “This would provide a directional force-distribution map over the entire sensing surface, and complex three-dimensional architectures, mimicking the mechanical properties and multimodal sensing of human fingertips.” Tactile sensing systems mounted on mechatronic limbs could therefore provide robotic systems with the complex representations needed to characterize, identify and manipulate, e.g. objects.

Human principles as inspiration for future designs

To achieve haptically informed and dexterous machines, the researchers secondly propose taking inspiration from the principles of the hierarchically organised human central nervous system (CNS). The CNS controls, which signals the brain receives from tactile senses and sends back to the body. The authors propose a conceptual framework in which a bioinspired touch-enabled robot shares control with the human – to a degree that the human sets. Principals of the framework include parallel processing of tasks, integration of feedforward and feedback control as well as a dynamic balance between subconscious and conscious processing. These could not only be applied in the design of bionic limbs, but also that of virtual avatars or remotely navigated telerobots.

It remains yet another challenge though to effectively interface a human user with touch-enabled robotic hands. “Enhancing haptic robots with high-density tactile sensing can substantially improve their capabilities but raises questions about how best to transmit these signals to a human controller, how to navigate shared perception and action in human-machine systems”, the paper reads. It remains largely unclear how to manage agency and task assignment, to maximize utility and user experience in human-in-the-loop systems. “Particularly challenging is how to exploit the varied and abundant tactile data generated by haptic devices. Yet, human principles provide inspiration for the future design of mechatronic systems that can function like humans, alongside humans, and even as replacement parts for humans.”

Philipp Beckerle’s Chair is part of the FAU’s Departments of Electrical Engineering, Electronics and Information Technology as well as the Department of Artificial Intelligence in Biomedical Engineering. “Our mission at ASM is to research human-centric mechatronics and robotics and strive for solutions that combine the desired performance with user-friendly interaction properties”, Beckerle explains. “Our focus is on wearable systems such as prostheses or exoskeletons, cognitive systems such as collaborative or humanoid robots and generally on tasks with close human-robot interaction. The human factors are crucial in such scenarios in order to meet the user’s needs and to achieve synergetic interface as well as interaction between humans and machines.”

Apart from Prof. Dr. Beckerle, scientists from the Universities of Genoa, Pisa and Rome, Aalborg, Bangor and Pittsburgh as well as the Imperial College London and the University of Southern California, Los Angeles were contributing to the paper.




Friedrich-Alexander-Universität Erlangen-Nürnberg





Related posts :



Robot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules

  07 Nov 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anthony Jules from Robust.AI about their autonomous warehouse robots that work alongside humans.

Teaching robots to map large environments

  05 Nov 2025
A new approach could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.

Robot Talk Episode 131 – Empowering game-changing robotics research, with Edith-Clare Hall

  31 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Edith-Clare Hall from the Advanced Research and Invention Agency about accelerating scientific and technological breakthroughs.

A flexible lens controlled by light-activated artificial muscles promises to let soft machines see

  30 Oct 2025
Researchers have designed an adaptive lens made of soft, light-responsive, tissue-like materials.

Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence