Robohub.org
 

Control strategies for active lower extremity prosthetics and orthotics


by
04 February 2015



share this:
Knee orthosis as worn by first author Mike Tucker

Knee orthosis as worn by first author Mike Tucker (photo: ReLab, ETHZ and Alain Herzog).

Much has been made of the numerous advances in robotic prosthetics and orthotics (P/O) over recent years, and the question of how to control these devices so that they work in accordance with the intention of the user is a continuing dilemma for roboticists. 

A team from four labs within NCCR Robotics, across ETH Zurich and EPFL (ReLab, ETH Zurich; LSRO, EPFL; SMS, ETH Zurich and CNBI, EPFL) have recently published a joint paper in the Journal of NeuroEngineering and Rehabilitation, in which 10 experts from the field review the state of the art in control approaches for active lower limb P/Os. They argue that for P/Os to be fully viable and to advance further, they must be treated as part of a framework whereby the control system becomes integrated with the user’s sensorimotor system.

Image of P/O devices

Traditionally, the fields of orthotics and prosthetics have been viewed separately, with hardware and controllers developed with a specific portion of the body in mind (i.e. knee, ankle and hips). By taking a broad survey that includes research for all joints of the lower limbs across the different fields (rather than just looking at a small subset), it is hoped that future developments can blur the lines between fields and create technologies that can ultimately restore walking to those with physical or neurological impairments.

This open access review pieces together where the state of the art is now and what work still needs to be done, providing valuable background about the field.

The authors behind the paper have been working to enhance communication between research groups, and to promote a more holistic approach to P/O devices. One of the co-authors is organizing next year’s Cybathlon, where teams comprised of bionic technology developers and a pilot will compete in one of six races.The competition’s ultimate aim is to increase discussion between academia, industry and end users through friendly competition.



tags: , , , , ,


NCCR Robotics

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 156 – Rugged robots for dangerous missions, with Gavin Kenneally

  15 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Gavin Kenneally from Ghost Robotics about robot dogs for defence, security, and public safety.

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence