Robohub.org
 

Long-term control of brain-computer interfaces by users with locked-in syndrome


by
28 August 2017



share this:

Using Brain Computer Interfaces (BCI) as a way to give people with locked-in syndrome back reliable communication and control capabilities has long been a futuristic trope of medical dramas and sci-fi. A team from NCCR Robotics and CNBI, EPFL have recently published a paper detailing work as a step towards taking this technique into everyday lives of those affected by extreme paralysis.

BCIs measure brainwaves using sensors placed outside of the head. With careful training and calibration, these brainwaves can be used to understand the intention of the person they are recorded from. However, one of the challenges of using BCIs in everyday life is the variation in the BCI performance over time. This issue is particularly important for motor-restricted end-users, as they usually suffer from even higher fluctuations of their brain signals and resulting performance. One approach to tackle this issue is to use shared control approaches for BCI, which has so far been mostly based on predefined settings, providing a fixed level of assistance to the user.

The team tackled the issue of performance variation by developing a system capable of dynamically matching the user’s evolving capabilities with the appropriate level of assistance. The key element of this adaptive shared control framework is to incorporate the user’s brain state and signal reliability while the user is trying to deliver a BCI command.

The team tested their novel strategy with one person with incomplete locked-in syndrome, multiple times over the course of a year. The person was asked to imagine moving the right hand to trigger a “right command”, and the left hand for a “left command” to control an avatar in a computer game. They demonstrated how adaptive shared control can exploit an estimation of the BCI performance (in terms of command delivery time) to adjust online the level of assistance in a BCI game by regulating its speed. Remarkably, the results exhibited a stable performance over several months without recalibration of the BCI classifier or the performance estimator.

This work marks the first time that this design has been successfully tested with an end-user with incomplete locked-in syndrome and successfully replicates the results of earlier tests with able bodied subjects.

 

Reference:

S. Saeedi, R. Chavarriage and J. del R. Millán, “Long-Term Stable Control of Motor-Imagery BCI by a Locked-In User Through Adaptive Assistance,” IEEE Transactions on neural systems and rehabilitation engineering,” Vol. 25, no. 4, 380-391.



tags:


NCCR Robotics

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 156 – Rugged robots for dangerous missions, with Gavin Kenneally

  15 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Gavin Kenneally from Ghost Robotics about robot dogs for defence, security, and public safety.

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence