Robohub.org
 

Towards independence: A shared control BCI telepresence robot


by
29 June 2015



share this:
Telepresence robots give the disabled a sense of independence (Photo: Alain Herzog, EPFL)

Telepresence robots give the disabled a sense of independence (Photo: Alain Herzog, EPFL)

For those with extreme mobility problems, such as paralysis following spinal cord injury or neurological disease, telepresence can greatly help to offset social isolation. However, controlling a mobile telepresence device through obstacles like doorways can be difficult when fine motor skills have been compromised. Researchers from CNBI, EPFL and NCCR Robotics this week published a cunning solution that uses brain-computer interfaces (BCIs) to enable patients to share control with the robot, making it far easier to navigate.

The idea  is for patients to use BCIs to remotely control a mobile telepresence robot, through which the user can move and communicate with friends using a bidirectional audio/video connection.

The study tested nine end users and 10 healthy participants using a BCI and telepresence robot. The aim was to study both the BCI configuration and shared control of the robot when used in tandem with this.

The system was tested using a non-invasive BCI to record brain impulses from a cap placed over the head on the sensorimotor complex, with 16 electroencephalogram (EEG) channels transmitting to the robot via Skype. Healthy participants, who were at home, were then asked to remotely manoeuvre the robot in the CNBI lab along a set path with obstacles (such as doorways and tables) and under four separate conditions: BCI shared control; BCI no shared control; manual shared control and manual no shared control. Two separate conditions (BCI shared control and BCI no shared control) were used with end-users.

https://youtu.be/I1KvtNhv9Yc

In the shared control condition, the robot uses sensors to avoid obstacles without instruction from the driver, this has two advantages:

Firstly, inaccuracies in the system are reduced. For example, if a participant instructs the robot to turn left, this can result in a hard turn or a gentle turn; the system isn’t able to detect which. With shared control, if the user wishes to travel through a doorway to the left, they instruct the robot to turn in that direction, and the robot uses its sensors to avoid the doorframe and travels safely through it.

The second advantage is that less information needs to be communicated from the user, thus reducing the cognitive workload. In shared control, participants were able to complete tasks in shorter time periods and with fewer commands.

Of the 19 participants, all were able to pilot the robot after training. There was no discernable difference in ability between disabled and able-bodied users. “Each of the nine subjects with disabilities managed to remotely control the robot with ease after less than 10 days of training,” said Prof. José del R. Millán, who was in charge of the study.

Where possible, patients were also allowed to control the robot through small residual movements, such as head leans that someone might perform while playing a video game, or pressing a button with their head. It was remarkable that these movements were shown to be no more effective in controlling the robot than using the information transmitted over the BCI alone.

Reference:

R. Leeb, L. Tonin, M. Rohm, L. Desideri, T. Carlson and J. Millán, “Towards Independence: A BCI Telepresence Robot for People with Severe Motor Disabilities,” Proceedings of the IEEE, vol. 103, no. 6, pp. 969-982, Jun 2015. A



tags: , , ,


NCCR Robotics





Related posts :



Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.

Interview with Amar Halilovic: Explainable AI for robotics

  10 Jun 2025
Find out about Amar's research investigating the generation of explanations for robot actions.

Robot Talk Episode 124 – Robots in the performing arts, with Amy LaViers

  06 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Amy LaViers from the Robotics, Automation, and Dance Lab about the creative relationship between humans and machines.

Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence