Robohub.org
 

Virtual race: Competing in Brain Computer Interface at Cybathlon


by
05 October 2016



share this:
_img5870_resized

This week the world’s first Cybathlon will take place in Zurich, Switzerland. Cybathlon is the brainchild of NCCR Robotics co-director and ETH Zurich professor Robert Riener, and is designed to facilitate discussion between academics, industry and end users of assistive aids, to promote the position of people with disabilities within society and to push development of assistive technology towards solutions that are suitable for use all-day, every day.

In our privileged position as presenting sponsor we are also proud to have NCCR Robotics represented by two teams: In the Brain Computer Interface (BCI) race, by the team EPFL Brain Tweakers, and in the Powered Arm Prosthesis Race, by the team LeMano.

The BCI race is a virtual race, whereby the pilots use BCIs to control an avatar running through a computer game – the pilots may only use their thoughts as no other commands (e.g. head movements) will affect the actions of the avatar. As the Brain Tweakers are a team of researchers representing the Chair in Brain-Machine Interface (CNBI) laboratory led by Prof. José del R. Millán at the Swiss Federal Institute of Technology (EPFL), Lausanne and NCCR Robotics, they jumped at the chance to participate in this race, which plays to their experience and expertise. Indeed, the focus of their research is on the direct use of human brain signals to control devices and interact with the environment around the user.

_img5902_resizedAt Cybathlon, the Brain Tweakers will race with two pilots, 30-year-old Numa Poujouly and 48-year-old Eric Anselmo, both of whom have been practicing over the summer during weekly and bi-weekly two hour sessions. The BCI system that they will use translates brain patterns, as captured in real-time by Electroencephalography (EEG, a completely safe, non-invasive and minimally obtrusive approach), into game commands.

_img5938_resizedThis is possible by means of what is called a Motor Imagery (MI) BCI. EEG signals (i.e., electrical activity on the user’s scalp, which occurs naturally in everyone as part of the process of using one’s brain), which are monitored through 16 electrodes placed in specific locations on the pilot’s head. Distinct spatio-spectral cortical patterns (patterns of brainwaves) are known to emerge and persist when someone either makes or thinks about making a movement (i.e., movements of the hands, arms and/or feet). It is this pattern of brainwaves when a person imagines making a movement that makes this technology usable by and attractive to people with severe disabilities. These cortical patterns are not only specific to which limb is being thought about, but also tend to be fairly user-specific. Therefore the first of the objectives of the sessions that the team have been using to train this summer have been to identify the MI tasks that are “optimal” (i.e., more easily distinguishable) for each of the two pilots. These then feed into a set of processing modules, consisting of signal processing and machine learning algorithms, that allow the online, real-time detection of the type of movement being executed. The latter is straightforwardly translated into a predetermined command to the pilot’s avatar in the game (speed-up, roll, slide).

_img5927_resizedWhile BCI technology is not in and of its self very new, what the Brain Tweakers hope will give them the winning edge is this introduction of machine learning techniques. For the Brain Tweakers, the Cybathlon provides an excellent opportunity for their team to rapidly advance and test their research outcomes in real-world conditions, exchange expertise and foster collaborations with other groups, as well as to push BCI technology out of the lab to provide practical daily service for end-users in their homes.

Attend the Cybathlon in person or watch along live on the Cybathlon website to cheer along for the Brain Tweakers.


If you liked this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: ,


NCCR Robotics





Related posts :



Social media round-up from #IROS2025

  27 Oct 2025
Take a look at what participants got up to at the IEEE/RSJ International Conference on Intelligent Robots and Systems.

Using generative AI to diversify virtual training grounds for robots

  24 Oct 2025
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.

Robot Talk Episode 130 – Robots learning from humans, with Chad Jenkins

  24 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Chad Jenkins from University of Michigan about how robots can learn from people and assist us in our daily lives.

Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.

Robot Talk Episode 129 – Automating museum experiments, with Yuen Ting Chan

  17 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Yuen Ting Chan from Natural History Museum about using robots to automate molecular biology experiments.

What’s coming up at #IROS2025?

  15 Oct 2025
Find out what the International Conference on Intelligent Robots and Systems has in store.

From sea to space, this robot is on a roll

  13 Oct 2025
Graduate students in the aptly named "RAD Lab" are working to improve RoboBall, the robot in an airbag.

Robot Talk Episode 128 – Making microrobots move, with Ali K. Hoshiar

  10 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Ali K. Hoshiar from University of Essex about how microrobots move and work together.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence