Robohub.org
 

Brain Computer Interface used to control the movement and actions of an android robot


by
13 November 2012



share this:
12-0199-d

Researchers at the CNRS-AIST Joint Robotics Laboratory and the CNRS-LIRMM Interactive Digital Human group, are working on ways to control robots via thought alone.

“Basically we would like to create devices which would allow people to feel embodied, in the body of a humanoid robot. To do so we are trying to develop techniques from Brain Computer Interfaces (BCI) so that we can read the peoples thoughts and then try to see how far we can go from interpreting brain waves signals, to transform them into actions to be done by the robot.”

The interface uses flashing symbols to control where the robot moves and how it interacts with the environment around it.

“Basically what you see is how with one pattern, called the SSVEP, which is the ability to associate flickering things with actions, it’s what we call the affordance, means that we associate actions with objects and then we bring this object to the attention of the user and then by focussing their intention the user is capable of inducing which actions they would like with the robot, and then this is translated.”

“He is wearing a cap which is embedded with electrodes, and then we read the electric activities of the brain that are transferred to this PC, and then there is a signal processing unit which classifies what the user is thinking, and then as you see here there are several icons that can be associated with tasks or you can recognize an object that will flicker automatically, and with different frequencies we can recognize which frequency the user is focussing their attention on and then we can select this object and since the object is associated with a task then it’s easy to instruct the robot which task it has to perform.”

“And the applications targeted are for tetraplegics or paraplegics to use this technology to navigate using the robot, and for instance, a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan.”



tags: ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.


Subscribe to Robohub newsletter on substack



Related posts :

Resource-sharing boosts robotic resilience

  31 Mar 2026
When a modular robot shares power, sensing, and communication resources among its individual units, it is significantly more resistant to failure than traditional robotic systems.

Robot Talk Episode 150 – House building robots, with Vikas Enti

  27 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Vikas Enti from Reframe Systems about using robotics and automation to build climate-resilient, high-performance homes.

A history of RoboCup with Manuela Veloso

and   24 Mar 2026
Find out how RoboCup got started and how the competition has evolved, from one of the co-founders.

Robot Talk Episode 149 – Robot safety and security, with Krystal Mattich

  20 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Krystal Mattich from Brain Corp about trustworthy autonomous robots in public spaces.

A multi-armed robot for assisting with agricultural tasks

  18 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.

Robot Talk Episode 148 – Ethical robot behaviour, with Alan Winfield

  13 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Alan Winfield from the University of the West of England about developing new standards for ethics and transparency in robotics.

Coding for underwater robotics

  12 Mar 2026
Lincoln Laboratory intern Ivy Mahncke developed and tested algorithms to help human divers and robots navigate underwater.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence