Robohub.org
 

Brain Computer Interface used to control the movement and actions of an android robot


by
13 November 2012



share this:
12-0199-d

Researchers at the CNRS-AIST Joint Robotics Laboratory and the CNRS-LIRMM Interactive Digital Human group, are working on ways to control robots via thought alone.

“Basically we would like to create devices which would allow people to feel embodied, in the body of a humanoid robot. To do so we are trying to develop techniques from Brain Computer Interfaces (BCI) so that we can read the peoples thoughts and then try to see how far we can go from interpreting brain waves signals, to transform them into actions to be done by the robot.”

The interface uses flashing symbols to control where the robot moves and how it interacts with the environment around it.

“Basically what you see is how with one pattern, called the SSVEP, which is the ability to associate flickering things with actions, it’s what we call the affordance, means that we associate actions with objects and then we bring this object to the attention of the user and then by focussing their intention the user is capable of inducing which actions they would like with the robot, and then this is translated.”

“He is wearing a cap which is embedded with electrodes, and then we read the electric activities of the brain that are transferred to this PC, and then there is a signal processing unit which classifies what the user is thinking, and then as you see here there are several icons that can be associated with tasks or you can recognize an object that will flicker automatically, and with different frequencies we can recognize which frequency the user is focussing their attention on and then we can select this object and since the object is associated with a task then it’s easy to instruct the robot which task it has to perform.”

“And the applications targeted are for tetraplegics or paraplegics to use this technology to navigate using the robot, and for instance, a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan.”



tags: ,


DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.
DigInfo TV is a Tokyo-based online video news platform dedicated to producing original coverage of cutting edge technology, research and products from Japan.





Related posts :



Engineering fantasy into reality

  26 Aug 2025
PhD student Erik Ballesteros is building “Doc Ock” arms for future astronauts.

RoboCup@Work League: Interview with Christoph Steup

and   22 Aug 2025
Find out more about the RoboCup League focussed on industrial production systems.

Interview with Haimin Hu: Game-theoretic integration of safety, interaction and learning for human-centered autonomy

and   21 Aug 2025
Hear from Haimin in the latest in our series featuring the 2025 AAAI / ACM SIGAI Doctoral Consortium participants.

AIhub coffee corner: Agentic AI

  15 Aug 2025
The AIhub coffee corner captures the musings of AI experts over a short conversation.

Interview with Kate Candon: Leveraging explicit and implicit feedback in human-robot interactions

and   25 Jul 2025
Hear from PhD student Kate about her work on human-robot interactions.

#RoboCup2025: social media round-up part 2

  24 Jul 2025
Find out what participants got up to during the second half of RoboCup2025 in Salvador, Brazil.

#RoboCup2025: social media round-up 1

  21 Jul 2025
Find out what participants got up to during the opening days of RoboCup2025 in Salvador, Brazil.

Livestream of RoboCup2025

  18 Jul 2025
Watch the competition live from Salvador!



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence