Robohub.org
 

Video of human controlling a quadrotor via non-invasive brain/computer interface


by
04 June 2013



share this:

Researchers from the University of Minnesota have developed a non-invasive brain/computer interface that allows humans to remotely control a robot (in this case, a quadrotor) using only their thoughts. The research team, led by Bin He, Professor of Biomedical Engineering, hopes this technology can one day be used to help people with speech and mobility problems.

According to research team member Karl LaFleur, “If you imagine making a fist with your right hand, it turns the robot to the right. And if you imagine making a fist with both hands, it moves the robot up.”

The beauty of this research is that no implants are required to interface with the system.

Instead, an EEG cap fitted with 64 electrodes is used to transmit the brain’s electric currents to a computer, which then sends the commands via Wi-Fi to the robot. This non-invasive approach to controlling assistive robotic devices is important because, while researchers have had some success using implants to control assistive systems, neural-machine connections tend to degrade over time.

Says Professor He: “We envision this technology will be used to control wheelchairs, artificial limbs or other devices.”

See also these similar systems:

 



tags: , ,


Robohub Editors





Related posts :



Rethinking how robots move: Light and AI drive precise motion in soft robotic arm

  01 Oct 2025
Researchers at Rice University have developed a soft robotic arm capable of performing complex tasks.

RoboCup Logistics League: an interview with Alexander Ferrein, Till Hofmann and Wataru Uemura

and   25 Sep 2025
Find out more about the RoboCup league focused on production logistics and the planning.

Drones and Droids: a co-operative strategy game

  22 Sep 2025
Scottish Association for Marine Science is running a crowdfunding campaign for educational card game.

Call for AAAI educational AI videos

  22 Sep 2025
Submit your contributions by 30 November 2025.

Self-supervised learning for soccer ball detection and beyond: interview with winners of the RoboCup 2025 best paper award

  19 Sep 2025
Method for improving ball detection can also be applied in other fields, such as precision farming.

#ICML2025 outstanding position paper: Interview with Jaeho Kim on addressing the problems with conference reviewing

  15 Sep 2025
Jaeho argues that the AI conference peer review crisis demands author feedback and reviewer rewards.

Apertus: a fully open, transparent, multilingual language model

  11 Sep 2025
EPFL, ETH Zurich and the Swiss National Supercomputing Centre (CSCS) released Apertus today, Switzerland’s first large-scale, open, multilingual language model.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence