Robohub.org
 

Engineers build brain-controlled music player


by and
04 March 2014



share this:

Imagine if playing music was as simple at looking at your laptop screen. Now it is thanks to Kenneth Camilleri and his team of researchers from the Department of Systems and Control Engineering and the Centre for Biomedical Cybernetics at the University of Malta, who have developed a music player that can be controlled by the human brain.

Camilleri and his team have been studying brain responses for ten years. Now they have found one that is optimal for controlling a music player using eye movements. The system was originally developed to improve the quality of life of individuals with severely impaired motor abilities such as those with motor neuron disease or cerebral palsy.

The technology works by reading two key features of the user’s nervous system: the nerves that trigger muscular movement in the eyes and the way that the brain processes vision. The user can control the music player simply by looking at a series of flickering boxes on a computer screen. Each of the lights flicker at a certain frequency and as the user looks at them their brain synchronizes at the same rate. This brain pattern reading system, developed by Rosanne Zerafa relies on a system involving Steady State Visually Evoked potentials (SSVEPs).

VoltProj3Electrical signals sent by the brain are then picked up by a series of electrodes placed at specific locations on the user’s scalp. This process, known as electroencephalography (EEG), records the brain responses and converts the brain activity into a series of computer commands.

As the user looks at the boxes on the screen, the computer program is able to figure out the commands, allowing the music player to be controlled without the need of any physical movement. In order to adjust the volume, or change the song, the user just has to look at the corresponding box. The command takes effect in just seconds.

For people who have become paralyzed due to a spinal injury, the normal flow of brain signals through the spine and to muscles is disrupted. However, the cranial nerves are separate and link directly from the brain to certain areas of the body, bypassing the spine altogether. This particular brain-computer interface exploits one of these; the occulomotor nerve, which is responsible for the eye’s movements. This means that even an individual with complete body paralysis can still move their eyes over images on a screen.

This cutting age brain-computer interface system could lead the way for the development of similar user interfaces for tablets and smart phones. The concept could also be designed to aid with assisted living applications, for example.

The BCI system was presented at the 6th International IEEE/EMBS Neural Engineering Conference in San Diego, California by team member Dr. Owen Falzon.

 



tags: , , , ,


Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.
Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.

TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.
TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.





Related posts :



Robot Talk Episode 103 – Keenan Wyrobek

  20 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Keenan Wyrobek from Zipline about drones for delivering life-saving medicine to remote locations.

Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association