Robohub.org
 

Engineers build brain-controlled music player


by and
04 March 2014



share this:

Imagine if playing music was as simple at looking at your laptop screen. Now it is thanks to Kenneth Camilleri and his team of researchers from the Department of Systems and Control Engineering and the Centre for Biomedical Cybernetics at the University of Malta, who have developed a music player that can be controlled by the human brain.

Camilleri and his team have been studying brain responses for ten years. Now they have found one that is optimal for controlling a music player using eye movements. The system was originally developed to improve the quality of life of individuals with severely impaired motor abilities such as those with motor neuron disease or cerebral palsy.

The technology works by reading two key features of the user’s nervous system: the nerves that trigger muscular movement in the eyes and the way that the brain processes vision. The user can control the music player simply by looking at a series of flickering boxes on a computer screen. Each of the lights flicker at a certain frequency and as the user looks at them their brain synchronizes at the same rate. This brain pattern reading system, developed by Rosanne Zerafa relies on a system involving Steady State Visually Evoked potentials (SSVEPs).

VoltProj3Electrical signals sent by the brain are then picked up by a series of electrodes placed at specific locations on the user’s scalp. This process, known as electroencephalography (EEG), records the brain responses and converts the brain activity into a series of computer commands.

As the user looks at the boxes on the screen, the computer program is able to figure out the commands, allowing the music player to be controlled without the need of any physical movement. In order to adjust the volume, or change the song, the user just has to look at the corresponding box. The command takes effect in just seconds.

For people who have become paralyzed due to a spinal injury, the normal flow of brain signals through the spine and to muscles is disrupted. However, the cranial nerves are separate and link directly from the brain to certain areas of the body, bypassing the spine altogether. This particular brain-computer interface exploits one of these; the occulomotor nerve, which is responsible for the eye’s movements. This means that even an individual with complete body paralysis can still move their eyes over images on a screen.

This cutting age brain-computer interface system could lead the way for the development of similar user interfaces for tablets and smart phones. The concept could also be designed to aid with assisted living applications, for example.

The BCI system was presented at the 6th International IEEE/EMBS Neural Engineering Conference in San Diego, California by team member Dr. Owen Falzon.

 



tags: , , , ,


Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.
Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.

TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.
TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.


Subscribe to Robohub newsletter on substack



Related posts :

Resource-constrained image generation and visual understanding: an interview with Aniket Roy

  07 Apr 2026
Aniket tells us about his research exploring how modern generative models can be adapted to operate efficiently while maintaining strong performance.

Back to school: robots learn from factory workers

  02 Apr 2026
A Czech startup is making factory automation easier by letting workers teach robots new tasks through simple demonstrations instead of complex coding.

Resource-sharing boosts robotic resilience

  31 Mar 2026
When a modular robot shares power, sensing, and communication resources among its individual units, it is significantly more resistant to failure than traditional robotic systems.

Robot Talk Episode 150 – House building robots, with Vikas Enti

  27 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Vikas Enti from Reframe Systems about using robotics and automation to build climate-resilient, high-performance homes.

A history of RoboCup with Manuela Veloso

and   24 Mar 2026
Find out how RoboCup got started and how the competition has evolved, from one of the co-founders.

Robot Talk Episode 149 – Robot safety and security, with Krystal Mattich

  20 Mar 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Krystal Mattich from Brain Corp about trustworthy autonomous robots in public spaces.

A multi-armed robot for assisting with agricultural tasks

  18 Mar 2026
How can a robot safely manipulate branches to reveal hidden flowers while remaining aware of interaction forces and minimizing damage?

Graphene-based sensor to improve robot touch

  16 Mar 2026
Multiscale-structured miniaturized 3D force sensors for improved robot touch.



Robohub is supported by:


Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence