Robohub.org
 

Engineers build brain-controlled music player

by and
04 March 2014



share this:

Imagine if playing music was as simple at looking at your laptop screen. Now it is thanks to Kenneth Camilleri and his team of researchers from the Department of Systems and Control Engineering and the Centre for Biomedical Cybernetics at the University of Malta, who have developed a music player that can be controlled by the human brain.

Camilleri and his team have been studying brain responses for ten years. Now they have found one that is optimal for controlling a music player using eye movements. The system was originally developed to improve the quality of life of individuals with severely impaired motor abilities such as those with motor neuron disease or cerebral palsy.

The technology works by reading two key features of the user’s nervous system: the nerves that trigger muscular movement in the eyes and the way that the brain processes vision. The user can control the music player simply by looking at a series of flickering boxes on a computer screen. Each of the lights flicker at a certain frequency and as the user looks at them their brain synchronizes at the same rate. This brain pattern reading system, developed by Rosanne Zerafa relies on a system involving Steady State Visually Evoked potentials (SSVEPs).

VoltProj3Electrical signals sent by the brain are then picked up by a series of electrodes placed at specific locations on the user’s scalp. This process, known as electroencephalography (EEG), records the brain responses and converts the brain activity into a series of computer commands.

As the user looks at the boxes on the screen, the computer program is able to figure out the commands, allowing the music player to be controlled without the need of any physical movement. In order to adjust the volume, or change the song, the user just has to look at the corresponding box. The command takes effect in just seconds.

For people who have become paralyzed due to a spinal injury, the normal flow of brain signals through the spine and to muscles is disrupted. However, the cranial nerves are separate and link directly from the brain to certain areas of the body, bypassing the spine altogether. This particular brain-computer interface exploits one of these; the occulomotor nerve, which is responsible for the eye’s movements. This means that even an individual with complete body paralysis can still move their eyes over images on a screen.

This cutting age brain-computer interface system could lead the way for the development of similar user interfaces for tablets and smart phones. The concept could also be designed to aid with assisted living applications, for example.

The BCI system was presented at the 6th International IEEE/EMBS Neural Engineering Conference in San Diego, California by team member Dr. Owen Falzon.

 



tags: , , , ,


Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.
Daniel Faggella Daniel Faggella is the founder of TechEmergence, an internet entrepreneur, and speaker.

TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.
TechEmergence is the only news and media site exclusively about innovation at the crossroads of technology and psychology.





Related posts :



ep.

340

podcast

NVIDIA and ROS Teaming Up To Accelerate Robotics Development, with Amit Goel

Amit Goel, Director of Product Management for Autonomous Machines at NVIDIA, discusses the new collaboration between Open Robotics and NVIDIA. The collaboration will dramatically improve the way ROS and NVIDIA's line of products such as Isaac SIM and the Jetson line of embedded boards operate together.
23 October 2021, by

One giant leap for the mini cheetah

A new control system, demonstrated using MIT’s robotic mini cheetah, enables four-legged robots to jump across uneven terrain in real-time.
23 October 2021, by

Robotics Today latest talks – Raia Hadsell (DeepMind), Koushil Sreenath (UC Berkeley) and Antonio Bicchi (Istituto Italiano di Tecnologia)

Robotics Today held three more online talks since we published the one from Amanda Prorok (Learning to Communicate in Multi-Agent Systems). In this post we bring you the last talks that Robotics Today...
21 October 2021, by and

Sense Think Act Pocast: Erik Schluntz

In this episode, Audrow Nash interviews Erik Schluntz, co-founder and CTO of Cobalt Robotics, which makes a security guard robot. Erik speaks about how their robot handles elevators, how they have hum...
19 October 2021, by and

A robot that finds lost items

Researchers at MIT have created RFusion, a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper, that fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
18 October 2021, by

Robohub gets a fresh look

If you visited Robohub this week, you may have spotted a big change: how this blog looks now! On Tuesday (coinciding with Ada Lovelace Day and our ‘50 women in robotics that you need to know about‘ by chance), Robohub got a massive modernisation on its look by our technical director Ioannis K. Erripis and his team.
17 October 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association