Robohub.org
 

Can a brain-computer interface convert your thoughts to text?


by
01 November 2016



share this:
Abstract electric circuit with brain technology concept- deep-learning

By Srividya Sundaresan, Frontiers Science Writer

Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential game-changer for those with speech pathologies, and even more so for “locked-in” patients who lack any speech or motor function.

“So instead of saying ‘Siri, what is the weather like today’ or ‘Ok Google, where can I go for lunch?’ I just imagine saying these things,” explains Christian Herff, author of a review recently published in the journal Frontiers in Human Neuroscience.

ECoG and audio data are recorded at the same time. Speech decoding software is then used to determine timing of vowels and consonants in acoustic data. ECoG models are then trained for each phone individually by calculating the mean and covariance of all segments associated with that particular phone. Courtesy of Christian Herff

While reading one’s thoughts might still belong to the realms of science fiction, scientists are already decoding speech from signals generated in our brains when we speak or listen to speech.

In their review, Herff and co-author, Dr. Tanja Schultz, compare the pros and cons of using various brain imaging techniques to capture neural signals from the brain and then decode them to text.

The technologies include functional MRI and near infrared imaging that can detect neural signals based on metabolic activity of neurons, to methods such as EEG and magnetoencephalography (MEG) that can detect electromagnetic activity of neurons responding to speech. One method in particular, called electrocorticography or ECoG, showed promise in Herff’s study.

This study presents the Brain-to-text system in which epilepsy patients who already had electrode grids implanted for treatment of their condition participated. They read out texts presented on a screen in front of them while their brain activity was recorded. This formed the basis of a database of patterns of neural signals that could now be matched to speech elements or “phones”.

When the researchers also included language and dictionary models in their algorithms, they were able to decode neural signals to text with a high degree of accuracy. “For the first time, we could show that brain activity can be decoded specifically enough to use ASR technology on brain signals,” says Herff. “However, the current need for implanted electrodes renders it far from usable in day-to-day life.”

So, where does the field go from here to a functioning thought detection device? “A first milestone would be to actually decode imagined phrases from brain activity, but a lot of technical issues need to be solved for that,” concedes Herff.

Their study results, while exciting, are still only a preliminary step towards this type of brain-computer interface.


If you liked this article, you may also like reading:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.



tags: ,


Frontiers in Computational Neuroscience is devoted to promoting theoretical modeling of brain function and fostering interdisciplinary interactions between theoretical and experimental neuroscience...
Frontiers in Computational Neuroscience is devoted to promoting theoretical modeling of brain function and fostering interdisciplinary interactions between theoretical and experimental neuroscience...





Related posts :



Robot Talk at the Smart City Robotics Competition

  22 Oct 2025
In a special bonus episode of the podcast, Claire chatted to competitors, exhibitors, and attendees at the Smart City Robotics Competition in Milton Keynes.

Robot Talk Episode 129 – Automating museum experiments, with Yuen Ting Chan

  17 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Yuen Ting Chan from Natural History Museum about using robots to automate molecular biology experiments.

What’s coming up at #IROS2025?

  15 Oct 2025
Find out what the International Conference on Intelligent Robots and Systems has in store.

From sea to space, this robot is on a roll

  13 Oct 2025
Graduate students in the aptly named "RAD Lab" are working to improve RoboBall, the robot in an airbag.

Robot Talk Episode 128 – Making microrobots move, with Ali K. Hoshiar

  10 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Ali K. Hoshiar from University of Essex about how microrobots move and work together.

Interview with Zahra Ghorrati: developing frameworks for human activity recognition using wearable sensors

and   08 Oct 2025
Zahra tells us more about her research on wearable technology.

Women in robotics you need to know about 2025

  06 Oct 2025
This global list celebrates women's impact across the robotics ecosystem and globe.

Robot Talk Episode 127 – Robots exploring other planets, with Frances Zhu

  03 Oct 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Frances Zhu from the Colorado School of Mines about intelligent robotic systems for space exploration.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


 












©2025.05 - Association for the Understanding of Artificial Intelligence