Published on
ADVEReadNOWISEMENT
An experimental brain implant can read people’s minds, translating their inner thoughts into text.
In an early test, scientists from Stanford University used a brain-computer interface (BCI) device to decipher sentences that were thought, but not spoken aloud. The implant was correct up to 74 per cent of the time.
BCIs work by connecting a person’s nervous system to devices that can interpret their brain activity, allowing them to take action – like using a computer or moving a prosthetic hand – with only their thoughts.
They have emerged as a possible way for people with disabilities to regain some independence.
Perhaps the most famous is Elon Musk’s Neuralink implant, an experimental device that is in early trials testing its safety and functionality in people with specific medical conditions that limit their mobility.
The latest findings, published in the journal Cell, could one day make it easier for people who cannot speak to communicate more easily, the researchers said.
“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” said Erin Kunz, one of the study’s authors and a researcher at Stanford University in the United States.
Working with four study participants, the research team implanted microelectrodes – which record neural signals – into the motor cortex, which is the part of the brain responsible for speech.
The researchers asked participants to either attempt to speak or to imagine saying a set of words. Both actions activated overlapping parts of the brain and elicited similar types of brain activity, though to different degrees.
They then trained artificial intelligence (AI) models to interpret words that the participants thought but did not say aloud. In a demonstration, the brain chip could translate the imagined sentences with an accuracy rate of up to 74 per cent.
In another test, the researchers set a password to prevent the BCI from decoding people’s inner speech unless they first thought of the code. The system recognised the password with around 99 per cent accuracy.
The password? “Chitty chitty bang bang”.
For now, brain chips cannot interpret inner speech without significant guardrails. But the researchers said more advanced models may be able to do so in the future.
Frank Willett, one of the study’s authors and an assistant professor of neurosurgery at Stanford University, said in a statement that BCIs could also be trained to ignore inner speech.
“This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech,” he said.