In neuroscience, researchers have been working with brain-computer interfaces (BCIs) for decades.
After the discovery of the brain’s electrical activity in 1924 by Hans Berger, enabling electroencephalography (EEG), scientists have been trying to understand the language of the mind – ‘reading’ thoughts directly.
While some BCIs are invasive, relying on sensors implanted in the brain, far more promising approaches use non-invasive techniques.
These BCIs typically either use tiny sensors worn on the head to measure brain activity, or a special form of magnetic resonance imaging, an fMRI, that illustrates active, real-time blood flow in the brain.
With constant advances in this tech, scientists have reached the point where mind-reading isn’t just a parlour trick.
These systems can now detect brain activity and translate those signals into communication and control.
And that promises hope for people who’ve lost the ability to move or speak, helping them regain independence and live fuller lives.
In fact, recent innovations may allow people to talk directly with their minds, opening new possibilities for treatment and communication, especially in environments that are far too loud for normal speech.
If while reading, you pronounce each word aloud in your mind, you know exactly what subvocalisation is.
It helps us better understand the words we’re seeing, though it also slows reading speed substantially.
In a sense, when you do this, your brain is speaking silently, and your body knows it.
It’s a cool trick: your brain thinks a word and sends a signal to the muscles of your face, even though they don’t move.
And now, a team of researchers at the Massachusetts Institute of Technology has developed a system that can read these silent thoughts and transcribe them into speech.
A team of researchers at the Massachusetts Institute of Technology has developed a system that can read these silent thoughts and transcribe them into speech.
As Arnav Kapur, the lead author of the project from MIT’s Media Lab, explains, there were a number of applications that got them thinking about the tech.
On one hand, there were the merely curious (and perhaps not so good) ideas: could they design a system that would allow people to access their mobiles unobtrusively during a conversation?
As Kapur explains, “at the moment, the use of those devices is very disruptive.
If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself.” We have enough trouble getting and keeping one another’s attention now – that sounds like the last thing we need!