Brain Waves to Words: Breakthrough Lets Scientists (Almost) Hear What You're Thinking

"Can you read my mind?" Margot Kidder whispers in her head to Christopher Reeve during one of cinema's cheesiest sequences. No, Superman can't, but U.S. neuroscientists are apparently a lot closer to a brainwave-speech algorithm after deciphering several of the brain processes associated with hearing words.

When someone says "Hello!" or "What's up?" or "How about this crazy winter weather?" your brain sorts all those sonic syllables and pitch frequencies into signals that are instantly translated as language. Now, reports Popular Science, a team of researchers at the University of California Berkeley has figured out how some of those processes map out — call it a metaphorical decoder wheel for the brain's linguistic centers. What's more, they've been able to reproduce words someone heard simply by monitoring electrical activity in the brain region associated with the process. The study, titled "Reconstructing Speech from Human Auditory Cortex," was just published in the journal PLoS Biology.

Imagine the benefits to someone who can't speak. Doctors could fit them with implants capable of selectively monitoring brain activity and converting it into spoken language piped to a totable audio system.

"This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig's disease and can't speak," said Robert Knight, one of the study's coauthors and director of the Helen Wills Neuroscience Institute at the University of California. "If you could eventually reconstruct imagined conversations from brain activity, thousands of people could benefit."

Researchers conducted experiments on 15 volunteer patients with epilepsy in the U.S., employing a computer to decode their brain activity. The goal was to determine where the seizures originated. To that end, researchers placed electrodes directly on volunteers' brains, then had them listen to words while monitoring brain activity in the region associated with the primary auditory cortex (known as the superior temporal gyrus). The result: The computer was able to decode and play back words the patients heard in recognizable fashion most of the time.

"Potentially, the technique could be used to develop an implantable prosthetic device to aid speaking, and for some patients that would be wonderful," said Knight. "The next step is to test whether we can decode a word when a person imagines it. That might sound spooky, but this could really help patients. Perhaps in 10 years it will be as common as grandmother getting a new hip."

Matt Peckham writes for TIME and PCWorld. You can find him on Twitter, Facebook, or Google+.

Subscribe to the Best of TechHive Newsletter

Comments