© 2025 WFAE

Mailing Address:
WFAE 90.7
P.O. Box 896890
Charlotte, NC 28289-6890
Tax ID: 56-1803808
90.7 Charlotte 93.7 Southern Pines 90.3 Hickory 106.1 Laurinburg
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
WFAE's HD signals are impaired. Learn more.

Brain implants that decode a person's inner voice may threaten privacy

ARI SHAPIRO, HOST:

A person can bite their tongue to avoid blurting out a secret, but a surgically implanted brain computer interface can reveal words that were never meant to be spoken. NPR's Jon Hamilton reports on a new study that looks at the privacy concerns raised by technology that decodes signals in the brain.

JON HAMILTON, BYLINE: Brain computer interfaces, or BCIs, are experimental devices that can restore a paralyzed person's ability to speak. Erin Kunz of Stanford University says these implanted devices monitor the brain's motor cortex, which controls the muscles involved in speech.

ERIN KUNZ: We're recording the signals as they're attempting to speak and translating those neural signals into the words that they're trying to say.

HAMILTON: Either on screen or with a synthesized voice. Relying on signals produced when a paralyzed person attempts speech makes it easy for them to mentally zip their lip and avoid oversharing. But it also means that person has to make a concerted effort to convey a word or sentence. That's tiring and time consuming. So Kunz and a team set out to find a better way with help from four people already using BCIs.

KUNZ: The first thing we did was looked at individual words, both when they're attempting to speak as well as when they're imagining to speak and even when they were listening or reading.

HAMILTON: The team found a lot of overlap between intended speech and imagined speech. Eventually, they were able to decode words and sentences that existed only in a person's imagination.

KUNZ: We were able to get up to 74% accuracy decoding sentences from a 125,000-word vocabulary.

HAMILTON: That made communication faster and easier for the participants, but Kunz says the success also raised a question.

KUNZ: If inner speech is similar enough to attempted speech, could it accidentally leak out when someone is using a BCI?

HAMILTON: It could, so the team tried two strategies to protect BCI users' privacy. One was to program the device to ignore inner speech signals. That worked, but took away the speed and ease of decoding imagined words. So Kunz says the team borrowed an approach used by virtual assistants like Alexa and Siri, which wake up only when they hear a specific phrase.

KUNZ: We picked chitty chitty bang bang 'cause it doesn't occur too frequently in typical conversations, I would guess, and it's highly identifiable, highly decodable.

HAMILTON: That allowed participants to control when their inner speech was being decoded. The study, which appears in the journal Cell, adds to an ongoing discussion about privacy and new technologies that decode a person's brain activity. Nita Farahany of Duke University wrote a book on the subject called "The Battle For Your Brain." She says the two privacy safeguards used in the study are a step in the right direction.

NITA FARAHANY: But these both assume that we can control our thinking in ways that may not actually match how our minds work.

HAMILTON: For example, Farahany says participants in the study couldn't always control their inner voice.

FARAHANY: They had this counting task, and during that counting task, the BCI did pick up numbers people were thinking, which means that the boundary between private and public thought may be blurrier than we assume.

HAMILTON: Farahany says surgically implanted BCIs will be regulated by the Food and Drug Administration, which could require privacy protections. But that sort of regulation may not extend to consumer BCIs, worn as caps and used to do things like play video games. Farahany says the new study suggests that someday, those consumer devices may also be able to detect unspoken words.

FARAHANY: What this research shows is something unsettling, right? The brain patterns for thinking words and speaking them are remarkably similar.

HAMILTON: Farahany says that could allow companies like Apple, Amazon, Google and Facebook to find out what's going on in a consumer's mind even if that person doesn't intend to share.

FARAHANY: The more we push this research forward, the more transparent our brains become, and we have to recognize that this era of brain transparency really is an entirely new frontier for us.

HAMILTON: Farahany says she's encouraged, though, that researchers are already looking for ways to help people protect their mental privacy.

Jon Hamilton NPR News. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Tags
Morning EditionAll Things Considered
Jon Hamilton is a correspondent for NPR's Science Desk. Currently he focuses on neuroscience and health risks.