this post was submitted on 20 Aug 2025
42 points (100.0% liked)

science

21108 readers
1406 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

founded 2 years ago
MODERATORS
 

Surgically implanted devices that allow paralyzed people to speak can also eavesdrop on their inner monologue.

That's the conclusion of a study of brain-computer interfaces (BCIs) in the journal Cell.

The finding could lead to BCIs that allow paralyzed users to produce synthesized speech more quickly and with less effort.

But the idea that new technology can decode a person's inner voice is "unsettling," says Nita Farahany, a professor of law and philosophy at Duke University and author of the book: The Battle for Your Brain.

top 13 comments
sorted by: hot top controversial new old
[–] otter@lemmy.dbzer0.com 19 points 4 days ago (2 children)

They'll have ads injected soon enough, mark my words.

Or, just read "Transmetropolitan", etc.

[–] SolidShake@lemmy.world 5 points 4 days ago

Black mirror has an episode like that.

[–] Coopr8@kbin.earth 3 points 3 days ago

Yeah or "Feed", if anything needs regulation it's this tech.

[–] teft@piefed.social 11 points 4 days ago (2 children)

What's to stop a government from forcibly inserting one of these into the brain of someone they need information from? It'd be real hard to stop your thoughts from revealing the location of the classified information if it's anything like that whole "don't think of a purple elephant" thing.

[–] Admetus@sopuli.xyz 13 points 4 days ago

This probably pops up in a few sci-fi books but Hyperion's was by far the nastiest version of this + torture. What's to stop big gov doing it?

[–] MysteriousSophon21@lemmy.world 8 points 4 days ago (2 children)

Current BCIs require extensive training where the user actively thinks specific patterns - they cant just "read" random thoughts, and the implants are customized to specific brain regions and neural patterns so forcing someone to use one without their cooperation would yield gibbersh data at best.

[–] teft@piefed.social 6 points 4 days ago (1 children)

Today that is true. I'm thinking of ten or twenty years from now when they have enough training data from all the volunteers to make solid guesses for randoms. I just think it's something people should keep in mind for a technology like this. It could easily be abused.

Or force people to read/watch media and learning what activates

[–] Coopr8@kbin.earth 4 points 3 days ago

Think Clockwork Orange scenario. Hard not to think words when you are shown those things in images, and especially if you're drugged.

[–] AmidFuror@fedia.io 5 points 4 days ago (1 children)

Same thing happens after being thawed from cryogenic storage.