this post was submitted on 22 Oct 2024
154 points (98.1% liked)

Futurology

1793 readers
29 users here now

founded 1 year ago
MODERATORS
top 5 comments
sorted by: hot top controversial new old
[–] Warl0k3@lemmy.world 23 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

We've been dealing with this a lot in WA, too. The popular one a couple months ago was the cliche hollywood "We've kidnapped your grandkid here's proof (AI copy of their grandchild, taken from facebook videos as far as we can tell, reading off that day's news article) please send $xxx bitcoin". It's disgustingly effective. Don't post your kids on social media, folks. Don't do it.

[–] Fiivemacs@lemmy.ca 5 points 4 weeks ago

Is this the 2000s again....

No names, no numbers, no addresses, stop storing things on someone else's computer . But it's honestly too late for most people. They were all caught up in the fuckerburg give him your personal info days.

More then half the world, doesn't care or never thinks of the potentials and then this kinda crap happens.

[–] cobwoms@lemmy.blahaj.zone 4 points 4 weeks ago

this is literally the plot of Thelma

[–] Rai@lemmy.dbzer0.com 3 points 4 weeks ago

Everyone should make their elders watch Kitboga. I’ve learned about so many scams from him.

[–] EpeeGnome@lemm.ee 2 points 4 weeks ago

This scam has been around since long before AI voice was a thing. You say something scary enough, and people will subconsciously attribute anything off about the voice to a bad connection and the severe stress the person is presumably under and genuinely think the voice sounds exactly like their loved one. AI voice makes it easier to fool more people, but I bet most of these types of scammers are not putting in the time to research every target to build a voice profile and instead focus on calling as many people as possible. Of course, these days anyone taken by this scam will assume it must have been AI voice, because otherwise how did they sound so convincing.