this post was submitted on 05 May 2025
253 points (94.7% liked)

Technology

69726 readers
3123 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jubilationtcornpone@sh.itjust.works 34 points 15 hours ago (9 children)

Sounds like a lot of these people either have an undiagnosed mental illness or they are really, reeeeaaaaalllyy gullible.

For shit's sake, it's a computer. No matter how sentient the glorified chatbot being sold as "AI" appears to be, it's essentially a bunch of rocks that humans figured out how to jet electricity through in such a way that it can do math. Impressive? I mean, yeah. It is. But it's not a human, much less a living being of any kind. You cannot have a relationship with it beyond that of a user.

If a computer starts talking to you as though you're some sort of God incarnate, you should probably take that with a dump truck full of salt rather then just letting your crazy latch on to that fantasy and run wild.

[–] rasbora@lemm.ee 16 points 14 hours ago (2 children)

Yeah, from the article:

Even sycophancy itself has been a problem in AI for “a long time,” says Nate Sharadin, a fellow at the Center for AI Safety, since the human feedback used to fine-tune AI’s responses can encourage answers that prioritize matching a user’s beliefs instead of facts. What’s likely happening with those experiencing ecstatic visions through ChatGPT and other models, he speculates, “is that people with existing tendencies toward experiencing various psychological issues,” including what might be recognized as grandiose delusions in clinical sense, “now have an always-on, human-level conversational partner with whom to co-experience their delusions.”

[–] A_norny_mousse@feddit.org 16 points 12 hours ago (1 children)

So it's essentially the same mechanism with which conspiracy nuts embolden each other, to the point that they completely disconnect from reality?

[–] rasbora@lemm.ee 6 points 11 hours ago (1 children)

That was my take away as well. With the added bonus of having your echo chamber tailor made for you, and all the agreeing voices tuned in to your personality and saying exactly what you need to hear to maximize the effect.

It’s eery. A propaganda machine operating on maximum efficiency. Goebbels would be jealous.

[–] A_norny_mousse@feddit.org 1 points 3 hours ago* (last edited 3 hours ago)

The time will come when we look back fondly on "organic" conspiracy nuts.

[–] CheeseNoodle@lemmy.world 1 points 7 hours ago (1 children)

human-level? Have these people used chat GPT?

[–] rasbora@lemm.ee 1 points 7 hours ago

I have and I find it pretty convincing.

load more comments (6 replies)