rasbora

joined 1 week ago
 
[–] rasbora@lemm.ee 11 points 3 hours ago (1 children)

I’m a bit rusty on the rules but before becoming a saint, don’t you first have to die?

In that case I’m all for Trump being sainted.

[–] rasbora@lemm.ee 1 points 3 hours ago

I have and I find it pretty convincing.

[–] rasbora@lemm.ee 16 points 5 hours ago (1 children)

But what if my favorite coworkers didn’t actually view me as their favorite coworker? Maybe to them I was just meh. Would they come to my reunion, since they wouldn’t consider me too belong at theirs?

I mean, a certain mutuality is implied.

[–] rasbora@lemm.ee 6 points 7 hours ago (1 children)

That was my take away as well. With the added bonus of having your echo chamber tailor made for you, and all the agreeing voices tuned in to your personality and saying exactly what you need to hear to maximize the effect.

It’s eery. A propaganda machine operating on maximum efficiency. Goebbels would be jealous.

[–] rasbora@lemm.ee 3 points 7 hours ago

I got it! What I meant to say was, next up for the tariffs will be American movies filmed on location in foreign lands. My wording was unclear.

[–] rasbora@lemm.ee 3 points 10 hours ago (2 children)

Next up: 100% tariffs on movies filmed in foreign lands.

[–] rasbora@lemm.ee 9 points 10 hours ago

>goes to sleep
>dreams of being at work

[–] rasbora@lemm.ee 16 points 10 hours ago (5 children)

Yeah, from the article:

Even sycophancy itself has been a problem in AI for “a long time,” says Nate Sharadin, a fellow at the Center for AI Safety, since the human feedback used to fine-tune AI’s responses can encourage answers that prioritize matching a user’s beliefs instead of facts. What’s likely happening with those experiencing ecstatic visions through ChatGPT and other models, he speculates, “is that people with existing tendencies toward experiencing various psychological issues,” including what might be recognized as grandiose delusions in clinical sense, “now have an always-on, human-level conversational partner with whom to co-experience their delusions.”

[–] rasbora@lemm.ee 53 points 10 hours ago (1 children)
[–] rasbora@lemm.ee 35 points 10 hours ago (2 children)

Don’t worry, guy, victory is sweet after all.

[–] rasbora@lemm.ee 13 points 11 hours ago

Turns out AI is really good at telling people what they want to hear, and with all the personal information users voluntary provide while chatting with their bots it’s tens to maybe hundreds times much more proficient at brainwashing its subjects than any human cult leader could ever hope to be.

 
 
 
119
[OC] Sunrise (pxlfdde.fsn1.your-objectstorage.com)
 
view more: next ›