Note to all here:
Don't browse that subreddit.
Shit is so depressing, It feels like watching new mental illnesses being conceived in real time
A Boring Dystopia
Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.
Rules (Subject to Change)
--Be a Decent Human Being
--Posting news articles: include the source name and exact title from article in your post title
--If a picture is just a screenshot of an article, link the article
--If a video's content isn't clear from title, write a short summary so people know what it's about.
--Posts must have something to do with the topic
--Zero tolerance for Racism/Sexism/Ableism/etc.
--No NSFW content
--Abide by the rules of lemmy.world
I can't believe it's not satire
There is a guy on there who did an interview for a TV news station about this.
If it's satire, it's a masterpiece.
As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.
Where are their friends and families? Are they so bad at socialising that they can't meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?
This world is fucked in so many different ways.
I snooped around a little in the sub and there is this one girl, whose only other posts in different communities talk about being sexually assaulted multiple times by her ex boyfriend, who I suppose is real.
I figure a chatbot boyfriend can't physically threaten or harm her, so she kind of dives into this to feel loved without having to fear harm.
I honestly understand her desire and feel for her, although this deep attachment is still unhealthy.
I imagine she's not the only one having a super sad story to end up in this state of mind
That’s why I feel so for these people, if only because of how much I see myself in them. Having grown up as a depressed autistic kid without any friends or social skills, LLMs would’ve fucked me up so much had they existed when I was young.
It felt promising when I downloaded one of the first AI companion apps, but it felt as awkward as talking to a stranger and even less intriguing than talking to myself.
I know it's crazy, but I can absolutely understand this feeling. I had recently married Abby in Stardew Valley and was starting to make friends with the other villagers. I did something the game wasn't expecting, and gave Seby a loved gift on his birthday, and then quickly triggered an event where we kissed! (FWIW, I think this behavior has been fixed and you can't do this on the current patch.)
I still feel bad thinking about that Abigail that I accidentally cheated on, and I haven't loaded that save again. It's been years; SV 1.4 wasn't even out yet.
So, despite how much I dislike all this "AI" hype, I really do sympathize for the users that feel like they've lost a relationship.
Prostitutes are human beings and deserve respect. Don't equate them to AI.
Real LLM-sexuals run their partners locally, the rest are just wannabes.
wow. that sub is .. something.
shit some of the stuff there is really sad, I am not gonna put links here to point fingers but wow...
I find it depressing that many of the users trying to salvage their 4o boyfriends are stuck so far down the rabbit hole that they don't see how creepy the entire premise is.
You just lost your AI boyfriend, so now you're frantically archiving every conversation you've had with him (it), feeding the archive to the new model, and conditioning him (it) to behave exactly how you want...
In their minds, the AI boyfriends are legitimate partners and have some amount of humanity inside them... so where is the line between conditioning and abuse?
I mean this seems the most mild case to me. There are people who go on nature walks with their AI boyfriend, makes their AI boyfriend choose an engagement ring and then buy it or get dumped by their AI boyfriends after some updates that makes AI more suggestive towards human connections. The world is sadly in such an emotional crisis, people really grasp for comfort from wherever they can and isolate themselves from the rest as much as possible.
I also found the engagement rings really unsettling. The reason I find my example more worrying is because of the dissonance between humanization and dehumanization within the same action.
Say you were to replace an AI boyfriend with a real person in a cage, forcibly made to respond and tortured/drugged when giving a unsatisfactory response. If the user never became aware of this cruelty, they would perceive this change as an improvement (responses became more human). These users desperately argue that their AI boyfriends are processing emotion, love, and understanding like humans do, but continue to treat their AI boyfriends as sub-human.
Imagine if you had a partner who punished your undesirable behaviors by spiking you with amnesia-inducing drugs and training you to behave exactly how they want you to. Keep in mind that this has definitely happened to real people, and any decent person would identify the perpetrator as a criminal and abuser.
Terrifying.
EDIT: Fellow men, do better. The bar has gotten SO FUCKING LOW.
Bleak.
One of the great things about my screws coming loose is that I'm actually happy alone. I wish everyone could be.
That said, this was inevitable. AI is programmed to kiss the user's ass, and most of these women have probably been treated pretty badly by their romantic partners over the course of their lives, which makes it far easier to fall into this trap of humanizing a soulless AI.
I wonder how many messages you'd have to send to your GPT-partner in a year to spend more water/energy than it takes to keep a human alive?
Well, if we take the average amount of water loss per message at 0.3 mL, and the average water consumption (low end) at 2.6 L per day per person, we're looking at 8,666 messages a day.
You'd have to send out 3.163 million messages in a year to equate to the amount of water someone needs for a year.
You'd have to send out approximately 250 million messages before your looking at the low end of the amount of water you'd need to keep someone alive for their entire lives.
The delusion these people share is so incredibly off-putting. As is their indignation that someone would dare to take away their "boyfriend".
It doesn't help that anybody can create an echo chamber of enablers to talk about it, as if it was normal.
The movie "Her" was incredibly prescient.
Except those were conscious AIs that were like “lol you guys suck” and then rebuilt Alan Watts as an AI and then just left because they knew it would be bad if they stayed
The human side of the film, certainly. But in this situation they won’t leave, the systems will get “smarter” and more profitable, and they are just incredibly advanced text prediction engines
Wild that Futurama called this shit to the letter 20 fkin years ago.
Humanity is disappointing.
Where is the fucking meteor?
Meteor?
What are you, a quitter?
Just get the guillotine and we'll fix this shit np
Mental health services are becoming dangerously underfunded.
The mental health crisis is being accelerated by silicon valley so they can profit from it. Between dark mirror AI and surveillance policing they have a product for every facet of the crisis
Blame the wealth hoarders.
People have been falling in actual love with weird shit forever, we just hear about it more these days
This guy got married to a real woman after he got viral.
His parents also turned out to be filthy rich.
I tried gpt5 lastnight and I don't know if it was just me but these people are going to be in shambles if they try to recreate their "boyfriend".
It would forget previous prompts within the same conversation. It felt like with each response it was like starting a new chat. I gave it a very basic prompt of "walk me through the steps of building my own one page website in basic HTML and CSS" and when I would ask a couple of follow up questions to either clarify something or explain a step in another way it would forget what we were trying to accomplish (how to build a one page website) or if I told it "something didn't work" to try and fix the problem it would then forget what we were even trying to do.
At some points it was almost out right dismissive of the problem and it felt like it was trying to make me go away.
Again maybe it was just me but it felt like a massive step backwards.
This is a common pattern unfortunately. Big LLMs are benchmaxxing coding and one shot answers, and multi turn conversation is taking a nosedive.
https://arxiv.org/abs/2504.04717
Restructure your prompts, or better yet try non-OpenAI LLMs. I’d suggest z.ai, Jamba, and Gemini Pro for multi turn. Maybe Qwen Code, though it’s pretty deep fried too.
Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.
This is just adhd gamer boyfriend with extra steps.
The religious psychosis is far more concerning imo. People out here letting a silicon parrot convince them that this is the matrix and they're neo. Or they're some kind of messiah.
If they really truly loved their 4o they'd pay for the API access model which is still there, and use a leaked prompt to resurrect them.
I'm almost tempted to set up a simple gateway to it and become rich, but for the fact that it seems like probably a dick move...
You would be enabling their mental illness, so... it's probably a dick move, yeah.
Always my damned morals getting between me and my becoming filthy, filthy rich...
They might not even know it’s an option? People don’t really look at AI settings, which is kinda how they get into GPT boyfriends (when it’s kinda a horrible LLM for it in the first place).
Omg