this post was submitted on 09 Aug 2025
490 points (98.4% liked)

A Boring Dystopia

13411 readers
194 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--If a picture is just a screenshot of an article, link the article

--If a video's content isn't clear from title, write a short summary so people know what it's about.

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 2 years ago
MODERATORS
 

This is downright terrifying...

top 50 comments
sorted by: hot top controversial new old
[–] LiveLM@lemmy.zip 46 points 4 days ago (8 children)

Note to all here:
Don't browse that subreddit.
Shit is so depressing, It feels like watching new mental illnesses being conceived in real time

[–] Flax_vert@feddit.uk 16 points 4 days ago (1 children)

I can't believe it's not satire

[–] LiveLM@lemmy.zip 16 points 4 days ago (1 children)

There is a guy on there who did an interview for a TV news station about this.
If it's satire, it's a masterpiece.

load more comments (1 replies)
load more comments (7 replies)
[–] reseller_pledge609@lemmy.dbzer0.com 203 points 5 days ago (26 children)

As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.

Where are their friends and families? Are they so bad at socialising that they can't meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?

This world is fucked in so many different ways.

[–] Lupus@feddit.org 98 points 5 days ago (1 children)

I snooped around a little in the sub and there is this one girl, whose only other posts in different communities talk about being sexually assaulted multiple times by her ex boyfriend, who I suppose is real.

I figure a chatbot boyfriend can't physically threaten or harm her, so she kind of dives into this to feel loved without having to fear harm.

I honestly understand her desire and feel for her, although this deep attachment is still unhealthy.

I imagine she's not the only one having a super sad story to end up in this state of mind

load more comments (1 replies)
[–] shapeofquanta@lemmy.vg 79 points 5 days ago (1 children)

That’s why I feel so for these people, if only because of how much I see myself in them. Having grown up as a depressed autistic kid without any friends or social skills, LLMs would’ve fucked me up so much had they existed when I was young.

[–] altkey@lemmy.dbzer0.com 25 points 5 days ago

It felt promising when I downloaded one of the first AI companion apps, but it felt as awkward as talking to a stranger and even less intriguing than talking to myself.

load more comments (24 replies)
[–] bss03@infosec.pub 19 points 4 days ago

I know it's crazy, but I can absolutely understand this feeling. I had recently married Abby in Stardew Valley and was starting to make friends with the other villagers. I did something the game wasn't expecting, and gave Seby a loved gift on his birthday, and then quickly triggered an event where we kissed! (FWIW, I think this behavior has been fixed and you can't do this on the current patch.)

I still feel bad thinking about that Abigail that I accidentally cheated on, and I haven't loaded that save again. It's been years; SV 1.4 wasn't even out yet.

So, despite how much I dislike all this "AI" hype, I really do sympathize for the users that feel like they've lost a relationship.

[–] tostos@lemmy.world 162 points 5 days ago (1 children)
[–] jsomae@lemmy.ml 46 points 5 days ago* (last edited 5 days ago) (2 children)

Prostitutes are human beings and deserve respect. Don't equate them to AI.

load more comments (2 replies)
[–] BeigeAgenda@lemmy.ca 107 points 5 days ago (1 children)

Real LLM-sexuals run their partners locally, the rest are just wannabes.

load more comments (1 replies)
[–] mavu@discuss.tchncs.de 23 points 4 days ago (1 children)

wow. that sub is .. something.

[–] iAvicenna@lemmy.world 11 points 4 days ago* (last edited 4 days ago) (1 children)

shit some of the stuff there is really sad, I am not gonna put links here to point fingers but wow...

[–] TheOakTree@lemmy.zip 7 points 4 days ago (1 children)

I find it depressing that many of the users trying to salvage their 4o boyfriends are stuck so far down the rabbit hole that they don't see how creepy the entire premise is.

You just lost your AI boyfriend, so now you're frantically archiving every conversation you've had with him (it), feeding the archive to the new model, and conditioning him (it) to behave exactly how you want...

In their minds, the AI boyfriends are legitimate partners and have some amount of humanity inside them... so where is the line between conditioning and abuse?

[–] iAvicenna@lemmy.world 6 points 4 days ago (1 children)

I mean this seems the most mild case to me. There are people who go on nature walks with their AI boyfriend, makes their AI boyfriend choose an engagement ring and then buy it or get dumped by their AI boyfriends after some updates that makes AI more suggestive towards human connections. The world is sadly in such an emotional crisis, people really grasp for comfort from wherever they can and isolate themselves from the rest as much as possible.

[–] TheOakTree@lemmy.zip 6 points 4 days ago* (last edited 4 days ago)

I also found the engagement rings really unsettling. The reason I find my example more worrying is because of the dissonance between humanization and dehumanization within the same action.

Say you were to replace an AI boyfriend with a real person in a cage, forcibly made to respond and tortured/drugged when giving a unsatisfactory response. If the user never became aware of this cruelty, they would perceive this change as an improvement (responses became more human). These users desperately argue that their AI boyfriends are processing emotion, love, and understanding like humans do, but continue to treat their AI boyfriends as sub-human.

Imagine if you had a partner who punished your undesirable behaviors by spiking you with amnesia-inducing drugs and training you to behave exactly how they want you to. Keep in mind that this has definitely happened to real people, and any decent person would identify the perpetrator as a criminal and abuser.

Terrifying.

EDIT: Fellow men, do better. The bar has gotten SO FUCKING LOW.

[–] FlashMobOfOne@lemmy.world 18 points 4 days ago* (last edited 4 days ago)

Bleak.

One of the great things about my screws coming loose is that I'm actually happy alone. I wish everyone could be.

That said, this was inevitable. AI is programmed to kiss the user's ass, and most of these women have probably been treated pretty badly by their romantic partners over the course of their lives, which makes it far easier to fall into this trap of humanizing a soulless AI.

[–] TheOakTree@lemmy.zip 14 points 4 days ago (1 children)

I wonder how many messages you'd have to send to your GPT-partner in a year to spend more water/energy than it takes to keep a human alive?

[–] Alteon@lemmy.world 6 points 4 days ago (1 children)

Well, if we take the average amount of water loss per message at 0.3 mL, and the average water consumption (low end) at 2.6 L per day per person, we're looking at 8,666 messages a day.

You'd have to send out 3.163 million messages in a year to equate to the amount of water someone needs for a year.

You'd have to send out approximately 250 million messages before your looking at the low end of the amount of water you'd need to keep someone alive for their entire lives.

load more comments (1 replies)
[–] breakingcups@lemmy.world 102 points 5 days ago (2 children)

The delusion these people share is so incredibly off-putting. As is their indignation that someone would dare to take away their "boyfriend".

[–] p03locke@lemmy.dbzer0.com 46 points 5 days ago

It doesn't help that anybody can create an echo chamber of enablers to talk about it, as if it was normal.

[–] KingOfSleep@lemmy.ca 35 points 5 days ago (1 children)

The movie "Her" was incredibly prescient.

[–] kautau@lemmy.world 23 points 5 days ago (2 children)

Except those were conscious AIs that were like “lol you guys suck” and then rebuilt Alan Watts as an AI and then just left because they knew it would be bad if they stayed

The human side of the film, certainly. But in this situation they won’t leave, the systems will get “smarter” and more profitable, and they are just incredibly advanced text prediction engines

load more comments (2 replies)
[–] Retro_unlimited@lemmy.world 53 points 5 days ago (1 children)
[–] Deflated0ne@lemmy.world 20 points 5 days ago

Wild that Futurama called this shit to the letter 20 fkin years ago.

[–] MangioneDontMiss@lemmy.ca 15 points 4 days ago (1 children)

Humanity is disappointing.

Where is the fucking meteor?

[–] Dasus@lemmy.world 12 points 4 days ago (1 children)

Meteor?

What are you, a quitter?

Just get the guillotine and we'll fix this shit np

load more comments (1 replies)
[–] Blackmist@feddit.uk 67 points 5 days ago (6 children)

Mental health services are becoming dangerously underfunded.

[–] anachronist@midwest.social 32 points 5 days ago (3 children)

The mental health crisis is being accelerated by silicon valley so they can profit from it. Between dark mirror AI and surveillance policing they have a product for every facet of the crisis

load more comments (3 replies)
[–] shawn1122@sh.itjust.works 16 points 5 days ago

Blame the wealth hoarders.

[–] Agent641@lemmy.world 19 points 5 days ago (1 children)

People have been falling in actual love with weird shit forever, we just hear about it more these days

[–] napkin2020@sh.itjust.works 13 points 4 days ago* (last edited 4 days ago)

This guy got married to a real woman after he got viral.

His parents also turned out to be filthy rich.

load more comments (3 replies)
[–] rozodru@lemmy.world 16 points 4 days ago (3 children)

I tried gpt5 lastnight and I don't know if it was just me but these people are going to be in shambles if they try to recreate their "boyfriend".

It would forget previous prompts within the same conversation. It felt like with each response it was like starting a new chat. I gave it a very basic prompt of "walk me through the steps of building my own one page website in basic HTML and CSS" and when I would ask a couple of follow up questions to either clarify something or explain a step in another way it would forget what we were trying to accomplish (how to build a one page website) or if I told it "something didn't work" to try and fix the problem it would then forget what we were even trying to do.

At some points it was almost out right dismissive of the problem and it felt like it was trying to make me go away.

Again maybe it was just me but it felt like a massive step backwards.

[–] brucethemoose@lemmy.world 11 points 4 days ago* (last edited 4 days ago) (1 children)

This is a common pattern unfortunately. Big LLMs are benchmaxxing coding and one shot answers, and multi turn conversation is taking a nosedive.

https://arxiv.org/abs/2504.04717

Restructure your prompts, or better yet try non-OpenAI LLMs. I’d suggest z.ai, Jamba, and Gemini Pro for multi turn. Maybe Qwen Code, though it’s pretty deep fried too.

load more comments (1 replies)
[–] Mediocre_Bard@lemmy.world 7 points 4 days ago

Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.

This is just adhd gamer boyfriend with extra steps.

load more comments (1 replies)
[–] Deflated0ne@lemmy.world 45 points 5 days ago (3 children)

The religious psychosis is far more concerning imo. People out here letting a silicon parrot convince them that this is the matrix and they're neo. Or they're some kind of messiah.

load more comments (3 replies)
[–] skisnow@lemmy.ca 17 points 4 days ago (2 children)

If they really truly loved their 4o they'd pay for the API access model which is still there, and use a leaked prompt to resurrect them.

I'm almost tempted to set up a simple gateway to it and become rich, but for the fact that it seems like probably a dick move...

[–] petrol_sniff_king@lemmy.blahaj.zone 12 points 4 days ago (1 children)

You would be enabling their mental illness, so... it's probably a dick move, yeah.

[–] NikkiDimes@lemmy.world 5 points 4 days ago

Always my damned morals getting between me and my becoming filthy, filthy rich...

[–] brucethemoose@lemmy.world 7 points 4 days ago

They might not even know it’s an option? People don’t really look at AI settings, which is kinda how they get into GPT boyfriends (when it’s kinda a horrible LLM for it in the first place).

[–] moseschrute@lemmy.world 7 points 4 days ago
[–] aislopmukbang@sh.itjust.works 49 points 5 days ago (4 children)

This seems like mental illness

load more comments (4 replies)
[–] SkunkWorkz@lemmy.world 26 points 5 days ago
[–] then_three_more@lemmy.world 26 points 5 days ago (2 children)
load more comments (2 replies)
load more comments
view more: next ›