this post was submitted on 15 Sep 2025
56 points (100.0% liked)

technology

23986 readers
406 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] NuraShiny@hexbear.net 11 points 19 hours ago (1 children)

What a surprise!

If you didn't see this coming, I can't even feel sorry for you. It was obviously going to happen.

[โ€“] SunsetFruitbat@lemmygrad.ml 12 points 18 hours ago* (last edited 18 hours ago)

I'm going to feel sorry considering that generally people using it are those usually not having anyone else to talk to or feel like they can't, and they shouldn't be punished by cops coming in and coming to kill them since some people reviewing a flagged conversation decided it was worth calling the pigs. Just because they used ChatGPT to talk about something, shouldn't mean they should be punished for that.

I think it is worth looking into as to why someone would go to talk to ChatGPT about issues. In which, If anything, this just shows a heightening contradiction in a lot of societies where mental health is just not really taken seriously or punished, plus a few other factors like alienation. In a way it sort of reminds me of how people say to reach out if your struggling, but if you start talking about suicidal thoughts, it just tends to be "go away" or "go talk to a therapist" who, much like openai, will call the police to depending on how specific you are with said suicidal thoughts or if it thoughts of harming others. Mainly since lots of people don't know how to handle such things or don't take it seriously or belittle someone going through a mental health crises. Least not to mention to go into other things, assuming someone even can afford to go see a therapist or mental health professional.