this post was submitted on 28 Aug 2025
611 points (99.8% liked)

Technology

74831 readers
3146 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] OsrsNeedsF2P@lemmy.ml 7 points 1 week ago (3 children)

Nah, not for suicide:

But in the post warning users that the company will call the authorities if they seem like they're going to hurt someone, OpenAI also acknowledged that it is "currently not referring self-harm cases to law enforcement to respect people’s privacy given the uniquely private nature of ChatGPT interactions."

[–] synae@lemmy.sdf.org 2 points 1 week ago

Oh, so only for discussing topics the authorities consider verboten

[–] turtlesareneat@discuss.online 2 points 1 week ago* (last edited 1 week ago)

Oh thank god I was afraid some more kids might not get talked into suicide by a fucking server

[–] Soup@lemmy.world 2 points 1 week ago

Consider how the US handles those cases, that may actually be a broken-clock good thing. If they sent the cops to a suicidal person’s house said cops would probably kill them themselves.