this post was submitted on 29 Jun 2025
507 points (95.7% liked)

Technology

72323 readers
4486 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] MoogleMaestro@lemmy.zip 85 points 3 days ago (4 children)

It's stupid as hell to share any personal information with a company that is interested in spying on you and feeding your data to the nearest advertiser they can find.

Like seriously -- are people using their brains or what?

[–] roofuskit@lemmy.world 59 points 3 days ago

Donald Trump was ELECTED TWICE. How is the stupidity of humanity not apparent.

[–] IAmNorRealTakeYourMeds@lemmy.world 33 points 3 days ago (2 children)

they need therapy, obviously they need help, and blaming them for not doing the most reasonable thing that might be unaffordable is even stupider.

blame predatory AI, openai could in a single afternoon make it so Chatgpt recomends or even helps you find a local therapist, instead of enabling this for profit.

load more comments (2 replies)
[–] Lost_My_Mind@lemmy.world 37 points 3 days ago (2 children)

are people using their brains or what?

What? No. Seriously, are you new here? And by here I mean Earth.

I see idiots all around me. Everybody only interested in advancing themselves. But if we advanced the group, it would be better for EVERYBODY.

But we as a species are too stupid to build a society that benefits everybody.

So no. No brain use here.

load more comments (2 replies)
load more comments (1 replies)
[–] vane@lemmy.world 29 points 3 days ago* (last edited 3 days ago) (5 children)

Maybe because it's cheaper, easier and you're not judged by other person.

load more comments (5 replies)
[–] HugeNerd@lemmy.ca 2 points 2 days ago

Buy more. Buy more now.

[–] chunes@lemmy.world 18 points 3 days ago (1 children)

And it's awesome. Men aren't allowed by others to show weakness. AI therapy genuinely helps a lot.

[–] prof@infosec.pub 18 points 3 days ago* (last edited 3 days ago)

Or it gets them into a negative feedback loop since AI hardly ever tries to contradict you.

But yeah. At least they're opening up to someone/something.

[–] mycodesucks@lemmy.world 39 points 3 days ago (9 children)

Look, if you can afford therapy, really, fantastic for you. But the fact is, it's an extremely expensive luxury, even at poor quality, and sharing or unloading your mental strain with your friends or family, particularly when it is ongoing, is extremely taxing on relationships. Sure, your friends want to be there for you when they can, but it can put a major strain depending on how much support you need. If someone can alleviate that pressure and that stress even a little bit by talking to a machine, it's in extremely poor taste and shortsighted to shame them for it. Yes, they're willfully giving up their privacy, and yes, it's awful that they have to do that, but this isn't like sharing memes... in the hierarchy of needs, getting the pressure of those those pent up feelings out is important enough to possibly be worth the trade-off. Is it ideal? Absolutely not. Would it be better if these systems were anonymized? Absolutely. But humans are natural anthropomorphizers. They develop attachments and build relationships with inanimate objects all the time. And a really good therapist is more a reflection for you to work through things yourself anyway, mostly just guiding your thoughts towards better patterns of thinking. There's no reason the machine can't do that, and while it's not as good as a human, it's a HUGE improvement on average over nothing at all.

load more comments (9 replies)
[–] poopkins@lemmy.world 11 points 3 days ago (7 children)

Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and "trying those out." One of the commenters described themselves as "neuro diverse" and was acting upon "advice" from generated LLM responses.

And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There's a tremendous amount of depth and context to somebody's mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.

Let's not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed "memory" of everything that's been discussed.

It's is effectively self-treatment with more steps.

load more comments (7 replies)
load more comments
view more: ‹ prev next ›