this post was submitted on 01 Oct 2025
943 points (97.5% liked)

Technology

75707 readers
3900 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] mcv@lemmy.zip 13 points 10 hours ago (1 children)

The main thing that AI has shown, is how much bullshit we subconsciously filter through every day without much effort. (Although clearly some people struggle a lot more with distinguishing between bullshit and fact, considering how much politicized nonsense has taken hold.)

[โ€“] Adderbox76@lemmy.ca 2 points 1 hour ago* (last edited 1 hour ago)

Exactly that.

If I were to google how to get gum out of my child's hair and then be directed to that same reddit post. I'd read through it and be pretty sure which were jokes and which were serious; we make such distinctions, as you say, every day without much effort.

LLMs simply don't have that ability. And the number of average people who just don't get that is mind-boggling to me.

I also find it weirdly dystopian that, if you sum that up, it kind of makes it sound like in order for an LLM to make the next step towards A.I. It needs a sense of humour. It needs the ability to weed through when the information it's digging from is serious, or just random jack-asses on the internet.

Which is turning it into a very very Star Trek problem.