this post was submitted on 21 May 2025
960 points (97.6% liked)
Technology
70249 readers
4609 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That kind of the main problem: there is no indication that it will. I know one thing: current way LLM works, the chances that the problem of "lying" and "hallucinations", will even be solved are slim to none. There could be some mechanism that works in tandem with the bullshit generator machine to keep it in check, but it doesn't exist yet.
So most likely either we will collectively learn this fact and stop relying on this bullshit, which means there is a generation of kids who essentially skipped a learning phase, or we don't learn this fact, and there will be a society of mindless zombies that are fed lies and random bullshit on a second-to-second basis.
Both cases are bleak, but the second one is nightmarish.
but we already have Fox News
It’s all based off of predictions, it has no concept of the physical world or the ability to understand facts. Its goal is to please the user, not parse the entirety of human knowledge and provide an insightful complete thought.