this post was submitted on 17 May 2024
496 points (94.8% liked)
Technology
59597 readers
3045 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In terms of LLM hallucination, it feels like the name very aptly describes the behavior and severity. It doesn't downplay what's happening because it's generally accepted that having a source of information hallucinate is bad.
I feel like the alternatives would downplay the problem. A "glitch" is generic and common, "lying" is just inaccurate since that implies intent to deceive, and just being "wrong" doesn't get across how elaborately wrong an LLM can be.
Hallucination fits pretty well and is also pretty evocative. I doubt that AI promoters want to effectively call their product schizophrenic, which is what most people think when hearing hallucination.
Ultmately all the sciences are full of analogous names to make conversations easier, it's not always marketing. No different than when physicists say particles have "spin" or "color" or that spacetime is a "fabric" or [insert entirety of String theory]...
After thinking about it more, I think the main issue I have with it is that it sort of anthropomorphises the AI, which is more of an issue in applications where you’re trying to convince the consumer that the product is actually intelligent. (Edit: in the human sense of intelligence rather than what we’ve seen associated with technology in the past.)
You may be right that people could have a negative view of the word “hallucination”. I don’t personally think of schizophrenia, but I don’t know what the majority think of when they hear the word.
You could invent a new word, but that doesn't help people understand the problem.
You are looking for an existing word that describes providing unintentionally incorrect thoughts but is totally unrelated to humans. I suspect that word doesn't exist. Every thinking word gets anthropomorphized.