this post was submitted on 16 Sep 2025
173 points (97.8% liked)
Games
21370 readers
302 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
Beehaw.org gaming
Lemmy.ml gaming
lemmy.ca pcgaming
founded 2 years ago
MODERATORS
The problem is hallucinations are part of the solution to conversations with LLMs, but they're destructive in a game environment. An NPC tells you something false and the player will assume they just couldn't find the secret or that the game is bugged rather than an AI that just made some shit up.
No amount of training removes hallucinating because that's part of the generation process. All it does is take your question and reverse engineer what an answer to that looks like based on what words it knows and it's data set. It doesn't have any "knowledge", not to mention that the training data would have to be different for each npc to represent different knowledge sets, backgrounds, upbringing, ideology, experience and culture. And then there's the issue of having to provide it broad background knowledge of the setting without it adding new stuff or revealing hidden lore.
That said, I wouldn't be surprised if we see this attempted, but I expect it to go horribly wrong.