this post was submitted on 16 Sep 2025
176 points (97.8% liked)

Games

21370 readers
291 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Kolanaki@pawb.social 6 points 1 day ago* (last edited 1 day ago) (1 children)

If they used the generative AI to actually generate dialogue on the spot it could be pretty dope assuming it is trained properly so its consistent with the lore of the game and you could actually have a productive conversation with the NPCs without being constantly gaslit.

But that's not how any major dev has planned to use it. They use it to cheap out on art assets and writing the pre-scripted shit.

As it stands, even the few independent games doing the thing mentioned in my first paragraph are pretty garbage because they can't even remain consistent enough in their own logic to make the games actually playable.

The problem is hallucinations are part of the solution to conversations with LLMs, but they're destructive in a game environment. An NPC tells you something false and the player will assume they just couldn't find the secret or that the game is bugged rather than an AI that just made some shit up.

No amount of training removes hallucinating because that's part of the generation process. All it does is take your question and reverse engineer what an answer to that looks like based on what words it knows and it's data set. It doesn't have any "knowledge", not to mention that the training data would have to be different for each npc to represent different knowledge sets, backgrounds, upbringing, ideology, experience and culture. And then there's the issue of having to provide it broad background knowledge of the setting without it adding new stuff or revealing hidden lore.

That said, I wouldn't be surprised if we see this attempted, but I expect it to go horribly wrong.