this post was submitted on 07 Sep 2024
122 points (97.7% liked)
Gaming
2528 readers
156 users here now
The Lemmy.zip Gaming Community
For news, discussions and memes!
Community Rules
This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:
You can see Lemmy.zip's rules by going to our Code of Conduct.
What to Expect in Our Code of Conduct:
- Respectful Communication: We strive for positive, constructive dialogue and encourage all members to engage with one another in a courteous and understanding manner.
- Inclusivity: Embracing diversity is at the core of our community. We welcome members from all walks of life and expect interactions to be conducted without discrimination.
- Privacy: Your privacy is paramount. Please respect the privacy of others just as you expect yours to be treated. Personal information should never be shared without consent.
- Integrity: We believe in the integrity of speech and action. As such, honesty is expected, and deceptive practices are strictly prohibited.
- Collaboration: Whether you're here to learn, teach, or simply engage in discussion, collaboration is key. Support your fellow members and contribute positively to shared learning and growth.
If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.
founded 1 year ago
MODERATORS
Like it or not, one way or another, AI is going to keep playing a larger roll in the games industry. I don't think we will ever have anything worthwhile that will generate whole games like he is saying but it's still going to be used to generate content.
It's already taking over chat moderation in a lot of larger games.
I mean... We had Daggerfall and Minecraft with procedural generation under the hood, and many others, for a very long time. Why we'd need a model that 'learns'?
I ask about in-game applications, not the use of LLMs in production.
Obvious application is having NPCs that you can actually talk with. Not just about one or two topics that they have a pre-recorded voice line to tell you about, but about anything at all. And with AI speech generation as well, you could have them somewhat realistically talk back to you.
You could also have an LLM working as a kind of DM, coming up with new quests with stories and some content variety. A lot of games have repeatable randomized missions, but this are very formulaic and feel very repetitive after you've done a few. There's usually no story, just a basic combat grind. A LLM could come up with actually interesting randomized quests, like a murder mystery where the murderer had a motive and you can legitimately question the suspects about anything they know.
I read that sentiment about quests a lot and have something for it myself, but I find it questionable.
Formulaic is what makes the quest work with the system. It should, just as a raw code, have a list of triggers for events, responses, all nailed to the system and the world that already exist. It needs to place an NPC with a question mark that has a fetch quest that updates your map\journal when you get it and correctly respond with an award when conditions are met. That's on a basic level.
The LLM to create such complex and strict manipulations should be narrowly guided to generate a working fetch quest without a mistake. We'd basically kill off most of what it is good at. And we'd need to build pipelines for it to lay more complex quests, all ourselves. At this point it's not easier than creating a sophisticated procedural generation engine for quests, to the same result.
Furthermore, it's a pain in the ass to create enough learning material to teach it about the world's lore alone, so it won't occasionally say what it doesn't know, and would actually speak - because to achieve the level of ChatGPT's responses they fed them petabytes of data. A model learning on LotR, some kilobytes, won't be able to greet you back, and making an existing model to speak in-character about the world it's yet to know is, well, conplicated. In your free time, you can try to make Bard speak like a dunmer fisherman from Huul that knows what's going on around him on all levels of worldbuilding young Bethesda put in. To make it correct, you'd end up printing a whole book into it's prompt window and it would still spit nonsense.
Instead, I see LLMs being injected in places they are good at, and the voicing of NPC's lines you've mentioned is one of the things they can excel at. Quick drafts of texts and quests that you'd then put into development? Okay. But making them communicating with existing systems is putting a triangle peg in a square hole imho.
On procedural generation at it's finest, you can read about the saga of the Boatmurder in Dwarf Fortress: https://lparchive.org/Dwarf-Fortress-Boatmurdered/Introduction/
I don't have time right now to write a full proper response, but for quests I would imagine starting out we would still use traditional random generation the bones of the quest, but use an LLM to create the narrative and NPC dialogs for it. Games like Shadows of Doubt already do a good job with randomly generated objectives, but there's no motive for the crimes. Just taking the already existing gameplay and using LLM to generate a reason why the crime happened would help with the atmosphere a lot. Also, you can question suspects and sometimes solve the case by them telling you they saw [person] at [location] at [time], but I think an LLM could provide actual witness interrogation where you have to ask the right question, or try to catch them in a lie.
As far as the mechanics for LLMs to actually provide dialog, I expect to see some 3rd party AI startups work on it. Some kind of system where they have some base language packages that provide general knowledge and dialog abilities, and then a collection of smaller models/loras to specialize. Finally you would have behind the scenes prompting that tells the NPC who their character is, any character/quest specific knowledge they have, their disposition towards the player, etc. I don't expect every game company to come up with this on their own, I suspect we'll get a few individual companies offering a built solution for it starting out, before it eventually becomes built into the larger game engines.
I forgot what game it was for, but some guy implemented an actual conversation system with in-game outcomes using AI.
I could also see more dynamic questing systems, character behaviors, even crafting systems based around the tech. But that requires investment and effort to make the tech work. Not exactly why studios might be investing in AI in the first place.
There are a small handful of good use. Content moderation and automatic translation of voice chat is an example.
Mostly though i think it will be AI used to generate content for the game, not during the game.