this post was submitted on 27 Sep 2025
643 points (99.7% liked)
RPGMemes
13802 readers
985 users here now
Humor, jokes, memes about TTRPGs
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
ChatGPT is a text generator. Any "information" it delivers is only correct by chance, if at all. Without the knowledge to check the answers yourself, you can't possibly tell whether you're falling for random error.
More in-depth, ChatGPT has learned how likely certain word patterns are in combination. Something like "1+1=" will most often be followed by "2". ChatGPT has no concept of truth or mathematical relationship, so it doesn't "understand" why this combination occurs like that, it just imitates it.
You can actually see the slight randomisation in the inconsistent way 5.18 is rounded to 5.2 instead. If this was correct – I'm not qualified to comment on that – and written by a human, you'd expect them to be more consequent with the precision. It's likely that ChatGPT learned these number-words from different sources using different precision and randomly picks which one to go with for each new line.
So what happens when it decides a word combination seems plausible, but it doesn't actually make sense? Well, for example, lawyers get slapped with a fine for ChatGPT citing case law that doesn't exist. They sounded valid, because that's what ChatGPT is made for: generating plausible word combinations. It doesn't know what a legal case is or how it imposes critical restrictions on what's actually valid in this context.
There's an open access paper on the proclivity of LLMs to bullshit, available for download from Springer. The short version is that it's entirely indifferent to truth. It doesn't and can't care or even know whether the figures it spits out are correct.
Use it to generate texts, if you must, but don't use it to generate facts. It's not looking them up, it's not researching, it's not doing the math – it's making them up to sound right.