this post was submitted on 18 Sep 2025
68 points (100.0% liked)

technology

23994 readers
263 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
 

"In short, the OpenAI paper inadvertently highlights an uncomfortable truth," Xing concluded. "The business incentives driving consumer AI development remain fundamentally misaligned with reducing hallucinations."

you are viewing a single comment's thread
view the rest of the comments
[–] prole@hexbear.net 29 points 1 day ago (6 children)

I am begging someone to make one that doesn't confidently answer everything and just says it doesn't know or gives multiple possibilities, confidence levels, anything other than blind confidence. I use them to write code sometimes and it's wild how often it is like "you're totally right!" And I am, in fact, wrong. I rarely turn to an LLM for something I don't already know the answer to because of this shit.

Once again capitalism ruins everything around me

[–] fox@hexbear.net 28 points 1 day ago (1 children)

Yeah they can't do that because they're unable to discern truth from falsehood and all that's happening is matrix algebra on vectors representing how statistically likely an output is for any given input. Hallucinations are produced by the exact same process as useful answers

[–] Bay_of_Piggies@hexbear.net 12 points 1 day ago

They're idealism machines. They have zero interaction with the real world.

load more comments (4 replies)