this post was submitted on 22 Nov 2024
71 points (89.0% liked)
196
17002 readers
1307 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
If you have any questions, feel free to contact us on our matrix channel.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You know asking AI is asking to be lied to yes?
Like, Apple has gone so far as to say it's impossible for current LLMs to reason.
It's incapable of knowing what is true in it's current form according to apple.
Don't trust AI to actually know anything.
i didn't even trust it much
It's not that you can trust it a little, it's that you can't trust it ever. It's just saying want it thinks you want to hear, not what is true.
i posted here to see if anyone knew the reason, couldn't even get a comment on !isitdown@infosec.pub .
I mean, by the definition of "LLM", it's impossible for the models to reason. It's literally a fancy big text generation model, like your keyboard text prediction on ~~steroids~~ GPUs.