Well, there's no understanding or reasoning behind ChatGPT, so ...
this post was submitted on 12 Mar 2024
7 points (76.9% liked)
Futurology
1812 readers
198 users here now
founded 1 year ago
MODERATORS
Is it explaining its reasoning or fabricating a plausible justification for the outputs? We'll never know
Depends a bit on perspective and nuance. Gpt4 pretty much always returns text relevant to the prompt, the neural net sees A and knows B comes next. thats a form of understanding. Not understanding would be like its incapable of seeing A and outputting something irrelevant.
For reasoning, which i believe is actually “step by step” logic it needs a good handholding Prompt but then it can consistently create grade school level of solutions to logical problems.
Neither is what humans would call true understanding and true reasoning but its way to early judge ai by human standards.
This timeline is doomed.