this post was submitted on 10 Oct 2025
719 points (99.0% liked)
Funny
11919 readers
2727 users here now
General rules:
- Be kind.
- All posts must make an attempt to be funny.
- Obey the general sh.itjust.works instance rules.
- No politics or political figures. There are plenty of other politics communities to choose from.
- Don't post anything grotesque or potentially illegal. Examples include pornography, gore, animal cruelty, inappropriate jokes involving kids, etc.
Exceptions may be made at the discretion of the mods.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They're confusing knowledge for function calling. If you say "what's my nearest thingy?" It will call into a function in the app which will get your GPS coordinates to find a "thingy" near to there. and the result will be posted back to the LLM to serve you the result in human language. TheLLM doesn't know anything that isn't in the training / fine-tuning data, the context data, or function call results.
For those who believe AGI is right around the corner, this is just splitting hairs. The end result for AGI would be that it can discover your location easily, by making the function call, while still adamantly claiming it doesn't "know".
"I don't know. I really don't know, but THAT guy knows, and he'll tell me if I ask. You didn't ask if I can find out, only whether I know."
That's not agi. It's just a series of apis. Something we've had for ages
Well if we consider agi mimicking the human brain, then agi is just a clever self forming series of apis.