this post was submitted on 26 Jan 2024
6 points (80.0% liked)
Futurology
1814 readers
29 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They may indeed develop linguistic skills at deeper levels, but LLMs are still only playing with words. Imagine a kid who grew up confined in a library with unlimited books, but no experience of the real world outside, no experiments with moving about, bouncing balls, eating, smelling, seeing, hearing, interacting with others, only reading - might write eloquently but have no 'common sense' of reality. To train a real AI with physical sense and capabilities would be like bringing up a kid - messy, not easy to automate, takes a long time.
I mean, I've never seen a toucan, or even been close to one to my knowledge, but I feel like I understand them pretty well, at least at a basic level. I've also never known any bird except through smell, sound, touch, sight and so on.
They do still suck at physical tasks, and probably will until we find a totally different approach to the problem than brute gradient decent, but I'm not so sure that makes everything else they (appear to) know useless. I'm not convinced kinetic knowledge is the only kind.