this post was submitted on 05 Apr 2025
312 points (97.9% liked)
Futurology
2435 readers
389 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It understands relationships between concepts, which is something that can be learned from reading text even without firsthand experience of the world. "Tariffs" is associated with "recession" and "recession" is associated with "bad".
Sort of. It understands "0.0023" is associated with "0.0037" and "0.0037" is associated with "0.15532"
Yes, but I don't see that as particularly significant in this context. Information, including the knowledge of economic theory stored in a human brain, can be represented digitally. The fact that a present-day AI presumably can't actually experience what it's like to be unhappy as prices rise and incomes fall doesn't affect its ability to reason about economics.
We should probably just agree to disagree. I think the strides made in AI are at the very least impressive and have made some things (text-to-speech, for example) better - if not enormously then at least noticeably.
But there isn’t a true analog to be had between calculated probabilities and conscious thought. The former is a mimic of varied competence, but has no logic inherent to it. It requires human maintenance, it’s only path to “growth” if we want to call it that, is a black-box of infinite probabilities it calculates at incredible speed.
It’s a super-magic-8-ball that we choose to pretend has agency of some sort. But it does not.
Nailed it. ChatGPT gave a pretty balanced definition, but at least it popped out "bad".
And if you put in Smoot-Hawley:
These people responding think you think AI is thinking. See, because they're smarter than you! This place fucking annoys hell out of me sometimes, just like old reddit. At least we're not run over with bots and fascists.