this post was submitted on 23 Sep 2023
18 points (90.9% liked)

Futurology

1813 readers
54 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 6 points 1 year ago (2 children)

Weird because you can run some LLM AIs locally on even your phone. That’s not a very impressive claim.

[–] DScratch@sh.itjust.works 6 points 1 year ago (1 children)

You can add huge numbers! Want to know what a million plus a million is? Intel has the answer!

[–] morrowind@lemmy.ml 1 points 1 year ago

Not very well though, the idea is to make them more efficient