this post was submitted on 23 Sep 2023
18 points (90.9% liked)

Futurology

1813 readers
17 users here now

founded 1 year ago
MODERATORS
all 5 comments
sorted by: hot top controversial new old
[–] pennomi@lemmy.world 6 points 1 year ago (2 children)

Weird because you can run some LLM AIs locally on even your phone. That’s not a very impressive claim.

[–] DScratch@sh.itjust.works 6 points 1 year ago (1 children)

You can add huge numbers! Want to know what a million plus a million is? Intel has the answer!

[–] morrowind@lemmy.ml 1 points 1 year ago

Not very well though, the idea is to make them more efficient

[–] kool_newt@lemm.ee 3 points 1 year ago

Why would I want to run a local BS generator?