this post was submitted on 19 Apr 2025
1127 points (98.3% liked)

Fuck AI

2503 readers
1489 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

Source (Via Xcancel)

you are viewing a single comment's thread
view the rest of the comments
[–] renzev@lemmy.world -4 points 2 days ago* (last edited 2 days ago) (1 children)

Just run your AI models locally? Problem solved lol.

[–] Luffy879@lemmy.ml 7 points 2 days ago* (last edited 2 days ago) (3 children)

So you want to buy me and every about like 10 million people a server that can run a halfway good LLM Model and pay the electric bills?

Also it dosent change any point except 2

[–] Trainguyrom@reddthat.com -3 points 2 days ago* (last edited 2 days ago)

There's actually quite a few models that can run easily on a mid-range gaming computer (yes even for image generation, and yes they can run reasonably well) which...that same energy would probably be consumed by gaming so not really a huge difference.

Microsoft even just released an open weights LLM that runs entirely on CPU that is comparable to the big hosted ones people are paying for

Edit: just saw what community I'm in, so my comment was probably not appropriate for the community

[–] jsomae@lemmy.ml -1 points 2 days ago* (last edited 2 days ago)

The electric bills would not be high (per person) jsyk. It'd be comparable to playing a video game when in use.