You can use local models for free, it's just slower.
this post was submitted on 06 May 2025
1201 points (98.2% liked)
Fuck AI
3341 readers
1284 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 1 year ago
MODERATORS
Why would you not want to use all the tools available to be as efficient as possible?
load more comments
(8 replies)
And local is usually less parameters. Reasoning on a local model is very poor.
I still think that local models in places without internet are better then offline documentation.
WTF "vibe coding"? I'm not even wasting the electricity to googgle that one.
load more comments
(2 replies)
You can always tell when your on a new bug when you ask about error “exception when calling…” and AI returns your exact implementation of the error back as a solution.
Not really intelligent