this post was submitted on 28 Oct 2025
41 points (87.3% liked)

Linux

13757 readers
104 users here now

Welcome to c/linux!

Welcome to our thriving Linux community! Whether you're a seasoned Linux enthusiast or just starting your journey, we're excited to have you here. Explore, learn, and collaborate with like-minded individuals who share a passion for open-source software and the endless possibilities it offers. Together, let's dive into the world of Linux and embrace the power of freedom, customization, and innovation. Enjoy your stay and feel free to join the vibrant discussions that await you!

Rules:

  1. Stay on topic: Posts and discussions should be related to Linux, open source software, and related technologies.

  2. Be respectful: Treat fellow community members with respect and courtesy.

  3. Quality over quantity: Share informative and thought-provoking content.

  4. No spam or self-promotion: Avoid excessive self-promotion or spamming.

  5. No NSFW adult content

  6. Follow general lemmy guidelines.

founded 2 years ago
MODERATORS
 

Let me preface by saying I despise corpo llm use and slop creation. I hate it.

However, it does seem like it could be an interesting helpful tool if ran locally in the cli. I've seen quite a few people doing this. Again, it personally makes me feel like a lazy asshole when I use it, but its not much different from web searching commands every minute (other than that the data used in training it is obtained by pure theft).

Have any of you tried this out?

you are viewing a single comment's thread
view the rest of the comments
[–] alecsargent@lemmy.zip 11 points 2 days ago* (last edited 2 days ago) (2 children)

I've run several LLM's with Ollama (locally) and I have to say that is was fun but it is not worth it at all. It does get many answers right but it does not even come close to compensate the amount of time spent on generating bad answers and troubleshooting those. Not to mention the amount of energy the computer is using.

In the end I just rather spent my time actually learning the thing I'm supposed to solve or just skim through documentation if I just want the answer.

[–] possiblylinux127@lemmy.zip 2 points 1 day ago (1 children)

I have had really good luck with Alpaca which uses Ollama

Gemma3 has been great

[–] alecsargent@lemmy.zip 1 points 1 day ago

Alpaca is the GTK client of Ollama right? I used it for a while to let my family have a go at local LLM's. It was very nice for them but on my computer it ran significantly slower than what they expected so that's that.

[–] bridgeenjoyer@sh.itjust.works 5 points 2 days ago (1 children)

This has been my experience with llms in my day to day job. Thank you for comment

[–] alecsargent@lemmy.zip 2 points 2 days ago

thank you as well