this post was submitted on 03 Jun 2024
1289 points (96.4% liked)

Technology

60123 readers
2688 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] grue@lemmy.world 99 points 6 months ago (3 children)

I think it's important to note that Linux can be a way to avoid AI, but doesn't have to be. If you flip the headline around it almost implies that people who do want AI would be missing out by using Linux, but that's not true at all: instead, the reality is that Linux is still better for them, too, because you could install all the same kind of functionality if you wanted, but it would be wholly under your control, not Microsoft's.

[–] Lem453@lemmy.ca 34 points 6 months ago* (last edited 6 months ago) (1 children)

Self hosted AI seems like an intriguing option for those capable of running it. Naturally this will always be more complex than paying someone else to host it for you but it seems like that's that only way if you care about privacy

https://github.com/mudler/LocalAI

[–] Churbleyimyam@lemm.ee 3 points 6 months ago

Check out Jan AI. It's open source and extremely easy to install and run. I run it locally on a 2017 laptop without a dedicated GPU and it works, just takes longer to generate responses compared to something like ChatGPT.

[–] werefreeatlast@lemmy.world 8 points 6 months ago

Beautifully stated. Owning the AI personally as I own my personal computer if not more is the key.

[–] SOB_Van_Owen@lemm.ee 3 points 6 months ago (2 children)

That sounds very cool. I'm totally ignorant of the hardware requirements. What sort of minimum setup would such an install take?

[–] Avatar_of_Self@lemmy.world 5 points 6 months ago

It really depends on what model you want to run and how much training is bundled with it. You can pretty much run any model if you have enough disk space but of course GPU + VRAM is preferred for a ChatGPT like fast response. Otherwise, running on an older CPU and RAM is going to be noticeably slower, especially with complex models with a lot of training data to trawl through.

There are some pretty lite models out there but the responses will be more barebones and probably seem 'less informed'.

Give GPT4All a try for your first time. It makes install, configuration and usage point-and-click while being fairly straight forward. For the presented/featured models, it presents a small summary and VRAM recommended, though there are many, many other models available from inside the UI.

[–] nexussapphire@lemm.ee 3 points 6 months ago

Will my dell latitude from 2006 work?