this post was submitted on 15 Apr 2025
410 points (97.5% liked)

Privacy

37159 readers
399 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

A chart titled "What Kind of Data Do AI Chatbots Collect?" lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.

  • Gemini: Collects all 10 data types; highest total at 22 data points
  • Claude: Collects 7 types; 13 data points
  • CoPilot: Collects 7 types; 12 data points
  • Deepseek: Collects 6 types; 11 data points
  • ChatGPT: Collects 6 types; 10 data points
  • Perplexity: Collects 6 types; 10 data points
  • Grok: Collects 4 types; 7 data points
you are viewing a single comment's thread
view the rest of the comments
[–] exothermic@lemmy.world 17 points 1 week ago (13 children)

Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.

[–] Kiuyn@lemmy.ml 23 points 1 week ago* (last edited 1 week ago) (5 children)

I recommend GPT4all if you want run locally on your PC. It is super easy.

If you want to run in a separate server. Ollama + some kind of web UI is the best.

Ollama can also be run locally but IMO it take more learning than GUI app like GPT4all.

[–] codexarcanum@lemmy.dbzer0.com 11 points 1 week ago (4 children)

If by more learning you mean learning

ollama run deepseek-r1:7b

Then yeah, it's a pretty steep curve!

If you're a developer then you can also search "$MyFavDevEnv use local ai ollama" to find guides on setting up. I'm using Continue extension for VS Codium (or Code) but there's easy to use modules for Vim and Emacs and probably everything else as well.

The main problem is leveling your expectations. The full Deepseek is a 671b (that's billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.

They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren't as impressive as the cloud hosted big versions though.

[–] smee@poeng.link 4 points 1 week ago

Or if using flatpak, its an add-on for Alpaca. One click install, GUI management.

Windows users? By the time you understand how to locally install AI, you're probably knowledgeable enough to migrate to linux. What the heck is the point of using local AI for privacy while running windows?

load more comments (3 replies)
load more comments (3 replies)
load more comments (10 replies)