this post was submitted on 21 Jun 2024
404 points (98.6% liked)

Privacy

31614 readers
832 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 31337@sh.itjust.works 3 points 3 months ago* (last edited 3 months ago) (1 children)

It's also trained on data people reasonably expected would be private (private github repos, Adobe creative cloud, etc). Even if it was just public data, it can still be dangerous. I.e. It could be possible to give an LLM a prompt like, "give me a list of climate activists, their addresses, and their employers" if it was trained on this data or was good at "browsing" on its own. That's currently not possible due to the guardrails on most models, and I'm guessing they try to avoid training on personal data that's public, but a government agency could make an LLM without these guardrails. That data could be public, but would take a person quite a bit of work to track down compared to the ease and efficiency of just asking an LLM.

[–] possiblylinux127@lemmy.zip 2 points 3 months ago

What you are describing is highly specific to a particular AI model.