this post was submitted on 10 Oct 2025
718 points (99.0% liked)
Funny
11919 readers
2659 users here now
General rules:
- Be kind.
- All posts must make an attempt to be funny.
- Obey the general sh.itjust.works instance rules.
- No politics or political figures. There are plenty of other politics communities to choose from.
- Don't post anything grotesque or potentially illegal. Examples include pornography, gore, animal cruelty, inappropriate jokes involving kids, etc.
Exceptions may be made at the discretion of the mods.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Does it if you know, though..?
IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.
We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.
While they're not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs' models, we should also both know how slovenly greedy these cunt corpos are. It'll only be a matter of time before they're feeding everything they clearly already have in.
At that point, it won't just be creep factor, but a legitimate doxxing problem.
This made me think of that "Im a robot" movie starring Fresh prince of Bel-air, when he had that hologram of that guy whose murder he was tryna solve, and it only answered him when he asked the right question. Definitely a tool call.
Also there was an AI in that that drove a bulldozer at fresh prince and made the robots glow red angrily (and also probably had location data accessible it it).
I'm not trying to say that it's art imitating life or anything because even my elastic definition of art can't be stretched that far, but it's sure something!