this post was submitted on 27 Feb 2025
678 points (98.2% liked)

Technology

63746 readers
3757 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ubergeek@lemmy.today 1 points 8 hours ago

yeah but like, legally, is this even a valid argument?

Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

Morally, that's what you're doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim,

It makes them a victim.

But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

Does a facial structure recognition model use the likeness of other people?

Yes.

Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

Exactly. So, without consent, it shouldn't be used. Periodt.