this post was submitted on 23 Apr 2024
904 points (97.1% liked)

Technology

59578 readers
3053 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] BreakDecks@lemmy.ml 2 points 7 months ago (1 children)

The pedos out there are using AI to nudity pictures of real kids. That's just going to drive up the demand for creep shots and child model photosets to exploit.

There may be a small percentage of offending pedophiles that switch to pure GenAI over pictures of real kids, but I don't see GenAI ever playing a role in harm reduction given the harm it ultimately enables.

One of the current sickening trends is for a predator to convince a kid to send underwear or swimsuit pics, and then blackmail them into more hardcore photos with nudified versions of the original pics. They're already seeing an influx of that kind of CSAM online, that involves abusing real kids on social media.

I just wish America was less puritanical and taught kids about sex and boundaries to protect them, and that we had a functioning mental healthcare system that directly helps people who experience inappropriate sexuality attractions like pedophilia before they go down these dark paths.

[–] Jarix@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

Look we dont know for sure. Im grasping at silver linings made of straws. I dont care how unlikely it is to be true, but there is a chance.

A chance that some day, months years or decades, we will find out whether or not it didnt work out in best way it could have given whats already happening. But we will get an answer that will be pretty hard to disagree with

And i wont be surpised when it isnt what im hoping it might be. I wont be devastated or have my world view shattered.

Im not naive, im just hoping that we are wrong. Even if is a bit rediculous and theres only evidence to the contrary along the way.

What we know today may not be what we understand next year

Truth is stranger than fiction. We have soo many problems now its fine if we WANT an easy win we wont be able to KNOW the answer to for AN amount of time yet. But only if we are honest with ourselves that just because we want something, doesnt mean its has to happen. I also know that today

But holy hell my guy im fucking grasping that straw. This shit is too bleak and we need something to keep us from taking vigilant action. What we dont need is to stir the pot of fear worry and horror before its time to take action.

If we cant see paths to better places we will have one hell of a hard time recognizing things that will help us get to that path. And if you dont agree we need a new path, it might be too late for you