this post was submitted on 07 Oct 2023
928 points (96.9% liked)

Technology

59666 readers
2624 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A lawsuit filed by more victims of the sex trafficking operation claims that Pornhub’s moderation staff ignored reports of their abuse videos.


Sixty-one additional women are suing Pornhub’s parent company, claiming that the company failed to take down videos of their abuse as part of the sex trafficking operation Girls Do Porn. They’re suing the company and its sites for sex trafficking, racketeering, conspiracy to commit racketeering, and human trafficking.

The complaint, filed on Tuesday, includes what it claims are internal emails obtained by the plaintiffs, represented by Holm Law Group, between Pornhub moderation staff. The emails allegedly show that Pornhub had only one moderator to review 700,000 potentially abusive videos, and that the company intentionally ignored repeated reports from victims in those videos.

The damages and restitution they seek amounts to more than $311,100,000. They demand a jury trial, and seek damages of $5 million per plaintiff, as well as restitution for all the money Aylo, the new name for Pornhub’s parent company, earned “marketing, selling and exploiting Plaintiffs’ videos in an amount that exceeds one hundred thousand dollars for each plaintiff.”

The plaintiffs are 61 more unnamed “Jane Doe” victims of Girls Do Porn, adding to the 60 that sued Pornhub in 2020 for similar claims.
Girls Do Porn was a federally-convicted sex trafficking ring that coerced young women into filming pornographic videos under the pretense of “modeling” gigs. In some cases, the women were violently abused. The operators told them that the videos would never appear online, so that their home communities wouldn’t find out, but they uploaded the footage to sites like Pornhub, where the videos went viral—and in many instances, destroyed their lives. Girls Do Porn was an official Pornhub content partner, with its videos frequently appearing on the front page, where they gathered millions of views.

read more: https://www.404media.co/girls-do-porn-victims-sue-pornhub-for-300-million/

archive: https://archive.ph/zQWt3#selection-593.0-609.599

you are viewing a single comment's thread
view the rest of the comments
[–] Damage@slrpnk.net 143 points 1 year ago (5 children)

It's quite simple honestly, if you profit off something, you have the responsibility to make sure it's legal. We all like platforms like YouTube where you can find anything you want, but the truth is that they're currently unsustainable when forced to comply with the law.

With the advent of AI there's hope for improved systems for detecting violations, but it doesn't seem to be there yet.

[–] hellothere@sh.itjust.works 82 points 1 year ago (4 children)

I agree that pornhub, et al, should be liable for abuse their platform distributes, but how on earth is AI meant to help in sex trafficking?

[–] BreakDecks@lemmy.ml 67 points 1 year ago (2 children)

A lot of people have this very naive view that if we just build AI overlords to monitor all human activity, we can somehow automate good behavior and make the world a better place.

Really we'll just end up with RoboCop.

[–] Telodzrum@lemmy.world 9 points 1 year ago (1 children)
[–] leftzero@lemmy.ml 10 points 1 year ago

That seems like an excellent idea, we should all make everything possible to make sure such AI overlords are built.

Please don't hurt me, or an eventual future indistinguishable facsimile of myself..?

[–] jaybone@lemmy.world 3 points 1 year ago

But robocop was the good guy.

ED-209 was the bad guy.

He looked much cooler, but he was kind of a dick. And bad at stairs.

[–] riskable@programming.dev 1 points 1 year ago (1 children)

AI will help with sex trafficking by generating all the porn so humans won't need to be involved at all.

In the future the equivalent lawsuit will be from the victims of hackers who used people's PCs to generate porn.

[–] hellothere@sh.itjust.works 13 points 1 year ago

That's like saying professional porn got rid of amateur / "real" sex porn. It didn't.

There will always be a demand for real humans actually doing the thing depicted. While I'm sure there will be very popular AI production houses, similar to hentai, etc, if you think AI generated porn will completely remove the desire for humans from performing, then you do not understand why people watch porn.

[–] elbarto777@lemmy.world -5 points 1 year ago* (last edited 1 year ago) (3 children)

Edit: I said "ideally," as in utopian. In practice, corporations, governments and overall greed are in the way.


Ideally, sci-fi style, an effective AI can sift through all the reports and take down the videos that are clearly suspicious (as opposed to popular and well-known videos of porn stars that could be found elsewhere, for example, in dvd format.) It could message the reporter asking for more information, for example. Then it could message an actual human for the videos it is not confident to deem as abusive.

It may even try to contact the victims and offer them options to report the perpetrators to the authorities. Or lead them to a safe house, etc.

It could do this without never being tired, never being hungry, never feeling shocked.

In practice, we're not there yet. Close, but not there.

[–] hellothere@sh.itjust.works 14 points 1 year ago (2 children)

Close? Pull the other one.

And that's long before we get the ethical quandary of sourcing training data, and implicit biases.

[–] Jakdracula@lemmy.world 4 points 1 year ago

Not a hot dog.

[–] elbarto777@lemmy.world 1 points 1 year ago

True. I guess close was a bit of a stretch.

[–] johnnyb@discuss.tchncs.de 7 points 1 year ago (1 children)
[–] elbarto777@lemmy.world 4 points 1 year ago

I know what you mean.

My scenario was ideal, from the point of view of my 80s kid self looking forward to a promising future.

That future is now, and I hate it, because governments and big corporations ruined it for all of us.

[–] ArcaneSlime@lemmy.dbzer0.com 0 points 1 year ago (1 children)

This is why I can't jive with idealists. They put forth a proposal because "ideally..." and get people to thinking "yeah he's right," but he conveniently left off the fact that due to human nature it is basically an impossible pipedream and you're more likely to find true gnosis than for that to become reality.

[–] elbarto777@lemmy.world 0 points 1 year ago

The funny thing is that I'm a realist. But if course I like to think about what the supposed scenario is.

[–] NuPNuA@lemm.ee 44 points 1 year ago

As soon as you open anything to user generated content you run the risk that they're going to do something dodgy with that access. More than a decade ago I remember they added the emblem creator to Call of Duty and people were making swastikas within minutes.

[–] assassin_aragorn@lemmy.world 7 points 1 year ago (1 children)

Well here's the question, is an AI detection software legal if it's trained to identify this material? Strictly speaking, unless it is 100% free, selling the AI software would be profiting off the illegal material that you used to teach the AI.

[–] barsoap@lemm.ee 7 points 1 year ago (1 children)

It’s quite simple honestly, if you profit off something, you have the responsibility to make sure it’s legal.

Morally, yes, in practice that's not how our economy generally works, this is a gigantic can of worms from cobalt mines to work safety in Asian textile factories and back and forth and into a gazillion places. Germany has recent legislation about this but AFAIK it's the only such legislation in the world.

[–] Damage@slrpnk.net 1 points 1 year ago (1 children)

Well, countries' laws, with some exceptions, only have authority within their own borders.

[–] barsoap@lemm.ee 1 points 1 year ago* (last edited 1 year ago)

The German version is kinda only a proof of concept and saying to the rest of the EU "we're serious about this shit".

The actual goal is a EU-wide version which is in the pipeline, actually stricter (because Parliament wills it). It will apply to any company >250 employees with a net turnover of 40M in the EU (or world-wide for EU companies). And it's very hard to ignore the EU when you want to make money at scale, see the Brussels effect.

[–] ColeSloth@discuss.tchncs.de 5 points 1 year ago (1 children)

Yes, but how far does due diligence go in a matter like this? If company A is buying things they get from company B and company B gives company A all the proper paperwork, is it company A's responsibility to make sure company B didn't do anything illegal to obtain what they had? Was there a reason for company A to suspect company B was illegally obtaining something that many other companies legally and legitimately acquire?

I don't think so. I think in that case it would be completely company B that is at fault 100%.

I think it starts to become also company A's fault when it can be shown that they were aware of company B possibly obtaining things illegally or that company A started getting complaints about what company B was illegally doing. This here is more like what pornhub has done. They seemed to have purposefully understaffed the review and complaints department in order to more or less ignore complaints. Up until that part I don't think PH would be responsible.

[–] Damage@slrpnk.net 0 points 1 year ago (1 children)

If you re-sell for example stolen goods, your proceeds from those sale may be taken from you together with whatever stolen good you have on stock, and if you are found to be aware of the illegal origin of those goods, then you are an accomplice and are charged accordingly.

That's why buyers of used items have the difficult task of ascertaining whether those items are stolen or not.

[–] ColeSloth@discuss.tchncs.de 4 points 1 year ago (1 children)

In the case of pawn shops, the money made if the item is already gone does not get claimed back. If the item that was stolen is still there, then the item is returned to the owner and the store is out whatever it paid the thief.

However this is a perfect example of what I've said. In the above scenario, the pawn shop is under no legal trouble at all unless it was discovered that they were knowingly buying stolen goods. It is the thief who stole the items that will be in legal trouble.

[–] Damage@slrpnk.net 0 points 1 year ago

Maybe that's how it works in your country, but not in mine