this post was submitted on 07 Jan 2024
1183 points (92.1% liked)

A Boring Dystopia

9777 readers
191 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] RainfallSonata@lemmy.world 195 points 10 months ago (10 children)

I never understood how they were useful in the first place. But that's kind of beside the point. I assume this is referencing AI, but due to the fact that you've only posted one photo out of apparently four, I don't really have any idea what you're posting about.

[–] Hildegarde@lemmy.world 246 points 10 months ago (10 children)

The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.

The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.

If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.

[–] RainfallSonata@lemmy.world 68 points 10 months ago* (last edited 10 months ago) (2 children)

I didn't realize they originated with verifying nsfw content. I'd only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn't necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you're matching the verification against other photos, that makes more sense.

[–] RedditWanderer@lemmy.world 69 points 10 months ago

It's been used way before the nsfw stuff and the advent of AI.

Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn't need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn't do anything, you need to provide paperwork to the mods.

This is a poor way to police fake nudes though, I wouldn't have trusted it even before AI.

load more comments (1 replies)
[–] oce@jlai.lu 28 points 10 months ago (10 children)

Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?

[–] can@sh.itjust.works 53 points 10 months ago

Probably not, but it would still reduce the amount considerably.

[–] AnneBonny@lemmy.dbzer0.com 31 points 10 months ago (4 children)

I think it takes a considerable amount of work to photoshop something written on a sheet of paper that has been crumpled up and flattened back out.

load more comments (4 replies)
[–] psmgx@lemmy.world 30 points 10 months ago (1 children)

It's mostly about filtering the low-hanging fruit, aka the low effort trolls, repost bots, and random idiots posting revenge porn.

load more comments (1 replies)
load more comments (7 replies)
load more comments (8 replies)

I found this singular screenshot floating around elsewhere, but yes r/stablediffusion is for AI images.

load more comments (8 replies)
[–] hemko@lemmy.dbzer0.com 151 points 10 months ago (7 children)
load more comments (7 replies)
[–] Coasting0942@reddthat.com 120 points 10 months ago (5 children)

I’m pretty sure we can just switch to a verification video chat which will buy us a year.

[–] thallamabond@lemmy.world 46 points 10 months ago (5 children)

One year? I'm guessing six months, what a time to be alive!?!

load more comments (5 replies)
load more comments (4 replies)
[–] MargotRobbie@lemmy.world 118 points 10 months ago (20 children)

Due to having so many people trying to impersonate me on the internet, I've become somewhat of a expert on verification pictures.

You can still easily tell that this is fake because if you look closely, the details, especially the background clutter, is utterly nonsensical.

  1. The object over her right shoulder (your left), for example, looks like if someone blended a webcam with a TV with a nightstand.
  2. Over her left shoulder (your right), her chair is only on that one side and it blends into the counter in the background.
  3. Is it a table lamp or a wall mounted light?
  4. The doorframe in background behind her head is not even aligned.
  5. Her clavicles are asymmetrical, never seen that on a real person.
  6. Her wispy hairstrands. Real hair don't appear out of thin air in loops.
[–] Honytawk@lemmy.zip 41 points 10 months ago (4 children)

The point isn't that you can spot it.

The point is that the automated system can't spot it.

Or are you telling me there is a person looking at every verification photo, and if they did they would thoroughly scan the photo for imperfections?

[–] MargotRobbie@lemmy.world 20 points 10 months ago (4 children)

The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like "turn your head slowly" or "open your mouth slowly" which would be trivial for a human to perform but near impossible for AI generators.

[–] curiousPJ@lemmy.world 23 points 10 months ago (2 children)

but near impossible for AI generators.

...I feel like this isn't the first time I heard that statement before.

load more comments (2 replies)
load more comments (3 replies)
load more comments (3 replies)
[–] Smoogs@lemmy.world 33 points 10 months ago (1 children)

Margot Robbie

Due to having so many people trying to impersonate me on the internet

Uh huh.

[–] MargotRobbie@lemmy.world 21 points 10 months ago* (last edited 10 months ago) (6 children)

That's esteemed Academy Award nominated verification picture expert/character actress Margot Robbie to you!

Now watch me win my Golden Globe tonight. (Still no best actress... sigh)

load more comments (6 replies)
[–] Aganim@lemmy.world 24 points 10 months ago* (last edited 10 months ago) (4 children)

Her clavicles are asymmetrical, never seen that on a real person.

Shit, are you telling me that every time I see myself in the mirror I'm actually looking at a string of AI generated images, generated in real-time? The matrix is real. 😱

It's either that, or my clavicles are actually very asymmetric. ☹️

load more comments (4 replies)
load more comments (17 replies)
[–] HiddenLayer5@lemmy.ml 92 points 10 months ago* (last edited 10 months ago) (15 children)

At some point the only way to verify someone will be to do what the Klingons did to rule out changelings: Cut them and see if they bleed.

[–] chemical_cutthroat@lemmy.world 25 points 10 months ago

Don't worry, companies like 23andMe and Ancestry have been banking DNA records, so mimicking blood won't be too hard, either.

load more comments (13 replies)
[–] yamanii@lemmy.world 86 points 10 months ago (5 children)

Can confirm, I made some random korean dude on dall-e to send to Instagram after it threatened to close my fake account, and it passed.

[–] uriel238@lemmy.blahaj.zone 47 points 10 months ago* (last edited 10 months ago) (3 children)

Once again everyone on the internet is a cute girl if they want to be.

Or a cute cat.

Or Elvis.

[–] Bonsoir@lemmy.ca 23 points 10 months ago* (last edited 10 months ago) (1 children)

And then there is me. I'm all of the above.

load more comments (1 replies)
load more comments (2 replies)
[–] Fredselfish@lemmy.world 46 points 10 months ago (1 children)

What am I looking at here?

[–] forksandspoons@lemmy.world 89 points 10 months ago* (last edited 10 months ago) (7 children)

GenAI made image of a verification post. The point i guess is that with genAI photos, anyone can easily make a fake verification post, making them less useful as a means to verify identity.

The post originally is from reddit (https://www.reddit.com/r/StableDiffusion/s/fEle6uaiR7)

load more comments (7 replies)
[–] wick@lemm.ee 39 points 10 months ago (3 children)

I can finally realise my dream of commenting on r/blackpeopletwitter

load more comments (3 replies)
[–] psmgx@lemmy.world 37 points 10 months ago (1 children)

Very rapidly the basis of truth in any discussion is going to get eroded.

load more comments (1 replies)
[–] shasta@lemm.ee 34 points 10 months ago

They were always useless

[–] STRIKINGdebate2@lemmy.world 33 points 10 months ago (7 children)

Isn't there a trick where you can ask someone to do a specific hand gesture to get photos verified. That'll still work especially because AI makes fingers look wonky

[–] fidodo@lemmy.world 65 points 10 months ago (14 children)

AI has been able to do fingers for months now. It's moving very rapidly so it's hard to keep up. It doesn't do them perfectly 100% of the time, but that doesn't matter since you can just regenerate it until it gets it right.

load more comments (14 replies)
[–] IgnatiusJReilly@lemmy.wtf 26 points 10 months ago (1 children)

"Can you hold up 7 fingers in front of the camera?"

Photo with one hand up

load more comments (1 replies)
load more comments (5 replies)
[–] qaz@lemmy.world 26 points 10 months ago (5 children)

That’s why you need a video with movement. AI still can’t do video right.

[–] oce@jlai.lu 40 points 10 months ago (3 children)

It's getting close, now you can provide a picture of someone and an animated skeleton, and it outputs the person moving according to the reference.

[–] Reverendender@sh.itjust.works 28 points 10 months ago (4 children)

Where do I get an animated skeleton?

Home Depot sells them around October

load more comments (3 replies)
load more comments (2 replies)
load more comments (4 replies)
[–] Kolanaki@yiffit.net 25 points 10 months ago* (last edited 10 months ago) (1 children)

My discord friends had some easy ways to defeat this.

You could require multiple photos; it's pretty hard to get AI to consistently generate photos that are 100% perfect. There would bound to be things wrong with trying to get AI to generate multiple photos of the same (non-celeb) person that would make it obvious it's fake.

Another idea was to make it a short video instead of a still photo. For now, at least, AI absolutely sucks balls at making video.

load more comments (1 replies)
[–] GTKashi@lemmy.world 25 points 10 months ago (3 children)

Never trust your eyes or ears again in this modern digital hellscape! https://youtube.com/shorts/55hr7Tx_7So?si=db5hROJWYjdQRMTD

load more comments (3 replies)
[–] Willer@lemmy.world 22 points 10 months ago (3 children)

AI pictures are like reverse uncanny-valley: They feel right, but you will shit brix upon further inspection.

load more comments (3 replies)
load more comments
view more: next ›