this post was submitted on 23 Aug 2024
415 points (90.9% liked)

Technology

59578 readers
3331 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Blackmist@feddit.uk 26 points 3 months ago (3 children)

We've had fake photos for over 100 years at this point.

https://en.wikipedia.org/wiki/Cottingley_Fairies

Maybe it's time to do something about confirming authenticity, rather than just accepting any old nonsense as evidence of anything.

At this point anything can be presented as evidence, and now can be equally refuted as an AI fabrication.

We need a new generation of secure cameras with internal signing of images and video (to prevent manipulation), built in LIDAR (to make sure they're not filming a screen), periodic external timestamps of data (so nothing can be changed after the supposed date), etc.

[–] ricdeh@lemmy.world 22 points 3 months ago (2 children)

I am very opposed to this. It means surrendering all trust in pictures to Big Tech. If at some time only photos signed by Sony, Samsung, etc. are considered genuine, then photos taken with other equipment, e.g., independently manufactured cameras or image sensors, will be dismissed out of hand. If, however, you were to accept photos signed by the operating system on those devices regardless of who is the vendor, that would invalidate the entire purpose because everyone could just self-sign their pictures. This means that the only way to effectively enforce your approach is to surrender user freedom, and that runs contrary to the Free Software Movement and the many people around the world aligned with it. It would be a very dystopian world.

[–] Blackmist@feddit.uk 9 points 3 months ago

It would also involve trusting those corporations not to fudge evidence themselves.

I mean, not everything photo related would have to be like this.

But if you wanted you photo to be able to document things, to provide evidence that could send people to prison or be executed...

The other choice is that we no longer accept photographic, audio or video evidence in court at all. If it can no longer be trusted and even a complete novice can convincingly fake things, I don't see how it can be used.

[–] hobovision@lemm.ee 6 points 3 months ago

There's no need to make these things Big Tech, so if that's why you are opposed to it, reconsider what you are actually opposed to. This could be implemented in a FOSS way or an open standard.

So you not trust HTTPS because you'd have to trust big tech? Microsoft and Google and others sign the certificates you use to trust that your are sending your password to your bank and not a phisher. Like how any browser can see and validate certificates, any camera could have a validation or certificate system in place to prove that the data is straight from an unmodified validated camera sensor.

[–] feedum_sneedson@lemmy.world 1 points 3 months ago

I was thinking about those pictures! The garden is magic enough without there needing to be fairies at the bottom of it. I'm not sure if the saying is linked to these forgeries, but I always kind of thought it was.

[–] PrimeMinisterKeyes@lemmy.world 1 points 3 months ago

What in the world is going on with Elsie's hand in the "second of the five photographs?"