this post was submitted on 05 Jan 2025
150 points (96.3% liked)

Privacy

4441 readers
116 users here now

A community for Lemmy users interested in privacy

Rules:

  1. Be civil
  2. No spam posting
  3. Keep posts on-topic
  4. No trolling

founded 2 years ago
MODERATORS
 

I have never liked Apple and lately even less. F.... US monopolies

top 11 comments
sorted by: hot top controversial new old
[–] Retro_unlimited@lemmy.world 39 points 2 days ago (4 children)

Just disabled it.

Settings - apps - photos - scroll all the way down.

[–] smee@sosial.link 33 points 2 days ago (1 children)

Being able to partially opting out of a privacy nightmare still leaves a lot to desire. 😔

[–] billbasher@lemmy.world 5 points 1 day ago

I want anything AI-related to be forced to be opt-in. I’m sick of having to find out these things on the news/Lemmy/wherever when it is likely too late to have them use the data already

[–] foremanguy92_@lemmy.ml 7 points 2 days ago

These days some settings are allowed to be disabled, but do you think this is really an improvement, yes you'll not see the "enhanced search" results but how do you that know that apple is not going to analyse photos the same way?

[–] Jessica@discuss.tchncs.de 2 points 1 day ago

Better still, don't update

[–] LEVI@feddit.org 20 points 2 days ago (2 children)

so what happpens on your phone, doesn't stay on your phone ? 😐

[–] smee@sosial.link 15 points 2 days ago

Hasn't been like that for years unless you're running a custom setup. The default is no privacy.

[–] phoneymouse@lemmy.world 11 points 2 days ago* (last edited 2 days ago) (1 children)

It’s done on device, for the most part.

Apple did explain the technology in a technical paper published on October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted. A local machine-learning model analyzes photos to look for a "region of interest" that may depict a landmark. If the AI model finds a likely match, it calculates a vector embedding – an array of numbers – representing that portion of the image.

The device then uses homomorphic encryption to scramble the embedding in such a way that it can be run through carefully designed algorithms that produce an equally encrypted output. The goal here being that the encrypted data can be sent to a remote system to analyze without whoever is operating that system from knowing the contents of that data; they just have the ability to perform computations on it, the result of which remain encrypted. The input and output are end-to-end encrypted, and not decrypted during the mathematical operations, or so it's claimed.

The dimension and precision of the embedding is adjusted to reduce the high computational demands for this homomorphic encryption (presumably at the cost of labeling accuracy) "to meet the latency and cost requirements of large-scale production services." That is to say Apple wants to minimize its cloud compute cost and mobile device resource usage for this free feature.

[–] LWD@lemm.ee 6 points 2 days ago (1 children)

October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted.

I love it when surprise features get silently added, then discovered a couple months later. Makes you wonder exactly how easy it would be for Apple to start scanning for tons of other stuff in addition to landmarks, now that they've built out the infrastructure for it.

It's really kind of them, too. Collecting data for all of your photos for free, holding a database of public places for free, scanning your photos against them for free, returning that day to you for free... They're so generous!

[–] WhatAmLemmy@lemmy.world 1 points 2 days ago

I love big brother