this post was submitted on 04 Sep 2023
160 points (97.6% liked)

Technology

68723 readers
3554 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 27 comments
sorted by: hot top controversial new old
[–] [email protected] 83 points 2 years ago (2 children)

Man, I remember the controversy when this initiative launched. Can't please anyone, it seems.

[–] [email protected] 36 points 2 years ago (2 children)

I never supported since it was on device and given this is the US hashes to spot "extremism could be added" given apple doesn't know what the hashes are.

[–] [email protected] 28 points 2 years ago* (last edited 2 years ago)

No you're wrong.

They are not cryptographic hashes. They are "perceptual" hashes or "fuzzy" hashes. They're basically just a low resolution copy of the original image. It's trivial for an attacker to maliciously send innocent seeming images that are a hash collision. This is, by the way, a feature not a bug. Perceptual hashes are not designed to perform a perfect match.

There are plenty of free white-papers on how perceptual hashes work, and Facebook's implementation is even open source.

Apple said they tested 100 million perfectly legal images and three had collisions with a CSAM perceptual hash. When you consider how many photos Apple was proposing to scan (hundreds of trillions of photos) that means thousands of false positives would have occurred even if nobody maliciously abused the system.

And because of all that - Apple was planning to do human reviews of every photo. They would, therefore, have seen every match (and every false positive). It couldn't have been hidden from Apple.

[–] [email protected] -3 points 2 years ago (1 children)

What makes you day apple didn't know what they are? Is this a thing that happened that I'm not aware of?

[–] [email protected] 12 points 2 years ago* (last edited 2 years ago) (1 children)

If they only get the hashes supplied, Apple can't tell why they're bad files.

[–] [email protected] 78 points 2 years ago (1 children)

There are two types of people in support of this scanning: technologically illiterate or malicious.

Either way, keep your invasive scanners off of my shit.

[–] [email protected] 20 points 2 years ago

This outrage is going to be had by several people who want protection of children who had monsters do a terrible thing to them and who exacerbated the situation by uploading it to the cloud, which makes sharing it easier. However, these people aren’t seeing the bigger implications of this. I don’t really think many of the people that are against CSAM scanning are against protection for children or prevention of the very thing this is designed to prevent, myself included. However, what people are against is the scanning of material on your phone (which is what Apple proposed). People don’t want pictures scanned on their phones, even if it’s only as those photos will be uploaded to the cloud. Several companies were doing the scanning after the content was placed on the cloud, which many people against the previously mentioned scanning were in favor of. Apple, who is not in favor of scanning of your cloud data, was against this, which I think is admirable.

The fact of the matter is that scanning data for any purpose is at odds with the protection of your privacy. I, for one, am in favor of privacy protection. And although at times it may seem like people are against things like the protection for children, the fact is we’re actually in favor of protection for everyone.

[–] [email protected] 18 points 2 years ago (1 children)
[–] [email protected] 2 points 2 years ago