Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
Surely this is only a step away from people having their doors booted in at silly o'clock in the morning because some div at Apple deemed the holiday photos of their kids playing on the beach to be 'dodgy'.

Apple don’t decide on the photos. The database of hash codes are provided by child protection agencies.
 
So someone could maliciously insert a photo on your device without your knowledge using a virus or something and your phone would literally report you to the authorities

Or 1000s of them. There have already been exploits/malware like this which fill the disk with images.

Make it in to a QR code and stick it somewhere, maybe a fake NHS app one. Watch the clueless masses scan it and go.
 
I love how people here are missing the difference between an image and image hash.

They know the hashes of KNOWN, distributed child abuse images and are using AI to compare hashes of your own images against those.

Some people need to lay off the tinfoil.

Apple's own statements make it clear that's not all that will be done. The intent is to scan everything, evaluate everything, categorise everything. The intent is far wider than you stated. What you stated is their public facing weapon to suppress dissent.

Then, of course, there will be function creep. There always is. There will also be incorrect results. There always is.
 
I love how people here are missing the difference between an image and image hash.

They know the hashes of KNOWN, distributed child abuse images and are using AI to compare hashes of your own images against those.

Some people need to lay off the tinfoil.



so howling before hashes of copywrited works are added?

Apple has a major vested interest in disability privacy on its devices to direct customers to its subscription/on demand services


iTunes, Apple TV etc
 
Apple's own statements make it clear that's not all that will be done. The intent is to scan everything, evaluate everything, categorise everything. The intent is far wider than you stated. What you stated is their public facing weapon to suppress dissent.

Then, of course, there will be function creep. There always is. There will also be incorrect results. There always is.

Apple confirmed its plans in a blog post, saying the scanning technology is part of a new suite of child protection systems that would “evolve and expand over time”.

Expand? Expand?!

https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f5411f

The problem is when people start believing what they're told as the truth and anything other as "tinfoil hat". It's autocannibalism, just as planned.
 
Last edited:
If a strong enough match is flagged, then Apple staff will be able to manually review the reported images

I think the term 'strong enough match' makes it clear they will be using some kind of AI, if they were solely comparing hashes then either it's a match or it isn't. Seems like a privacy (and security) nightmare that will do absolutely nothing to catch child predators now that they announced it to the world. Imagine an Apple employee browsing through your files because you took a picture of your toddler in the bath. Not that I was particularly interested in their overpriced phones before, but I certainly wouldn't go anywhere near them now.
 
Imagine typing that and being scared of every "what if" scenario imaginable.

I don't even have an iPhone, I use Android but if Google want to match hashes, to find people with child abuse images as well, they should go for it.


Well from the Snowden leaks we know the largest single producer of child pornography is the US/Uk security forces. Through mass collecting of video chats

and they’re who you want involved in this?
 
If a strong enough match is flagged, then Apple staff will be able to manually review the reported images

I think the term 'strong enough match' makes it clear they will be using some kind of AI, if they were solely comparing hashes then either it's a match or it isn't. Seems like a privacy (and security) nightmare that will do absolutely nothing to catch child predators now that they announced it to the world. Imagine an Apple employee browsing through your files because you took a picture of your toddler in the bath. Not that I was particularly interested in their overpriced phones before, but I certainly wouldn't go anywhere near them now.


How long before some apple employee gets outed as a peadophile I wonder
 
I wonder how long till security forces add pictures of various people into this hash list?


As we’ve just seen people aren’t happy about an apple employee reviewing matches,so it should be a police officer right?


Well as it’s all hashes no one at apple would know then if the image flagged is child abuse or say some target of the security forces face right?

just there’s a match and they pass it onto the officer who views in private


An amazing tracking ability for blackmail etc
 
These things are always initially introduced with good intentions and in a way that no-one can object. Then the scope expands. In this case I expect something along the lines of...

* Scanning cloud images for child abuse. No-one can object to that, right?
* Then extend it to scan for terrorism related issues. No-one can object to that, right?
* Extend functionaility to scan images at the phone level itself, because it prevents paedophiles and terrorists avoiding the tool by not uploading to the cloud. If you have nothing to hide, right?
* Extend it to scan for images for other crimes. It's only a small change. Some people might object but shame on them if they do.
* All images scanned for whatever is deemed public benefit. It's only a small extra step and you're already being scanned so why object?

Last step, use AI to predict who might become a bad person and punish them before they think wrong
 
The problem as I see it is that it sets the precedent that you don't actually own or have any expectations of privacy of the data you store on your electronic devices or cloud services.

If we allow governments/companies free reign to scan and analyse all personal electronically stored information I can see that quickly being expanded well beyond the original stated remit of protecting children....

Though one of apples main marketing points is that they respect the privacy of their users and I think it would jeopardize sales if they were seen to compromise that so it's in their own interests to ensure privacy isn't compromised. Time will tell though!
 
Last edited:
Of course Apple will abuse it, people are sleep walking into living in the PRC under the guise of safety, the data will be abused by Apple and child abusers will just use a different platform if they aren't already.

I wonder if hollywood celebrities and the so-called political elite will now change over to android?

Also, how many families with pictures of their children playing in the garden or in a pool will be flagged up?
 
Also, how many families with pictures of their children playing in the garden or in a pool will be flagged up?

None, as that’s not how it works. It’s been covered a lot in this thread already.
 
So apple employees will be reveiwing images of kiddie porn now and not police? That sounds like a crazy idea.
You make it sound like some regular employee will be reviewing them. That definitely won’t be the case.
 
Back
Top Bottom