Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
Apple bashing is a hobby for many.
OTOH, you come across as their #1 fan. Your whole argument seems to be, "Apple would never do anything bad. I trust them implicitly to do exactly what they said they'd do and nothing else!"

Also, "Other companies do it!" is not actually a justification for doing something shady. Even tho it's likely true, it's still just as bad when Apple does the bad thing.
 
OTOH, you come across as their #1 fan. Your whole argument seems to be, "Apple would never do anything bad. I trust them implicitly to do exactly what they said they'd do and nothing else!"

I have openly criticised Apple on this forum.

My entire “argue” is based on what is essentially the information we have to go off. Not conjecture.
 
I have openly criticised Apple on this forum.

My entire “argue” is based on what is essentially the information we have to go off. Not conjecture.
You mean Apple's FAQ, where - oddly enough - they claim to not be doing anything wrong? I mean I'm not sure I'd just take them at their word.
 
You mean Apple's FAQ, where - oddly enough - they claim to not be doing anything wrong? I mean I'm not sure I'd just take them at their word.

I mean the technical doc, FAQ and all the related public statements Apple have made. That is what I’m basing my opinion on. I’m not as cynical as many here.
 
I mean the technical doc, FAQ and all the related public statements Apple have made. That is what I’m basing my opinion on. I’m not as cynical as many here.
I'm unashamedly cynical to the max when it comes to corporate ethics.

Corporate ethics. An oxymoron if ever there was one.
 
I mean the technical doc, FAQ and all the related public statements Apple have made. That is what I’m basing my opinion on. I’m not as cynical as many here.

No one in IT security trusts companies like Apple. Believe them, not Apple themselves or some random blogger/journalist who doesn't really know anything.
 
Also, Apple, while they love to larp on about privacy, really just want your data to themselves. AND China! They are good mates with the PRC.

I mean, Apple don't care that much about your data because they charge you for everything. Either you pay for a product or you are the product. You don't pay for facebook, google maps, mail, doc etc because they sell your info.
 
They aren't just comparing hashes.

They are also scanning all images for nudity.

So lets say you've taken photos of your young son/daughter at the beach shirtless and you have them on your Apple device.. they will be triggered as suspected images... the employee with no training and doesn't know the context of your photos will call the cops to be on the safe side.
 
They aren't just comparing hashes.

They are also scanning all images for nudity.

So lets say you've taken photos of your young son/daughter at the beach shirtless and you have them on your Apple device.. they will be triggered as suspected images... the employee with no training and doesn't know the context of your photos will call the cops to be on the safe side.

No, there are two different things being implemented.

Neither is just scanning all images on your phone.
 
No, there are two different things being implemented.

Neither is just scanning all images on your phone.

The Apple documents, at least what I've read, don't make that clear - you'd hope it was part of the sync/upload process rather than general scanning.
 
They aren't just comparing hashes.

They are also scanning all images for nudity.

So lets say you've taken photos of your young son/daughter at the beach shirtless and you have them on your Apple device.. they will be triggered as suspected images... the employee with no training and doesn't know the context of your photos will call the cops to be on the safe side.

The only reason I can think of that has bought you to this conclusion is that you have misunderstood what Apple is actually doing. There are two different features:

The first covers children on family accounts which uses on device (important given the context) analysis which attempts to prevent said children from sending nudes to each other via iMessage. It only covers iMessage and seems like a perfectly reasonable parental control to implement. The image never goes Apple or the police, not is there any kind of report except to the parent for kids under 12.

The second checks hashes images images uploaded to iCloud to see if they match cashes of known/verified CSAM. You have to upload an undisclosed threshold of verified CSAM to iCloud to be flagged. It's fair to say its more than one given its supposedly a 1 in a trillion false positive rate per year. That said, 1 in a trillion is potentially still hundreds or of users per year mismatched given how many uploads there are to iCloud each year just in America but that's the point of the manual review.

Realistically, neither of those is going to result in you being 'shopped' to the FBI for taking pictures of your own children because the pics you've taken of your kids be in the CSAM database. The other feature wouldn't impact you as an adult.

The Apple documents, at least what I've read, don't make that clear - you'd hope it was part of the sync/upload process rather than general scanning.

This is what the Apple FAQ says:

Does this mean Apple is going to scan all the photos stored on my iPhone? No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.

So its doesn't explicitly say when it does the matching but does it really matter? Matches only leave the device if that individual photo is uploaded to iCloud. Turn iCloud off or have iCloud set to not upload certain albums and those albums will not be uploaded.

I suppose once you factor in that most people have iCloud enabled and just leave it on its default settings. It is not unreasonable to summarise that most photos added to the built in photo library will end up on iCloud at some point.

There is some debate here as to whether the second feature is acceptable but all of these services need to do it to comply with the law in the USA.
 
I've tried having the discussion regarding 'circumcision pics' of the aforementioned religious practice on yet another free speech site.

The general consensus is that the site owners consider such pictures to be 'explicit involving minors', regardless of there being zero legal or censorship issues regarding it anywhere else, and every media outlet is free to print non graphic images depicting said practice.

Obviously with these type of websites, a lot of users use them to whinge about a certain religion. Many end up finding themselves banned when calling out the practice of 'metzitzah b'peh' with a non graphic image from google searching, not realizing that the site owners are going to brand this as CP and ban them, because to most people circumcision is not the same thing as porn. Yet there are countless memes calling the practice out using to same pictures to be found all over reddit / twitter / facebook / imgur etc.

People who find themselves banned for posting anti circumcision posts with such pictures are left baffled as to how that was ever CP.

There is the belief that because webhosts hate such websites by default, they will use any such excuse to deplatform them, even if such content is allowed anywhere else by Google and whichever web host.

Such sites start off promising 'Anything that doesn't break US law is fine', and then less than 6 months later begin needing to implement more posting restrictions than normal social media due to how much they are being reported and threatened with dehosting.
 
The first covers children on family accounts which uses on device (important given the context) analysis which attempts to prevent said children from sending nudes to each other via iMessage. It only covers iMessage and seems like a perfectly reasonable parental control to implement. The image never goes Apple or the police, not is there any kind of report except to the parent for kids under 12.

From this paragraph it makes it sound like its restricted images between children. It's any image that a childrens account receives will be scanned.

https://www.apple.com/child-safety/

The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

If little Johnie has done some sexting (I believe thats what the kids call it) and Apple have scanned the image and decides it needs a notification, why wouldn't the police been informed? It's a crime.

https://childlawadvice.org.uk/information-pages/sexting/

It'd be interesting to know how it determines an image not in the database is sexually explicit. The term itself is can be quite expansive, especially in very conservative America.

The second checks hashes images images uploaded to iCloud to see if they match cashes of known/verified CSAM. You have to upload an undisclosed threshold of verified CSAM to iCloud to be flagged. It's fair to say its more than one given its supposedly a 1 in a trillion false positive rate per year. That said, 1 in a trillion is potentially still hundreds or of users per year mismatched given how many uploads there are to iCloud each year just in America but that's the point of the manual review.

You don't have to upload the image. It's being scanned locally on the device;

https://www.apple.com/child-safety/

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.

I would be very surprised if all a user would need to do is turn off the iCloud link to avoid all this. If it was that simple I'm sure all the pedos would be doing that now.
 
From this paragraph it makes it sound like its restricted images between children. It's any image that a childrens account receives will be scanned.

https://www.apple.com/child-safety/



If little Johnie has done some sexting (I believe thats what the kids call it) and Apple have scanned the image and decides it needs a notification, why wouldn't the police been informed? It's a crime.

https://childlawadvice.org.uk/information-pages/sexting/

Received/sent via the Apple messages App only and not the rest of the device or the other apps the user may have access to. That said I’m sure it will be expanded to 3rd party apps in time via an API but I’m not sure you’ll find many objections from parents.

Lobby Apple to change it then if you think it should tell the police, but that isn’t how it works and at no point does the documentation point to it notifying the police. It’s a parental control feature. For under 12s it notifies the parent. For over 12s it just prompts the child.


It'd be interesting to know how it determines an image not in the database is sexually explicit. The term itself is can be quite expansive, especially in very conservative America.

Probably the same way they can tell a dog is a dog. I’m sure Apple’s machine learning is good enough to work out what is explicit and what isn’t. Youtube has been doing it for years and isn’t exactly new. What’s new is that Apple can do it on device instead of relying on a remote server which is a good thing in the context.


I would be very surprised if all a user would need to do is turn off the iCloud link to avoid all this. If it was that simple I'm sure all the pedos would be doing that now.

This is not the same feature as above, this is to do with the known CSAM and not parental controls. I don’t know why you are questioning this, it’s covered in the Apple’s documentation within same link you posted. Turn off iCloud and it turns off the ‘feature’, it only effects images which are uploaded to iCloud.
 
I mean, Apple don't care that much about your data because they charge you for everything. Either you pay for a product or you are the product. You don't pay for facebook, google maps, mail, doc etc because they sell your info.

Yes, they have all your data
 
If little Johnie has done some sexting (I believe thats what the kids call it) and Apple have scanned the image and decides it needs a notification, why wouldn't the police been informed? It's a crime.
If little Jonny is sending texts to little Jenny and vice versa, then the solution is probably not to make them both sex criminals. But that does seem to be the way the law is right now, rightly or wrongly.
 
If little Jonny is sending texts to little Jenny and vice versa, then the solution is probably not to make them both sex criminals. But that does seem to be the way the law is right now, rightly or wrongly.

I seem to remember reading about a case a while back where a bloke was jailed and put on the register because he was found to have home videos of him and his wife's kinky time. She was 17, he was 20 and they were legally and happily married.
 
They aren't just comparing hashes.

They are also scanning all images for nudity.

So lets say you've taken photos of your young son/daughter at the beach shirtless and you have them on your Apple device.. they will be triggered as suspected images... the employee with no training and doesn't know the context of your photos will call the cops to be on the safe side.


That would be my first concern, followed by the mission creap followed later by the later ai profiling to catch people who might be bad or think wrong

And there will be cases where kids take selfies that are explicit and get prisioned 5 years later,
 
Last edited:
If little Jonny is sending texts to little Jenny and vice versa, then the solution is probably not to make them both sex criminals. But that does seem to be the way the law is right now, rightly or wrongly.

I agree that they shouldn't be sex criminals. But sadly over the last 20+ years the act of discretion when it comes to the law as been removed.

I don't know if you remember this but in the days when we didnt have phones for photos we had to go get them developed at a photo shop. But there was a story of the tv presenter Julia Somerville who had taken some photos of her family to be developed. One of them was of her 7 year old daughter in the bath. When her partner went back to the shop to pick up the photos he was met by the police who arrested him and Julia and their house searched. Eventually the allegations were dropped. But it shows that people will call in the cops even when they know the images are probably innocent to be on the safe side. It seems in areas like this context goes out the window.
 
Back
Top Bottom