Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
Very timely. :D :p

And what makes them out of scope of the same law Apple et al. are where it’s an offence to even process the content?

They can encrypt the data all they like but what happens when the authorities in the US find them processing CSAM?

I’m not saying the US authorities are snooping on the provider, what happens is when these people are eventually caught, all their electronic devices forensically examined and the content is likely to be found.
 
And what makes them out of scope of the same law Apple et al. are where it’s an offence to even process the content?

They can encrypt the data all they like but what happens when the authorities in the US find them processing CSAM?

Who says they're US based? They're not processing CSAM or even photos of me at the beach in a mankini - they're storing random strings of numbers and letters, ie cyphertext. You seem overly concerned with what the authorities in the US think or want. One could rearrange the 1s and 0s on your very own PC to make anything they wished - does that mean you're processing CSAM, or does it mean that random numbers and letters can be rearranged into anything you want?

Edit: To illustrate - here's the contents of a file in my cloud storage provider:

Code:
5243 4c4f 4e45 0000 c201 c74c 5fe4 80fe
b260 9bb5 8092 6891 348c 5eca e629 18d4
4cf7 a11c 2808 f9f5 98bd 2717 3431 ab23
c20a 82b0 b209 fe4d f1e8 c42a 8bdd 8799
49a5 f38a 0871 df48 5780 c9c8 611a 0794
3aad 0072 4498 cb9b ef26 183d 1352 dee1
1609 a18a 27b2 e83f 414d 041c abb9 5711
bb22 d2a6 3942 d64d 2dd8 410e fd81 3c06
6f64 dad3 6f94 ef72 b0eb 7a63 0975 6f37
c265 02f8 846d b643 8fcd aa20 aff9 c56c
2338 d203 6068 8995 15e0 1b8f dc46 54c5
6530 eee8 4c48 1efe ffd7 4ed8 5a0e 8a7a
3deb fff3 642e b874 06ca 18a5 fd6c 2dbd
5dbd 50

Where's the... anything? What offence is committed there? None. It actually says hello btw (plus padding plus encryption overhead).
 
Who says they're US based? They're not processing CSAM or even photos of me at the beach in a mankini - they're storing random strings of numbers and letters, ie cyphertext. You seem overly concerned with what the authorities in the US think or want. One could rearrange the 1s and 0s on your very own PC to make anything they wished - does that mean you're processing CSAM, or does it mean that random numbers and letters can be rearranged into anything you want?

In the broad view of the legal system, it’s irrelevant if they are encrypted. They still processed the image and stored it on their servers.

The point I was making is that Apple are making these changes is in response to a law which makes its it an offence to process CSAM.

What makes you think any other cloud storage provider isn’t going to be in scope if they are based or potentially even operate in the US?
 
In the broad view of the legal system, it’s irrelevant if they are encrypted. They still processed the image and stored it on their servers.

The point I was making is that Apple are making these changes is in response to a law which makes its it an offence to process CSAM.

What makes you think any other cloud storage provider isn’t going to be in scope if they are based or potentially even operate in the US?

You're missing the point, along with bypassing every point I made in my prior post, seizing instead on the simple link to the article about Stingle. They might have 'processed' the entire works of Shakespeare or documents on how to build a W M D. But who'll ever know? Not Stingle, not the government, not LEO, nobody. So it's a moot point.

Making it an offence to even process things as part of being a common carrier is disingenuous at best, and outright thin wedging at worst. Browsers and search engines enable access to everything from drugs to human assassination to abuse to... well, anything. Do we monitor and regulate those next? Are Mozilla and Google et al. going to have to implement monitoring and alert capabilities into browsers and search engines and routers and switches (and so on ad nauseum) to make sure people aren't illegally processing/carrying/relaying/conveying/storing illegal matter, wrongthink or wrongspeak? It's pointless, but it does open up possibilities for further oppression of, subversion of, and control of a populace.
 
I’m not missing the point at all, you deflected and didn’t actually answer the question.

What makes other (US based or have servers/presence in the US) cloud service providers that process images out of scope of this law?
 
I’m not missing the point at all, you deflected and didn’t actually answer the question.

What makes other (US based or have servers/presence in the US) cloud service providers that process images out of scope of this law?

I know exactly what you're trying to say: Apple are only acting in response to this law, and are trying to do it in as sane and privacy sensitive manner as possible. It is not Apple specific, all US companies are subject to it.

As I thought I'd said: Yes, US based entities will be (shock) subject to US law. Since e2ee makes it a moot point, contrary to your assertion it is not 'irrelevant if they are encrypted', because that means that there's nothing to see and nothing to act upon. As with many ham-fisted and knee jerk pieces of technology legislation (often with an ulterior agenda), the law itself is flawed and does basically nothing to address the supposed target. It does, however, result in many potential breaches of privacy or worse... but is a useful tool for scope creep.
 
Perhaps I should have rephrased. You posted a link to another service implying that you should put your photos there instead of in iCloud because it encrypts the images before physically uploading them.

The point I was making is that in the broad eyes of the law, it’s irrelevant whether the service encrypts the files or not (in the case of the one you linked). The service still processed the images and it would take a strange interpretation of the sprit of the law to say they were out of scope.

Now if you uploaded an encrypted file which that the service didn’t create then it would be difficult for them to make any reasonable effort to prevent CSAM from being stored on their network unless the government also banned encryption. And no, I’m not suggesting the latter is a good thing or in any remote way a reasonable response.

The broad point I have been making in this thread is that is isn’t an Apple thing, most of not all providers have the same duty and go far further than Apple is going here to analyse, review and even moderate the content stored on their servers.

I don’t see why this wouldn’t apply to non-image orientated services either e.g. drop box etc.
 
Why is there not a thread on here spewing nonsense hyperbolic outrage about Google and MS who already do this?

Do Google and Microsoft currently deploy client-side CSAM scanning on their OS's, or is it only limited to online (allegedly Apple already do this with iCloud)?

Where are all the people going to go if they think Apple are suddenly the big villain here? Get an android phone? Oh wait…

Why not? You can use Android without Google services and there's plenty of other OS's available :confused:
 
Do Google and Microsoft currently deploy client-side CSAM scanning on their OS's, or is it only limited to online (allegedly Apple already do this with iCloud)?



Why not? You can use Android without Google services and there's plenty of other OS's available :confused:

I’m not sure what your point is, all of this only applies to images uploaded to the icloud. What Apple is proposing doesn’t apply to images stored only on the device.

It doesn’t really matter if the server ‘scans’ it or the device ‘scans’ it, it’s still getting ‘scanned’.

There really aren’t other relevant, useable, working, user friendly smart phone operating systems out there. Sure some tech bro could get away with using a Linux phone but in reality ‘normal’ people just wouldn’t have a clue. They have the choice of iOS or Android.
 
It doesn’t really matter if the server ‘scans’ it or the device ‘scans’ it, it’s still getting ‘scanned’.

If that's the case and it doesn't matter which point they do the scanning, why doesn't Apple just continue with iCloud scanning rather than bringing it client-side?

There really aren’t other relevant, useable, working, user friendly smart phone operating systems out there. Sure some tech bro could get away with using a Linux phone but in reality ‘normal’ people just wouldn’t have a clue. They have the choice of iOS or Android.

Sure but Android doesn't mean Google services and there are hardened and non-privacy invading flavours around if need be.
 
Increased privacy by doing it on device.

How is privacy increased for the end user if only files that are marked for uploading to iCloud are scanned?
They're ending up on iCloud regardless, where Apple have vastly more processing power available to them and can continue to CSAM scan them.
 
They're ending up on iCloud regardless, where Apple have vastly more processing power available to them and can continue to share them with GCHQ, the NSA, CIA, DHS et al., thanks to having purposefully backed up users' "secure" iMessage encryption keys to iCloud; as well as weakening iCloud security to 'encrypted at rest' (for which Apple hold the key) rather than having user-held e2ee keys - at the request of NSA.

FTFY. :p
 
I don't know if someone can answer this question but by all accounts they are taking a hash of known bad images and comparing them to peoples photos on iCloud. They are not "scanning" your photos for content, they are simply using an algorithm that has a database of hashes for known bad images and compares the hashes of your images to those. They are not doing any image processing per se or learning / guessing.

As someone else pointed out however. This is in theory a fine idea. You can have whatever smut you want on your phone as long as its not a known dangerous image. Those photos of yourself or your partner naked won't show up. The problem is that it sounds like depending on the country, that database could be anything. If China decided that they wanted to add all photos of winnie the poo to the database then they could. Anyone with that image on their iCloud would be flagged up and potentially arrested.

If thats the case, it doesn't matter what technology they are using, this should be illegal. Giving anyone the power to decide that any arbitrary image is "dangerous" is very very bad. I don't know who I would trust to curate that database. Certainly not any government.

A lot of people are caught up on the wording. "Expanding over time" etc. Its a slippery slope, soon it's ML image scanning running from a black box chip and we all know how reliable that can be!

Also, Apple, while they love to larp on about privacy, really just want your data to themselves. AND China! They are good mates with the PRC.

Won't you think of the children! is a common cry to install new legislation that is later used out of context
 
Back
Top Bottom