Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
Not exactly, your massing a step in that Apple will eyeball the image before making any report so if it isn’t the flagged image it isn’t reported.
 
I don't know if someone can answer this question but by all accounts they are taking a hash of known bad images and comparing them to peoples photos on iCloud. They are not "scanning" your photos for content, they are simply using an algorithm that has a database of hashes for known bad images and compares the hashes of your images to those. They are not doing any image processing per se or learning / guessing.

I think there's some confusion because they're proposing to introduce two different things and they're being discussed as the same thing.

'Communication Safety' which will scan images in iMessage (or Messages, not sure if my terminology is out of date for Apple these days), attempt to identify 'sexually explicit' images and filter/blur them out.

They're also introducing something which will look for hash matches on photos uploaded to iCloud to try and find known 'Child Sexual Abuse Material'.
 
I can’t see any parents who deploy parental controls on their kids devices disabling the new communicating safely feature. It seems entirely sensible.
 
Not exactly, your massing a step in that Apple will eyeball the image before making any report so if it isn’t the flagged image it isn’t reported.

If the image matches a database of images given to them by whoever though, why wouldn't they send it on to the authorities. What is stopping the remit of this extending beyond child porn.
 
Not exactly, your massing a step in that Apple will eyeball the image before making any report so if it isn’t the flagged image it isn’t reported.

So what happens when I don't want anyone at Apple, or anywhere else, 'eyeballing' my personal photos? They're mine. The whole stance seems to be 'Well some random person in another country will trawl any flagged pics so it's fine'. Er, no thank you. Where's your warrant? Who made Apple LEOs? The whole thing stinks and it's the thin end of the wedge.
 
Not exactly, your massing a step in that Apple will eyeball the image before making any report so if it isn’t the flagged image it isn’t reported.
If that's correct then Apple employees are going to see a lot of peoples private snaps of their partners. I really would be moving away from Apple on principle if I actually had one.
 
If that's correct then Apple employees are going to see a lot of peoples private snaps of their partners. I really would be moving away from Apple on principle if I actually had one.

Again, I think this is where the two different things are being conflated slightly.

I've only seen suggestion of 'review' of flagged images in relation to the hash matching of known CSAM, not of images being identified as sexually explicit in the messaging app.

I doubt 'private' type pictures of people's partners are all that likely to end up getting hash matched to known child porn.
 
If the image matches a database of images given to them by whoever though, why wouldn't they send it on to the authorities. What is stopping the remit of this extending beyond child porn.

Nothing but what’s stopping the Conservative party from getting rid of benefits or deporting anyone they like?

All these speculative ‘what ifs’ are just that and they completely ignore the checks and balances that exist in western societies.

I get that doesn’t apply in China but they don’t have a western society with checks and balances provided by its citizens.

Do you really think Apple would allow states to secretly inject other images into the system? ‘Reputation damage’ is an understatement, they’d be done if that happened.
So what happens when I don't want anyone at Apple, or anywhere else, 'eyeballing' my personal photos? They're mine. The whole stance seems to be 'Well some random person in another country will trawl any flagged pics so it's fine'. Er, no thank you. Where's your warrant? Who made Apple LEOs? The whole thing stinks and it's the thin end of the wedge.

No, you’d need to have multiple images on your phone which match those in child protection databases and you uploaded them to iCloud. Do you meet that criteria?


If that's correct then Apple employees are going to see a lot of peoples private snaps of their partners. I really would be moving away from Apple on principle if I actually had one.

You know that’s not how it works as has been said multiple times. As noted above, private snaps of your partner is are not in child protection databases….



Just out of interest how goes Google handle images uploaded to Google photos? We know they analyse the entire image and not just a hash. Do they scan for known CSAM?
 
Just out of interest how goes Google handle images uploaded to Google photos? We know they analyse the entire image and not just a hash. Do they scan for known CSAM?

Google, MS etc are already doing this with their Cloud services. MS provide an API for it called photoDNA.

Apple have taken their time and implemented a number of additional layers to protect user privacy.
 
Google, MS etc are already doing this with their Cloud services. MS provide an API for it called photoDNA.

Apple have taken their time and implemented a number of additional layers to protect user privacy.

I thought that would be the case.

Where are all the people going to go if they think Apple are suddenly the big villain here? Get an android phone? Oh wait…
 
I thought that would be the case.

Where are all the people going to go if they think Apple are suddenly the big villain here? Get an android phone? Oh wait…

I believe it is law (certainly in the US) that any photos stored in the cloud have to be checked for CSAM.
 
Nothing but what’s stopping the Conservative party from getting rid of benefits or deporting anyone they like?

All these speculative ‘what ifs’ are just that and they completely ignore the checks and balances that exist in western societies.

Thats not really an argument against. If you want to go down that route then nothing is off limits. Thats how most of these things work. They start small and their scope is extended.

The government is already spying on us to the maximum capabilities of their technology already. Was that all done out in the open and approved by the citizens? Of course not.

Do you really think Apple would allow states to secretly inject other images into the system? ‘Reputation damage’ is an understatement, they’d be done if that happened.

I'm not suggesting they would "inject" them on the sly at all. They would just require apple to use whatever database they are given. Different countries have different ideas about what are harmful images. I'm not sure why you think Apple might not comply with these countries wishes to have their own databases at some point. Apple are fundamentally a business and they are not one of the most wealthy and valuable companies in the world because they are stupid.
 
Last edited:
Because it is illegal to own child porn, what other images are illegal to own?


Anything copywrited you havent paid for?


Tv shows
Movies
Music
Photos
Art work
Books
Audiobooks
Porn
Any clip from a movie or film that breeches the obscene porn act because it’s only legal with the bbcf rating which only covers the entire film (any clip of violence or threat to life ie the sex scene where they’re shooting in mr and ms smith or the babysitter or whatever it was)
video or picture of any illegal activity drugs say
Gay imagery depending on your country
Anti government imagery depending on your country
Memes that breech any of the above



the list of what is illegal is extensive

what do you think might be illegal in say Saudi Arabia or China or iran?


How may this technology lead to problems for normal citizens there?
 
I'm not suggesting they would "inject" them on the sly at all. They would just require apple to use whatever database they are given. Different countries have different ideas about what are harmful images. I'm not sure why you think Apple might not comply with these countries wishes to have their own databases at some point. Apple are fundamentally a business and they are not one of the most wealthy and valuable companies in the world because they are stupid.

But again, it’s complete speculation based on zero evidence.

Why is there not a thread on here spewing nonsense hyperbolic outrage about Google and MS who already do this?

Holding copyrighted content without a licence isn’t even a criminal matter. It’s a civil matter dealt with by compensation and damages. Creation and distribution is a completely different conversation and you can’t detect that activity with the tool.
 
But again, it’s complete speculation based on zero evidence.

Of course it is, everything is a hypothetical until it happens. What isn't hypothetical is the governments desire to have complete access to everyones information and digital data. Anything that could lead to more access to our private data should be well scrutinised and questioned.

Why is there not a thread on here spewing nonsense hyperbolic outrage about Google and MS who already do this?

Because its Apple and people expect better from a company that sells itself as a privacy first company. I know google and MS (to a lesser extent) will sell my data to anyone and will do plenty more shady things. Apple have always been, up to a point, the gatekeepers of what is acceptable intrusion when it comes to tech. The further they compromise this, the more normalised it will become. Stories about Apple refusing to do X are big news and draw attention to issues that people would otherwise not even know about. If they become another google then I fear it really will become open season.

Who knows, it might be fine but I trust big tech about as far as I can throw them and people should be asking the "what ifs" when it comes to things like this.
 
Of course it is, everything is a hypothetical until it happens. What isn't hypothetical is the governments desire to have complete access to everyones information and digital data. Anything that could lead to more access to our private data should be well scrutinised and questioned.

I agree but the that conversation has very little of anything to do with Apple and has very thing to do with holding governments to account.


Because its Apple and people expect better from a company that sells itself as a privacy first company. I know google and MS (to a lesser extent) will sell my data to anyone and will do plenty more shady things. Apple have always been, up to a point, the gatekeepers of what is acceptable intrusion when it comes to tech. The further they compromise this, the more normalised it will become. Stories about Apple refusing to do X are big news and draw attention to issues that people would otherwise not even know about. If they become another google then I fear it really will become open season.

But that’s a double standard that makes very little sense. You either accept the practice or you don’t. If you don’t, then you should want to hold all of them account equally?


Who knows, it might be fine but I trust big tech about as far as I can throw them and people should be asking the "what ifs" when it comes to things like this.

You are not going to find me saying you shouldn’t scrutinise what Apple and others are doing but that isn’t what’s happening in this thread.

You should absolutely question their motives and the scope of the ‘feature’ but there is no evidence that the answers given by shouldn’t be taken in good faith.

What’s happening in this thread is unsubstantiated conjecture (often utilising hyperbolic statements) or just a fundamental misunderstanding of what is actually happening.
 
No, you’d need to have multiple images on your phone which match those in child protection databases and you uploaded them to iCloud. Do you meet that criteria?

I was responding to a post that postulated that might or would be the case. I wasn't answering you, or Apple. Conflating me with a paedophile, or suggesting I shouldn't care if I'm not one, because I disagree with something is not constructive to the discussion. Just because I have objections to an invasive, or potentially invasive, practice with scope for serious mistake or abuse does not automatically mean I'm a criminal or a child abuser.

If I was some type of criminal, I'd do what thousands or millions of innocent people also already do (including me). I'd upload my photos and files to a vendor agnostic encrypted store (eg with rclone, Cryptomator or similar) and nobody would be any the wiser. So what happens then? Suddenly Apple finds 0% CSAM and lauds the clean living of their user base? No, they say oh people just aren't using iCloud Photos for this material any more, so we'll expand the search to on-device photos to catch them before they're shared outside of iCloud, or uploaded to third party encrypted services. And so the feature creep - and expansion of the scope of operation - begins. Well it could begin, except that already happened years ago (Snoopers' Charter, RIPA, S5 POA etc).

Apple have pinky promised never to do that, you say? The same Apple who, being bound by NSA/CIA/DHS/FISA to spy every which way to Sunday, already intentionally weakened iMessage and iCloud to carry out all manner of spying of users? Definitely believe that one. That being the case, once government says 'Oh but you're allowing paedophiles/terrorists/whatever to hide on your platform because you only scan this way not that way', the horse has already bolted. The tech is in place, it's in use, and objecting to its expansion is just enabling terrorism/child abuse/insert anti-government behaviour here and governments 'must take action' to ensure the children/citizens/country is safe. A bit like the current escalation of argument over enabling a 'back door in encryption for messages' (lol), with the EU huffing and puffing about sanctioning Facebook if it doesn't back down.

Exercising your right to privacy (enshrined in the UN and EU constitutions) does not make you a criminal, and the fact that criminals, including child abusers, can use tools to obfuscate their vile behaviour does not mean everyone else should be denied sovereignty over their own possessions and lives, or their privacy. Criminals also drink water and live in houses, should we outlaw and oversee those activities too, just to be safe? We're kind of back to 'if you ban (item X) then only criminals will own/do/have (item X)'.

As for the usual 'Apple haters gonna hate' comments I'm seeing scattered here and about, I have a MacBook Pro, iPad Pro, iPad Mini, several Apple watches, and seven iPhones in the house (including mine). I just keep anything confidential on-prem and out of easy grasp of corporations or their governmental overlords. That is to say, by using BSD, Linux, Tor, VPN, Matrix/Signal, e2e encryption, and so on; ironically, as recommended to the public by intelligence services on both sides of the pond.

This discussion is not as straightforward as 'LOL you can trust Apple they because they said it's OK - you're a paedo if you disagree' or 'Apple evil, burn the empire'.

Do you really think Apple would allow states to secretly inject other images into the system? ‘Reputation damage’ is an understatement, they’d be done if that happened.

Like Apple, Microsoft/Windows, Facebook, Yahoo, and all the others were 'done' after the Snowden leaks? Most people don't care so long as they can still swipe right and buy a latte while they count their likes and followers.
 
Back
Top Bottom