Apple to scan images for child abuse

LiE

LiE

Caporegime
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
https://www.macrumors.com/2021/08/06/snowden-eff-slam-plan-to-scan-messages-images/

What are people's thoughts on this?

Mac Rumours is very biased towards a certain privacy view being US-centric. Snowden doesn't like it, thinks Apple is going to abuse it.

Personally I'm OK with this. The intentions are good, the technical implementation looks to maintain privacy and I trust Apple to make it work.

More details about Apple's plan:

https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/
 
Associate
Joined
29 Jan 2003
Posts
1,103
I don't think anyone could disagree with the notion of preventing child abuse (which is probably why they presented it using this example) but the question presents itself: once that door is open, what else will be scanned for? Terrorism? Saying things that a ruling party doesn't like?

What will other governments demand from Apple as the cost of doing business in their country? Since they're only comparing images to an existing database of illegal images isn't there already mechanisms in place to detect illegal content while it's being transmitted?
 
Soldato
Joined
23 Feb 2009
Posts
4,978
Location
South Wirral
Details seem sketchy. I'd want to know how they define the boundaries between child porn and just sending pics of your kids to family / friends.

I would hazard a guess there's going to be some kind of AI based photo recognition going on. How did they train the model, what happens when something it doesn't like show up ? Are there human moderators for review ? At what stage do they involve law enforcement ?
 
Soldato
Joined
10 May 2012
Posts
10,062
Location
Leeds
Of course Apple will abuse it, people are sleep walking into living in the PRC under the guise of safety, the data will be abused by Apple and child abusers will just use a different platform if they aren't already.
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
Details seem sketchy. I'd want to know how they define the boundaries between child porn and just sending pics of your kids to family / friends.

I would hazard a guess there's going to be some kind of AI based photo recognition going on. How did they train the model, what happens when something it doesn't like show up ? Are there human moderators for review ? At what stage do they involve law enforcement ?

It's not AI, it uses CSAM hashes. Have a look at the 2nd link in the OP for some more details.
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
Of course Apple will abuse it, people are sleep walking into living in the PRC under the guise of safety, the data will be abused by Apple and child abusers will just use a different platform if they aren't already.

If you take that stance, you could quite easily say they are already abusing. If you define the trust level as that low, then you must assume they are already doing shady stuff behind closed doors.

Apple is all about privacy right now, it's a huge part of their business model.
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
It's just an excuse to scan and monitor everything.

"Child abuse" is a get out of jail free card for the companies and government to clamp down the monitoring.

It's not really scanning though is it? It's checking specific known image hashes.

Apple said its method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations. Apple said it will further transform this database into an unreadable set of hashes that is securely stored on users' devices.
 
Soldato
Joined
30 Apr 2006
Posts
17,964
Location
London
Of course people will accept the sacrifice of privacy for the big things, but then they'll have the right for the smaller things that aren't illgeal in the future.
 
Soldato
Joined
17 Jan 2016
Posts
8,784
Location
Oldham
Why is the image being sent to an employee? That person is likely going to be looking at CP daily. That isn't a healthy situation, and how is it even legal? There is no need for an employee to be involved (and we know the type of person that would be aiming to get that job!).

How does this system work for nude images? In another source I read it said they will be looking at images and giving notifications out.

Apples backside as gone, privacy is over.
 
Man of Honour
Joined
19 Oct 2002
Posts
29,542
Location
Surrey
These things are always initially introduced with good intentions and in a way that no-one can object. Then the scope expands. In this case I expect something along the lines of...

* Scanning cloud images for child abuse. No-one can object to that, right?
* Then extend it to scan for terrorism related issues. No-one can object to that, right?
* Extend functionaility to scan images at the phone level itself, because it prevents paedophiles and terrorists avoiding the tool by not uploading to the cloud. If you have nothing to hide, right?
* Extend it to scan for images for other crimes. It's only a small change. Some people might object but shame on them if they do.
* All images scanned for whatever is deemed public benefit. It's only a small extra step and you're already being scanned so why object?
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
How does this system work for nude images? In another source I read it said they will be looking at images and giving notifications out.

I think you are referring to this, this is on device and only applies to children.

Communication Safety
First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children and their parents when receiving or sending sexually explicit photos. Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

When a child attempts to view a photo flagged as sensitive in the Messages app, they will be alerted that the photo may contain private body parts, and that the photo may be hurtful. Depending on the age of the child, there will also be an option for parents to receive a notification if their child proceeds to view the sensitive photo or if they choose to send a sexually explicit photo to another contact after being warned.

Apple said the new Communication Safety feature will be coming in updates to iOS 15, iPadOS 15 and macOS Monterey later this year for accounts set up as families in iCloud. Apple ensured that iMessage conversations will remain protected with end-to-end encryption, making private communications unreadable by Apple.
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
These things are always initially introduced with good intentions and in a way that no-one can object. Then the scope expands. In this case I expect something along the lines of...

* Scanning cloud images for child abuse. No-one can object to that, right?
* Then extend it to scan for terrorism related issues. No-one can object to that, right?
* Extend functionaility to scan images at the phone level itself, because it prevents paedophiles and terrorists avoiding the tool by not uploading to the cloud. If you have nothing to hide, right?
* Extend it to scan for images for other crimes. It's only a small change. Some people might object but shame on them if they do.
* All images scanned for whatever is deemed public benefit. It's only a small extra step and you're already being scanned so why object?

Apple could do that today, they have the machine learning already.
 
Man of Honour
Joined
19 Oct 2002
Posts
29,542
Location
Surrey
Apple could do that today, they have the machine learning already. The intent to use CSAM hashes to flag known images is what they are proposing.
Yes but it would be publicly unacceptable today. That's why they boil the frog slowly.
 
Associate
Joined
20 Aug 2020
Posts
2,041
Location
South Wales
This is a very slippery slope and Apple should not have a backdoor to scan any of my photos, I have nothing to hide but still don't like it.

I thought Apple liked privacy, I guess not. If Apple has the means to identify child abuse images on a users device then that means they have the ability to track the images back to the original device/account. This is one hell of a big backdoor that can very easily be abused by governments if they force Apple to give them the ability to use it.
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,692
Location
Milton Keynes
This is a very slippery slope and Apple should not have a backdoor to scan any of my photos, I have nothing to hide but still don't like it.

I thought Apple liked privacy, I guess not. If Apple has the means to identify child abuse images on a users device then that means they have the ability to track the images back to the original device/account. This is one hell of a big backdoor that can very easily be abused by governments if they force Apple to give them the ability to use it.

As mentioned above, Apple already have the tech to scan images, yet they don't.

CSAM hashes isn't scanning images, it's literally looking at hash codes, very low level stuff.
 
Soldato
Joined
13 Apr 2013
Posts
12,452
Location
La France
These things are always initially introduced with good intentions and in a way that no-one can object. Then the scope expands. In this case I expect something along the lines of...

* Scanning cloud images for child abuse. No-one can object to that, right?
* Then extend it to scan for terrorism related issues. No-one can object to that, right?
* Extend functionaility to scan images at the phone level itself, because it prevents paedophiles and terrorists avoiding the tool by not uploading to the cloud. If you have nothing to hide, right?
* Extend it to scan for images for other crimes. It's only a small change. Some people might object but shame on them if they do.
* All images scanned for whatever is deemed public benefit. It's only a small extra step and you're already being scanned so why object?

Once upon a time, I’d have thought that you were some sort of tinfoil hat wearer. In 2021, I think you’re highly likely to proved correct.
 
Back
Top Bottom