Apple to scan images for child abuse

Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Yeah hash-matching would seem to be a nonce sense way of going about this.

All it would take is for somebody to change a single pixel and boom, your file hash is different.

It seems to me that they'd want to leverage AI (as everybody does these days) or some other way of spreading the net more widely.
 
Soldato
Joined
3 Oct 2007
Posts
12,090
Location
London, UK
Sure but the majority of folk know where they stand with those providers. Apple on the other hand have made themselves the banner for privacy but are implementing (arguably not the first time) a tool that goes against that, hence the uproar.

And the uproar isn't unjust considering previous history with tools being developed for "good" but have be used for "bad", eg - NSO Pegasus.

That was always going to be abused. They might have marketed it as to be used to catch terrorist but they could hardly market it as a tool for governments to monitor their own population and foreign citizens who they don't like because of what they say about said government.

Or partially matched and then, from what's being reported, the uploaded material is given a "threshold" by an Apple employee before forwarding to authorities or rejecting (at which point where's the privacy if an Apple employee has seen your partner in all their glory).

Well unless your partner has the body and face of a child then you should be fine.
 
Man of Honour
Joined
29 Nov 2008
Posts
12,852
Location
London
Yeah hash-matching would seem to be a nonce sense way of going about this.

All it would take is for somebody to change a single pixel and boom, your file hash is different.

It seems to me that they'd want to leverage AI (as everybody does these days) or some other way of spreading the net more widely.

Read the article, the tech accounts for this, so you could blur the photo slightly, change it from colour to B&W, crop it etc. and the AI still picks it up. There's a tolerance.
 
Soldato
Joined
22 Nov 2006
Posts
23,376
Sure but the majority of folk know where they stand with those providers. Apple on the other hand have made themselves the banner for privacy but are implementing (arguably not the first time) a tool that goes against that, hence the uproar.

And the uproar isn't unjust considering previous history with tools being developed for "good" but have be used for "bad", eg - NSO Pegasus.



Or partially matched and then, from what's being reported, the uploaded material is given a "threshold" by an Apple employee before forwarding to authorities or rejecting (at which point where's the privacy if an Apple employee has seen your partner in all their glory).



Got a source to suggest that type of material wouldn't be part of the "database"?

Apple builds their devices in China and makes compromises with privacy within China for the CCP. They are not even remotely a "banner for privacy". They just pretend to be.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Read the article, the tech accounts for this, so you could blur the photo slightly, change it from colour to B&W, crop it etc. and the AI still picks it up. There's a tolerance.
Well you/they can't have it both ways.

There's either going to be no false positives because it uses hash matching, or it can account for deviations as you say, and then you can guarantee there will be false positives.
 
Soldato
Joined
26 Feb 2007
Posts
8,519
It seems odd to me how much apple fought the government to stop them being able to access phones, but then decide to run their own scans. I guess it boils down to if you trust the elected government or the super corporations more.
 
Soldato
Joined
3 Oct 2007
Posts
12,090
Location
London, UK
Yeah hash-matching would seem to be a nonce sense way of going about this.

All it would take is for somebody to change a single pixel and boom, your file hash is different.

It seems to me that they'd want to leverage AI (as everybody does these days) or some other way of spreading the net more widely.

Anyone with any sense wouldn't have such materials on their phone/tablet full stop. They'd keep it on a device that never goes online. Fortunately a lot of these people are idiots and so aren't clever enough to avoid being caught and every one of them that is caught is one less on the streets/not known to authorities.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,147
Doesn't bother me on the online side if they automatically scan cloud data on the cloud side - but as soon as you start enabling such activity on the client side it is a slippery slope where it becomes hard to argue against increasingly wider measures and once you start letting that become normal it will eventually be abused even with good intentions.
 
Soldato
Joined
22 Nov 2006
Posts
23,376
It seems odd to me how much apple fought the government to stop them being able to access phones, but then decide to run their own scans. I guess it boils down to if you trust the elected government or the super corporations more.

The corporations are WAY worse. Governments are mostly interested in the big picture stuff and crime, corporations want all of your data for their own gains.

Apple will pass this off as helping the government catch criminals, but also spy on people for themselves at the same time.
 

LiE

LiE

Caporegime
OP
Joined
2 Aug 2005
Posts
25,640
Location
Milton Keynes
Doesn't bother me on the online side if they automatically scan cloud data on the cloud side - but as soon as you start enabling such activity on the client side it is a slippery slope where it becomes hard to argue against increasingly wider measures and once you start letting that become normal it will eventually be abused even with good intentions.

Does it though? I see many people stating that if they do A then it will lead to XYZ, but where has this happened in Apple's history?

They are implementing a feature that they themselves have publicised, everything else that it may lead to is conjecture.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Doesn't bother me on the online side if they automatically scan cloud data on the cloud side - but as soon as you start enabling such activity on the client side it is a slippery slope where it becomes hard to argue against increasingly wider measures and once you start letting that become normal it will eventually be abused even with good intentions.
It seems we're heading for a more monitored/controlled/authoritarian future whether we like it or not.

I wonder if that's simply an inevitable consequence of the growing human population. We'll be reaching reaching towards 10 billion pretty soon.

Certainly won't be enough govt workers to keep track of everybody, and not keeping track of everybody doesn't seem to be an option. So we're all being conditioned to accept an ever increasing monitoring function, through all our devices, through cameras, satellites, and everything else.

The future is probably going to have a lot, lot more of this. Half of us don't even seem to care a jot. The conditioning has already worked.
 
Soldato
Joined
3 Jun 2005
Posts
3,066
Location
The South
All it would take is for somebody to change a single pixel and boom, your file hash is different.

Hence fuzzy, so in essence it's looking for partial matches to pick up "modifications" but that doesn't make it black-white rather it's (or can be) very grey. And it's why people are questioning it as what's currently being reported is suggesting if a match is made, it's handed over to Apple at which point privacy comes into play as it could be entirely innocent.

That was always going to be abused.

Perhaps but that's not to say tools like this couldn't be abused as well or for the scope to be increased.

Well unless your partner has the body and face of a child then you should be fine.

Yes, no, maybe; depends how it works exactly and at which point the material gets fired over to Apple for them to gawp at.

There's either going to be no false positives because it uses hash matching, or it can account for deviations as you say, and then you can guarantee there will be false positives.

Exactly, you've got to fuzzy otherwise you're "peeing in the wind" but that means an increased amount of false-positives at which point an Apple employee may have seen innocent private material whilst breaching privacy.

It seems odd to me how much apple fought the government...

Whilst also backing down in other areas of privacy, ie - iCloud encryption.
 
Soldato
Joined
3 Oct 2007
Posts
12,090
Location
London, UK
It seems odd to me how much apple fought the government to stop them being able to access phones, but then decide to run their own scans. I guess it boils down to if you trust the elected government or the super corporations more.

They wanted Apple to put a backdoor in so the authorities could gain access to your devices. That would have been abused by governments worldwide.
 
Soldato
Joined
15 Sep 2008
Posts
2,510
I love how people here are missing the difference between an image and image hash.

They know the hashes of KNOWN, distributed child abuse images and are using AI to compare hashes of your own images against those.

Some people need to lay off the tinfoil.

Only if the hashes are already in the database.

If this means the complete erosion of all CP material off the face of the internet then it would be a good thing. However, as evil as these pedos are it may drive them to make new material of which Apple and their associates don't have the file hashes for. Heaven forbid. Then you introduce the case for image scanning etc that's been discussed.

Apple's AI already scans the images on your device - open your Photos app then select Search and type "dog" and it will show you any images it's AI thinks is a dog.
 
Soldato
Joined
22 Nov 2006
Posts
23,376
Apple have deliberately made their products inferior in many areas due to not mining users for data.

I don't think leaving out a SD slot, a USB socket, using a super fragile screen, non-changable battery and crappy OS is the reason for that :p
 
Back
Top Bottom