Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
Surely the only people worried about this are people with child abuse images on their phones?

I really have no issue with it at all.

Nope, those who are the most vocal are those who think Apple are going to use this as a catalyst to scan their entire library and report them to the police for not shopping at the government approved supermarkets... you get the idea.

They forget they could do that today already if they wanted.
 
Surely the only people worried about this are people with child abuse images on their phones?

I really have no issue with it at all.

Not at all. The people who raise concerns aren’t raising concerns because they are uploading child abuse pictures to apples servers.

You could easily see this kind of technology being abused. Meme on your phone that could be interpreted as a hate crime?

I think this must have come about because Apple have found content on their servers that could ultimately put them in a difficult legal and moral position. From their perspective, this is obviously the legal and moral choice to ensure they remain compliment and seen to do the right thing (which obviously in this instance is the ‘right’ thing). But it poses serious long term questions regards privacy, private ‘property’ even if it’s digital, and access to it by companies, governments and law enforcement.
 
The intentions are good..

Said something similar in the Apple thread but it's in the same way as exploits being developed to help governments/agencies "fight crime" but have ultimately lead to the snooping of other governments, heads of state, charity workers, journalists and attacks on infrastructure.

This is another extremely powerful tool, that others will follow, that will get abused unfortunately.

Apple is all about privacy right now...

If that was truly the case, why do Apple not encrypt iCloud backups or not hand them over to agencies or (allegedly) not scan backups for illicit material?

And if material is flagged, rightly or wrongly, then it appears Apple has the ability to remotely access it which doesn't align with the whole "Apple is all about privacy right now..." image.

...it uses CSAM hashes.

It's fuzzy hashing so the scope isn't as tight as you would think, hence there's a pretty big chance of innocent folk getting caught up in this.

....yet they don't.

Supposedly it's already deployed on iCloud backups.

Surely the only people worried about this are people with child abuse images on their phones?

Common-sense would suggest it would be stored outside of the scope of this "scanner" and/or they're using other "non-invasive" platforms, similar to what criminals are already doing and why backdooring encryption is all a bit pointless.
 
I love how people here are missing the difference between an image and image hash.

They know the hashes of KNOWN, distributed child abuse images and are using AI to compare hashes of your own images against those.

Some people need to lay off the tinfoil.
 
They know the hashes of KNOWN, distributed child abuse images and are using AI to compare hashes of your own images against those.
That's a fair point and I have no objection to that (my point about vulnerabilities notwithstanding).

If you're a paedo muppet then how likely would you be to upload those images? Surely you'd just store them offline somewhere though.
 
I love how people here are missing the difference between an image and image hash.

They know the hashes of KNOWN, distributed child abuse images and are using AI to compare hashes of your own images against those.

Some people need to lay off the tinfoil.

That's just the start. The tech is called neuralmatch - they're not going to be discussing everything it can do - yes they've mentioned hashing at the moment but it won't take much to widen what it can do. When they say known it's not quite right. It's similar to those hashes but not exact. It then, apparently, gets reviewed by a human. Imagine if they found some naked celebrity pictures, it doesn't take much imagination to realise that they could be leaked.

First it will scan the pictures for nudity, etc. in the guise of stopping pedophiles. It won't take much to to scan documents, emails, notes, etc. in the name of protecting from terrorism.

In most places, to go through my data or even search my house, you need a warrant. If that is taken away, where there is no accountability, then we're loosing parts of our rights. For example we don't even know if this is admissible in court.

Why it might not affect sales too much it's certainly made me think my next device. I imagine a lot of people will turn off updates.




M.
 
That's a fair point and I have no objection to that (my point about vulnerabilities notwithstanding).

If you're a paedo muppet then how likely would you be to upload those images? Surely you'd just store them offline somewhere though.

And how did you get them offline?

Don't forget, deleted isn't deletes (in most cases)
 
That's a fair point and I have no objection to that (my point about vulnerabilities notwithstanding).

If you're a paedo muppet then how likely would you be to upload those images? Surely you'd just store them offline somewhere though.

It's done on device, so not in the cloud.
 
A few years ago, there was a terrorist incident, think it was a shooting in a church in the USA. The terrorist had an Apple phone with data that was encrypted, and the court judge ordered Apple to build a back door that would open up the phone. Apple refused, citing privacy concerns, as once it has been done on the terrorist's phone, it sets a dangerous precedent that can then be used on other people's Apple devices too.

Agreed that it's in good intentions, but I disagree with what Apple are doing now because it contradicts what I wrote here ^^ about privacy. The only difference is that the cloud is now involved. Apple are contradicting themselves.
 
And how did you get them offline?

Don't forget, deleted isn't deletes (in most cases)
I would expect they get sent them, or download them from somewhere, and store them on a USB stick or external hard drive. As long as they avoid Apple's icloud then they are "safe". They could even use a more secure encrypted cloud provider.

Delete is indeed delete if you do it properly and don't just rely on your phone/PC delete option. But I would not expect most paedo's to know how to do that.
 
Back
Top Bottom