Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
That would be my first concern, followed by the mission creap followed later by the later ai profiling to catch people who might be bad or think wrong

And there will be cases where kids take selfies that are explicit and get prisioned 5 years later,

Or you take a photo but there is something going on in the background which you didn't notice, because it's like 100 meters away. The artificial unintelligence picks it up and reports you.
 
I did as such when I was 15, no teen knows that sexting at that age is illegal.
From what I understand there has been a fairly big push to inform teens of the risks (both legal and blackmail) of "sexting" as part of sex education etc.

If your age is what I think it is, there is little chance of you sexting when you were 15, given the phones at the time were big, bulky, very expensive, and didn't IIRC do texts.
 
From what I understand there has been a fairly big push to inform teens of the risks (both legal and blackmail) of "sexting" as part of sex education etc.

If your age is what I think it is, there is little chance of you sexting when you were 15, given the phones at the time were big, bulky, very expensive, and didn't IIRC do texts.

I’m not even 35 and you were considered to be a mad baller at school if your phone had a camera that produced a blurry pixelated mess as a photo.
 
From what I understand there has been a fairly big push to inform teens of the risks (both legal and blackmail) of "sexting" as part of sex education etc.

If your age is what I think it is, there is little chance of you sexting when you were 15, given the phones at the time were big, bulky, very expensive, and didn't IIRC do texts.

I picked up a Philips 535 around 2003 - about the first time phones started doing cameras really and at sort of reasonable prices - IIRC it was a £140 phone then, was a couple of models in the year or two before that but they were crazy expensive and not good.

I wasn't in a rush to get a mobile phone even when other people started getting them - wasn't until a mixture of my mum being seriously ill in hospital, so I wanted to be contactable and that they started to be useable as a camera, mp3 player and basic web browsing, etc. that I came around to getting one.
 
If your age is what I think it is, there is little chance of you sexting when you were 15, given the phones at the time were big, bulky, very expensive, and didn't IIRC do texts.

Of course he didn't. It's like everything else he posts, from his pretending to suffer from all manner of maladies to his imaginary degree, complete and utter bullflop.
 
If little Jonny is sending texts to little Jenny and vice versa, then the solution is probably not to make them both sex criminals. But that does seem to be the way the law is right now, rightly or wrongly.

It's crazy, the law is supposed to protect kids but it actually punishes the victim of the crime!
 
Can't wait until all of the whistleblowers start getting rounded up for investigative journalism, as opposed to being a mouthpiece for increasingly authoritarian governments.
 
it obviously won't stop here.

I bet loads of pics get false flagged by people without kids.....

I have pics on my phone of a baby covered in red blotches almost all over his body, to an idiot it looks like bruising.....

in reality it was skin condition and we were trying different milk formulas, special washing detergents and softeners etc..... as well as getting medical help from our family doctor for months.....


anyway next it will be photos of anything against the law being scanned, in 50-100 years they will be scanning your brains.


any attractive people will suddenly be getting randos asking if they want to go on a date, because thats the level of professionalism you can expect these days.


it wont prevent child abuse anyway people will avoid it, the law abiding citizens will be the ones being monitored
 
Last edited:
It's crazy, the law is supposed to protect kids but it actually punishes the victim of the crime!
Well there aren't really any victims if no coercion is involved, and both were willing participants to send and receive those images to each other.

And ironically if the images simply remained on their devices, there would be no victims from the unwanted distribution of the images. But if Apple decides to upload them all to the cloud for processing and criminal investigation...
 
This is not going to end well.

Do I want sick ***** to be caught and dealt with? yes. Am I seriously concerned with dumb ass AI deeming something on my phone as a problem? Hell yes.

I already see crap flagged for stuff it isn't on FB, so why should I believe Apple will be any different. Also.. context. For example, a paintball prop is a paintball prop to some dumb ass without context they may think it's something suspicious. I have pictures of my kid... or MYSELF as a kid on my phone? heck.. what's it gonna do etc.

As a parent, I want all kiddy fiddlers to burn in hell, but this is dangerous.
 
They are not scanning your images, only comparing hashes to known CSAM when uploading to iCloud. You need to have a collection of known CSAM to get flagged. Known CASM is that which is collected by law enforcement from those involved in distributing CSAM or are caught with it.

Pretty much all the big tech firms already do this when you upload images to their networks as you probably do already.

I just don’t get how people continue to misunderstand the fundaments even after being explained multiple times.
 
No, I get the argument, I’m not completely naive. But until they actually do something beyond this feature it’s speculation at best though to complete FUD or conspiracy theory’s at worst.

Unlike some people I seem to be able to tell the difference between reality and what might happen and don’t try and conflate the two into a false narrative.

I’m also not buried in some false reality to realise that all tech companies already scan any image you put into the cloud for CSAM.

There is a reason why I can type dog in Google photos and any picture I have taken of a dog shows up. No one seems to bat an eyelid over that but it goes well beyond what Apple are doing here.
 
I just don’t get how people continue to misunderstand the fundaments even after being explained multiple times.
There is a reason why I can type dog in Google photos and any picture I have taken of a dog shows up. No one seems to bat an eyelid over that but it goes well beyond what Apple are doing here.

Yet people are still ignoring the fact Apple are bringing this client-side rather than keeping it server-side, hence the uproar around privacy rather than "because Apple whaa whaa".

Again, if this "scanning" only occurs (as Apple have said themselves) when images/data are marked for upload (to the iCloud), then why can't Apple continue with CSAM scanning on the iCloud? Why do they need to bring it on to the device?

Similarly Apple has said in the past (2016) they wouldn't build a tool for agencies to gain access to Apple devices as they deemed it to dangerous as "...while the government may argue that its use would be limited to this case, there is no way to guarantee such control." (https://www.apple.com/customer-letter/).
Yet they're quite happy to deploy a powerful tool on to users devices that arguably could be abused by governments ¯\_(ツ)_/¯

And on top of that Apple have had to bow down and forgo it's privacy beliefs to countries and regimes to get their devices sold, ie - China - https://www.nytimes.com/2021/05/17/technology/apple-china-censorship-data.html

Unlike some people I seem to be able to tell the difference between reality and what might happen and don’t try and conflate the two into a false narrative.

I think it's pretty clear, and rightly so, why a lot of the industry are questioning this addition. But sure, you ignore the facts and Apple's questionable side and you carry on guzzling down on Apple's "Kool-Aid"....

tenor.gif
 
Am I seriously concerned with dumb ass AI deeming something on my phone as a problem? Hell yes.
Yeah the AI is still far away from being usable for something like this. Given you can put a bit of text in front of an object and the AI will prioritise the text in its analysis
If you tell it the apple is a banana it takes it as fact
 
This is not going to end well.

Do I want sick ***** to be caught and dealt with? yes. Am I seriously concerned with dumb ass AI deeming something on my phone as a problem? Hell yes.

I already see crap flagged for stuff it isn't on FB, so why should I believe Apple will be any different. Also.. context. For example, a paintball prop is a paintball prop to some dumb ass without context they may think it's something suspicious. I have pictures of my kid... or MYSELF as a kid on my phone? heck.. what's it gonna do etc.

As a parent, I want all kiddy fiddlers to burn in hell, but this is dangerous.
Just to add to this, all you need to do is look at Youtube. The AI, frequently gets things wrong and causes issues for users and creators.
 
Back
Top Bottom