Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
No, the government created a law for "extreme pornography", and lumped stuff like fisting in with bestiality and necrophilia, so they basically criminalised every gay man because that sort of stuff is common on porn websites.

It's an example of how homophobic laws are put in place under the guise of protecting the public.


I think a more reasonable argument you could have used instead of implying gay men are necrophiliac animal ******s into fisting would have been “female ejaculation” or squirting. Was banned in pornography as extreme
 
Context - are hundreds of images on google images of ... Never mind, I don't think I'd be allowed to discuss it here.

I take an image from google of said practice and say 'this is bad and needs to be banned'.

And apparently I get called a pedo.

The images that would flag are those provided by the NCMEC and their CSAM database. I would be surprised if images found on google would be in this database, especially since the majority of images are already being scanned using this technique (photoDNA).
 
I think a more reasonable argument you could have used instead of implying gay men are necrophiliac animal ******s into fisting would have been “female ejaculation” or squirting. Was banned in pornography as extreme

Oddly I remember back to when this ban happened.

The reasoning behind it was 100% 'Women against porn'. All the arguments were about 'sexualizing / normalizing extreme sexual practice against women'.

Many of the comments that were used in passing the laws were that 'this only happens to women in porn and never to men'.

Basically all the people involved in this only looked at examples of straight porn. Far worse 'dehumanization' level of crap already exists in gay porn, except none of the people involved in passing this law were gay and had never seen any such examples before deciding to ban 90% of gay porn based on the hurt feelings of a few extreme feminists.
 
I have a question on consent.


A parent takes a photo of their underage child, it triggers the hash and is reviewed by an Apple employee.

the employee quickly goes “nope just a family photo not abuse” and dismisses it.

does that child when they become an adult have the right to sue Apple and the employee for viewing them naked without their consent?




Snowden showed that Nsa employees abused their powers to spy on their partners regularly.


If in future a former Apple employee of this system is found to be an abuser too what system will be in place for compensating every last person who may have had their images viewed by this person without their consent?
 
The images that would flag are those provided by the NCMEC and their CSAM database. I would be surprised if images found on google would be in this database, especially since the majority of images are already being scanned using this technique (photoDNA).

But the images in question are simply 'borderline' and do not actually contain 'genitalia' but depict what is happening. This has been branded as still being 'pedophilia' and banned by the vast majority of websites regardless of google actually allowing the images to remain hosted in the first place.

Which raises the question - If such practice is legal and routinely carried out daily across all medical institutions, why is its discussion and opposition of such practice shut down over accusations of pedophilia?

If curious, look up how infant boys catch STDs during circumcision.
 
The NCMEC aren't law enforcement.

yes 2 non law enforcement agencies makes it better I suppose??


Oddly I remember back to when this ban happened.

The reasoning behind it was 100% 'Women against porn'. All the arguments were about 'sexualizing / normalizing extreme sexual practice against women'.

Many of the comments that were used in passing the laws were that 'this only happens to women in porn and never to men'.

Basically all the people involved in this only looked at examples of straight porn. Far worse 'dehumanization' level of crap already exists in gay porn, except none of the people involved in passing this law were gay and had never seen any such examples before deciding to ban 90% of gay porn based on the hurt feelings of a few extreme feminists.


not really straight porn is way weirder than gay porn don’t worry ivechecked
 
A parent takes a photo of their underage child, it triggers the hash and is reviewed by an Apple employee.

the employee quickly goes “nope just a family photo not abuse” and dismisses it.

does that child when they become an adult have the right to sue Apple and the employee for viewing them naked without their consent?

That isn't how the technology works though. The hash is a unique digital fingerprint for an exact photo that NCMEC have added to their hash database.
 
yes 2 non law enforcement agencies makes it better I suppose??

not really straight porn is way weirder than gay porn don’t worry ivechecked

You haven't checked hard enough, back on reddit before all the slightly useful subreddits were removed, we had proper comparisons being made.

Obviously I cant say anything in detail about it on a family friendly forum.
 
That isn't how the technology works though. The hash is a unique digital fingerprint for an exact photo that NCMEC have added to their hash database.


But apple have repeatedly said close matches not exact matches.

if it’s exact why have a person check?

and again everything posted says hashes are not truly unique as they can account for edited pictures


Houses don’t get knocked down just cause a pedo used to live there. Kids are pretty similar looking new child same bathtub etc?

apple also probably didn’t think there would be a government funded hack of their os that just required you to receive an iMessage either did they?
 
Personal view.... the idea is good in theory (protect children and all that) but the entire idea is the start of an incredibly slippery slope in terms of invasion of privacy and who actually controls what is on your phone/pc etc.

Also the irony isn't lost on me regarding their current stance about 'privacy' and their 1984 advert about 'big brother' etc.
 
Throughly vetted members of law enforcement should be doing it

You still need an internal team to deal with that material in the first instance though, otherwise how else are police or other authorities/organisations supposed to take action?
 
You still need an internal team to deal with that material in the first instance though, otherwise how else are police or other authorities/organisations supposed to take action?


Any positive result sent via automation to the authorities team.?


Otherwise there really needs to be a system in place for every child to be able to sue for compensation for their images being viewed by a civilian

there needs to be a massive deterrent for the company to avoid abuse the cost of failure needs to be orders of magnitude greater than the cost of enforcing correct controls so the equation is never it’s cheaper to pay off and have lax enforcement
 
Any positive result sent via automation to the authorities team.?

That's a terrible idea, you'd just be shifting all responsibility for dealing with the issue away from the company/platform hosting it. They have a responsibility to develop systems and processes to identify IIOC and other material and investigate where and how it's appearing on their platform, and they can't do that if all they do is forward everything to the police or other organisation.


Otherwise there really needs to be a system in place for every child to be able to sue for compensation for their images being viewed by a civilian

I'd imagine the terms of use of these systems/services have already anticipated that.
 
I think a more reasonable argument you could have used instead of implying gay men are necrophiliac animal ******s into fisting would have been “female ejaculation” or squirting. Was banned in pornography as extreme

How on earth did you get that from my post! :eek:

The point was, the government are putting mainstream sexual activity between adults on the same level as bestiality, and there seems to be some homophobic intent with regards to fisting.
 
How on earth did you get that from my post! :eek:

The point was, the government are putting mainstream sexual activity between adults on the same level as bestiality, and there seems to be some homophobic intent with regards to fisting.


The association comes from the fact it’s more likely to be anal in gay men than vaginal fisting.

it carries increased infection risks due to perforation that’s why it’s asked at sexual health screenings for gay men but not for straight men and women.

straight people fist each other probably in higher numbers than gays due to the percentage differences


Like how white people are more likely to be “anything” in the uk than black people
 
That's a terrible idea, you'd just be shifting all responsibility for dealing with the issue away from the company/platform hosting it. They have a responsibility to develop systems and processes to identify IIOC and other material and investigate where and how it's appearing on their platform, and they can't do that if all they do is forward everything to the police or other organisation.




I'd imagine the terms of use of these systems/services have already anticipated that.


So again you want private employees looking at innocent pictures of people’s children.


Yes I’m absolutley sure they absolve themselves of all liability completely.

that is the problem.



and this is ignoring the hash list being abused security forces have regularly forced companies to produce dodgy RNGs for the purposes of making back doors etc


What’s the chances suspected intelligence officers faces end up on this list in future?
 
If a strong enough match is flagged, then Apple staff will be able to manually review the reported images

I think the term 'strong enough match' makes it clear they will be using some kind of AI, if they were solely comparing hashes then either it's a match or it isn't. Seems like a privacy (and security) nightmare that will do absolutely nothing to catch child predators now that they announced it to the world. Imagine an Apple employee browsing through your files because you took a picture of your toddler in the bath. Not that I was particularly interested in their overpriced phones before, but I certainly wouldn't go anywhere near them now.

Example from Apple's explanation of the technology:

Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit.

So definitely not just hashing, and definitely a slippery slope. There's been many an oppressive instrument introduced under 'think of the children', 'terrorism' or similar. Most of the rest of my arguments/objections have been raised already. If the Snowden leaks didn't open eyes and minds I don't know what will. Seriously reconsidering my new M1 MacBook Pro and iPhone 13 Pro Max (or equivalent) if this goes ahead.

As Jordan Peterson said (in regard to stop and search/knife crime, paraphrased), it's such a low probability event that the powers exercised in the so-called prevention/detection of it by far exceeds the possible benefit of implementing such an invasion of privacy. Anyone actually wanting to download, view or trade in such sick filth will no doubt continue using a throwaway Android device with custom ROM. Meanwhile privacy continues to circle the drain and authoritarian regimes (including the UK) wet themselves with glee at the potential for widening the scope of the implementation for 'security'.

'They' are already trying to ban e2ee and private messaging, crypto currency, and now on-device surveillance (above and beyond the mass that already takes place). Soon people will be forming queues happy to get their state approved device with malware pre-installed for your convenience. They have nothing to hide, after all... :rolleyes: Meanwhile ministers are squirrelled away on Matrix/Element, Signal and such to keep their own perversions and corruptions 'private'.

Edit: The Register have an article on it now.

El_Reg said:
Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain.

The neural network-based tool will scan individual users' iDevices for child sexual abuse material (CSAM), respected cryptography professor Matthew Green told The Register today.

Rather than using age-old hash-matching technology, however, Apple's new tool – due to be announced today along with a technical whitepaper, we are told – will use machine learning techniques to identify images of abused children.

"What I know is that it involves a new 'neural matching function' and this will be trained on [the US National Centre for Missing and Exploited Children]'s corpus of child sexual abuse images. So I was incorrect in saying that it's a hash function. It's much more powerful," said Green, who tweeted at length about the new initiative overnight.

"I don't know exactly what the neural network does: can it find entirely new content that "looks" like sexual abuse material, or just recognize exact matches?" the US Johns Hopkins University academic told El Reg.

Thinking on, will this even be allowed in Europe due to GDPR et al.?
 
Last edited:
Back
Top Bottom