Apple to scan images for child abuse

  • Thread starter Thread starter LiE
  • Start date Start date
That sounds like you are hiding homophobic images?

No, the government created a law for "extreme pornography", and lumped stuff like fisting in with bestiality and necrophilia, so they basically criminalised every gay man because that sort of stuff is common on porn websites.

It's an example of how homophobic laws are put in place under the guise of protecting the public.
 
I read somewhere that the retention rates and staff wellbeing are awful within the departments that manually review abuse, beheadings and hate crimes on social media

Pay is high but most people don't last more than a few months. Plus they have to do things like mandatory counciling. I know someone who did it for the police for a while.

He made just enough to move house, pay off his mortage and then quit.
 
Last edited:
I'm just watching the Barnacules Nerdgasm techtalk and he's talking with others about this and the potential for exploits.

I predict innocent people are going to get caught up in this.

Here is a link to where they start talking about the subject: https://youtu.be/kHchl9_ggEQ?t=3640
 
Last edited:
Pay is high but most people don't last more than a few months. Plus they have to do things like mandatory counciling. I know someone who did it for the police for a while.

He made just enough to move house, pay off his mortage and then quit.

Sounds like a great job for me actually, but how much does it pay?
 
Sounds like a great job for me actually, but how much does it pay?

I think he was getting about 60k around 5-6 years ago.

Apparently it's much more automated now and after all the filters they don't end up having to check a lot of images manually, but someone still has to do it.
 
£60k for doing nothing but sitting on your arse looking at images and watching video? Where do I sign up?

Most people are not well adjusted to dealing with that kind of thing - either they just unprepared to deal with it or the other way get off on it :(

Some people think they can handle it but can never get over what they've had to deal with to the point some even commit suicide.
 
£60k for doing nothing but sitting on your arse looking at images and watching video? Where do I sign up?

I knew someone who did it for the police. For six months. That was as much as they could take, even with psychological support. The police often see things that shouldn't happen, but this is a different level. You'd have to be a bona fide sociopath to not be affected. Granted, they had to study the material in detail to find clues for an investigation (e.g. who, where, etc) and to be able to testify in a trial. So it was worse. But even just seeing enough to be able to classify it as child porn or not means seeing things that shouldn't be and which you can't unsee. I wouldn't do it for £600K a year, let alone £60K.
 
No private company should be searching through your private property, regardless of the intentions.

If they want to search their online cloud storage then I care less, but someones phone content, no. It's something that should never be available to them to use.
 
How easy is it to fake an image hash?

like can I make a picture of something (or just random static). Have a similar hash to a known abuse image?


Could be an ace way to get access to anyone’s phone or discredit them just send them an innocus image that flags the system
 
How easy is it to fake an image hash?

like can I make a picture of something (or just random static). Have a similar hash to a known abuse image?


Could be an ace way to get access to anyone’s phone or discredit them just send them an innocus image that flags the system

Have a read of this: https://inhope.org/EN/articles/what-is-image-hashing

Image hashing is the process of using an algorithm to assign a unique hash value to an image. Duplicate copies of the image all have the exact same hash value. For this reason, it is sometimes referred to as a 'digital fingerprint'.


How is image hashing used in the fight against Child Sexual Abuse Material?
Hashing is a powerful tool used by hotlines, Law Enforcement, Industry and other child protection organisations in the removal of Child Sexual Abuse Material (CSAM). This is because it enables known items of CSAM to be detected and removed without requiring them to be assessed again by an analyst.

Because we know that once CSAM exists online it is often shared thousands of times, using hashing technology has an enormous impact. It reduces the workload and emotional stress for analysts and law enforcement of reviewing the same content repeatedly, and reduces the harm to the victim by minimizing the number of people who witness the abuse.


What about if the image is edited?
In earlier versions of hashing technology, if an image underwent very minor alterations, such as being cropped or changed to black and white, then each edited version of the image would be assigned a different hash value. This made using the technology to help remove known CSAM much less effective.

However, in 2009, Microsoft in collaboration with Dartmouth College developed PhotoDNA. PhotoDNA uses hash technology but with the added ability that it 'recognises' when an image has been edited so still assigns it the same hash value. Learn more about PhotoDNA here.


Does image hashing affect my privacy?
No. Many platforms use hash technology to detect known CSAM in order to remove it from their platforms. This does not violate users' privacy because the technology only detects matching hashes and does not 'see' any images which don't match the hash. Hash values are also not reversible, so cannot be used to recreate an image.

Learn more about how Artificial Intelligence is used in the fight against CSAM here.
 


Yes I know what it is. But what I’m saying is can you take it if you have a known hash?

hash “collisions” exist so it is possible to have 2 images with t same or similar enough one right?

mom just curious if there is a way to spoof the hash that will be checked? Ie is it a tag or something editable Or will the ai be constantly analysing images in detail from the image in full.

like how you can hide text or a second image in a digital image?

min which case how much of my devices memory and processing power is going to be taken up?




I’m not talking about accidentally I mean with the full force of knowledge skill and malice can someone make an image that will reliably trigger this AI hashing system?



then say pop it in a banner advert and mess with millions of people or target a specific individual with a message?


Pegasus just showed us that iOS can be hacked without user intervention and was targeted against political targets and human rights activists and journalists l.


I can’t think of a better way of discrediting a reporter investigating you than having them labelled a pedophile?
 
Back
Top Bottom