Soldato
- Joined
- 21 Oct 2011
- Posts
- 22,384
- Location
- ST4
Oh wow! Really?!![]()
No, he just likes to make **** up in an attempt to turn every thread around to be about him. He's not right in the head.
Last edited:
Oh wow! Really?!![]()
So definitely not just hashing
The association comes from the fact it’s more likely to be anal in gay men than vaginal fisting.
it carries increased infection risks due to perforation that’s why it’s asked at sexual health screenings for gay men but not for straight men and women.
straight people fist each other probably in higher numbers than gays due to the percentage differences
Like how white people are more likely to be “anything” in the uk than black people
So again you want private employees looking at innocent pictures of people’s children.
That’s referring to the other feature under Communication Safety, which is separate to the CSAM hashing.
https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/
Oh dear, okNo, he just likes to make **** up in an attempt to turn every thread around to be about him. He's not right in the head.
Oh wow! Really?!![]()
That isn't how the technology works though. The hash is a unique digital fingerprint for an exact photo that NCMEC have added to their hash database.
Matching a "unique digital fingerprint for an exact photo" is not what's being proposed.
That’s what Apple are proposing. Can you point me to where they are lying and actually doing something different?
The association comes from the fact it’s more likely to be anal in gay men than vaginal fisting.
it carries increased infection risks due to perforation that’s why it’s asked at sexual health screenings for gay men but not for straight men and women.
Wow!Not like legally.
I posted 'religious' circumcision pics found on google search on one of those 'free speech' sites calling it out as a bad thing (I.E the type that causes babies to die from STDs). The pics of such were found from normal news articles calling out the practice as such and the whole thing being advocating to stop it.
So I get banned for posting 'pedophilia'.
Wow!
Can someone hold my surprise? It's not heavy.BallistixOnZ490 said:<more inflammatory guff> Also while I was banned on one such site ...
I know, the true child abusers are the Jews and Muslims practicing such horrific acts! How can such religious fanatic groups expect to be accepted into modern western society when they brutalise children?
Shows you the priorities of these corporations.
https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdfCSAM detection
Does this mean Apple is going to scan all the photos stored on my iPhone?
No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone pho- to library on the device.
Will this download CSAM images to my iPhone to compare against my photos?
No. CSAM images are not stored on or sent to the device. Instead of actual images, Apple uses unreadable hashes that are stored on device. These hashes are strings of numbers that repre- sent known CSAM images, but it isn’t possible to read or convert those hashes into the CSAM images they are based on. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. Using new applications of cryptography, Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing collections of photos that match to these known CSAM images, and is then only able to learn about photos that are known CSAM, without learning about or seeing any other photos.
Why is Apple doing this now?
One of the significant challenges in this space is protecting children while also preserving the privacy of users. With this new technology, Apple will learn about known CSAM photos being stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not learn anything about other data stored solely on device.
Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides signifi- cant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM.
Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, in- cluding the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-man- dated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limit- ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Can non-CSAM images be “injected” into the system to flag ac- counts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Will CSAM detection in iCloud Photos falsely flag innocent people to law enforcement?
No. The system is designed to be very accurate, and the likelihood that the system would incor- rectly flag any given account is less than one in one trillion per year. In addition, any time an ac- count is flagged by the system, Apple conducts human review before making a report to NCMEC. As a result, system errors or attacks will not result in innocent people being reported to NCMEC.
Maybe don't suggest bringing back Hitler, to random people?