I don't know if someone can answer this question but by all accounts they are taking a hash of known bad images and comparing them to peoples photos on iCloud. They are not "scanning" your photos for content, they are simply using an algorithm that has a database of hashes for known bad images and compares the hashes of your images to those. They are not doing any image processing per se or learning / guessing.
As someone else pointed out however. This is in theory a fine idea. You can have whatever smut you want on your phone as long as its not a known dangerous image. Those photos of yourself or your partner naked won't show up. The problem is that it sounds like depending on the country, that database could be anything. If China decided that they wanted to add all photos of winnie the poo to the database then they could. Anyone with that image on their iCloud would be flagged up and potentially arrested.
If thats the case, it doesn't matter what technology they are using, this should be illegal. Giving anyone the power to decide that any arbitrary image is "dangerous" is very very bad. I don't know who I would trust to curate that database. Certainly not any government.