The proliferation of end to end encryption in the civilian sector is something that causes the secret police of the world deep frustration. The desire to at least possess the ability to read the contents of every email has been the stated policy of the American State since the Clinton Administration. The inability of the FBI to crack into the contents of a dead person's iPhone was put on display by former FBI director James Comey in 2016. After Comey's last round on the Whitehouse Season of the apprentice, where he at least beat the out the Scaramuchi in tenure before he got Fandangoed, the FBI began implementing the Amy Hess approach to spying on you by building their own in house offensive cybersecurity infrastructure. It is not clear how well this has developed since it's inception.
A certain unity in the two approaches has revealed itself recently in Apple's iOS15. Prior to the San Bernardino shooting, Comey using harping the child pornography angle as his rhetorical truncheon to demand that industry simply build a back door into all their products. With the end of the Afghanistan war, the constant surveillance of all of the content produced, read, heard, shared and stored by all Americans needs a new rational and CSAM (Child Sexual Abuse Material) is going to be the one. As a working reporter, I have written about exactly three pedophiles. Two were Columbus Ohio police sergeants and one was Jeffery Epstein, a CIA asset. With visual evidence of Epstein's crimes in the FBI's possession for years before his eventual federal arrest, perhaps they need to scan their own phones and then work their way out to ranking members of large police departments and officials of the Catholic Church.
Four years ago, Apple was prepared to go to court to protect user data. Apple has now reversed course and offered up a massive back door in iOS15 for government use. This feature failed, but Apple has added other security weaknesses that the federal government is legally entitled to make use of without notice or a warrant.
Apple's plan for was to scan every picture on every phone BEFORE said picture is uploaded to the iCloud. This scanning would use a one way has function and compare the hash result of each picture to the hashes known images of child pornography. A hash is a digital fingerprint of a file that generates a (mostly, most often) unique number based on the data in that file. Nobody asked or mentioned where or how Apple got their source material to generate their hashes from.
The security problem for the user and the surveillance boon for the secret police is that every hash result will still be uploaded and can be compared to any picture that a hash has been generated for. Thus facial recognition can be done without the face even being seen. Religious or political messages can be scanned, hashed and compared.
Because Apple has done the scan itself, the uploaded hash is now Apple's business record. Thus it is subject to seizure without warrant or notice by the federal government. The contents of every picture in your new iPhone, regardless of whether they were taken by you or sent to you or downloaded by you, are now the property of the FBI should they so desire.
Where Apple failed was in implementation, which is why this “feature” will be added later and caused an overall delay in the release of iOS15 for a couple of weeks until today September 20th. Their hash function did not work as well as they had advertised or planned.
A hash function, mathematically speaking, is a proper function. This is so because any give value in the range, in this case a picture, corresponds to one and only one value in the domain, the hash. That does not preclude, as happens in the case of Apple's hash function and some others, two different pictures having the same hash result. This would generate a false positive. Further, with many Hash functions applied to pictures, a small change in data, like cropping an eighth of an inch off the side, will change the data and thus change the hash. This would generate a false negative. A person wanting to change the hash of a picture could crop, rotate, apply a filter, draw a line, add some text, or make a similar small change that need not been visible or at least apparent to the naked eye.
Apple sought to combat false positives and false negatives with an AI enabled hash method called NeuralHash. External testers have tried it on cat pictures and the results were better. From what little this reporter has seen much better. That said hash collisions, a technical term for two pictures having the same hash value, and false negatives were generated. One tester added the perfect amount of digital noise to a picture of Grumpy cat and got the same result as a hash from Doge who serves as the model for Dogecoin.
Various civil liberties groups have been protesting with the usual level of impotence since the announcement of this feature, but after three weeks Apple has released iOS15 and will add this product in an update at some unspecified time. It is not clear if they have a path to improving their method measurably or the unspecified time is just Apple waiting until the toothless dogs of the privacy lobby go back to skulking outside.
Meanwhile, another feature to protect your children and, allegedly, your privacy has been immediately implemented in the text messaging portions of iOS15. That feature will automatically scan, presumably using NeuralHash, any incoming pictures for ANY adult content and block the image and alert the user. It is assumed that adults can shut off the blocking. It is not clear if this stops the scanning and hashing of every picture sent via the messenger app. Again, this would make the hash a business record and not subject to even the minute protections that your text messages supposedly have on them.
Thus, your naked selfies belong to Apple, not who you sent them to, and if they belong to Apple they belong to the federal government as well. Mockingbird Publishing gives iOS15 a negative 10 out of 10 for privacy on it's product reviews and acknowledges that Tim Cook finally knuckled under.