Apple has responded to the great controversy that has generated its new functionality, which is capable of scanning files for photographs of child abuse on the iPhone.
From the first moment this function was leaked, rejection among cybersecurity experts has been widespread, opening the door to possible abuses by governments or even Apple itself. These experts, along with privacy organizations, have published an open letter to Apple , in which they ask the company not to implement this technology.
Far from that, Apple’s first reaction has been the opposite, publishing a document (pdf) in which it defends the benefits of this development, and tries to calm the fears it has generated. To begin with, he claims to have received a lot of support from organizations related to the protection of minors , as well as private organizations and interested parties. However, he also acknowledges that he has received “many questions” and therefore wants to clarify them; It is no coincidence that these “questions” are the same criticisms that the system has received in just a few days.
Apple’s detection system works by comparing “hashes”, strings of numbers generated from an algorithm; the logic is that, if two files have the same “hash”, they are identical, bit by bit. Therefore, the photos are not visually compared, it is only checked if they are exactly the same files as those searched. The problem is that this system can be used to find files other than those containing child abuse. Apple claims its process is designed to prevent that from happening: The “hashes” of the images come from NCMEC (the US National Center for Missing & Exploited Children) and other child advocacy organizations, so Apple claims that only child abuse images (whose possession is illegal in USA) will be detected.
However, the great danger of this system denounced by experts is that this list of “hashes” could be modified by governments , introducing files that have nothing to do with child abuse, such as those related to persecuted religions, LGTB organizations, or political movements. In the hands of an authoritarian government that asks for it, this would be a great tool to capture any dissident user or user contrary to its policies.
In response, Apple says only one thing: “Apple will reject those demands . ” It promises that the system has been created solely to detect child abuse, identified by the experts, and that it will not be used for anything else. Apple confirms that in the past it has received orders from governments to implement changes that degrade privacy, and has rejected them and will continue to reject them. He has also spoken of the human role in this system, remembering that it will not automatically report users who generate false positives to the police, since all reports will have to be approved by employees who will see the detected image. At the same time, he affirms that such cases will be “unlikely.”
It is an important statement: it shows that Apple knows very well what the main problem with this system is, and knows that it is in danger of being labeled as an ally of authoritarian governments to persecute minorities; We are talking about a company that has created an image of privacy and respect, either by blocking trackers in apps, or by selling rainbow bracelets on the occasion of Pride Day. This controversy can seriously affect that image.
The problem for Apple is the background. The company collaborates with all the governments of the countries in which it is present. China is the best known case, where it has expelled apps critical of the government, or that could have been used to organize protests. In Saudi Arabia, it does not include the FaceTime app on iPhone because it uses end-to-end encryption, prohibited by the government.
For the moment, Apple has recalled that this measure only affects users who have chosen to make a copy of their photos in iCloud , and that for the moment, it is in operation only in the US.