Apple’s recent announcement of its new technology to detect child sexual abuse material (CSAM) uploaded on iCloud with an update to iOS 15 has raised several controversial concerns regarding privacy, surveillance, and tech-enabled crimes. The new detection tool, called NeuralHash, will identify images of child abuse. This hash will then be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC).
The entire process uses a cryptographic technique known as private set intersection, which will then detect a hash match without revealing the image. These scanning operations are not new in the world of technology, in which big tech companies including Google, Microsoft, and Dropbox already scan material stored on their servers for child sexual abuse material. The only way Apple’s new move differs is that it gives the power to the company to scan and detect such material on the iPhone itself, which is a serious infringement of the users’ right to privacy.
This move by Apple has led to concerns of user privacy being compromised on a huge scale. It goes against the very ethos of the right to privacy enshrined in various international and regional human rights instruments including the Universal Declaration of Human Rights (UDHR), the International Covenant on Civil and Political Rights (ICCPR) and, the EU Charter of Fundamental Rights. Privacy experts have pointed to the previous attempts by law enforcement and intelligence agencies to force tech giants including Apple to build backdoors into their systems to catch “violent criminal elements”.
Privacy concerns are exacerbated by Apple’s resignation to the FBI in the San Bernardino legal dispute after which the company decided not to proceed with end-to-end encryption of iCloud. These efforts to snoop into people’s devices have been complemented by different states dedicated to a global “collect it all” surveillance mission, who have issued memos to their respective governments to demand tech giants to build certain backdoors to access encrypted data. This is when it comes to encrypted data. The absence of any encryption to user backup data on iCloud would make it fairly easy for states and law enforcement agencies to pressurise Apple into accessing and scanning iPhones without the consent of the users, which would further lead to security concerns.
Even in earlier instances, law enforcement officials around the world have pressured Apple to weaken its encryption for iCloud and iMessage to investigate child abuse material, terrorism, and other criminal activities. Such calls for encryption backdoors have resumed recently by virtue of pending agreements such as the U.S.-U.K. Bilateral Data Access Agreement and the first Executive Agreement under the CLOUD Act. The new move by the tech conglomerate is a way to address these demands at the cost of privacy of individuals.
According to a draft European Commission report published in 2020, an encryption backdoor of any kind which gives access to third parties to certain kinds of information and data can prove to be disastrous. When it comes to Apple’s new scanning technology and the subsequent move to detect child abuse material on its user’s devices, even an amateur can figure out the repercussions it could have on user privacy in the long run with the setting up of this dangerous precedent. Creating such a backdoor access would have the opposite effect of protecting children’s rights, and would rather allow criminals, authoritarian governments, and other malicious actors to exploit the situation in their favour through surveillance techniques. Moreover, it would severely weaken the security of the whole system and put the users at risk.
Bitthal is a fourth year B.A. LL.B. student at Rajiv Gandhi National University of Law. His areas of interest are Human Rights Law, Technology Law and International Law. He's extremely keen on exploring various cultures and traditions around the world.