Safety first or an Orwellian move?: Apple to scan American iPhones for child abuse imagery

September 5, 2021

3 min read

Sign up to our mailing list! 👇

What's going on here?

Apple’s soon-to-be-launched software – which detects child abuse material on American iPhone users’ devices – has prompted concerns surrounding privacy and surveillance.

What does this mean?

Apple has created a system called ‘neuralMatch’ designed to scan the images on American iPhone users’ devices to detect child abuse material. If detected, the images are converted into a ‘hash’ (a series of letters and numbers) which is compared with images on a database trained on 200,000 images featuring child abuse. Only when a specific quantity of images are flagged will the images be decrypted and if illegal, the images will be passed to law enforcement. A team of human reviewers will assess the legality of the imagery flagged. One setting also alerts parents when a child receives explicit imagery.

The software will be released on American iPhones in the iOS update due in September 2021. 

Similar scanning software already occurs on many cloud-based platforms (such as Google and Dropbox). However, Apple’s software will be the first to conduct these scans on the images stored solely on a user’s personal device.

The prevalence of technology has exacerbated a serious issue with abuse imagery. In 1998, there were 3,000 reports of child abuse imagery, yet by 2018, this number had surged to 18.4m. As such, the sheer importance of finding ways to protect children cannot be ignored within the debate concerning privacy. 

What's the big picture effect?

The debate surrounding Apple’s neuralMatch software is representative of the broader tensions Big Tech companies are facing between keeping the promises they have made to customers regarding their privacy by increasing the prevalence of end-to-end encryption, and remaining accountable to governments who are pressuring the companies to provide access to this encrypted information as a means to protect children.

Critics of Apple’s neuralMatch appreciate the need to prevent child abuse material, but state that this type of software could be exploited by governments which could surveille citizens and use it to find protest-related content, and authoritarian governments as a way to spy on their citizens. Furthermore, critics state that the software contradicts Apple’s former policy of never undermining the security features of its products. Apple fought to protect this policy in 2016 during the ‘FBI-Apple encryption dispute’ where the US government attempted to compel Apple to create a new operating system to bypass a built-in security feature.

An open letter has been written to Apple imploring it to halt the ‘deployment of its proposed content monitoring technology’ and has currently been signed by just under 7500 individuals. Furthermore, the head of WhatsApp, Will Cathcart, has criticised the technology for the risk it poses to abuses of power, and has stated that the messaging app will not be adopting anything similar. Finally, critics state that other companies (such as spyware companies) could exploit the technology created by Apple.

Apple states that its approach not only protects children but also protects users’ privacy by the inclusion of multiple safeguards. Furthermore, Apple has committed to not expanding the software, despite any pressure it may receive from governments. It claims that its software is more decentralised than existing techniques, meaning that only the material most likely to be deemed illegal will lose its encrypted status.

On top of identifying child abuse material, human reviewers have to spend countless hours searching through distressing material ranging from violent imagery and hate speech. If more Big Tech companies launch software like Apple’s, there may be a rise in cases such as that where Facebook had to pay out $52m to its content moderators to compensate them following their development of post-traumatic stress disorder as a direct cause of their role.

Ultimately, there remains a balancing act between the responsibilities of ever-growing Big Tech companies for their global user-base and government pressure to gain access to encrypted material as a means to keep children safe.

Report written by Edie Essex Barrett

Share this now!

Check out our recent reports!