Apple faces backlash against a new CSAM detection system (2021)

Summary:

 Apple announced on August 5, 2021 that it would introduce three new child safety features in its upcoming iOS 15 update in the United States. These features are intended to help protect children online and curb the spread of child sexual abuse material (CSAM). The new features will scan messages to alert parents and children that an image could potentially be CSAM, expand Siri and Search to provide resources and support in unsafe situations, and detect CSAM images stored in iCloud.

The most controversial feature is an on-device machine learning mechanism that scans a user’s photos, as they are being uploaded to Apple’s cloud storage system (iCloud), for CSAM images by comparing it with known CSAM stored in the National Center for Missing and Exploited Children (NCMEC) database. If a threshold is reached with regards to a certain number of matched photos, a human review of the images will be completed. If the images are confirmed to match known CSAM, the images and user will be reported to NCMEC, as required by U.S. law.

flow chart
Image from Apple’s CSAM Detection Technical Summary

Apple’s plans were in response to criticism it faced in 2019 that it makes very few reports to NCMEC on CSAM and in anticipation of EU legislation, which would require tech companies to proactively identify and report CSAM content shared on their products and platforms.

This move by Apple appears to contradict its strong philosophical stance on privacy, promising its users that they do not share user data with anyone, including governments. Critics argue that Apple is creating an on-device surveillance system that would violate the trust users had put in Apple for protecting on-device privacy. Apple was accused of hypocrisy after it had taken a stance on privacy in 2016 by refusing US law enforcement requests to unlock the iPhone used by one of perpetrators of the San Bernardino mass shooting.

Other criticism stated that the new features are a “slippery slope to backdoors that suppress speech” and the parental notification features could be harmful to LGBTQI adolescents. The Electronic Frontier Foundation warned that “though these capabilities are intended to protect children….we are concerned that it will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

Image from Apple

A major concern voiced by many is that Apple could come under pressure from governments to use its new features to gain access to their citizens’ personal data, such as anti-government messages. Edward Snowden criticized the plans as a “tragedy” and the new features would “permanently redefine what belongs to you, and what belongs to them.”

On the other hand, child protection advocates say that Apple’s move to detect CSAM is long overdue. 

“The amount of CSAM being shared through Apple’s services is a mystery.… Child Rescue Coalition thinks it is right to presume the CSAM is there but is not being found and reported – because nobody at Apple is detecting it.” — Glen Pounder, Child Rescue Coalition.

Company considerations

  • How could scanning user content on a device affect iPhone sales and user trust?  
  • Is there a difference in privacy expectations when companies scan for content on a user’s device vs scanning for user content that lives on a company’s servers?
  • How can companies communicate complex trust and safety changes such as these to the public, particularly when the public is already questioning the security of end-to-end encrypted messaging (for example, WhatsApp’s controversial terms of service change)?
  • How should the company leverage community/external stakeholders or industry alliances to craft and effectively communicate policy changes and approaches? 
  • How can the company assure users that they will not use this technology in future for purposes beyond its original intent?
  • How should the company approach obtaining sufficient user consent when changing the mechanism by which they manage users’ data for the purposes of trust and safety?
  • How should the company plan the number human reviewers required to process the scale of affected content?
  • How can companies balance user privacy whilst also ensuring that harmful content, such as CSAM, is detected?

Issue Considerations

  • Does creating monitoring features such as Apple’s CSAM detection feature increase the likelihood of pressure by regulatory or government to use it for other purposes? How can a company resist or hold the line against this type of pressure? When might a company decide to allow the use of such features beyond its intended purpose?
  • Could these features increase pressure on other tech companies to use similar techniques?
  • What can be done to restore the trust of users who may fear that their privacy is being compromised by on-device surveillance?
  • Is scanning for CSAM material using NCMEC and other comparable databases effective in combating child sexual abuse? 
  • What threats do these types of detection systems pose to end-to-end encrypted messaging systems?

Resolution 

In response to the intense media storm, on September 3, 2021, Apple announced that it would “take additional time over the coming months to collect input and make improvements” before releasing these critically important child safety features.


Written by Tasneem Akhtar, September 2021