Apple accused of underreporting suspected CSAM on its platforms

Apple accused of underreporting suspected CSAM on its platforms

A UK watchdog says Apple is behind many of its peers in tackling the issue.

Apple accused of underreporting suspected CSAM on its platforms | DeviceDaily.com
ASSOCIATED PRESS

Apple has been accused of underreporting the prevalence of child sexual abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in the UK, says that Apple reported just 267 worldwide cases of suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) last year.

That pales in comparison to the 1.47 million potential cases that Google reported and 30.6 million reports from Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony Interactive Entertainment (3,974). Every US-based tech company is required to pass along any possible CSAM cases detected on their platforms to NCMEC, which directs cases to relevant law enforcement agencies worldwide.

The NSPCC also said Apple was implicated in more CSAM cases (337) in England and Wales between April 2022 and March 2023 than it reported worldwide in one year. The charity used freedom of information requests to gather that data from police forces.

As The Guardian, which first reported on the NSPCC’s claim, points out, Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. However, WhatsApp has E2EE as well, and that service reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.

“There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities,” Richard Collard, the NSPCC’s head of child safety online policy, said. “Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the Online Safety Act in the UK.”

In 2021, Apple announced plans to deploy a system that would scan images before they were uploaded to iCloud and compare them against a database of known CSAM images from NCMEC and other organizations. But following a backlash from privacy and digital rights advocates, Apple delayed the rollout of its CSAM detection tools before ultimately killing the project in 2022.

Apple declined to comment on the NSPCC’s accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan. Apple said it opted for a different strategy that “prioritizes the security and privacy of [its] users.” The company told Wired in August 2022 that “children can be protected without companies combing through personal data.”

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

(16)