Apple has delayed plans to roll out detection technology that will scan US users’ iPhones to look for child sexual abuse material. The company is following widespread criticism from privacy groups and others that device tracking is setting a dangerous precedent. Apple said it’s listening and reevaluating negative feedback. There were concerns that the system could be abused by authoritarian states.
The so-called NeuralHash technology would scan images just before they were uploaded to iCloud photos. It would then be matched with known child sexual abuse materials in a database maintained by the National Center for Missing and Exploited Children. If a match was found, it would be manually reviewed by a human. If necessary, steps would be taken to disable a user’s account and report it to law enforcement. This technology would be released later in the year.
“Last month, we announced plans for features aimed at helping protect children from perverts who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” Apple said in a statement. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time in the coming months to gather input and make improvements before releasing these critical child safety features.”
Privacy campaigners have expressed concern that the technology could be expanded and used by authoritarian governments to spy on citizens. The Electronic Frontiers Foundation (TEFF) said that while child abuse is a serious problem, Apple’s attempt to “build a backdoor” into its data storage and messaging systems is fraught with problems. “To say we’re disappointed with Apple’s plan is an understatement,” he said. The signatures collected from consumers who opposed the move reached 25,000.