Edward Snowden: The capabilities announced by Apple yesterday (5) to combat child sexual abuse material (CSAM) through iCloud photo scanning were heavily criticized by well-known systems analyst and privacy activist Edward Snowden. In a series of Twitter posts, the former NSA official called the tool “mass surveillance for the whole world.”
Apple announced three main features to enhance child safety on its platforms. Communication tools would use machine learning to scan iMessage for confidential content sent between children. In addition, new Siri and search engine updates would help kids and parents in “unsafe situations”.
But the most impactful change, and one that turned out to be the most controversial, was the plan to use new encryption apps to detect CSAM-like images in personal photo libraries stored on Apple’s iCloud. Once the detection was made, the company would report the occurrences directly to the National Center for Missing and Exploited Children (NCMEC).
What do experts say?
“Don’t be fooled,” Snowden says in a post last night on Twitter. And concludes: “if they can scan child pornography today, they can scan anything tomorrow.” For the activist, “no matter how well-intentioned Apple may be”, but agreeing to these interferences could set a precedent that would allow the company to arbitrarily verify any other content.
In an interview with the Financial Times, the professor of security engineering at the University of Cambridge, Ross Anderson, called the initiative of the American multinational as “absolutely terrible”, as it would lead to a possible mass surveillance of our phones and laptops.
Apple defends itself and guarantees that the system is designed to protect privacy. According to the company, the chance of the tool erroneously flagging a user’s account is less than one in a trillion. Amidst the controversy, some people also remembered that the Apple solution is no different from many applications that people use in their daily lives.