Apple: The controversial Child Abuse Material Detection (CSAM) feature, announced by Apple in August, may have been abandoned. This Wednesday (15), Apple removed references to the tool from its page dedicated to child safety, according to MacRumors, suggesting a change of direction regarding the technology.
This possible withdrawal from the resource is yet another chapter involving the system, which would include scanning photos stored on iCloud, looking for images containing sexual abuse against children. Since its release, criticism of big tech has not stopped emerging.
One of the critics of the photo digitization was former US National Security Agency (NSA) official Edward Snowden, who called the function an attempt at mass surveillance. Security researchers, politicians and even Apple officials have also taken a stand against the tool.
In most cases, critics said that this type of technology poses risks to users’ privacy and could be used for other purposes by authoritarian governments. Many also pointed out that there is no evidence of the system’s effectiveness in detecting images in which children are being abused.
Explanations were not accepted
Trying to allay concerns about the feature to detect abusive material against children, the Cupertino giant released documents and interviews with executives detailing the system’s operation. Creating FAQs was another step taken to reassure iCloud users.
However, the explanations did not have the desired effect and the company ended up deciding to postpone the launch in September. At the time, Apple said the decision was made based on feedback from researchers, users and advocacy groups, choosing to further develop the system before releasing it to the general public.
This statement is now gone from the website, along with all other information about CSAM. Is the company giving up on the project, due to all the controversies registered in recent months, or is the removal of the contents due to another reason?
For now, the company has not made any statement regarding the case.