Apple Details iCloud Photo Scanning Works

0

Apple released this Friday (13) a document in which it provides more details about the new security features focused on children, announced earlier this month. In it, the company explains how the child abuse material detection system (CSAM) that has raised privacy concerns will work.

According to the Apple, the image bank used to detect photos possibly related to child exploitation in the device’s gallery will be generated by at least two different organizations. They operate in separate jurisdictions and are not under the control of the same government.

The Cupertino giant also said it will post a Knowledge Base article on its website containing a root hash of the database encrypted in each version of the supported operating system. This way, the user will be able to inspect the root hash on his device and compare it with the Base.

This measure, explains the owner of the iPhone, will facilitate the carrying out of audits by independent companies, to prove the correct functioning of the tool. The full document, including information on the manual review that will take effect after an account reaches the CSAM image limit, can be viewed on the Apple website.

Release confirmed

iCloud’s photo-scanning system for images that might indicate child abuse has been the subject of much discussion since it was announced. Specialists and entities even created an open letter to Apple, with more than 6,000 signatures, asking for the immediate suspension of the appeal.

Despite the criticism, reinforced by former NSA employee Edward Snowden and cryptographer Nadim Kobeissi, big tech kept rolling out new child safety features. They will first reach iPhone, iPad and Mac users in the United States by the end of this year.

LEAVE A REPLY

Please enter your comment!
Please enter your name here