Apple has suspended its controversial feature!

0

Apple has been the target of criticism with its child safety feature, which it has announced for a while. The company has announced that it will further develop the option.

 

Last month, Apple announced several child safety features, including CSAM scanning for iCloud Photos. There was a lot of discussion about the option. In this environment, Apple CEO Tim Cook made many statements on the subject. Now, the company has announced that it wants to develop the feature for a while before making it public.

Apple will make various improvements before releasing the feature

In an interview with 9to5Mac, Apple said; He got to the point by mentioning that last month they announced a new feature to protect children from abusers via communication tools. He added that they value feedback from customers, advocacy groups, researchers and others.

Finally, he announced that they have decided to devote additional time over the coming months to collect data and make improvements before releasing critical child safety features.

Child-safe features coming later this year

Apple’s new child-safe features are coming this year as part of updates to iOS 15, iPadOS 15, and macOS Monterey. There is no clear date on when the company will make the options available yet. At the same time, the statement does not offer a very clear idea about what new changes will be made to improve the system.

How does the CSAM detection method work?

According to Apple’s statements, the CSAM detection method was designed with user privacy in mind. The system does not scan on images in the cloud. It matches using a database of CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. The company also turns this database into an unreadable hash set that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, it matches on a device against known CSAM hashes for the image. This process is powered by encryption technology called special set intersection, which determines if there is a match. The device generates a cryptographic security voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

What do you guys think about Apple’s child safety feature? Do you think it’s really safe? Do not forget to express your ideas in the comments section.

LEAVE A REPLY

Please enter your comment!
Please enter your name here