Apple
(Photo : GettlyImages/ CHRISTOF STACHE ) Apple logo

Apple announced it would delay its child protection feature following intense criticism from public and security experts. The child protection feature would scan the photos of the users for child sexual abuse material or CSAM.

However, security experts pointed out that it could affect user privacy and that the system may wrongfully accuse users of saving child pornography even if they aren't. The feature was supposed to be distributed later this year.

Apple Halts the Release of CSAM

According to The Verge, Apple announced its plans for the scanning feature. It is intended to help protect children from predators who use their devices to search for pornographic materials.

Apple added that based on the feedback from their customers, several researchers, and advocacy groups, they have decided o take more time over the coming months to collect input and make the necessary improvements before releasing the feature.

Apple's original press release regarding the changes, which were intended to reduce the proliferation of CSAM, has the same statement as the newest release.

Also Read: Apple CSAM Detection: How to Stop it from Scanning Your iPhone, iPad Photos

The original release detailed three changes in the works, one of them being Search and Siri putting resources to prevent CSAM if a user searched for any information related to it.

The two other changes came under severe scrutiny. One change would immediately alert parents when it detects that their children receive or send sexually explicit images. It would also blur sexually explicit photos for children.

The other change would scan the pictures stored in a user's iCloud account for CSAM. If they are detected, it will report them to the company's moderators.

The moderators would then forward the reports to the National Center for Missing and Exploited Children or NCMEC.

The Backlash Over CSAM

Apple had detailed the scanning system at length these past few weeks to ease the users' worries and advocacy groups, but unfortunately, it did not change their minds.

Edward Snowden, a former whistleblower, called Apple's CSAM as a "disaster in the making."

Even Apple employees expressed their concerns about the CSAM tool, which flooded the company's Slack channel.

The scanning system will scan pictures stored in iCloud Photos on your device and assess these pictures together with a database of known CSAM image hashes given by NCMEC and other child safety organizations.

Still, privacy and security experts had criticized Apple for the new scanning system, arguing that it could have created an on-device surveillance system and violated the privacy of the users who store their photos in their devices.

On Aug. 5, the Electronic Frontier Foundation said in a statement that the new scanning system would break the company's promises of the messenger's encryption, and it could cause broader abuses no matter how well-intended the system is.

Ben Thompson at Stratechery stated that Apple is compromising the device that people operate and that users have no say in the matter.

According to ArsTechnica, more than 90 policy groups from the United States and worldwide signed an open letter urging the tech company to stop its plan to have iOS devices scan photos for CSAM.

The letter was forwarded to Apple CEO Tim Cook. The organizations stated that even though the feature is intended to protect children and reduce the spread of CSAM, they are concerned that they will be used to censor speech, threaten the privacy and of the users, and threaten their security.

Related Article: Apple's CSAM Catches San Francisco Doctor With Child Exploitative Images on His iCloud

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags: Apple Delay CSAM
Join the Discussion