Apple's photo scanning feature for iCloud in search of child sexual abuse material (CSAM) is no longer moving forward.

The iPhone maker confirmed that it killed its plans to roll out such a security feature, which brought massive controversy to the renowned tech giant. And as such, the detection tool would no longer see the light of day.

Apple CSAM iCloud Photo Scanning
(Photo : Illustration by Bruno Vincent/Getty Images)
In this photo illustration, the logo of the Apple computers brand is reflected in the eye of a man looking at a computer screen on May 8, 2006 in London, England.

Apple's CSAM iCloud Photo Scanning Feature is No Longer Rolling Out

As per the latest report by Mac Rumors, Apple announced that it is working to dish out new security features that address child safety.

And the proposed new safety features include CSAM detection on iCloud Photos. It should scan the photos of iCloud users to look for potentially child-abusive images. The iPhone maker planned to release it on iOS 15 and iPad OS 15.

However, the all-new iOS 16 has started rolling out, and we have yet to see it coming.

Apple iCloud
(Photo : Justin Sullivan/Getty Images)
SAN FRANCISCO, CA - JUNE 06: Attendees walk by a sign for the new iCloud during the 2011 Apple World Wide Developers Conference at the Moscone Center on June 6, 2011 in San Francisco, California.

It is worth noting that the feature got heavily criticized even before Apple started rolling it out. Security researchers and even some employees of the tech giant warned against it.

Due to the feedback of its customers, the Cupertino-based tech giant postponed its rollout. Initially, the firm plans to release the detection feature before the end of 2021.

Given that, the tech behemoth kept mum about it for almost a year now until today. Apple confirmed in its recent statement to Wired that it no longer plans to move forward with it.

Read Also: Apple Car Key Feature Now Shareable to Android Devices! Here's How to Share Digital Car Keys

Why is Apple Killing its CSAM iCloud Photo Detection Feature?

The tech giant says that it has "decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos."

Apple now says that "children can be protected without companies combing through personal data."

But despite that, the iPhone maker would still have child safety in mind.

"We will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all, Apple says.

Wired notes in its report that the iPhone maker confirmed that the CSAM detection feature is dead shortly after rolling out expanded end-to-end encryption to iCloud.

This time, the iOS 16.2 update is bringing end-to-end encryption even to backups and photos on iCloud. It should further enhance the privacy of its users.

Critics find the CSAM doing otherwise. Some cybersecurity experts warned that the detection feature could be used by law enforcement as a backdoor to surveil some users.

But now, Apple has completely abandoned its plans to roll out the scanning feature.

Related Article: Apple Store Robbers Caught on Video, Authorities Record Back-to-back Apple-related Crimes in November

Teejay Boris Hi-Res

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion