Apple's CSAM detection tool plan is receiving another swipe. This time, from Edward Snowden, the former whistleblower, and intelligence contractor, now a journalist.

Apple CSAM Detection Tool a “Disaster-in-the-Making,” Edward Snowden Says
(Photo : by Alexander Koerner/Getty Images)
HANOVER, GERMANY - MARCH 21: Whistleblower Edward Snowden is broadcast live from Russia at the Sakura Stage at the CeBIT 2017 Technology Trade Fair on March 21, 2017 in Hanover, Germany. The 2017 CeBIT will run from March 20-24.

Snowden further called the plan of Apple to scan photos from iPhones and iPad a "disaster-in-the-making," further dubbing the move as a "tragedy."

As per AppleInsider, the former whistleblower made these pronouncements in an editorial write-up on Substack, entitled "The All-Seeing "i": Apple Just Declared War on Your Privacy."

Apple CSAM Detection Tool

Most of you may have known by now Apple is controversially planning to add a feature on iOS 15 that tackles child sexual abuse materials, or shortly known as CSAM. The function is set to release initially in the United States alongside the debut of the upcoming operating system for iPhones.

The upcoming detection tool seeks to scan photos from the device's library that are for upload in iCloud.

It is worth noting that Apple is touting this feature to specifically child abusive images on an iPhone or iPad user. The Cupertino giant will then inform the authorities upon verifying that someone is indeed keeping CSAM.

However, Snowden sees the detection tool as a means to "erase the boundary dividing which devices work for you, and which devices work for them."

The former whistleblower further opined that the solution that Apple plans to introduce redefines "what belongs to you, and what belongs to them."

He also raised that Apple is known as the "pro-privacy" tech company. And despite that, the Cupertino behemoth is allegedly betraying its users with the CSAM tool.

Snowden: Apple's Branding and CSAM Detection

On top of that, Snowden noted that Apple users could easily avoid the CSAM detection tool by simply disabling the iCloud Photos uploads.

That said, the former whistleblower went on to conclude that the move is a mere marketing scheme, instead of a tool to go after those who abuse younger minds.

Read Also: Apple App Store Promotes Scam Applications, Mistakenly? Some of Them Even Have More Than $10 Weekly Fee: How To Know If It's Fake

CSAM Tool and Government Abuse

Earlier critics of Apple's CSAM tool, like security expert and cryptographer, Matthew Green, raised the alarming possibility of some oppressive authoritative government to abuse the photo scanning feature.

Green further noted that it could pave the way for law enforcers to override end-to-end encryption capabilities.

Snowden echoed similar fears, adding that the government might force Apple to require users to turn on the detection tool at all times. He added that the limits of the tool are loosely based on "Apple's all-too-flexible company policy, something governments understand all too well."

However, Apple has already addressed this issue, saying that they are also against it and that the tech giant will refuse to follow orders from governments.

Elsewhere, a watchdog claimed that Apple is not doing enough to protect children against adult content on its app marketplace.

Related Article: Apple's CSAM Catches San Francisco Doctor With Child Exploitative Images on His iCloud

This article is owned by Tech Times

Written by Teejay Boris

ⓒ 2021 All rights reserved. Do not reproduce without permission.