Apple employees are reportedly also voicing out their concerns over the plan of the iPhone maker to detect any CSAM, or child sexual abuse material, on the photo library of its users, which flooded the Slack channel of the company.

Apple Employees Voice Out Concerns Over CSAM Detection Tool—Flooding the Company Slack Channel
(Photo : NOAH BERGER/AFP via Getty Images)
Attendees gather for a product launch event at Apple's Steve Jobs Theater on September 12, 2018, in Cupertino, California. - New iPhones set to be unveiled Wednesday offer Apple a chance for fresh momentum in a sputtering smartphone market as the California tech giant moves into new products and services to diversify.

On Aug. 5, a security expert issued a warning that Apple is planning to start scanning the photo library of iPhones to detect any media that exhibits child abusive content.

The cryptographer and associate professor at John Hopkins Information Security Institute, Matthew Green, noted that the Cupertino giant will be using a sophisticated algorithm, known as homomorphic encryption, which could scan the context of the images without peeking at it.

Apple did confirm that it is planning to roll out such a feature on the same day that Green hinted about it. After which, the Cupertino giant has been receiving staunch backlash due to the possible abuse of the upcoming detection tool.

Apple Employees Voice Out Concerns Over CSAM Detection Tool

As per MacRumors, even some Apple employees joined the critics of the CSAM detection, echoing their security concerns via the internal Slack channel of the Cupertino giant.

The majority of the worry that the employees raised were the likelihood of an authoritative regime forcing Apple to hand out information from the tool to carry out their own agendas.

According to Reuters, the worker who tipped about the internal clamor said that it is the first time that employees raised their concerns at this extent, at least in terms of the volume and duration.

However, it is worth noting that Apple employees working under user security are not part of the group that vehemently protested the new photo scanning function.

Apple CSAM Detection Tool

The CSAM detection feature is supposed to roll out on both iOS 15 and iPadOS later this fall. However, even before the release, many have expressed their concerns about it.

Green, who previously shared how law enforcement found a way to break into the iPhones of suspected criminals, also issued a warning about the possible exploitation of the CSAM detection.

The security expert further raised the possibility of using the CSAM scanning tool to initiate surveillance over iPhone data, adding that law enforcers could use the system to override end-to-end encrypted information.

Read Also: Apple XProject Antivirus Now Breached by New AdLoad Malware-Your MacBook Could Be At Risk

Apple's Response on Government Forcing Them to Use Detection Tool

Nevertheless, the Cupertino giant is also against the use of its upcoming detection tool for the interest of any repressive government. As such, Apple addressed the concern on the FAQ document of the feature.

Apple said that if ever a government attempts to force the company to include images that are beyond CSAM to the tool, the Cupertino giant "will refuse any demands," noting that it was designed to solely detect child abuse content.

Related Article: Apple Faces Patent Infringement Lawsuit by Bell Northern Research for Basic Mobile Wireless Technology

This article is owned by Tech Times

Written by Teejay Boris

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion