Apple is reportedly introducing a way to detect sensitive content on the iPhone photo library via an image identification tool, which pinpoints a child abuse image that a user keeps.

Apple to Detect Sensitive Content on iPhone Photo Libraries—But Security Expert Has a Warning
(Photo : by LOIC VENANCE/AFP via Getty Images)
An employee works on smartphones reconditioning, mainly Iphone, at the Largo company headquarters which is a Back Market refurbishing company subcontractor, in Sainte-Luce-sur-Loire, outside Nantes, on January 26, 2021.

9to5Mac reported that the new photo identification system could check the photos in the camera roll of an iPhone user. Thus, the algorithm can detect any suspected sensitive content. Then, the image would be subjected to further review by a real person.

Previously, on February 11, 2020, a warrant of arrest issued by Homeland Security exposed that Apple is accessing photos uploaded on iCloud and even those sent using the servers of its email client. The phone maker does that to look for child abuse content.

The warrant further detailed the process that the Cupertino giant employs.

Apple to Detect Sensitive Content on iPhone Photo Libraries

This time around, a security expert disclosed that Apple is expanding its photo identification into the camera roll of iOS users, according to AppleInsider.

Apple is reportedly announcing a new detection tool that uses photo hashing, enabling iPhones to detect Child Sexual Abuse Material, also known as the CSAM, sitting in its library.

The sophisticated algorithm that the iPhone maker will be using is alike to the system that the Cupertino giant uses to detect objects, scenes, and even people on the Apple Photos. The feature helps some of its users to conveniently search for the images on their devices.

Nevertheless, it is worth noting that Apple has yet to confirm the rumored upcoming feature. For now, the only known source of it is a security expert, Matthew Green.

Read Also: Altpass LLC Sues Apple with Patent Lawsuit for Security Features Like Passcodes, Password Creation, Face ID, and More 

Apple Sensitive Content Detection Tool: Security Expert Issues Warning

Green, also a cryptographer and an associate professor at Johns Hopkins Information Security Institute, wrote on Twitter that he "had independent confirmation from multiple people" about the matter.


The security expert went on to dub the CSAM scanning tool as "a really bad idea," noting that the process could further allow surveillance of the data sent and received from an iPhone.

He further said that law enforcers from various parts of the world have been asking for this kind of system to override end-to-end encryption for authorities to access the message of a criminal mind.

Although Green acknowledged that the scanning tool helps detect sensitive images on a person's iPhone, he worries about what would happen if an authoritarian government got access to the said feature.

Notably, Green previously exposed how law enforcement breaks into the iPhones of suspected criminals. 

Not just that, the security expert, together with the John Hopkins University, also found a way to fix a security bug on iMessage.

Related Article: Apple iPhones Hackable Even WITHOUT Victim Clicking a Link, Amnesty International Says Amid Pegasus Malware

This article is owned by Tech Times

Written by Teejay Boris

ⓒ 2021 All rights reserved. Do not reproduce without permission.