Meta announced that the company would be funding a new tool designed to help remove explicit images of minors from the internet in partnership with the non-profit child-protection group National Center for Missing & Exploited Children. 

Facebook,Instagram And WhatsApp Experience Global Outage
(Photo : Justin Sullivan/Getty Images)
SAN ANSELMO, CALIFORNIA - OCTOBER 04: In this photo illustration, the Facebook and Instagram apps are seen on the screen of an iPhone on October 04, 2021 in San Anselmo, California. Social media applications Facebook, Instagram and WhatsApp are experiencing a global outage that started before 9 a.m. (P.S.T.) on Monday morning.

Take It Down Tool

NCMEC and Meta announced a partnership to bring a tool that helps minors to remove nude, partially nude, or sexually explicit images of themselves from the internet. As per The Verge's report, "Take It Down" will be a free service as Meta provided initial funding to create this tool.

Chief Executive Officer and President Michelle DeLaune stated that this tool aims at children facing desperate situations. "Our hope is that children become aware of this service, and they feel a sense of relief that tools exist to help take the images down. NCMEC is here to help," she added.

Meta's platforms, including Facebook, Instagram, OnlyFans, Pornhub, and Yubo, have signed on to integrate this new tool. Adults can also utilize this tool, for instance, if they are still underaged in such content. Parents or guardians can also make a report on the child's behalf.

Users need to have the content on their device as they will submit it to a web-based tool that will convert the images into a digital fingerprint, also known as "hashes." These will be sent to NCMEC and shared with platforms and will use hash-matching technology to find and flag any attempts to upload the original images.

TechCrunch reported that Take It Down hashes the content in the browser, which means that this tool does not leave the child or parent's devices. Once there is an attempt to upload the original images, the hash-matching technology will detect a match and send the newly uploaded materials to a content moderator for review.

This process ensures that the tool is not being used to remove other types of images that are not considered explicit or sexual. Meta added that the company would ingest new hashes multiple times daily to block content quickly.

Also read: FTC Withdraws Antitrust Complaint Against Meta for Within Unlimited Acquisition

Minors Sextortion

This tool is an effort to combat the rising problem of minors sextortion. As per NCMEC, the reports of sextortion doubled from 2019 to 2021. Bloomberg reported that teenage boys had become the most common targets regarding these things.

Children are deceived into sharing intimate and explicit images online. Some of them experience blackmailing with the prospect of having those images posted online as they seek money.

Meta's Global Head of Safety, Antigone Davis, stated that this tool gives some control back to people, especially those in desperate and helpless situations. "Being threatened in this way puts people in a very vulnerable position and can have devastating consequences for them," Davis added.

Related Article: Meta's AI is Expanding as Mark Zuckerberg Forms New Team for Social Media Platforms

Written by Inno Flores

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion