WEDNESDAY, NOV. 23 - Majority of the EU lawmakers agreed to draft rules requiring tech companies like Meta, Google, and other online services to report and take down online child pornography, Reuters reported. 

The proposed legislation requires messaging services, application stores, and internet service providers to report and remove known and new photos and videos concerning child pornography, as well as grooming incidents. 

These rules are a part of the European Commission's proposed law last May of 2022, dubbed by a separate report as the child sexual abuse regulation (CSAR). 

EU DSA List to Require 19 Online Platforms to Report Algorithmic Risks! Facebook, Twitter Included
(Photo : Photo by JUSTIN TALLIS/AFP via Getty Images)

The law aims to prevent and combat online sexual abuse of children and child sexual abuse material (CSAM).

The proposed regulation aims to establish a new independent EU Centre on Child Sexual Abuse (EU Centre), as well as clear obligations for service providers to detect, report, remove, and block access to online child sexual abuse material and specific prevention, prosecution, and protection responsibilities for national authorities, according to the commission's website.

In line with its aim, Meta, Alphabet's Google, and other big tech companies will now also be permitted to select the technology used to detect such violations, as long as it was submitted to an independent, public audit.

Previously, the CSAR proposed the use of artificial intelligence systems for identifying child sexual abuse material.

Read Also: TikTok, Snapchat, to Crackdown on AI-Generated Child Abuse Images 

Criticisms Surrounding EU's New Legislation

EU's proposed law has been continuously criticized by online safety advocates, specialists, and privacy campaigners, stating that behind the guise of good intentions, equipping authorities to combat the proliferation of CSAM effectively could result in disproportionate violations of fundamental rights and online privacy for all EU citizens.  

In response to these concerns, the latest draft rules, as per ABC News, saw the European Union legislators approving a number of revisions last Tuesday (Nov. 20), stating that end-to-end encrypted content is excluded from detection; however, time-limited detection warrants issued by courts can be utilized to track down illicit material where mitigating steps are insufficient.

The move has been praised by the Computer and Communications Industry Association, a big tech lobbying group, as detection orders will now only be granted by a competent legal body in a targeted and limited manner, serving as a last option.

The European Liberal Youth (LYMEC) echoed this notion by lauding the new measures as it removes "indiscriminate chat control."

Increasing Child Sexual Abuse Material

ABC News states that the number of reports of online child sexual abuse in the EU has risen from 23,000 in 2010 to more than 1 million by 2020. Globally, allegations of child abuse on the internet increased from 1 million to over 22 million between 2014 and 2020, with over 65 million photographs and videos of children being sexually assaulted detected. 

Years after the increase, the European Commission proposed the new legislation after it believed that the present voluntary detection fails to sufficiently safeguard children as many businesses do not perform the identifying task. 

Related Article: AI-Generated Child Sexual Abuse Images Are Rampant, Could Flood the Internet, UK Watchdog Warns 

Written by Aldohn Domingo

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion