Meta and Snap have received formal requests for information (RFI) from the European Commission concerning the safety measures they place for minors on their platforms. 

This action aligns with the requirements stipulated in the European Union's Digital Services Act (DSA). The European Commission had previously issued similar RFIs to TikTok and YouTube, underscoring the increasing importance of child protection in the oversight of the DSA.

In April, the Commission identified 19 entities, designating them as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).

Among these, Meta's social networks, including Facebook and Instagram, and Snap's messaging app Snapchat, were included. The focus on ensuring the safety of minors has rapidly become a key priority for the EU's oversight of the DSA.

FRANCE-EU-TECHNOLOGY-MEDIA-LEGISLATION
(Photo : LIONEL BONAVENTURE/AFP via Getty Images)
This picture taken on April 27, 2023 in Toulouse, southwestern France, shows a screen displaying the Meta logo and the European flag.

EU's Formal Notice

The European Commission issued a formal statement stating:

"The Commission is requesting the companies to provide more information on the measures they have taken to comply with their obligations related to the protection of minors under the DSA, including the obligations related to risk assessments and mitigation measures to protect minors online, in particular with regard to the risks to mental health and physical health, and on the use of their services by minors."

Meta and Snap are required to provide the requested information to the Commission by December 1. The Commission will then evaluate the responses and consider potential next steps. This could involve the formal initiation of proceedings in accordance with Article 66 of the DSA.

Under Article 74 (2) of the DSA, the Commission holds the authority to impose fines for providing incorrect, incomplete, or misleading information in response to a request for information. 

Failure to respond could prompt the Commission to request the information through a decision, with non-compliance leading to the imposition of periodic penalty payments. 

Read Also: EU, UN Express Concern Over the Risk of Tech Like AI and Quantum Computing Being Weaponized by Countries Like China

Risk Assessments

Following their designation as Very Large Online Platforms, Meta's platforms and Snapchat are obligated to adhere to the comprehensive provisions introduced by the DSA.

This includes assessing and mitigating risks related to the dissemination of illegal and harmful content, potential negative effects on fundamental rights, including those of children, and the protection of minors. 

Meta had previously received a request for information on October 19, related to the spread of terrorist content, violent content, hate speech, and alleged disinformation.

The European Commission had also recently requested information from TikTok and YouTube, initiating investigations under the EU's newly enacted digital content law. The specifics of the information sought include detailed insights into the companies' efforts to meet their DSA obligations concerning the protection of minors. 

This encompasses conducting risk assessments and implementing mitigation measures aimed at ensuring the well-being of minors online, especially regarding potential threats to mental and physical health.  

Related Article: Tech Giants Including Google, Meta, ByteDance to Fall Under EU's Digital Markets Act

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion