The European Union (EU) has initiated investigations into the safety measures implemented by YouTube and TikTok to ensure the protection of minors. 

The European Commission has taken this step by formally requesting information from TikTok and YouTube, marking the initial phase of investigations conducted under the EU's recently enacted law on digital content.

FRANCE-INTERNET-TECHNOLOGY-TIKTOK
(Photo : SEBASTIEN BOZON/AFP via Getty Images)
This illustration photograph taken on October 30, 2023, shows the logo of TikTok, a short-form video hosting service owned by ByteDance, on a smartphone in Mulhouse, eastern France.

EU'S DSA

In adherence to the Digital Services Act (DSA), a significant component of the EU's arsenal aimed at regulating major tech entities, the European Commission is keen to assess the actions taken by video-sharing platforms in addressing the spread of illegal and harmful content. 

The primary focus of the inquiry is to understand the measures these platforms have adopted to comply with the DSA, particularly concerning the potential risks posed to the mental and physical health of children.

Under the DSA, digital platforms can face fines amounting to six percent of their global turnover for any violations detected. The EU's move reflects its commitment to holding influential tech companies accountable for safeguarding users, especially minors, and ensuring the responsible dissemination of online content.

In an official statement, the European Commission outlined the specifics of the information sought from TikTok and YouTube. The requests include detailed information on the companies' efforts to fulfill their obligations under the DSA related to the protection of minors. 

This encompasses the execution of risk assessments and the implementation of mitigation measures aimed at safeguarding minors online, particularly concerning the potential threats to mental and physical health. The investigation also delves into how minors utilize these platforms.

Both TikTok and YouTube are required to provide the requested information to the European Commission by November 30. Subsequent actions will be determined based on the evaluation of the provided responses. Possible outcomes include formally initiating proceedings under Article 66 of the DSA.

Article 74 (2) of the DSA grants the Commission the authority to impose fines for inaccurate, incomplete, or deceptive information provided in response to information requests. 

In cases where platforms fail to respond, the Commission holds the prerogative to demand the required information through a decision. Non-compliance within the stipulated timeframe may result in the imposition of periodic penalty payments.

Read Also: EU, UN Express Concern Over the Risk of Tech Like AI and Quantum Computing Being Weaponized by Countries Like China

Very Large Online Platforms

Given their designation as Very Large Online Platforms, both TikTok and YouTube are obligated to adhere to the comprehensive provisions outlined in the DSA. 

This includes the assessment and mitigation of risks associated with the dissemination of illegal and harmful content, potential negative impacts on fundamental rights, including children's rights, and the overall protection of minors. 

According to the Commission, TikTok had previously received a request for information on October 19, specifically addressing concerns related to the spread of terrorist and violent content, hate speech, alleged disinformation, and broader aspects concerning the protection of minors online.

Related Article: Tech Giants Including Google, Meta, ByteDance to Fall Under EU's Digital Markets Act

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion