YouTube Recommends Violent Gun Videos to Child Accounts- Study Claims

A simulated nine-year-old account was flooded with graphic videos, according to the study.

A new study by a nonprofit that studies social media has claimed that YouTube is recommending violent gun videos to child accounts, in violation of the platform's own policies, reported first by AP.

The Tech Transparency Project created two nine-year-old accounts that mimicked the behavior of typical boys in the US, both of whom liked video games, especially first-person shooter games.

YouTube
StockSnap/ Pixabay

One account clicked on videos recommended by YouTube, while the other ignored the platform's suggestions.

The account that engaged with the recommendations was quickly inundated with graphic videos depicting school shootings, tactical gun training, and instructions on how to convert firearms to automatic.

Many of these videos clearly violate YouTube's policies against violent and gory content, according to the researchers.

"It's the Algorithms"

Despite YouTube's dedicated efforts in content moderation and the implementation of rules, the study claims that the platform is unsuccessful in preventing the proliferation of disturbing videos that have the potential to traumatize vulnerable children or lead them toward paths of extremism and violence.

Within a single month, the account that followed YouTube's recommended videos received a staggering 382 different videos related to firearms, while the account that disregarded the suggestions received a mere 34 such videos.

In addition, the researchers established accounts resembling the profiles of 14-year-old boys who had a keen interest in video games.

These accounts, much like the others, encountered a similar quantity of content pertaining to firearms and violence.

"Video games are one of the most popular activities for kids. You can play a game like "Call of Duty" without ending up at a gun shop-but YouTube is taking them there," Katie Paul, director of the Tech Transparency Project, said in a statement.

"It's not the video games, it's not the kids. It's the algorithms."

Social Media Concerns

YouTube and TikTok, among the most widely used platforms among children and teenagers, have faced criticism for their role in hosting and, at times, allegedly promoting content that encourages gun violence, eating disorders, and self-harm.

Social media critics have also highlighted the connections between social media usage, radicalization, and instances of real-world violence, as noted by AP.

Many prominent tech companies heavily rely on automated systems to detect and remove content that violates their policies.

However, the findings presented in the Tech Transparency Project's report indicate the necessity for increased investments in content moderation.

Shelby Knox, campaign director of the advocacy group Parents Together, highlighted that social media companies have the potential to specifically target young users with content that may pose risks, aiming to create a recurring and addictive user experience.

Parents Together criticized platforms like YouTube, Instagram, and TikTok for their role in allegedly enabling children and teenagers to easily discover content related to self-harm, firearms, violence, and drugs.

YouTube has implemented a policy where users under the age of 17 must obtain parental consent before accessing the platform. Additionally, for users below 13 years old, their accounts are directly linked to their parental accounts, ensuring a level of supervision and oversight.

Further details of the study can be found here.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:Youtube
Join the Discussion
Real Time Analytics