A recent study by Christo Wilson, a computer scientist at Northeastern University, illuminates the persistence of extremist communities on YouTube despite efforts to curb their presence. 

RUSSIA-INTERNET-YOUTUBE
This picture, taken in Moscow on October 12, 2021, shows the logo of YouTube's social media on a smartphone screen. (Photo: KIRILL KUDRYAVTSEV/AFP via Getty Images)

Extremist Communities Are Still Relying on YouTube for Hosting: Study

Wilson's research indicates that while extremist content remains on the platform, disaffected users are primarily drawn to it through subscriptions and external referrals rather than YouTube's recommendation algorithm.

According to Wilson, the study did not uncover a significant "rabbit-holing effect" on YouTube itself, suggesting that users were not radicalized solely by the platform. Instead, extremist communities heavily rely on YouTube for video hosting, with the process of radicalization often starting off-site.

Wilson's findings, presented at an ACM Web Science Conference, highlight the interconnected nature of online radicalization, where users navigate from politically partisan websites to YouTube channels and videos that align with their ideologies. 

This phenomenon, known as "rabbit-holing," perpetuates exposure to problematic content, leading users deeper into extremist communities, according to the study.

The study claims that despite YouTube's efforts to address its role in hosting fringe content, including changes to its recommendation algorithm in 2019, extremist content persists on the platform. 

Wilson emphasizes that YouTube serves as more than a standalone platform. Videos can be embedded into external websites, amplifying their reach beyond YouTube's domain.

Read Also: YouTube TV's Multiview Is Coming to Android Smartphones, Tablets-Pre-Selected Only


"Problematic YouTube Channels"

The study, which analyzed the browsing behavior of over 1,000 U.S. residents across different user cohorts, revealed that users encountered more YouTube videos on external websites than on YouTube itself. 

Additionally, politically right-leaning websites were found to embed more videos from "problematic" YouTube channels compared to centrist or left-leaning websites.

Wilson distinguishes between alternative channels, characterized by intellectual openness, and extremist channels, which promote hate speech and misinformation. 

However, users exposed to videos from channels deemed problematic off-platform are more likely to gravitate towards similar content on YouTube itself, blurring the line between off-platform and on-platform activity.

"We find that politically right-leaning websites tend to embed more videos from problematic YouTube channels than centrist or left-leaning websites, and that participants exposed to off-platform videos from problematic channels are significantly more inclined to browse towards on-platform videos from problematic channels," the study reads.

While acknowledging YouTube's limited control over users' behavior on external sites, Wilson suggests implementing stronger content moderation policies. He proposes that YouTube should monitor where its videos are embedded and scrutinize channels associated with misinformation or extremism.

Moreover, Wilson argues that YouTube's role as a hosting platform for extremist content implicates the company in facilitating fringe communities' activities. He advocates for stricter content moderation to curb the spread of harmful content and mitigate YouTube's contribution to online radicalization. 

The findings of the study were presented at an ACM Web Science Conference.

Related Article: YouTube to Focus on AI in its Quest to Dominate Streaming; YouTube TV Reaches 8M Subscribers

Byline


ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags: Youtube
Join the Discussion