YouTube New System Auto-Blocks Abusive Comment Posters
(Photo : NordWood Themes on Unsplash)

The gaming community has spoken out this week in response to the sudden demonetization of certain gaming videos on YouTube.

According to TechCrunch, YouTube's new policy is to blame. It was implemented in November 2022 to make some types of content more advertising friendly. The company is also attempting to make its vast collection of videos more age-appropriate in the wake of rising laws targeting social media's interaction with underage users, per the report.

The video-sharing and social media platform completely revisited its policy on offensive language and graphic violence by updating its advertiser-friendly content restrictions.

While it is too early to tell what YouTube will do, it seems the firm is taking the creators' complaints seriously.

Policy Modifications

The YouTube team has heard from several creators and influencers about this update in recent weeks, a representative told TechCrunch.

"That feedback is important to us, and we are in the process of making some adjustments to this policy to address their concerns. We will follow up shortly with our creator community as soon as we have more to share," said YouTube spokesperson Michael Aciman.

YouTube's definition of violent material was updated last November to include in-game content intended at a real identified person or actions that are created to generate frightening experiences, such as gruesome mass murdering. The company warned that standard gameplay might include graphic violence after the first eight seconds only.

Apparently, there was a lot of wiggle area for good or bad interpretation throughout this entire section.

More significant revisions were made to its profanity policy.

YouTube has said that it will no longer consider the phrases "hell" and "damn" to constitute profanity but that all other profanity will be grouped together regardless of degree. For example, "sh*t" and "f*ck" will now be regarded the same.

Videos that include foul language in the title, thumbnails, or in the first seven seconds, or used regularly throughout the video, as the new policy puts it, may not get ad income.

Films that begin using profanity after the first eight seconds are still eligible. Still, the modifications may have affected a huge number of videos, including those that were created before the announcement was made.

See Also: YouTube to Start Allowing Creators to Monetize on Shorts Starting February 1st

Online Backlash

Around the end of December 2022, creators began to take notice of the new rules that were being enforced, and they saw certain films get additional restrictions that limited their reach and ad qualification.

YouTuber Daniel Condren, who operates the channel RTGame, analyzed how the new guidelines might affect his channel in a video that has received over a million views this week. After having a dozen or so clips demonetized and his appeals denied, Condren has been struggling with the compliance adjustments in recent weeks.

Condren took to Twitter to share his sentiments. He said that if this keeps up, he thinks he is going to lose his whole livelihood. He added that there seems to be no way out of this predicament, which is really upsetting him.

TechCrunch asked for additional information, especially on how it intended to change the said policy, but YouTube has not responded yet.

See Also: YouTube Tests Free Streaming in a Hub of Ad-Supported Channels

Trisha Andrada

â“’ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion