YouTube is upgrading its system in order to let creators have more control in moderating comments they receive from videos uploaded on the site.

The company is rolling out the upgrade today for creators, giving them a better method to engage with the audience. Creators may now have the ability to pin specific comments on the top of the feed and put hateful or inappropriate comments on hold, submitting them to YouTube for review.

As a way for the creators to be more visible in the feed, their usernames will now also be highlighted if they respond to comments, alongside being able to show appreciation to their fans by virtue of clicking a certain comment as a "favorite."

YouTube is saying that the new comments system is aimed to elicit more personal connections between creators and fans, giving the creators and the community a better environment for online interaction.

"We realize that comments play a key role in growing this connection and we're dedicated to making your conversations with your community easier and more personal," writes YouTube in a blog post.

YouTube is visibly navigating toward a video service inherently embedded with elements of social media, further proven by launching YouTube Community mid-September, a space where creators can publish text, images, GIFs and other types of content as a way for them to hold more personal conversations with viewers and fans.

YouTube's New Comments Tools For Creators

The new tools will let creators pin comments at the top of the feed, as previously mentioned, give "hearts" to specific comments and have more visibility by virtue of a highlighted username for the creators.

The bulk of interaction that happens on YouTube can be found in the comments section, which more often than not can prove to be a toxic space that can perpetuate hate-fueled attacks and harsh, unsubstantiated criticism either directed at the creator or other users in the comments section.

As a way to mitigate this longtime cesspool of hatred, YouTube launched a "Heroes" initiative in September, enlisting the help of the community to moderate comments and videos and flag specific comments if they deem them to be offensive or inappropriate in the context of YouTube's community guidelines.

Now, YouTube is offering new tools for creators to sort out inappropriate comments, but most notably, the company is holding a test run for an algorithm-backed comment moderation system which automatically filters out and detects inappropriate comments. The creator may read these and opt to approve, hide or report them as needed.

Of course, no algorithm is created perfect, but it's expected to get more adept at detecting offensive and hateful comments in the long run, as creators keep utilizing the tool.

Be that as it may, these new tools neatly tie up YouTube's progressive efforts to sustain a safe environment for voicing out thoughts and opinions on videos. Some of the new features introduced a while back, however, wasn't received warmly. YouTube started demonetizing certain videos early September without providing creators specific policies that detailed the reason behind the demonetization.

For those who sideline as creators on YouTube, what's your opinion on the new tools to moderate comments? Feel free to sound off in the comments section below!

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion