Thousands of videos and channels have been removed from YouTube since a stricter anti-hate speech policy was introduced earlier this year.
YouTube Takes A Stand Against Hate Speech
According to the Google-owned video platform, between April and June, over 17,000 channels and 100,000 videos were culled from the site for hateful content — a five times increase over the first three months of 2019.
YouTube also said that over 500 million comments have been removed during the same period.
A blog post published on Tuesday, Sept. 3, explained that the spike in numbers of hateful removals was due in part to the inclusion of older videos and channels that have been permitted to stay on the platform under the previous policy. However, in June, YouTube introduced a new rule that banned videos containing extremist views like white supremacy or deny events such as the Holocaust.
In total, over 9 million videos and more than 4 million channels have been removed from YouTube in the second quarter of 2019. Majority of videos removed (around 66 percent) were due to spam, scam, or misleading content. Hate speech and harassment make up 1.2 percent of video and 0.4 percent of channel removals, respectively.
The site also boasted that over 87 percent of the total number of videos removed in the same period were first flagged by automated systems before being reviewed by human moderators.
"[I]mprovements in our automated flagging systems have helped us detect and review content even before it's flagged by our community, and consequently more than 80% of those auto-flagged videos were removed before they received a single view in the second quarter of 2019," the post read.
YouTube Announced New Policy Against Hate Speech
In June, YouTube announced a new policy that attempts to address the proliferation of hateful content on the popular video platform. The new rule now prohibits videos that push the idea that "a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status."
The policy change is a response to the increasing criticism against the video platform for allowing the proliferation of misinformation, conspiracy, and extremist views. In January, YouTube said that it will reduce recommendations of videos that may misinform users in potentially harmful ways. This includes videos that feature fake miracle cures and false claims about historic events.