Google sent a letter to a U.S. House panel that elaborated its current campaign in tracking down terrorism and extremist videos on YouTube.
Alphabet Inc., Google's parent company, told the House Committee on Homeland Security on April 24 that it spends hundreds of millions of dollars on a yearly basis just to carefully review uploaded videos on YouTube.
In the letter, Google checked over a million videos on YouTube that have suspected terrorism content. Out of those videos which were uploaded in the first three months of 2019, about 90,000 were deemed to have violated its terrorism policy. The letter was made available for public viewing on Thursday.
Working Double Time to Find Extremist Videos
In March, the House committee encouraged Facebook, YouTube, Microsoft, and Twitter to double their efforts in fighting terrorism content following the violent shooting in a New Zealand mosque. During the attack, videos and images of the actual shooting were posted online.
A few weeks later, Rep. Max Rose, together with other Democrats, urged the companies to disclose their respective budgets when it comes to fighting terrorism. At the moment, only Google and Twitter have followed suit.
"The fact that some of the largest corporations in the world are unable to tell us what they are specifically doing to stop terrorist and extremist content is not acceptable," said Rose in a statement.
Not Easily Calculated
Neither Microsoft nor Facebook has responded to the House committee, while Google said in its letter that it has over 10,000 employees who are responsible for content review.
Meanwhile, Twitter said in its own letter that it is difficult to put a definite monetary amount for its efforts in protecting the social networking site from terrorism content. However, it said it had already banned 1.4 million accounts between Aug. 1, 2015 and June 30, 2018 due to the accounts promoting terrorism.
In addition, the company said a substantial portion of its 4,100 global workforce are responsible for content review.