Facebook spoke strongly against Saturday's London Bridge terrorist attacks, promising to "aggressively remove terrorist content" from the social media platform.
Similarly, Britain's Prime Minister Theresa May said Sunday that Britain must work with allied democratic governments to strengthen internet regulation, as a way to prevent terrorists from using a tool to plan and organize attacks, in addition to spreading extremism.
Facebook Condemns Terrorism
May's comments come at a time of difficulty in finding balance between free speech and terrorism. However, in a way, Facebook's own condemnation of the attacks seemed to amplify May's statements.
Facebook aims to "provide a service where people feel safe," according to Simon Milner, the company's director of policy.
"[W]e do not allow groups or people that engage in terrorist activity, or posts that express support for terrorism. We want Facebook to be a hostile environment for terrorists."
Facebook's Treatment Of Harmful Content
Facebook still receives criticism for its slow response to a series of violent attacks, some of them depicting murder and sexual assault, that were broadcasted in the last few months using the site's live streaming functionality. Many highlight Facebook's failure to target terrorist propaganda on the site, let alone the troubling distribution of fake news.
To push its efforts, Facebook chief Mark Zuckerberg last month said he'll add 3,000 people to monitor, filter, and manage harmful content shared on the site. In late May, The Guardian published a series of documents detailing Facebook's monitoring process, the gist of which showed the site receives an enormous amount of reports, and often, moderators find it difficult to determine what's harmful because of Facebook's perplexing — and sometimes conflicting — guidelines.
The problem is, social media has made it easier to like or share content as much as possible instead of reporting it, presumably because sites have a financial incentive to convince users to share content as often as possible. That's according to Hany Farid, Dartmouth's computer science department chair.
Farid helped Microsoft develop a program called PhotoDNA that targeted child pornography shared on the internet. During last month's episode of On the Media, a radio program, he said that he approached Facebook offering a similar type of technology for terrorist detection, which Facebook rejected.
The process by which social media sites, or internet sites in general, respond to attacks and controversy has left people largely underwhelmed. There'll be a troubling situation or accusation, followed by a press release condemning the said situation, and then more often than not — nada. Nothing follows but "commitment."
That said, Milner did share that Facebook alerts authorities to prevent a potential terrorist attack.
"If we become aware of an emergency involving imminent harm to someone's safety, we notify law enforcement."
London Bridge Attacks
At least seven people were killed and 48 were injured Saturday night in London when three men drove into pedestrians on London Bridge. Afterward, they stabbed people in establishments nearby.
The attack occurred less than a month after 22 people were killed in an incident linked to terrorism during an Ariana Grande concert in Manchester.
Thoughts about the London Bridge attacks? What about Facebook's statements regarding the incident? Feel free to sound off in the comments section below!