Facebook responds to critics of its policies against terrorism
In a statement, posted in response to a Change.org petition that amassed more than 135,000 signatures, Facebook's Monika Bickert, Head of Global Product Policy said "there is no place on Facebook for terrorists, terrorist propaganda or the praising of terror."
The petition, titled "Dear Facebook, thanks for the 'Safety-Check,' but on fighting ISIS, you can do much better!". The post slammed Facebook for not responding fast enough to "sick jihadi accounts" that were posting messages of support for the terrorists behind the attacks in Paris in the hours immediately following the attacks.
Every 5 minutes, ISIS bot accounts sent the statement claiming responsibility for the attacks, along with links to pro ISIS accounts with a bloody picture of the Bataclan massacre, as their header image. As for the messages’ content, they sneered at us, gave praise to their « brave lions » (the suicide bombers), kept threatening us and posting jihadi propaganda videos.Facebook eventually removed these messages but the petition's author, Julie Guilbault, said the social network took too long to do so. She said that the company uses sophisticated technology to quickly detect pornography but "when it comes to advocating terrorism and publishing decapitation videos: no worries, they enjoy a comfortable delay before the content or their account will be deleted."
In a message posted on Change.org Tuesday, Bickert said "we work aggressively to ensure that we do not have terrorists or terror groups using the site" and noted that it relies on its users to report terrorist content.
When content is reported to us, it is reviewed by a highly trained global team with expertise in dozens of languages. The team reviews reports around the clock, and prioritizes any terrorism-related reports for immediate review.Bickert also noted that there are times when Facebook users share "upsetting content" for good reasons, such as to promote awareness of an issue and that it doesn't block content that is shared in this context.
We remove anyone or any group who has a violent mission or who has engaged in acts of terrorism. We also remove any content that expresses support for these groups or their actions. And we don't stop there. When we find terrorist related material, we look for and remove associated violating content as well.
When a crisis happens anywhere in the world, we organize our employees and, if necessary, shift resources to ensure that we are able to respond quickly to any violating content on the site. For instance, in the wake of the recent attacks in Paris, we also reached out immediately to NGOs, media, and government officials, to get the latest information so that we were prepared to act quickly. Many of our employees, especially our French speakers, worked around the clock to respond to the spike in reports from our community.
"Many people in volatile regions are suffering unspeakable horrors that fall outside the reach of media cameras. Facebook provides these people a voice, and we want to protect that voice. For this reason, we allow people to discuss these events and share some types of violent images but only if they are clearly doing so to raise awareness or condemn violence."
You can read the message in its entirety here. It was also posted in French.
Have something to add to this story? Share it in the comments.