'San Francisco – Social media companies have long struggled with what to do about extremist content that advocates for or celebrates terrorism and violence. But the dominant current approach, which features overbroad and vague policies and practices for removing content, is already decimating human rights content online, according to a new report from Electronic Frontier Foundation (EFF), Syrian Archive, and WITNESS. The report confirms that the reality of faulty content moderation must be addressed in ongoing efforts to address extremist content.
The pressure on platforms like Facebook, Twitter, and YouTube to moderate extremist content only increased after the mosque shootings in Christchurch, New Zealand earlier this year. In the wake of the Christchurch Call to Action Summit held last month, EFF teamed up with Syrian Archive and WITNESS to show how faulty moderation inadvertently captures and censors vital content, including activism, counter-speech, satire, and even evidence of war crimes.
“It’s hard to tell criticism of extremism from extremism itself when you are moderating thousands of pieces of content a day,” said EFF Director for International Freedom of Expression Jillian York. “Automated tools often make everything worse, since context is critical when making these decisions. Marginalized people speaking out on tricky political and human rights issues are too often the ones who are silenced.”
The examples cited in the report include a Facebook group advocating for the independence of the Chechen Republic of Iskeria that was mistakenly removed in its entirety for “terrorist activity or organized criminal activity.” Groups advocating for an independent Kurdistan are also often a target of overbroad content moderation, even though only one such group is considered a terrorist organization by governments. In another example of political content being wrongly censored, Facebook removed an image of a leader of Hezbollah with a rainbow Pride flag overlaid on it. The image was intended as satire, yet the mere fact that it included a face of a leader of Hezbollah led to its removal.'
Read more: Caught in the Net: The Impact of ‘Extremist’ Speech Regulations on Human Rights Content