Facebook’s Independent Oversight Board has requested public comment regarding its upcoming decision about restoring former President Trump’s accounts. The following represent core issues at stake and Shireen Mitchell’s response as the founder of Stop Online Violence Against Women, Inc., and the principal of The Stop Digital Voter Suppression™ Project.

Whether Facebook’s decision to suspend President Trump’s accounts for an indefinite period complied with the company’s responsibilities to respect freedom of expression and human rights, if alternative measures should have been taken, and what measures should be taken for these accounts going forward. 

In response to the recent events at the Capitol ahead of President Biden’s inauguration, social media platforms reacted to former President Trump’s attempts to incite his base by suspending and removing his accounts. Facebook’s Independent Oversight Board should not rescind this action. Not only was this response entirely too late, but it was only taken after at least six people died, and countless others were injured. We find this is a pattern with Facebook, including their responses to what happened in Kenosha andthe genocide in Myramnar.  There is no freedom to incite violence or death threats that are protected by the first amendment.

The first amendment reads: Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Facebook is not a government entity nor is it Congress; it is a private company. At the time that Donald Trump was removed, he was a government representative in the highest office in this country,  and had been targeting the press and other Americans with an incitement to violence for over four years. His content is not protected under the First Amendment.

C1: How Facebook should assess off-Facebook context in enforcing its Community Standards, particularly where Facebook seeks to determine whether content may incite violence. 

Facebook should not allow content that specifically targets and dehumanizes specific communities, and incites violence to be shared on their platform, period.

C2: How Facebook should treat the expression of political candidates.

Facebook continues to operate in a framework that allows political candidates and politicians to have unaccountable  freedom of speech over their users. First, they are simply users like any other user on their platform sharing content. Facebook has also weighed safety alongside freedom of speech, an action that  should have never existed in the first place. If you have the algorithm to remove hate speech, it shouldn’t  matter if the user having content removed or banned for it is a political groups or an average American. Hate speech is hate speech. It should not be based on status, wealth, or ethnic backgrounds. Those that are in a protected class should be protected. The Oversight Board’s  concerns should focus on the most marginalized — not the most privileged, (which is where political candidates reside). 

C3: The accessibility of Facebook’s rules for account-level enforcement (e.g. disabling accounts or account functions) and appeals against that enforcement. 

Not only does Facebook need to not be transparent with their rules of enforcement but also consistent. They should not (as they have always done) weigh these decisions based on status, wealth or ethnic backgrounds. What’s more, appeals also fall under these same guidelines. Community Standards should be enforced in a uniform fashion. 

C4: Considerations for the consistent global enforcement of Facebook’s content policies against political leaders 

“Newsworthiness” should never trump safety and Facebook should not allow white supremacists or conspiracy theories to spread and be amplified on the platform. This model of comparing censorship and newsworthiness to human rights is deeply flawed. Censorship is not and should never be more important than human lives. Facebook should not be responding to content removal or disabling accounts after massacres and genocides; preventing these atrocities from happening in the first place should be a top priority, and not a secondary or reactionary thought. 

A new report from MoveOn highlights that “in the wake of anti-racism protests across the country, right-wing Facebook groups have become hubs for racist and violent rhetoric. Fueled by conspiracy theories, dehumanizing speech, and misinformation, users on right-wing Facebook pages have encouraged police, military, and armed citizens to kill protesters.  These groups, historically dominated by racist misinformation and an array of conspiracy theories, have become notorious hubs for right-wing extremists.” Rahna Epting, Executive Director of MoveOn, also states that “what we’re seeing on Facebook is shocking, though hardly surprising when you consider the extent to which racists have been emboldened and encouraged by the Trump administration.”

Like my colleagues on the Real FaceBook Oversight Board, I strongly believe that no politician nor any government official should be given a free pass to spread disinformation, peddle in hate speech rhetoric or incite violence. This is even more pervasive when one considers Trump, and so, his reinstatement must be weighed carefully. Frankly, removing him when Facebook did so was four years and a day late.