Facebook employees are confused and angry after CEO Mark Zuckerberg told them in a meeting Thursday that Steve Bannon’s video urging people to behead FBI Director Christopher Wray and government infectious diseases expert Dr. Anthony Fauci “didn’t cross the line” to warrant an account suspension, according to a transcript of the conversation leaked to VICE News.
Bannon posted the video over a week ago, but in the days since Facebook employees have been expressing their anger and frustration at the company’s decision not to suspend his account, according to leaked screenshots of Facebook’s internal social network called Workplace.
On Thursday during a weekly question-and-session between the CEO and staff, Zuckerberg was forced to confront the issue head-on after more than 500 employees submitted a question on the topic ahead of the meeting, according to a post on Workplace.
The controversy started last Thursday when Bannon posted a video recording of his War Room podcast to Facebook, Twitter, and YouTube, in which he called for the beheadings.
“I’d put the heads on pikes,” the former White House advisor said in the video. “Right. I’d put them at the two corners of the White House as a warning to federal bureaucrats. You either get with the program or you are gone.”
Bannon says the comments were “clearly meant metaphorically” but Twitter took down the video and permanently banned his account. YouTube also removed the video, but Facebook allowed it to remain on the platform for 10 hours before removing it. In that time, it racked up over 200,000 views.
During the question and answer session, Zuckerberg was asked why Bannon’s account was not disabled, and why the former White House advisor’s call to behead Fauci and Wray didn’t violate Facebook’s policies.
“It did violate our policy, we took down that content,” Zuckerberg said, according to the transcript. “We have specific rules around how many times you need to violate certain policies before we will deactivate your account completely.”
But Facebook’s own Community Standards make no mention of the need for repeated offenses to trigger an account suspension:
“We remove content, disable accounts and work with law enforcement when we believe that there is a genuine risk of physical harm or direct threats to public safety,” the policy related to violence and incitement states.
The Facebook CEO said that disabling a person’s account was not a decision that should be taken lightly, no matter what the person has said or done.
“You know obviously, saying something that crosses a line, it's one thing to hide or make it so that piece of content is not shown, it's another to revoke someone's access to service, and private messages that people want to send them, or their ability to communicate at all, so we naturally and I think rightfully have a higher bar for that.”
He added: “And while the offenses here I think came close to crossing that line, they clearly did not cross the line so the account is up, the content is down, and that's what our policies clearly said that we should do.”
Facebook employees were not happy with Zuckerberg’s response, and hit out at the lack of transparency around the decision.
“[Saying] ‘It didn't meet the threshold’ is an increasingly insufficient excuse and we should aim to do better as a company,” one employee posted on the company’s internal network, Workplace.
“At the very least, we should be transparent about what these thresholds are and where certain actors stand so we can introspect and improve our policies. I'm really quite unsure why we can't do this.”
A number of other employees flagged that days after the video was posted, Bannon was once again at the center of controversy when Facebook removed a network of misinformation pages he was operating.
Throughout this week employees at the company have been criticizing the decision not to remove his account, with many of them pointing to Twitter’s permanent ban as an example of what Facebook should be doing.
“If Twitter permanently deletes the account you know it's serious,” a current Facebook moderator told VICE News, who granted them anonymity to speak freely. The moderator added that typically calling for someone to be beheaded would get an account banned under Facebook’s incitement to violence rules.
Employees said that even they didn’t fully understand Facebook’s policies around how many violations it requires before an account is suspended — and that lack of clarity is allowing bad actors to continue to spread hate speech.
“Why was his account not taken down?” one employee asked. “Some transparency would be helpful. If we are expressing as a company that the only consequences for calling for two people's deaths is content being taken down, that's a pretty low cost for a bad actor to absorb, and I'm not at all surprised that people continue to use our platforms to incite violence.”
For many, Bannon’s call for the beheading of top U.S. government officials should have triggered an automatic ban.
“It's deeply troubling that an account is still up hours after calling for executions of public figures. Is there nuance I'm missing? Is this not a clear-cut thing?” one asked last week.
Another asked: “Is Steve Bannon banned from Facebook as Twitter did? This should clearly violate our policy on inciting violence.”
On Thursday night, several employees posted links to a Reuters article that first reported some of Zuckerberg’s comments, with one employee saying “the optics of this is not great.”
Facebook’s own policy team has also been criticized by their colleagues for allowing Bannon’s video to remain on the site for so long, and for not suspending his account. While some Facebook employees defended the policy team, others pointed out that criticism “comes with the territory,” of holding such a “high-stakes job” and that the decisions being made were having an impact on real people’s lives.
“There are real human beings who suffer violence and oppression because of what happens on our platform, many of whom are very close to me,” one employee posted on Workplace.
“Every time Facebook makes a bad decision around policy or content, it hurts people. It can even get them killed.”