Tech

Facebook's First 'Civil Rights Audit' Is the First Step in Climbing Everest

Facebook's Civil Rights Audit, published on Sunday, recommends the platform also ban implicit forms of white nationalism.
Facebook office
Image: Jason Koebler

On Sunday, Facebook published its first ever Civil Rights Audit, which aimed to gauge the areas Facebook needs to focus on in order to strengthen civil rights and liberties on the platform. Over 90 civil rights organizations contributed to the audit, according to Facebook.

The report also contains a number of recommendations, including going beyond Facebook's current ban of explicit white nationalism, to also include content that supports the ideology even if the terms "white nationalism" or "white separatism" are not used.

Advertisement

"When you scale a sizable part of Mount Everest, are you making progress? Yes, but have you reached the summit? No. So we haven't reached the summit by any means, but we have really put a few stakes in the ground, and gotten the company to understand that these are platform wide concerns, and that civil rights values and principles and laws should be applied across the platform," Laura Murphy, the civil rights and civil liberties advocate Facebook brought on to lead the audit, told Motherboard in a phone call.

The report itself is split into four sections: content moderation and enforcement; advertising targeting practices; elections and census; and the civil rights accountability structure. With content moderation, the audit focused on harassment on Facebook; the under-enforcement of policies where hate speech is then left on the platform; and Facebook's over-enforcement of hate speech policies where users have content removed that actually condemned or spoke out against hate speech. The audit was conducted with civil rights law firm Relman, Dane & Colfax and Megan Cacace, one of the firm’s partners.

Do you work for Facebook? Did you used to? We'd love to hear from you. Using a non-work phone or computer, you can contact Joseph Cox securely on Signal on +44 20 8133 5190, Wickr on josephcox, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

"Without an active commitment by tech companies to understand these impacts, patterns of discrimination and bias can spread online and become more entrenched," Murphy writes in an introduction to the report. "When you consider all of the groups that are subject to discrimination, it is clear that civil rights issues don't concern one set of people or one community but they affect everyone and speak to our shared values. As such, being responsive to civil rights concerns is not just a moral imperative but a business imperative for all online platforms."

Advertisement

The report announces that Facebook will create a searchable database for credit and employment adverts nationwide; this comes after a lawsuit following investigations that found Facebook customers could target housing adverts based on race. For content moderation, the report recommends that Facebook goes beyond its recent ban of white nationalism.

"The narrow scope of the policy leaves up content that expressly espouses white nationalist ideology without using the term 'white nationalist'. As a result, content that would cause the same harm is permitted to remain on the platform," the report reads. (Facebook banned white nationalism after a Motherboard investigation found Facebook banned white supremacy and not nationalism, despite them being two sides of the same ideological coin, according to experts.)

Activists Motherboard spoke to applauded Facebook's commitment to carrying out the audit.

"It's a public record of accountability; it's Facebook being open and transparent about what they've done, and it allows us, the public, advocacy organizations, and people who do public interest work to examine the space between what they said they've done, and what the experience is for everyday people. That is really important," Rashad Robinson, president of racial justice organization Color Of Change, one of the groups that was in contact with Facebook during this process, told Motherboard. "This is why we led the call for the audit in the first place."

Advertisement

The report also provides a concrete document those who are being negatively impacted on Facebook can point to and prove their concerns are legitimate.

"When people say that something is happening to them, people don't believe them. If we're saying, hey, we're getting disproportionately banned by Facebook, people are like: no you're not; that's not true, everybody gets banned," said Carolyn Wysinger, a black activist and member of Color Of Change who has repeatedly had her posts removed by Facebook, particularly when discussing the issue of racism on the platform.

"But it's like trying to turn around an ocean liner."

Facebook has historically allowed civil rights or other issues to creep up on it. The audit also provides an opportunity for, perhaps, civil rights to become more ingrained in everyday decisions, especially from employees who may not ordinarily take those factors into consideration in their everyday work.

"With Sheryl [Sandberg's, Facebook chief operating officer] leadership, I have seen a notable change where this has trickled down to all teams, and people are really taking that civil rights mentality, ensuring that the products, the policies, the protocols are all being watched, being considered," Neil Potts, public policy director at Facebook, told Motherboard in a phone call.

But, as Murphy suggested, this is not the end, but really the start.

"I feel because I've spent my life as a civil rights advocate and a civil liberties advocate, I feel a difference; I feel change. But it's like trying to turn around an ocean liner. There have to be a lot of forces working in concert. I do feel the movement, I do see the change. But I don't think we're there yet, but we're getting there," she said.

Robinson added, "We want it to be clear, that there is no sort of 'Mission Accomplished' flag that is going to be planted on Sunday when the audit comes out, and no one should expect that. And the people at Facebook don't expect us to say that."

Jason Koebler provided additional reporting.

Subscribe to our new cybersecurity podcast, CYBER.