News

Facebook Is Ignoring Moderators’ Trauma: ‘They Suggest Karaoke and Painting’

“You don't always feel like singing, frankly, after you've seen someone being battered to bits.”
Life-sized cutouts of Facebook CEO Mark Zuckerberg wearing 'Fix Fakebook' t-shirts are displayed by advocacy group, Avaaz, on the South East Lawn of the Capitol on Capitol Hill in Washington, Tuesday, April 10, 2018.
Life-sized cutouts of Facebook CEO Mark Zuckerberg wearing 'Fix Fakebook' t-shirts are displayed by advocacy group, Avaaz, on the South East Lawn of the Capitol on Capitol Hill in Washington, Tuesday, April 10, 2018. (AP Photo/Carolyn Kaster)
Logo_Disinfo Dispatch Padding
Unraveling viral disinformation and explaining where it came from, the harm it's causing, and what we should do about it.

Isabella Plunkett, a content moderator for Facebook, loves her job.

She loves it despite having to view some of the most horrific content on the internet, including images of child abuse, graphic violence, and self-harm. She loves it despite the “climate of fear” she feels in her workplace, where she worries about missing her targets and speaking out about the difficult conditions. And this is all despite being paid a fraction of the money and benefits that full-time Facebook employees receive.

Advertisement

But in September last year, in the middle of the pandemic, she finally reached a breaking point and decided it was time to speak out.

“I was working at home for maybe six months, and then I was brought back to the office on the 22nd of September, not wanting to, purely because my mother is high risk; she's had cancer twice. I didn't want to put her at risk,” Plunkett told VICE News ahead of her giving evidence to the Irish Parliament Wednesday morning.

When she approached her manager about the situation, she says she was told to simply suck it up.

“We had a meeting with the regional manager at the time, and I spoke to her about my concerns about [working from home] and that I had raised the concerns with HR. And her answer to me at the time was: You should be able to go to work and go home and not see your family, you know, go to your room, go to sleep, and then come back to work.”

Being forced to come back to the office to work had an immediate impact on Plunkett’s health.

“I had no choice. I was back in the office, they weren't allowing me to work from home. I had terrible anxiety. I was put on antidepressants for seven months. It was really just ignored. It was as if it was nothing.”

When she spoke to the company doctor about her situation, explaining that she wanted to work from home on days when her anxiety flared up, Plunkett was told she’d have to give the doctor a timeline of how long her anxiety would last before he could sign off on anything.

Advertisement

Following years of revelations about the traumatic conditions that content moderators have faced, Facebook claims it has taken steps to improve the conditions. Central to these claims are the wellness support sessions available to all content moderators.

But those sessions are limited to 90 minutes a week, and Plunkett says the people she deals with can only do so much, because they aren’t trained psychiatrists.

“These people mean really well, but they're not doctors,”  Plunkett told the Committee on Enterprise, Trade and Employment on Wednesday morning. “They suggest karaoke and painting. But sometimes you don't always feel like singing, frankly, after you've seen someone being battered to bits,” Plunkett said.

Last week Facebook’s Content Oversight Board made headlines when they basically punted on whether the platform should permanently ban former President Donald Trump. The panel of high-paid academics, journalists, and privacy experts took months to consider this single decision.

But away from the headlines, tens of thousands of poorly-paid content moderators are forced to wade through thousands of posts featuring some of the internet’s most disturbing content all day. They’re expected to make decisions in seconds, while constantly fearful of being penalized for making the wrong decision. And employees are required to sign non-disclosure agreements preventing them from speaking to anyone—including family and friends—about what they’re dealing with, leading to the “climate of fear” among content moderators who are worried about losing their jobs if they speak out.

Advertisement

Facebook is utterly dependent on content moderation; without it, the social network would shut down within hours. And yet, despite it being so crucial to Facebook’s operations, the company has chosen to outsource content moderation to third-party companies that employ tens of thousands of low-paid staff around the globe. And all without Facebook facing any threat of sanction. Now activists and content moderators want things to change.

"What you've seen now is 15-plus years of self-regulation by Facebook,” Cori Crider, the director of Foxglove, a U.K.-based group that advocates for moderators, told VICE News. 

“You've basically had no regulation of something that is at least as big and as important as broadcast media. And it's been shown not to work, not just for the conditions of this workforce but frankly for the health of the public square as a whole.”

Plunkett works for Covalen (formerly CPL), a Dublin-based company that’s one of dozens around the world to which Facebook outsources the work of content moderation.

A number of former Facebook moderators have filed lawsuits against the company and the social media giant for post-traumatic stress disorder (PTSD). This follows lawsuits filed by content moderators in the U.S. and a string of highly critical news articles revealing the horrific conditions moderators operate under around the world.

Advertisement

And yet, despite all these revelations, the conditions for content moderators have not improved.

“It's as if they want it to look like they're doing something, like they're being proactive about it, that they're on top of an issue. But they're not actually doing anything about it; it's brushed under the rug all the time,” Plunkett said.

Covalen didn’t deny Plunkett’s claim about being forced to return to the office, but in responses identical to those provided by a Facebook spokesperson, the company told VICE News that some work undertaken by moderators had to be done in the office.

Plunkett questions this claim, however, by pointing out that some full-time Facebook employees who review some of the same content are still allowed to work from home. She also pointed out that during the Christmas season in 2019, she and her colleagues were allowed to work from home.

Plunkett currently works on a team called Ground Truth, which works on training Facebook’s algorithms. The company describes it as “a quality measurement system” that “measures how effectively our enforcement processes produce results that align to our policies.”

But Plunkett says that she feels she is effectively training the algorithm that will one day replace her—though she says the technology is “a long way off” at the moment.

Advertisement

And because some of her team of about 10 people are currently working from home because they’re at risk, it falls to her and a few other team members to review the most horrific content, despite the fact she is still on anti-depressants.

Plunkett knows that speaking out publicly like this will likely mean some form of retaliation from her employer, though she doesn’t know what form it will take. “Let's hope not for the worst,” she told VICE News.

“I just had the feeling that I needed to do it. I needed to speak up for the people that are too afraid to, who feel that they have too many responsibilities and they can't afford to take any risks,” Plunkett told the committee.

Big Tech companies like Facebook are facing increased scrutiny from governments and regulators globally. Plunkett is the first content moderator to give evidence on the record to any government anywhere in the world, which could force a renewed push by the Irish government to regulate all the tech companies that have based their international operations in Ireland. 

Earlier this year, some Facebook moderators met with the Irish deputy prime minister Leo Varadkar to express their concerns about the work they are doing. In a written response to questions posed by Varadkar, and seen by VICE News, Facebook Ireland’s head of public policy, Dualta Ó Broin, didn’t admit any shortcomings in the ways moderators are treated; he simply echoed the talking points the company has rehashed after any criticism.

But Ó Broin could soon be forced to answer questions more directly, as a number of committee members said Facebook should be called before the group to address the issues Plunkett raised. Facebook didn’t immediately respond to a VICE News request for comment about whether it would be willing to appear before the committee.

But a government crackdown isn’t the only threat to Facebook’s content moderation operation. Plunkett said that during the pandemic, a huge number of her colleagues left Covalen to moderate for TikTok, the Chinese video-sharing app.

Unlike Facebook, TikTok offers its content moderators direct contracts with the company, including all the benefits that its other full-time employees receive. For Plunkett and her fellow content moderators, being treated the same as full-time employees is key to their job performance.

“We should not be treated as disposable,” Plunkett said. “If we are core enough to risk our lives, we should be core enough to be Facebook employees.”