News

Bestiality, Stabbings, and Child Porn: Why Facebook Moderators Are Suing the Company for Trauma

“My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed.”
facebook dublin

DUBLIN — Sean Burke spent years working for big tech like Cisco and SAP. But nothing could prepare him for the work he did for Facebook.

"My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed. Day two was the first time for me seeing bestiality on video — and it all escalated from there.”

It took only two weeks before he "started seeing actual child porn.”

Advertisement

Burke, a native of New Jersey, is one of the thousands of low-paid content moderators worldwide who don’t work directly for Facebook but are employed by third-party companies who provide content moderation services to the tech giant — in this case, CPL Resources in Dublin.

Burke, along with other CPL employees, is gearing up to sue Facebook and CPL in Ireland’s High Court, saying they suffered “psychological trauma” as a result of poor working conditions and a lack of proper training to prepare employees for viewing some of the most horrific content seen anywhere online.

"I've had to go on antidepressants because of working on the job," Burke told VICE News. “At times I was solving my problems with alcohol to get to sleep because at least I wasn't dreaming when I slept after having a few drinks on me.”

READ: How Fox News dominates Facebook in the Trump era

Facebook is already facing a lawsuit in California, brought by two former moderators who are seeking class-action status, but the lawsuit being filed in Dublin this week poses a much bigger risk for the company.

Not only are Europe’s workplace-safety rules much stricter than California’s, but among the dozen Dublin plaintiffs is one of Facebook’s own highly-paid employees.

The Facebook employee, who is seeking to remain anonymous during the court proceedings, worked on content flagged by moderators working for CPL and says that even their limited exposure to harrowing material has resulted in being diagnosed with Type 2 PTSD, which can result in symptoms ranging from irritability and panic attacks to suicidal feelings.

Advertisement

And this is likely just the beginning of a much bigger problem for Facebook. The lawyers and investigators involved in preparing the lawsuit say it is “going to snowball” with many more moderators set to join once the writ is filed this week for the lead plaintiffs.

READ: The Russian social network letting ISIS back online

They have good reason for such expectations. Burke is one of about 15,000 low-paid moderators who work for third-party companies that Facebook relies on to keep its platform from turning into a pedophilic and porn-infested hellhole. These moderators comprise a crucial defense for Facebook, despite the tech company’s much-touted work on artificial intelligence.

Now, those who’ve long borne the brunt of such disturbing material want Facebook to pay for their suffering and change its policies to better protect future moderators. And they’re hoping the European courts agree.

“Obviously, there are thousands of moderators employed throughout Europe. We have been liaising with moderators in Barcelona and Berlin and have had inquiries from Sweden,” Diane Treanor, the Dublin-based lawyer leading the lawsuit, told VICE News.

Treanor says many of the workers she's spoken to fear coming forward publicly because they signed NDAs that prevent them from talking about their work with anyone — even members of their families. They also fear being blacklisted in the tech industry if they talk about their trauma.

Advertisement

It’s all about the data

The lawsuit seeks more than just monetary compensation for the trauma inflicted on former moderators; the plaintiffs also hope to shine a light on the lack of care Facebook takes in selecting suitable employees for these roles, and to force the company to improve conditions for those still working as moderators.

“Facebook needs to address the failure to protect its employees and provide a safe place and system of work,” Treanor said. “We are seeking to ensure future moderators will have access to counselors and healthcare professionals while working for Facebook and after they leave the employment.”

In order to achieve this goal, Treanor and her colleagues will need to show a systemic failure by Facebook and CPL to assess how the volume of graphic images their employees were exposed to impacted their mental health.

And the key to proving that will be the detailed data the companies hold on what exactly their employees were exposed to, and for how long.

“They are going to need to disclose just how much toxic content people are exposed to on a daily basis,” Cori Crider, the director of Foxglove, a U.K.-based advocacy group which is assisting in this case with investigation, strategy, and campaigning. “How many beheadings? How much child pornography? How much animal torture? It's awful, but this is real stuff moderators are dealing with all the time.”

Chris Gray, 50, who lives in Dublin is a former moderator based in Ireland who has been diagnosed with PTSD as a result of his 10 months working with CPL. He’s spearheading this lawsuit, and believes getting hold of this data will allow his legal team to “quantify with incredible detail what exactly I've been exposed to.”

Advertisement
1575390102892-headshot-center

Chris Gray, 50, was diagnosed with PTSD as a result of 10 months working as a Facebook moderator.

But he also believes the data can be used to help Facebook make moderators’ lives better. As well as screening new employees for susceptibility to trauma — which does not currently happen — Gray believes that the data gathered by Facebook could be used to assess moderators on-going wellbeing.

Gray believes that making the data public would lay bare the scale of the task faced by these moderators and force Facebook into taking action. But he also believes that Facebook itself could use the data it already collects on each action taken by their moderators to spot signals that the person is struggling to deal with their workload or seeing particular set of images and videos.

And such an approach could help the tens of thousands of other moderators working on other platforms.

“There are 40,000 people doing this shit”

“There are 40,000 people doing this shit,” Gray said, including contractors and Facebook employees. “If I can get them better working conditions, better care, then that also improves the quality of the content moderation decisions and the impact on society.”

Facebook has typically pushed back against any effort to gain detailed data about how its company operates, but the legal team bringing this lawsuit believes that in the Irish courts, the social network will have no choice but to hand over the information.

“The litigation process allows us to request the discovery of all documents held in relation to each moderator,” Treanor said.

Advertisement

Exposed to ‘true evil’

Daniel Valdermasson, who worked as a Facebook moderator at CPL for 14 months from October 2017 to December 2018, took the job because he was applying for a mortgage but also because he was interested in seeing the inner workings of Facebook’s content moderation process.

He felt that because he “grew up on the internet,” there was nothing he couldn’t handle.

“I have always seen terrible videos,” Valdermasson told VICE News. “In the early 1990s, we had rotten.com [showing] beheadings by Russian soldiers. I have seen everything”

But he was wrong.

“I really didn't think it was going to be affecting me but then you get exposed to true evil,” Valdermasson said. “For most people, I have spoken to, the problem is not seeing a grown man beheaded, it's seeing a little six-month-old baby being raped.”

The impact was not immediately obvious but in September 2018, 11 months into working at CPL, Valdermasson found himself having a panic attack while on the way to work. He was walking across the Liffey river in Dublin when it first struck.

“I would have to stop and sit down and try to breathe and try to tell myself that I can continue breathing and that my heart won’t stop,” he says. “You lose some vision around the peripherals as it feels like it’s getting darker somehow. It’s a feeling of impending doom.”

No screening

The lawyers say that Facebook has failed its employees and those contracted to review content posted to its platform. All the employees VICE News spoke said there was no training provided in how to adequately handle the deluge of hate speech and graphic content they would be exposed to.

There is also no psychological screening of new contract employees, who are paid just over minimum wage. In a number of cases, people are employed solely based on meeting the criteria of being able to speak a specific language.

Advertisement

“On one of the teams there was a 50-year-old lady who had never used a computer”

“On one of the teams there was a 50-year-old lady who had never used a computer and her previous job was a masseuse — now she is moderating content,” Gray said.

Following previous criticism of CPL’s failure to care for its employees, it does now provide up to 45 mins of counseling to its employees each week.

Facebook told VICE News that it recognizes that this work “can be difficult at times” but claims that “their well-being is a top priority for us.” The company pointed out that it provides resiliency training, wellness breaks and “open access to psychological support” as well as services like yoga, fitness and mindfulness.

But the moderators who spoke to VICE News and the lawyers representing others say Facebook’s claim don’t match up with the situation on the ground.

Having to view traumatizing material is not the only challenge these contractors have to deal with . They also have to view it unceasingly in order to meet impossible quality and accuracy targets set by Facebook, meaning any mental health resources being made available are not being utilized.

“The problem is that if you are in trouble and you are not hitting your numbers and you are under stress, you are not going to take 45 minutes out of your working schedule to go and sit and talk to somebody if you need that job,” Valdermasson said.

Secrecy

Concerns about how moderating the internet’s most barbaric content is not a new topic. The New York Times reported in 2010 that an industry group established by Congress recommended that the federal government provide financial incentives for companies to “address the psychological impact on employees of exposure to these disturbing images.”

But in the decade since the problem of content moderation has grown exponentially as the amount of user-generated content uploaded to sites like Facebook, YouTube and others exploded.

Advertisement

The problems facing Facebook’s moderators was first brought to the public’s attention two years ago in a documentary called “The Cleaners” which tracks a handful of people based in Manila who appear entirely unsuited for the jobs they are assigned — including one moderator whose job is to remove pornography from the website, but who admits he knows nothing about porn.

Then earlier this year, moderators working for a third party company in the U.S. spoke to the Verge about the conditions they faced and the trauma they endured.

Throughout all of this Facebook has promised to do more and improve conditions for moderators, including on-site yoga, weekly counseling sessions, and wellness breaks. But moderators say little has changed.

Now, the Irish lawsuit could expose much more than Facebook has been willing to admit.

“The real kicker is going to be when they have to disclose who knew this system of work was causing PTSD in a significant number of workers, and when,” Crider said.

CPL failed to respond to detailed questions about the lawsuit and the allegations made by Gray, Burke and Valdermasson, saying it doesn’t respond to “client-related questions.”

Cover: A visitor enters the Facebook Inc. European headquarters in Dublin, Ireland, on Thursday, Nov. 24, 2016. (Photo: Chris Ratcliffe/Bloomberg via Getty Images)