Tech

YouTube Moderator Training Included Videos of Beheading and Rape, Lawsuit Alleges

A former contracted content moderator at YouTube is bringing a class action lawsuit against the platform, claiming the job gave her severe PTSD, anxiety and depression.
A woman walking by the YouTube logo.
Getty Images

A former Youtube content moderation contractor is suing the company for negligence, after she developed severe anxiety and post-traumatic stress disorder from viewing extremely graphic and violent videos in her role—including child abuse, rape, bestiality, beheadings, suicide, and torture. 

Her exposure to traumatic images began in training, she claims, which included horrific content.

"For example, during training, Plaintiff witnessed a video of a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person's head being run over by a tank; beastiality; suicides; self-harm; children being rapped [sic]; births and abortions," the complaint alleges. "As the example was being presented, Content Moderators were told that they could step out of the room. But Content Moderators were concerned that leaving the room would mean they might lose their job because at the end of the training new Content Moderators were required to pass a test applying the Community Guidelines to the content."

Advertisement

YouTube did not immediately respond to a request for comment. 

The complaint, filed in a California superior court on Monday, details the experiences of an anonymous plaintiff who claims to have endured workplace negligence for about a year and a half of working there as a contractor with staffing agency Collabera in one of YouTube's Austin, Texas offices.

"She has trouble sleeping and when she does sleep, she has horrific nightmares," the lawsuit claims. "She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She cannot be in crowded places, including concerts and events, because she fears mass shootings. She has severe and debilitating panic attacks. She has lost many friends because of her anxiety around people. She has trouble interacting and being around kids and is now scared to have children." 

Do you have experience with content moderation at YouTube or elsewhere? We’d love to hear from you. Contact Samantha Cole securely on the messaging app Signal at +6469261726, direct message on Twitter, or by email: samantha.cole@vice.com

YouTube failed to provide a safe workplace and neglected to provide her and other moderators with safe equipment to do their job, the complaint claims. She's seeking medical treatment, compensation for her trauma and medical expenses, and is demanding YouTube establish a fund for a "medical monitoring program" for the prevention, screening, diagnosis, and treatment of content moderators who are similarly exposed to harmful content. The complaint is brought as a class-action lawsuit.

Advertisement

The lawsuit details a three-week training that new content moderators go through, involving little to no resilience training or wellness resources, the plaintiff claims.

Also according to the complaint, once they'd passed training and stepped into the roles of content moderators, YouTube "imposed strict quantity and accuracy quotas on content moderators," which could mean reviewing between 100 and 400 pieces of content per day, with an error rate of two to five percent—and if those quotas weren't met, moderators are threatened with termination. 

YouTube is "chronically understaffed" due to high turnover in moderation roles, the lawsuit claims, and remaining moderators are forced to pick up the slack. The lawsuit alleges that YouTube has "best practices" that include limiting moderation to four hours a day—but that employees spend more than that to get backlogs of content reduced, and that the platform failed to implement its own safety measures. 

The lawsuit points to Microsoft's policies for editing moderation content to make it less traumatic: in 2017, Microsoft claimed that to limit harm and stress from content moderators see, by blurring photos and rendering them in black and white, without audio, or with smaller thumbnails.

"YouTube controls how the videos are displayed (e.g., full screen versus thumbnails, blurred versus unblurred, etc.), how the accompanying audio is broadcast, and whether videos begin automatically upon completion of the prior video or whether the Content Moderator can catch his or her breath by controlling the start of the ensuing video," the lawsuit says. 

Advertisement

In 2017, when requests and comments were posted to YouTube's internal reporting system to implement safety changes such as blur tools and warning labels for ultra-graphic violence, an engineer allegedly responded that it would be a quick fix. "Suzanne French—Head of Global Vendor Operations at YouTube—commented that this tooling change was not a high priority and refused to implement the change," the lawsuit claims.

This lawsuit unfortunately isn't revealing new information about the harm that companies like Google, Facebook and Youtube inflict on content moderators in their offices around the world—who often are contractors, without the same health benefits or pay as direct employees for these multi-billion dollar tech companies. 

In 2018, a woman in California sued Facebook, claiming that she developed post traumatic stress disorder as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace” in her role as a contractor content moderator. In 2019, contracted content moderators in Facebook's Dublin office brought their case against the social media platform to Ireland’s High Court, similarly citing exposure to child sexual abuse materials and brutal murders.

These labor abuses have been documented at Youtube, as well: a 2019 investigation by The Verge revealed many content moderators at the company suffered from similar metal health issues—depression, anxiety, night terrors, insomnia—that the plaintiff in this week's complaint describes. Four days after that report, YouTube started requiring its content moderators to sign a document acknowledging that performing the job can cause PTSD, the lawsuit states.

But the complaint does reveal a system that's neglecting to help workers even when they do speak out through internal channels about what they're enduring, and claims to show YouTube leadership ignoring their pleas for better safety measures.