Tech

Patreon Denies Viral TikTok Accusations That It Hosts Child Sexual Abuse Material

Videos and tweets went viral this week that insinuated that Patreon laid off its security team to cover up illegal activity.
Getty Images
Getty Images

Patreon has released a public statement denying viral and unproven allegations on social media that its platform is hosting child sexual abuse material.

The allegations are based on a handful of Patreon accounts that sold pictures of what appears to be minors in bathing suits. One viral TikTok, which asked the Department of Justice to investigate Patreon, suggested that the company knowingly allowed this material to stay on its platform, based on a Glassdoor review which Patreon said is fake, and recent news of layoffs on Patreon’s security team. 

Advertisement

What social media users appear to have spotted on Patreon is a mix of “family vloggers” that shared pictures of their kids, and an adult “age play” account, meaning an adult model who presents like a minor. The latter creator was banned from Patreon over the summer, as well as being banned from YouTube and Instagram. 

While the accusations against Patreon started circulating on TikTok in March, they went viral again this week after the company laid off five members of its security team. 

In August, someone claiming to be on the company’s Trust and Safety team posted a Glassdoor review for Patreon claiming that the company is ignoring reports of accounts “selling lewd photographs” of children. Following the layoffs last week, a rumor began spreading through viral posts on Twitter and Tiktok that the layoffs and these claims were related—and tied to the Glassdoor review.

“Patreon has zero tolerance for the sexualization of children or teenagers and will remove any user and all associated content found to be in violation of this policy,” Ellen Satterwhite, US Policy Head at Patreon, told Motherboard.

Patreon also denied the claims in a blog post on Tuesday: “Dangerous and conspiratorial disinformation began circulating on social media recently, alleging that Patreon has hosted child sexual abuse material (CSAM),” the statement says. “We want to let all of our creators and patrons know that these claims are unequivocally false and set the record straight.”

Advertisement

Patreon claims that the Glassdoor review is fraudulent; in the company’s official Discord, a Patreon staffer claims that it’s “fake,” and that “Patreon is working to have it corrected.” 

Regarding the layoffs, a Patreon spokesperson told Motherboard that “a majority of Patreon’s internal engineers working on security were not laid off.”

Several TikTok videos about these claims went viral this week, showing specific (but blurred-out) Patreon accounts that featured seemingly young girls in bikinis and doing “modeling.” One of these accounts, which Motherboard viewed, had photosets for sale that featured what appeared to be a young girl in swimsuits and in various poses, whose content was labeled as 18+. Patreon requires documentation from 18+ creators that the account holder and everyone in the content is over 18. Patreon removed that account this summer for repeatedly violating its rule against sexualizing children, by representing themselves as under the age of 18.

Motherboard’s reporting found that that content creator was banned across multiple social media platforms, including Instagram and YouTube. Age play is a genre of porn in which consenting adults pretend to be ages they aren’t—often referring to roleplaying as a young child, or infantilizing oneself or one’s sexual partners. Many porn sites, including OnlyFans, don’t allow users to upload age play content or pretend to be children, even if they’re adult roleplaying as underage. Moderating age play content is often too risky for platforms to even attempt to allow; keeping minors off platforms is of utmost importance to any legitimate adult website.

Advertisement

Another account in viral TikToks accusing Patreon of hosting content exploiting children shows a young girl in swimsuits, and is selling photosets of the girl at the beach. Patreon does allow content that features minors that is not intentionally sexualized or distributed for sexual gratification, and minors are allowed on the platform if they have consent from an adult. This account operates a “family channel” on YouTube, part of a common genre of vloggers who documents things like family vacations and mundane parts of life, including children. Much has been written about the generally exploitative nature of family channels over the years, and the genre remains controversial. 

Content about “human trafficking” frequently goes viral on TikTok, especially when it concerns children. Conspiracy theories—about being “almost trafficked in a Target parking lot” or traffickers leaving kids outside as bait, or that furniture companies are shipping children in dressers—are endless on the platform, and frequently go massively viral. When it’s misplaced, trafficking panic not only spreads hysteria and misinformation that directs attention and resources away from actual victims of abuse, but is often used by conservative groups that want to wipe sexual content from the internet entirely—risking demolishing platforms and income sources for adult creators, and exposing more people to more financial precarity and exploitation.

The Glassdoor review “has led to a conspiracy that Patreon knowingly hosts illegal and child-exploitative material,” Patreon said in its statement. “First, let us be crystal clear: Patreon has zero tolerance for the sexualization of children or teenagers. We strive to keep our community safe on all fronts. We unequivocally forbid creators from funding content dedicated to non-consensual or illegal sexual themes and regularly review creators’ accounts to ensure creators behind adult campaigns are over the age of 18.” The company goes on to say it’s working with several organizations that work against child abuse, including the National Center for Missing and Exploited Children.

Patreon also noted that the security team and the Trust & Safety Team are separate, and being conflated as part of these allegations; the security team “focuses on ensuring the safety of things like user and payment data on the platform” while the Trust & Safety team monitors for illegal content and compliance with the community guidelines.

Nudity is allowed on Patreon, but pornography, which the platform defines as “real people engaging in sexual acts such as masturbation or sexual intercourse on camera,” is not. If someone is representing themselves as a minor on Patreon, but is posting sexually explicit content on other platforms like Instagram or Onlyfans, Patreon issues warnings and eventually removes their account. Patreon has a history of both being strict against NSFW creators in the past, and also letting abuses like harassment and stalking imagery exist on the platform for too long.