It’s only been a day since Tumblr announced it was banning adult content on the platform, but already its automated content moderation algorithm has been hard at work flagging nudity and porn. The trouble is, the algorithm has a really broad idea of what “adult” content looks like, causing it to inadvertently flag vomiting unicorns, raw chicken, and boot cleaners as adult content.
On Monday, Tumblr announced that it was making changes to its community standards and, as of December 17, would no longer allow “adult content, including explicit sexual content and nudity (with some exceptions).” The decision was criticized because the number of sites that support the distribution of adult content is dwindling, and Tumblr was a safe haven for many in the online kink and queer communities.
But the algorithm deployed to start identifying the offending content has made many mistakes already. In response, users have been sharing some of the innocuous posts that were flagged using the hashtag #TooSexyForTumblr with some pretty ridiculous results.
Motherboard contacted Tumblr to find out more about what has gone wrong, and a spokesperson would only direct us to a Tumblr support post on adult content censoring, and CEO Jeff D’Onofrio’s announcement of the changes to the site’s standard. The support post says that the flagging is done through “a mix of machine-learning classification and human moderation by our Trust & Safety team—the group of individuals who help moderate Tumblr,” adding that the site has been expanding the team due to the daunting task of wiping the site of any and all nudity and pornography.
In his blog post, Tumblr CEO Jeff D’Onofrio acknowledged that the algorithm was far from perfect and would certainly make mistakes in identifying adult content.
“Filtering this type of content versus say, a political protest with nudity or the statue of David, is not simple at scale,” D’Onofrio wrote. “We’re relying on automated tools to identify adult content and humans to help train and keep our systems in check. We know there will be mistakes, but we’ve done our best to create and enforce a policy that acknowledges the breadth of expression we see in the community.”
Still, many users expressed surprised at what the algorithm has flagged, since it often had little or nothing to do with sex, nudity, or other adult themes. Jillian York, the Director for International Freedom of Expression at the Electronic Frontier Foundation, said she’s seen similar bungled rollouts of automated moderation before.
“Microsoft Bing, when it first launched, used such a blunt tool that terms like ‘chicken breast’ and ‘breast cancer’ were filtered out in SafeSearch,” York wrote in an email. “Tumblr's announcement is unique in that this is perhaps the first time a company has relied fully on automated tools, and I think that these mishaps demonstrate how not-ready-for-primetime such tools are.”
I guess Tumblr’s algorithm is even more of a prude than the platform itself.
This article originally appeared Motherboard.