Tech

TikTok Is Removing Educational Hacking Videos

Multiple creators showed that TikTok is particularly aggressive at removing videos that are designed to educate viewers about hacking.
TikTok hacking
Image: art-skvortsova, remix by Emily Lipstein 
Screen Shot 2021-02-24 at 3
Hacking. Disinformation. Surveillance. CYBER is Motherboard's podcast and reporting on the dark underbelly of the internet.

"Okay, it looks like we found a match for admin," Serena DiPenti said in one TikTok video. DiPenti, who goes by shenetworks on the platform, creates educational hacking content to raise awareness of certain cybersecurity issues and to get a younger audience interested in security.

"And we're in," DiPenti's TikTok continues. She's demoing a tool called Hydra that churns through a list of words and then uses them to try to log into a target website. In this case, the site isn't a real target; DiPenti is using tryhackme.com, a service that provides controlled environments for people to test their hacking abilities.

Advertisement

TikTok, however, removed the video under its harassment and bullying policy. Then TikTok banned a second similar clip DiPenti uploaded but with a more explicit educational disclaimer, and also temporarily banned her from posting. For comparison, DiPenti told Motherboard that Instagram did not remove the original clip when she uploaded it as a Reel.

There are plenty of reasons someone may want to educate themselves about hacking techniques. Perhaps they're interested in becoming a penetration tester, someone who probes a company to find vulnerabilities so they can be fixed. Or perhaps they want to better understand how hackers actually work, and then protect themselves. The idea that all hacking is just done by nefarious people is antiquated at this point—the challenge for social networks is effectively and reliably enforcing against material that isn't designed to be educational, and harmful instead.

"I think if people aren’t aware of the potential security threats they face, they have no way of protecting themselves," DiPenti told Motherboard in an online chat. "TikTok removing these videos and punishing the security creators who make them is pretty bad. It’s also very discouraging. They’re going to lose the creators altogether when people get fed up from taking the time to create all this content for it to get removed immediately."

Advertisement

Do you have access to internal TikTok moderation documents? We'd love to hear from you. Using a non-work phone or computer, you can contact Joseph Cox securely on Signal on +44 20 8133 5190, Wickr on josephcox, or email joseph.cox@vice.com.

DiPenti is not the only one. TikTok has removed content from a spread of other security creators too, highlighting the grey area TikTok puts this sort of material in. Some hacking content on the platform will be maliciously driven, but here, the creators are focused on education.

"It's really anyone that posts security content at all," DiPenti said.

Jonathan Boring, a creator on TikTok and other platforms, told Motherboard "The videos with ANYTHING close[ly] related to hacking or cybersecurity get taken down."

"Almost always," he added.

Boring said the content is often restored. But that still creates a roadblock with creators having to appeal the decisions, and facing the prospect of being banned if too many of their videos are deemed to have violated the site's community guidelines. A suspension will last 3 days, and then appealing can take over 48 hours, Boring said.

Boring suggested he has ended up self-censoring in an effort to avoid triggering enforcement from TikTok.

"I have to avoid words like 'hack,' 'hacking,' 'hacker,' etc," he said. He added that he can't do any videos on the HackRF Portapack H2, an add-on for the HackRF software defined radio product.

Advertisement

"The name in itself gets the video removed. So I've just called it Portapack H2," he added.

Marcus Hutchins, generally known as MalwareTech and for his role in stopping the WannaCry ransomware attack, said TikTok has also removed his videos.

"A lot of us had the same issues on the early days of YouTube and other platforms. Seems to be a consistent problem when platforms are newer," he told Motherboard. "I think the over moderation could lead to the same issues it does everywhere: making the platform unusable."

TikTok told Motherboard that while the platform does not allow criminal activities or harassment, it does make exceptions for some educational content. When Motherboard sent an example of one of DiPenti's removed videos, TikTok told Motherboard the video had been removed in error and restored it.

A day later, TikTok removed it, again.

"TikTok removed the same video again. And I'm unable to appeal it," DiPenti told Motherboard.

Subscribe to our cybersecurity podcast, CYBER.