Tech

TikTok Users Are Inventing Wild Theories to Explain Its Mysterious Algorithm

No one knows how TikTok’s For You page algorithm works, so users have taken it upon themselves to construct their own theories.
Tikok illuminati conspiracy triangle
Image: TikTok logo and illuminati symbol from Wikimedia Commons. 

Probably half of the videos I see on TikTok include one of the following hashtags: #fyp, #foryou, or #foryoupage.

The hashtaggers’ theory is that if they use these tags in their captions, their posts are more likely to surface on more people’s For You pages. The For You page is TikTok’s recommendation feed, which is personalized to each user based on how that user interacts with videos on TikTok, according to the company.

Advertisement

There’s absolutely no proof that using these hashtags does anything, but it seems like they do. Because so many people use these hashtags, it looks like they actually get videos on people’s For You pages. Also, the top Google search result for “How to get on TikTok For You page” is a link to a blog that promotes this very method. (The blog provides other possible but unverified tips, like using high-quality footage, and using trending footage and trending hashtags.)

No one outside TikTok knows how its For You page algorithm works (like many social media recommendation algorithms, it's likely the company itself doesn't know all of its quirks). The company has repeatedly declined to answer Motherboard’s specific questions about it, including questions asked for this article.

TikTok users, without verifiable information from TikTok, are aggressively postulating their theories about how the For You page actually works on the platform. Speculation about the For You page has become prevalent that it’s practically adopted status as a meme on the platform. If users aren’t theorizing about it, then they’re making irreverent jokes about it.

One of the most popular theories on TikTok about how the For You page works is the “batch” theory. The idea is that TikTok shows each post to a small batch of users. The ratio of likes-to-views, or engagement-to-views, determines whether that post get released to a larger batch of users, and so on with each batch. Videos promoting these theories have gotten 46k and 17k likes, respectively.

Advertisement

Some users have tried to test this theory by asking users to comment how many likes the video had when they saw it. The thinking goes that if like counts increase in spurts, it’s possible that videos are shown in batches. But the results from this “test” are inconclusive. There’s absolutely no way to prove this theory without confirmation from TikTok.

According to the TikTok listing in the iOS App Store, a user’s For You page is based on an unclear mix of engagement metrics.

“A personalized video feed specifically for you based on what you watch, like, and share,” the page reads. “TikTok will quickly adapt to your taste to offer the most relevant, interesting, fun, quirky, head-turning videos that you’ll never want to stop watching.”

We don’t know which engagement metrics—likes re-watches, likes, or shares—have a more sizable influence on people’s For You pages than other metrics. (For instance, we don’t know if likes matter more than re-watches.)

Do you work at TikTok and have a tip about how For You recommendations work? Contact Caroline Haskins securely via email at caroline.haskins@vice.com or via Signal at +1 785-813-1084.

Becca Lewis, a digital culture researcher at the non-profit research organization Data and Society, told Motherboard in a phone call that when platforms aren't transparent about how content gets algorithmically surfaced, people start to look for patterns explaining why some things are seen and some things are not.

Advertisement

“It’s incredibly difficult to tell whether their observations are accurate or not, because it’s difficult from the outside to see what’s happening,” Lewis said. “That is one unintended consequence of these platforms keeping their algorithms so under wraps. You start to get these folk stories or urban legends about how these things work.”

Lewis added that there can be unintended consequences when platforms don’t provide information about their platforms to users. “I think the lack of transparency around algorithmic surfacing of content can lead to a culture of distrust and even a conspiratorial culture,” Lewis said.

There are many more theories about how TikTok’s For You page works, some of which are less sensible than others. For instance, one user proposed that TikTok assigns a score to accounts based on how much engagement their first five posts get, and TikTok uses that score to determine how often future posts get surfaced on the For You page. There’s no evidence that this is true. From TikTok’s perspective, this theory also doesn’t make sense because it would dissuade people from staying on the platform, and experimenting with content creation, on a long-term basis.

Other users have theorized that TikTok has no intelligent algorithm and just surfaces content randomly to each user. This user says that even when her views have a good “interaction per view” ratio—or, the ratio of likes, comments, and shares to total views—her videos still top off with very few views.

Advertisement

Some people see the opacity of the For You page algorithm as a call to action. For instance, one user asks his viewers if they also tap TikTok’s “share” button in order to try and make other people’s videos go viral on the app. The thinking is that by tapping share—even if users don’t actually share the video with a friend—users improve the engagement metrics of another video. It goes a step further than simply liking a video.

Another user told her viewers that TikTok users have a “duty” to like videos with only a handful of likes. She argues that the “batch” theory is correct, meaning that videos with high view-counts are distributed to more people’s For You Pages. For this reason, she says, it’s very difficult for videos with a small view count to get over the hump and reach a larger group of users.

“If you scroll past these little TikToks with like five likes, they’re probably going to die soon,” she says, with ironic melodrama. “So in conclusion, it is your civic duty as a member of the TikTok community to like and comment on this video, because if you don’t do it, no one else will.”

Doing this would surely affect one’s own For You page. However, it’s unclear how much it would affect other people’s For You pages.

Of course, TikTok isn’t the only platform with a secret content-surfacing algorithm. Becca Lewis said that often, tech companies will keep these algorithms secret because it’s expensive proprietary information, and more public information can empower media manipulators.

Advertisement

For example, if everyone had a list of attributes that move tweets to the top of people’s Twitter home feeds, everyone would just use those attributes.

“Since the early days of Google, for example, there’s always been a cat and mouse game where Google specifically tries to keep their search algorithm under wraps to a certain degree so it doesn’t get manipulated,” Lewis said. “But at the same time, you have people who want to appear highly in searches, and so there’s been the entire industry of search engine optimization that has emerged out of that.”

Some tech employees have resisted the principle of secrecy. YouTube’s union, for instance, is demanding more transparency about how the company operates. However, platforms like YouTube or TikTok don’t have an incentive to be transparent because maximizing engagement is a core part of their business models. Engagement often means people spend more time on a platform, and therefore, more time engaging with ads that make the company money. When platforms aren’t clear about what maximizes engagement, this money-making model stays safe.

And of course, all major algorithmically-powered feeds look different to each user because they’re influenced by how people behave. Twitter home pages, Facebook News Feeds, and Instagram home feeds display content in personalized orders, depending on how each user has behaved on the platform before. This isn’t unique to TikTok.

On one hand, there’s a positive side to personalized feeds. Users are more likely to see stuff that they want to see. But personalized feeds can also create echo chambers of content that don’t just subconsciously construct people’s own realities, but push them toward more radical content.

In any case, when users ask “why does this video have 43 likes and my video has zero,” they aren’t unjustified.