It is looking increasingly like TikTok, the massively popular video-based social media application owned by the Chinese firm ByteDance, will be “banned” in some manner in the United States, or sold at a fire sale price to a U.S.-based company (likely Microsoft). This comes after a month of speculation and vague statements from President Trump and members of his administration about the app.
Let's be clear: given Trump's overwhelming vanity, combined with his lack of sophistication on economic and technological issues, it is overwhelmingly likely that this decision is more about avoiding personal embarrassment or scoring petty points, rather than legitimate diplomatic concerns.
But that doesn't mean that there aren't actually meaningful concerns with the governance of an emergent, hugely popular social media network. After all, the TikTok affair came just as four major tech CEOs were called before Congress to testify about antitrust concerns (notably, Microsoft was absent). Virtually every centralized corporate tech platform, and certainly those represented in that hearing, has weathered a series of scandals about privacy and data usage, or algorithmic bias or discrimination, or content moderation, censorship, and user radicalization.
Granting that the U.S. government either “banning” a social media platform or forcing its sale—while collecting some kind of finder's fee (!)—is an absurd outcome, we can still recognize that there are three or so constellations of TikTok issues that merit discussion, none of which are likely to be solved by transferring it to an American tech giant. First, there's the fact that teens use the service to dunk on elected officials and, perhaps most relevant to this round of news, coordinate registration for political rallies to inflate expectations of attendance. Second, there are questions of the software itself harvesting sensitive user data. Finally, there's the issue of how content is presented to users, and what is algorithmically offered up to them.
Trump can't solve the first problem by banning TikTok, or forcing it to change hands. When it comes to teens dunking on buffoonish authority figures, life finds a way. Even in countries with strict laws against insulting political leaders, there is a long history of circumventing that censorship. Notably, for instance, Xi Jinping has not been able to rid the Chinese internet of comparisons to Winnie the Pooh.
It's clear that the second two problems aren't unique to TikTok, either. The app may collect too much data about users, but that's true of every other app users are likely to have on their phones. That data may end up in government hands, but we've known since at least the earliest Snowden revelations in 2013 that data stored with major American tech companies was also vulnerable to government capture. And while it's possible that TikTok could subtly shape its users timeline to push some secret agenda, we also know that YouTube and Facebook algorithms have been doing the same, intentionally or otherwise, for years. TikTok may censor some valuable speech or cut users off without due process or a clear appeal, but so does Amazon.
Ultimately, arguments that TikTok is “worse” than the major U.S.-based social media networks assume that users are at the mercy of tech firms no matter what. The only question is whether the invisible hands shaping the code you run and the content you can see are based in San Francisco or Beijing.
That's too limited a view. Once you realize that TikTok suffers from the same kinds of problems as the other social media platforms (along with a dash of presidential ego-bruising and a scoop of xenophobia), it's clear that a real solution lies not in banning the software or transferring its ownership to Microsoft, but in reevaluating the relationship between social media users and the platforms we create, more broadly.
There are frameworks for this reevaluation. The field of free software, for example, has focused on user empowerment and freedom for decades now. Applying free software principles to social networks is not easy—and lord knows, it's been tried a few times—but until it takes root, we'll face the same problems over and over, whether the platforms are domestic or not.
Similarly, people who have spent years researching the drawbacks and limitations of YouTube and Facebook's algorithms have outlined principles of transparency that could be applied there. Algorithmic transparency could empower users and auditors to determine whether content feeds are pushing people towards state propaganda, political candidates, or rightwing radicalization, and would be an important step forward for all of our social media sites.
When you strip away the vague invocations of “the Chinese” and a general distaste for Gen Z politics, the criticisms of TikTok that remain are the ones that apply to Facebook, to YouTube, to Twitter, even to Amazon and Google Search and others. The way out is not by changing the name of the service or the country of its operator, but by empowering users to avoid that kind of platform subjugation in the first place.