Clubhouse – the growing audio-only social media app – came along just at the right time. Ever since the pandemic forced most of us indoors, its ability to offer the chance of conversation with complete strangers has only made it increasingly popular. But the platform now faces one of the greatest challenges of modern times: misinformation.
Right now, the app is still in beta – it’s invite-only and available only for iOS users, but that hasn’t stopped people joining the waiting list or begging for invites. Celebrities like Drake, Oprah Winfrey and Kevin Hart have even popped up on the app, offering people the rare chance of digital proximity to the wealthy and famous.
But users are also warning that outlandish coronavirus conspiracy theories – everything from false rumours that the vaccine is made from fetal cells to 5G satellites controlling people through social distancing – are spreading fast on the app. Those who call out these false claims can even face harassment and abuse.
On 12th January, Chakabars Clarke – an entrepreneur and charity founder with a million Instagram followers – embarked on a heated debate with doctors and healthcare professionals about COVID-19 vaccines.
“Who paid for the research?” Clarke asked in a Clubhouse discussion, in audio recordings obtained by VICE. “Who owns the media platforms that that information is spread upon? And who is creating the curriculum to train the doctors that are telling us what we are supposed to put in our bodies and what we are not supposed to put in our bodies?”
He continued: “I’m being presented information by a particular ethnic group, mainly Europeans and all of that information comes from that group and all of the research and all of the funding and all of the development and financial benefit.”
In fact, people of colour from all over the world have been part of the vaccine research effort. The Pfizer-BioNTech vaccine is spearheaded by BioNTech founders Dr. Ugur Sahin and Dr. Özlem Türeci – two scientists of Turkish heritage – and Dr Kizzmekia Corbett, a 34-year-old Black female scientist, is at the forefront of the Moderna vaccine. The UK’s very own Deputy Chief Medical Officer Jonathan Van Tam is of Vietnamese descent – his grandfather, Nguyễn Văn Tâm, was briefly prime minister of South Vietnam.
The conversation on Clubhouse soon escalated and turned personal. While people were mass-reporting Clarke for allegedly discouraging people from getting vaccinated, he believed he was the one getting bullied. Clarke and those defending him, including comedian Tiffany Haddish, were accused of harassing medical professionals trying to correct the misinformation. Haddish rejected the incident as abuse, later tweeting: “Now people on Clubhouse saying I am bullying because I just told the truth."
Photo: Boumen Japet / Alamy Stock Photo
Clarke has also rejected these claims, telling VICE that his words were misrepresented: “The Clubhouse room about the COVID vaccine was problematic because I wasn’t speaking on the COVID vaccine, I was speaking about the harm that pharmaceutical companies had created in Africa. It got lumped in with that – I’m not sure why the conversation of misinformation, or information is focused so heavily around the personal opinions of celebrities.”
This incident wasn’t the first time the app has been criticised for misinformation. There have been previous reports of anti-Semitic conspiracy theories, false claims about health conditions, hate speech towards the LGBTQ community and many others.
Chante Joseph, a 24-year-old journalist and presenter who has written about COVID-19 conspiracy theories for VICE, has been on Clubhouse since November 2020. “I do hear people say how the vaccine has dead foetus cells in it or that the government is trying to make us stand two metres away from each other because they want to use satellites to control us,” she tells me.
She worries that the app hasn’t been taking misinformation seriously. “There's just no written evidence there. Everything is said and unless you screen record it, you can't really go back over things.”
Clubhouse user and journalist Nicolas-Tyrell Scott has also spotted misinformation about COVID-19 and vaccinations. “There’s just so much revisionism and false equivalence,” he says, “but because it’s real time audio, it’s a completely different and unique experience. Fallacies can be spread at a much faster rate without people really processing what’s going on because they are trying to keep up.”
“One of the jokes that people say is that on Clubhouse, you can hear people speaking the typos,” says Jere, a law student from London, who has also been on the platform since mid-November. “British musicians and celebrities on Clubhouse will often try to address and tackle subjects beyond music. So with that, [they’ll dissect topics like] misogyny, queerness or race relations within the Black community. But some of the stuff said – you can’t even just call that misinformation, man, people just be outright lying.”
Jere adds that controversial conversations often end up feeling like a public performance to which outsiders are drawn to spectate. “Some people in the audience are listening for the spectacle and for the back and forth, but some people get their very real and genuine fears affirmed.” To make matters worse, the most popular rooms appear to be pushed onto a user’s homepage based on whether those they follow are interacting with them.
Clubhouse faces a unique battle by virtue of its design. The app allows users to join live chat rooms that simply disappear when they end. Those who start the rooms are moderators by default and can nominate others to oversee the panel. People must send a request to mods in order to speak – everyone else in the room is a passive listener – and there are currently no internal Clubhouse-employed moderators.
Unlike Twitter, Facebook or Instagram, where users leave a digital footprint in the form of text, images or videos, the conversation is wiped once a room closes, making it almost impossible to hold people accountable for their words. Recording the chat rooms is against the guidelines and can get you kicked off the app. With individual rooms currently accommodating the maximum capacity of 5,000 people, the debates often get messy.
“Reddit is a text-based medium and it had a real devil of a time trying to moderate things that are asynchronous [not occurring in real time], while Clubhouse would have to moderate things that are synchronous,” says Dr Bernie Hogan, a senior research fellow at Oxford University’s Internet Institute.
He adds that the conflation of the speaker and the moderator is an issue that doesn't necessarily happen on other social media platforms. “It’s a lot of responsibility, but also really ensnares someone in such a way that makes it really hard to have a conversation. Using audio in this way does not seem like it’s going to be well set up for [negotiating] conflict.”
Hogan also believes the app reinforces what he describes as the “fast brain”, an emotional state where people react immediately and don’t use logic as much. Just like YouTube, Clubhouse amplifies the delivery of a message more than its content, reinforcing the words of those with emotive and persuasive delivery, rather than those whose content is logical, fair and factual.
“With audio, you can really communicate a lot of emotion and expression, but in doing so, it doesn't really lend itself to the same level of reflection that might be relevant. This seems to me like a magnifier of a pulpit, and a pulpit is meant to persuade you through charisma. Frankly, I think that's dangerous,” he adds.
Most social media platforms have never properly grappled with audio as a medium, which may explain Clubhouse’s free-for-all approach to moderation. “We have live streaming, live-blogging but audio has stayed in the remit of podcasters and traditional radio makers,” says Laura Garcia, a journalist specialising in spotting disinformation at First Draft News. “Moderating live audio is hard because we haven’t got the same AI or machine learning structures or automation processes that we have already developed to moderate text content or even images and videos.”
According to Garcia, there is work being done to help monitor audio content, but it’s a difficult medium – people have different accents, speeds and can talk in a mixture of languages.
“We know that misinformation is really sticky,” she explains. “Our brains remember it easily, especially when we hear it over and over again. So if you tune into three or five rooms on Clubhouse and hear the same misleading information, in a fragmented consumption pattern, moderation after the fact doesn’t matter anymore and that’s one of the problems.”
When Clubhouse founders Paul Davison and Rohan Seth created the app, it was unlikely they saw the platform as a petri dish for enabling rumour and misinformation. Though the company did not respond to repeated requests for comment, a spokesperson issued a statement just last month that said: “The company unequivocally condemns all forms of racism, hate speech, and abuse, as noted in our Community Guidelines and Terms of Service, and has trust and safety procedures in place to investigate and address any violation of these rules.”
They added that Clubhouse also plans to prioritise and offer new moderation resources, though no further detail was provided on what this support entailed.
So who’s at fault? How much responsibility can Clubhouse bear when misinformation exists outside of the platform and the conversations take place reflect wider prejudice and conspiratorial thinking? Nicolas-Tyrell believes one solution might be for Clubhouse to employ moderators who are specifically trained in de-escalating conflict.
But that still doesn’t solve the fundamental problem: “Misinformation is sexy, right?” Jere says. “It’s salacious, attractive and you can kind of compare it to things that are outside of the status quo. People feel like they’re speaking something that’s ‘different’, and people are attracted to that.”
Update 10/2/21: An earlier version of the illustration in this article used a logo from a different company also named Clubhouse. We regret the error.