- As Clubhouse downloads doubled in February, experts asked how it planned to moderate content.
- Users have hosted rooms questioning coronavirus vaccines, the Holocaust, and other topics.
- A year after its launch, the invite-only app has more than 10 million weekly users.
- Visit the Business section of Insider for more stories.
In a recent Clubhouse discussion about COVID-19 vaccines, a woman digitally raised her hand, entered the conversation, and spoke at length about how the virus could be treated more effectively with herbal and natural remedies than with vaccines.
She told dozens of listeners: “A pharmaceutical company is an industry, a business, just like anything else and everyone else, who is devoted specifically and exclusively to making sure their shareholders have profits, quarter over quarter. It is not about your health. It is not about your wellness.”
Clubhouse, which launched as an invite-only app last March, has in recent weeks surged in popularity to become one of the world’s most-downloaded iPhone apps. As of March 1, it had been downloaded about 11.4 million times, according to App Annie, a mobile data tracker. That was up from just 3.5 million a month earlier.
The company said in late February that it had more than 10 million active users each week.
As its growth skyrocketed this year, some technologists and academics began asking questions about how it moderates conversations. Outsiders were wondering about bots and the spread of misinformation – the same types of questions that have long been asked about Facebook, Twitter, and other social networks.
While vaccine discussions on Clubhouse may simply go against company guidelines – along with those from the Centers for Disease Control and Prevention – other conversations were more incendiary.
In one high-profile instance, a Twitter user shared a screenshot of a Clubhouse room called: “Were 6 million Jews really killed?”After users reported that room, the company said on Twitter: “This has no place on Clubhouse. Actions have been taken. We unequivocally condemn Anti-Semitism and all other forms of racism and hate speech.”
But some observers questioned whether less inflammatory misinformation had slipped through the cracks.
“Thus far, the creators of the app have been less concerned with misinformation, and more so with the growing number of users on the platform,” said Heinrich Long, a privacy expert at Restore Privacy.
By design, Clubhouse encourages users to explore, and jump in and out of discussions. At any given moment, there are hundreds or thousands of conversations in many different languages, making moderation a daunting task.
The company’s been building a Trust & Safety team for the last year, growing its numbers alongside the platform. As of Saturday, it had two public job postings for that team on its website.
Clubhouse declined an interview request for this story, but a spokesperson sent a statement saying “racism, hate speech and abuse are prohibited on Clubhouse.” Such speech would violate the company’s guidelines and terms.
“The spreading or sharing of misinformation is strictly prohibited on Clubhouse. Clubhouse strongly encourages people to report any violations of our Terms of Service or Community Guidelines,” the spokesperson said via email.
They added: “If it is determined that a violation has taken place, Clubhouse may warn, suspend, or remove the user from the platform, based on the severity of the violation.”
Everything said on Clubhouse is recorded in the moment, according to the app’s guidelines. While discussions are live, the company keeps that encrypted recording. But after a conversation ends, the recording is destroyed. The only time where a conversation would be saved longer was when a listener flagged it to the company.
That moderation model is similar to the one used by Reddit, which largely relies on crowdsourced moderation, said Paul Bischoff, a privacy advocate at Comparitech. Unlike text-based Reddit, however, there won’t be a permanent record of every audio interaction on Clubhouse.
“That could lead to insulated echo chambers where misinformation is amplified without any outside viewpoints,” Bischoff said. “The live-ness could prevent people from being able to report bad behavior on the app, but it could also stem the spread of misinformation beyond the app.”
In the conversation about vaccines, for example, one user asked the woman touting herbal COVID-19 remedies if she could share her information, so listeners could reach out offline to learn more about why vaccines weren’t the best solution for the coronavirus.
There’s also a question of how bots or large groups of coordinated users could affect conversations on the app, said Sam Crowther, founder and chief executive at Kasada, a company that identifies bot activity.
Crowther said he’s already seen some chatter on bot-related message boards about how Clubhouse could be exploited.
“One of the underlying truths with internet businesses is that if you build it, they’ll make a bot to exploit it,” Crowther said, adding, “Removing fake accounts after they’re live is too late – companies need to take control and seize bad bots at registration.”
So how can Clubhouse effectively moderate thousands of conversations between millions of users, many of whom are speaking local languages?
Like Facebook and other social networks, Clubhouse would do best with some form of artificial intelligence or voice pattern recognition system, said Stephen Hunnewell, executive director at the ADALA Project, a nonprofit that advocates for free speech around the world.
But, Hunnewell said, the real danger of audio conversations is that the content can’t be unheard.
Take the conversation about curing COVID-19 with herbal remedies. Dozens of people listening to that conversation already digested the information. Even if the conversation was flagged in real time, Clubhouse couldn’t guarantee that false information wasn’t spread further by those who had already heard it.
“The real danger is in the cross-pollination that seed has planted within whatever audience heard it and their further amplification,” Hunnewell said.
With a new platform like Clubhouse, which has scaled to millions of users in a short space of time, every new user counts, said Nir Kshetri, a professor at the University of North Carolina-Greensboro. That’s why a young company like Clubhouse could choose to prioritize growth at all costs.
Kshetri compared Clubhouse to bigger competitors, like Microsoft, which runs LinkedIn. That company’s been around for decades, and employs some 3,500 experts focused on cybercrime, artificial intelligence, and machine learning, he said.
For a small company like Clubhouse, it may take years to build similarly robust misinformation-tracking systems. In the end, it’s more a decision for the management, he added.
“The question of whether social network sites should play the role of gatekeeper for the news and information their users consume is more philosophical than technological,” Kshetri said.
Even now, some users are fighting back against what they see as misinformation on Clubhouse. In the chat about vaccines, where a woman spoke in favor of herbal remedies for COVID-19, a doctor was responding in real time to claims made in the room. A few times during the hourslong conversation, he popped in to express his opinions.
“I agree with some of what you’re saying, but I don’t agree with all of it,” he said, before finally exiting the room.