Clubhouse Rooms: Legal Pitfalls To Avoid in ‘Safe Spaces’

On June 9, 2021, actor Kusha Kapila shared stories of the rape threats she received after an article  allegedly twisted the narrative of a stray comment illegally recorded on Clubhouse (CH) on consensual ‘hate-sex’.

In March, TMZ published a recording of a CH chat in which comedian Tiffany Haddish disapproved of Nicki Minaj’s apparent disrespectful behaviour.

CH users are now hopefully cautious of the safe space for free speech CH seeks to espouse.

For those living under a rock, CH, the invite-only audio-only social network app, has taken the internet by storm. From Oprah Winfrey to Elon MuskAnupam Kher and Tanmay Bhatt, the who’s who of Blue Town have flocked to this new ‘it’ app. The app’s recent release on Android led to an explosion in the number of users in India. Name any topical genre, and you will find an influencer moderating an engrossed group on CH on the topic. Discussion rooms range from literature, music, politics, sports, science, history to bad habits of mother in laws and boss-haters. There was even a group on why tea is better than Bournvita.

Users can join any room of their choice, listen silently to the ongoing live discussions or add their own two cents to the conversation with permission from the moderator of the room. Users can effortlessly switch between discussion rooms, or even create their own rooms. The popularity of Clubhouse has to do with its adoption of conversation and rejection of photographs as a medium of communication. It allows its users to express themselves freely, and connect with like-minded folks, which in the COVID-19 induced lockdown, has substituted interpersonal interactions with CH rooms. Users freely discuss topics such as persecution of minorities, rise of militant nationalism and OTT censorship – which would typically be deleted or trolled heavily on other platforms.

While CH broadcasts are a refreshing breath of candidness, they aren’t liability-free. Arguably the first popular social app of an ‘audio’ nature, it is important to highlight the legal risks the users, especially group moderators, expose themselves to.

Civil rights actions 

Even though CH does not permit recording of conversations without the express consent of all speakers in the room, it would not take a genius to record it on a third-party app. Therefore, users should not think that they are immune from defamation related laws just because they are making purportedly ephemeral audio statements on the app. They have to be conscious of the digital footprint left behind, which could be used as evidence in legal actions for compensation if someone perceives their statements to be defamatory.

In addition to unsanctioned recordings, users should also note that CH itself makes internal recordings of all voice communications as per its policy. It retains the audio if a participant in a room reports a “Trust and Safety violation” for subsequent investigation and action. If the room ends without incident, it deletes the audio recording after the CH room ends. Many articles  have reported privacy concerns over this policy, which reads:

“Solely for the purpose of supporting incident investigations, we temporarily record the audio in a room while the room is live. If a user reports a Trust and Safety violation while the room is active, we retain the audio for the purposes of investigating the incident, and then delete it when the investigation is complete. If no incident is reported in a room, we delete the temporary audio recording when the room ends.”

Therefore, the ‘anything goes’ attitude which is rampant in CH rooms will have to be exercised with caution. In fact, attempts can be easily made by unscrupulous parties to record a CH conservation to distort the context of the communication. This would leave the moderators in a lurch with no recording of their own to refute any malicious insinuation. Kusha Kapila is unfortunately only the first victim of such hate attacks.

Hurting religious sentiments

Indians are no strangers to hyperactive, mala fide complaints under the garb of Section 295A IPC (hurting religious sentiments) or Section 124A (sedition). Rooms discussing sensitive topics need to be wary that the statements exchanged in the group aren’t considered seditious, racist pr even casteist. Culpability won’t be restricted to the creator/speaker of the alleged criminal content for the complaint can be twisted to include the whole CH room engaged in the discussion under Section 120A IPC (conspiracy). Thailand, in fact, has issued warnings of criminal action to dissidents who are flocking to discuss important issues on CH.

The Hindu has already reported that at least a dozen Central agencies, including the Intelligence Bureau, Research and Analysis Wing, National Investigation Agency, Enforcement Directorate, Central Bureau of Investigation and Narcotics Control Bureau have been authorised to track discussions in CH rooms. Users should be wary that many of these agencies are empowered to track discussions on CH under Section 69 (1) of the IT Act, 2000 read with Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009, and as such, should be vigilant of the prying ear of the authorities.

Illegal use

Users also have to stay vigilant that they do not end up being part of CH rooms that indulge in distorting information, spreading fake news or selling illegal items. CH has already faced criticism in the US over reports of COVID-19 misinformation, white supremacist rooms, misogyny despite rules against hate speech, abuse and false information. The abuse of a platform like CH for repeated circulation of provocative or motivated content can be a matter of deep concern. Audio messages constitute an entirely different set of challenges for law enforcement as they are harder to track and action and therefore, private CH groups have enormous potential of being misused.

Further, CH, being an invite-only app, will attract scrupulous sellers who will sell their invitation quota to others. Twitter and Instagram are already tackling the growing evil of selling blue ticks and followers. Considering the name of the person who invites you is eternally engraved on your profile, users should be cautious while accepting the invitation from unscrupulous or unknown entities.

Breach of privacy

Considering CH’s audio recording policy, it is clear that the audio conversations are not strictly end-to-end encrypted. Further, CH uses information from your contact list to build profiles of non-members, which is evident from the way they list out your contacts in the order of the number of friends the non-member has on CH. Therefore, people who have no interest in being part of CH invariably find their data at the hands of CH, and have no mechanism to delete their personal information. In fact, CH users themselves can’t delete their accounts on the app without sending an email to a support account on CH.

Liability of moderators

Recently, the Bombay high court in Kishor v State of Maharashtra held that a WhatsApp admin of a group cannot be held liable if one of the members of the WhatsApp group uses filthy language. While a welcome decision, it has not prevented WhatsApp admins from getting arrested for the acts of the members of the group. For instance, in 2017, a WhatsApp group admin was arrested in Karnataka because one of the members of the group posted an ‘obscene and ugly’ image of the Prime Minister.

In the same vein, moderators of rooms on CH also open themselves to such coercive actions by a disgruntled listener. And what may be worse is a moderator can be viewed to have a much greater influence on the discourse in a CH room in comparison to a WhatsApp Group admin, as the moderator has active control of curating the conversation being exchanged real time in the room. Therefore, it would not be surprising if moderators, like WhatsApp admins, receive unreasonable threats for random statements made in the room due to the perception that they are more powerful than the listeners of the room.

Status as an intermediary

Despite the seemingly draconian new IT rules, social media apps like Facebook and Twitter still enjoy safe harbour provisions by virtue of being an intermediary under the IT Act. But if CH’s policy permits them to record conversations in rooms for investigative purpose, it might be interpreted to mean they aren’t just passive carriers or simple conduits of conversation who play no active role as envisaged under Section 2(1)(ua)(w) of the IT Act, 2000. CH might not be ‘significant social media intermediaries’ (SSMI) under the New IT Rules but it might have a hard time claiming intermediary immunity under Section 79 of IT Act, 2000 if it records a conversation to be an ‘arbiter of truth’ and consequently ‘judge and jury’ when it takes necessary action at the end of an internal investigation.

What might be worse is that CH may find itself on the receiving end of requests for audio files from government authorities regarding any complaint made against rooms or room moderators, and if CH does not enjoy strict immunity as a neutral intermediary, the authorities might not even be required to adhere to formal notice requirements under Section 79(3)(b) (takedown notice) read with Section 69(3) of the IT Act, 2000 (access or monitoring notice).

Violation of copyright laws

There are countless CH rooms celebrating music or hosting DJ nights. The moderators of these groups also admit on Instagram that CH is a business place and they use CH to attract followers. The moderators should wary that singing or using songs for commercial purpose without necessary copyright permission or declarations can invite severe penalties.


It appears that the world is assuming that CH has created a more robust sphere of free speech by removing the written words and its permeance out of the context. But the fact of the matter is that even if CH enables a shift from ‘written’ to ‘audio’ model of communication, the dangers that plague Twitter or Facebook will infect CH in ways far worse than the written model, especially because users will not have their own audio recording to defend themselves or explain the context in which they uttered a particular statement.

As such, if a claim of defamation or sedition or hurting religious sentiment is brought against a CH user, it could lead to extremely expensive – financially, physically and psychologically – consequences for what they might have considered a throwaway comment in a transient environment. We all know that even if such claims ultimately don’t succeed, the harassment caused to the victim is enough of a victory for spiteful litigators.

Gourav Mohanty is a lawyer practicing in the Bombay High Court. He has five years of experience in dispute resolution, and is a gold medalist from Symbiosis Law School.

Featured image credit: Dmitry Mashkin/Unsplash