India has introduced draconian changes to its rules for online intermediaries, tightening government control over the information ecosystem and what can be said online. It has created rules that seek to restrict social media companies and other content hosts from coming up with their own moderation policies, including those framed to comply with international human rights obligations. The new “Intermediary Guidelines and Digital Media Ethics Code” (2021 Rules) have already been used in an attempt to censor speech about the government. Within days of being published, the rules were used by a state in which the ruling Bharatiya Janata Party is in power to issue a legal notice to an online news platform that has been critical of the government. The legal notice was withdrawn almost immediately after public outcry, but served as a warning of how the rules can be used.

The 2021 Rules, ostensibly created to combat misinformation and illegal content, substantially revise India’s intermediary liability scheme. They were notified as rules under the Information Technology Act 2000, replacing the 2011 Intermediary Rules.

New Categories of Intermediaries

The 2021 Rules create two new subsets of intermediaries: “social media intermediaries” and “significant social media intermediaries,” the latter of which are subject to more onerous regulations. The due diligence requirements for these companies include having proactive speech monitoring, compliance personnel who reside in India, and the ability to trace and identify the originator of a post or message.

“Social media intermediaries” are defined broadly, as entities which primarily or solely “enable online interaction between two or more users and allow them to create, upload, share, disseminate, modify or access information using its services.” Obvious examples include Facebook, Twitter, and YouTube, but the definition could also include search engines and cloud service providers, which are not social media in a strict sense.

“Significant social media intermediaries” are those with registered users in India above a 5 million threshold. But the 2021 Rules also allow the government to deem any “intermediary” - including telecom and internet service providers, web-hosting services, and payment gateways - a ‘significant’ social media intermediary if it creates a “material risk of harm” to the sovereignty, integrity, and security of the state, friendly relations with Foreign States, or public order. For example, a private messaging app can be deemed “significant” if the government decides that the app allows the “transmission of information” in a way that could create a “material risk of harm.” The power to deem ordinary intermediaries as significant also encompasses ‘parts’ of services, which are “in the nature of an intermediary” - like Microsoft Teams and other messaging applications.

New  ‘Due Diligence’ Obligations

The 2021 Rules, like their predecessor 2011 Rules, enact a conditional immunity standard. They lay out an expanded list of due diligence obligations that intermediaries must comply with in order to avoid being held liable for content hosted on their platforms.

Intermediaries are required to incorporate content rules—designed by the Indian government itself—into their policies, terms of service, and user agreements. The 2011 Rules contained eight categories of speech that intermediaries must notify their users not to “host, display, upload, modify, publish, transmit, store, update or share.” These include content that violates Indian law, but also many vague categories that could lead to censorship of legitimate user speech. By complying with government-imposed restrictions, companies cannot live up to their responsibility to respect international human rights, in particular freedom of expression, in their daily business conduct. 

Strict Turnaround for Content Removal

The 2021 Rules require all intermediaries to remove restricted content within 36 hours of obtaining actual knowledge of its existence, taken to mean a court order or notification from a government agency. The law gives non-judicial government bodies great authority to compel intermediaries to take down restricted content. Platforms that disagree with or challenge government orders face penal consequences under the Information Technology Act and criminal law if they fail to comply.

The Rules impose strict turnaround timelines for responding to government orders and requests for data. Intermediaries must provide information within their control or possession, or ‘assistance,’ within 72 hours to government agencies for a broad range of purposes: verification of identity, or the prevention, detection, investigation, or prosecution of offenses or for cybersecurity incidents. In addition, intermediaries are required to remove or disable, within 24 hours of receiving a complaint, non-consensual sexually explicit material or material in the “nature of impersonation in an electronic form, including artificially morphed images of such individuals.” The deadlines do not provide sufficient time to assess complaints or government orders. To meet them, platforms will be compelled to use automated filter systems to identify and remove content. These error-prone systems can filter out legitimate speech and are a threat to users' rights to free speech and expression.

Failure to comply with these rules could lead to severe penalties, such as a jail term of up to seven years. In the past, the Indian government has threatened company executives with prosecution - as, for instance, when they served a legal notice on Twitter, asking the company to explain why recent territorial changes in the state of Kashmir were not reflected accurately on the platform’s services. The notice threatened to block Twitter or imprison its executives if a “satisfactory” explanation was not furnished. Similarly, the government threatened Twitter executives with imprisonment when they reinstated content about farmer protests that the government had ordered them to take down.

Additional Obligations for Significant Social Media Intermediaries

On a positive note, the Rules require significant social media intermediaries to have transparency and due process rules in place for content takedowns. Companies must notify users when their content is removed, explain why it was taken down, and provide an appeals process.

On the other hand, the 2021 Rules compel providers to appoint an Indian resident “Chief Compliance Officer,” who will be held personally liable in any proceedings relating to non-compliance with the rules, and a “Resident Grievance Officer” responsible for responding to users’ complaints and government and court orders. Companies must also appoint a resident employee to serve as a contact person for coordination with law enforcement agencies. With more executives residing in India, where they could face prosecution, intermediaries may find it difficult to challenge or resist arbitrary and disproportionate government orders.

Proactive Monitoring

Significant social media intermediaries are called on to “endeavour to deploy technology-based measures,” including automated tools or other mechanisms, to proactively identify certain types of content. This includes information depicting rape or child sexual abuse and content that has previously been removed for violating rules. The stringent provisions in Rule 2021 already encourage over-removal of content; requiring intermediaries to deploy automated filters will likely result in more takedowns.

Encryption and Traceability Requirements

The Indian government has been wrangling with messaging app companies—most famously WhatsApp—for several years now, demanding “traceability” of the originators of forwarded messages. The demand first emerged in the context of a series of mob lynchings in India, triggered by rumors that went viral on WhatsApp. Subsequently, petitions were filed in Indian courts seeking to link social networking accounts with the users’ biometric identity (Aadhar) numbers. Although the court ruled against the proposal, expert opinions supplied by a member of the Prime Minister’s scientific advisory committee suggested technical measures to enable traceability on end-to-end encrypted platforms.

Because of their privacy and security features, some messaging systems don’t learn or record the history of who first created particular content that was then forwarded by others, a state of affairs that the Indian government and others have found objectionable. The 2021 Rules represent a further escalation of this conflict, requiring private messaging intermediaries to “enable the identification of the first originator of the information” upon a court order or a decryption request issued under the 2009 Decryption Rules. (The Decryption Rules allow authorities to request the interception or monitoring of decryption of any information generated, transmitted, received ,or stored in any computer resource). If the first originator of a message is located outside the territory of India, the private messaging app will be compelled to identify the first originator of that information within India.

The 2021 Rules place various limitations on these court orders, namely they can only be issued for serious crimes. However, limitations will not solve the core problem with this proposal: A technical mandate for companies to reengineer or re-design messaging services to comply with the government's demand to identify the originator of a message.

Conclusion

The 2021 Rules were fast-tracked without public consultation or a pre-legislative consultation, where the government seeks recommendations from stakeholders in a transparent process. They will have profound implications for the privacy and freedom of expression of Indian users. They restrict companies’ discretion in moderating their own platforms and create new possibilities for government surveillance of citizens. These rules threaten the idea of a free and open internet built on a bedrock of international human rights standards.

Tags