Something has been brewing in cyberspace and it affects us all.
On 24 December 2018, the eve of ‘good governance day’ (also known as Christmas), the Ministry of Electronics and IT (MeitY), issued the The Draft Information Technology [Intermediaries Guidelines (Amendment) Rules], 2018 (“the Draft Rules”).
These proposed amendments, drafted without any prior consultation with the public, propose that messaging apps, social networks, search engines, internet service providers, cyber cafes among others follow a content policing and filtering system.
While the ministry aims to weed out misinformation and curb potential violence arising from the circulation of rumours, in effect, the rules can heavily curb free speech, privacy and enforce surveillance over user content.
MeitY is expected to finalise the amendments to the intermediate liability rules around 25 July and could retain the demand for traceability of messages in texting apps like WhatsApp.
These rules pertain to its parent provision, that is section 79 of the Information Technology Act which deal with intermediary liability.
It will empower the government to sweep more personal data of citizens, monitor our online content and even enforce traceability of a message on texting apps by breaking encryption.
What is an Intermediary?
The word “intermediaries”, for whom these rules are made, is mentioned 27 times in the IT Act (2008). So, what is an intermediary in the world of internet?
Intermediaries are entities that transmit, host or publish content generated by us, the users, but do not exercise editorial control over it. Think of a Facebook, YouTube or Wordpress.
An intermediary would include our internet service provider (ISP) like Airtel or BSNL, a search engine like Google, e-mail service like Gmail, a video platform like YouTube and even the neighbourhood cyber cafe. Section 79 of the IT Act provides immunity to intermediaries for the content that is published by the end user.
Therefore, Twitter cannot be held guilty for a hate speech published by an individual on its platform. Now, in order to enjoy this immunity, the government wants intermediaries to effectively monitor content even more closely, an act that would amount to surveillance.
What is the Purpose of these Draft Intermediaries Rules?
Intermediaries perform the vital function of dissemination of information by providing the tools for accessing the internet and generating and sharing content.
Many countries like US, EU nations and India provide legal protection to the intermediaries from liability for any illegal content posted by a user on its platform.
In India, section 79 of the IT Act provides this legal protection known as ‘safe harbour’.
But principal legislations such as Section 79 leave the details to subordinate rules. This is exactly the Intermediary Rules, 2011 which were made after public consultation around March, 2011.Internet Freedom Foundation (IFF)
Who Came up With These Draft Rules ?
Good question. According to a report in The Indian Express these rules were conceived in a closed door meeting among ministry officials, social media companies and industry representatives. There were no public discussions or consultations prior to the drafting of these proposed amendments to the intermediary liability rules.
It was only after these proposed amendments were issued was a window of public consultation announced which ended on 1 February.
Let us now go through some of the most problematic parts of the proposed amendments.
The Nanny Requirement - Rule 3(4)
The IFF has aptly christened this as ‘a nanny requirement’.
According to the IFF’s submission to the ministry, this requirement changes the online environment “from a public platforms to a guarded schoolyard in which you are constantly reminded that you are under watch and you better behave yourself.”
Traceability Requirement and Breaking Encryption - Rule 3(5)
The requirement of “tracing out of such originator of information on its platform” constitutes a direct attack on the privacy of users by requiring encryption to be broken by messaging platforms such as WhatsApp.
End-to-end encryption is the backbone of messaging platforms and ensure that texts between two users can only be seen by them. This rule would enable the government to access our most personal data and also compromise the security of such apps.
In the absence of any substantial parliamentary or judicial oversight on surveillance, this would only serve to expand the surveillance dragnet of the state.
Indefinite Retention of Our Data - Rule 3(8)
The amendment proposes the increase in retention of our data from 90 to 180 days upon notification by a court of government agency. It must be noted that the rule does not define ‘government agencies’ but says data and records must be preserved for ‘investigation purposes’.
The amendment also provides for retention of our data for an indefinite period of time by stating “or for such longer period as may be required”. This not only goes against the proportionality benchmark in the right to privacy but also contains no provision to inform the users of their data retention.
Proactive Filtering & Censorship of Content - Rule 3(9)
Among the most severe blows to freedom of speech and expression, this rule requires intermediaries to proactively monitor and remove content. “The intermediary should not be mandated to determine on its own whether any given content is legal or not,” SLFC.in has stated in its submission to MeitY.
This would result in large-scale takedown of legitimate speech without any legal due diligence. This is also prone to severe discrimination as this policing is expected to be carried out by “technology based automated tools”. Such tools have been widely reported to have algorithmic biases and are known to target minorities disproportionately.
What do International Principles Say About Intermediaries ?
In 2015, civil society groups from across the world jointly drafted safeguards and best practices to limit intermediary liability. A set of six principles were formed to protect freedom of expression and promote online innovation. These are based on international human rights instruments and other international legal frameworks.
The Santa Clara Principles on Transparency and Accountability of Content Moderation Practices presents three prinicples for social media platforms to provide transparency and accountability in their content moderation practices. This was based on extensive surveys and consultations on best practices.
“We urge that all six principles in this framework should form an informed basis of any informed rule making on Intermediary liability.”Internet Freedom Foundation in its submission to MeitY
Absence of Clear Rationale for Proposed Amendments
Several concerns arise from the rationale provided for amending the existing intermediary liability rules. The sweeping provisions, over-broad definitions and vague language can cause unintended harm. Moreover, the invasive attacks on fundamental rights of speech, expression and privacy reflect a severe breach of proportionality.
“It is a matter of concern that the proposed changes in legal form have been proposed without explaining the rationale and true authority behind them,” the IFF submission states.
“While being cognisant of national security interests, we appeal for a less-invasive and proportional means of regulation of the internet.”SFLC.In in its submission to MeitY
Where do We Go From Here ?
The big question then is where do we go from here?
The rules have already been drafted. Is it too late then?
A number of organisations, including the IFF, Mozilla, GitHub, Wikimedia Foundation and Medianama among many others have made an appeal for the draft rules to be abandoned and the present consultation to be recalled.
A process initiated anew with a white paper together with a transparent consultation with all the stakeholders can lead towards a more meaningful solution to the grave problem of misinformation and hate speech.