10 Bolly Memes Explain Internet Censorship Rules Coming This Month
Something has been brewing in cyberspace and it affects us all.
On 24 December 2018, the eve of ‘good governance day’ (also known as Christmas), the Ministry of Electronics and IT (MeitY), issued the The Draft Information Technology [Intermediaries Guidelines (Amendment) Rules], 2018 (“the Draft Rules”).
These proposed amendments, drafted without any prior consultation with the public, propose that messaging apps, social networks, search engines, internet service providers, cyber cafes among others follow a content policing and filtering system.
MeitY is expected to finalise the amendments to the intermediate liability rules around 25 July and could retain the demand for traceability of messages in texting apps like WhatsApp.
These rules pertain to its parent provision, that is section 79 of the Information Technology Act which deal with intermediary liability.
1. What is an Intermediary?
The word “intermediaries”, for whom these rules are made, is mentioned 27 times in the IT Act (2008). So, what is an intermediary in the world of internet?
An intermediary would include our internet service provider (ISP) like Airtel or BSNL, a search engine like Google, e-mail service like Gmail, a video platform like YouTube and even the neighbourhood cyber cafe. Section 79 of the IT Act provides immunity to intermediaries for the content that is published by the end user.
Therefore, Twitter cannot be held guilty for a hate speech published by an individual on its platform. Now, in order to enjoy this immunity, the government wants intermediaries to effectively monitor content even more closely, an act that would amount to surveillance.
2. What is the Purpose of these Draft Intermediaries Rules?
Intermediaries perform the vital function of dissemination of information by providing the tools for accessing the internet and generating and sharing content.
Many countries like US, EU nations and India provide legal protection to the intermediaries from liability for any illegal content posted by a user on its platform.
In India, section 79 of the IT Act provides this legal protection known as ‘safe harbour’.
3. Who Came up With These Draft Rules ?
Good question. According to a report in The Indian Express these rules were conceived in a closed door meeting among ministry officials, social media companies and industry representatives. There were no public discussions or consultations prior to the drafting of these proposed amendments to the intermediary liability rules.
It was only after these proposed amendments were issued was a window of public consultation announced which ended on 1 February.
Let us now go through some of the most problematic parts of the proposed amendments.
4. The Nanny Requirement - Rule 3(4)
The IFF has aptly christened this as ‘a nanny requirement’.
According to the IFF’s submission to the ministry, this requirement changes the online environment “from a public platforms to a guarded schoolyard in which you are constantly reminded that you are under watch and you better behave yourself.”
5. Traceability Requirement and Breaking Encryption - Rule 3(5)
The requirement of “tracing out of such originator of information on its platform” constitutes a direct attack on the privacy of users by requiring encryption to be broken by messaging platforms such as WhatsApp.
In the absence of any substantial parliamentary or judicial oversight on surveillance, this would only serve to expand the surveillance dragnet of the state.
6. Indefinite Retention of Our Data - Rule 3(8)
The amendment proposes the increase in retention of our data from 90 to 180 days upon notification by a court of government agency. It must be noted that the rule does not define ‘government agencies’ but says data and records must be preserved for ‘investigation purposes’.
The amendment also provides for retention of our data for an indefinite period of time by stating “or for such longer period as may be required”. This not only goes against the proportionality benchmark in the right to privacy but also contains no provision to inform the users of their data retention.
7. Proactive Filtering & Censorship of Content - Rule 3(9)
Among the most severe blows to freedom of speech and expression, this rule requires intermediaries to proactively monitor and remove content. “The intermediary should not be mandated to determine on its own whether any given content is legal or not,” SLFC.in has stated in its submission to MeitY.
This would result in large-scale takedown of legitimate speech without any legal due diligence. This is also prone to severe discrimination as this policing is expected to be carried out by “technology based automated tools”. Such tools have been widely reported to have algorithmic biases and are known to target minorities disproportionately.
8. What do International Principles Say About Intermediaries ?
In 2015, civil society groups from across the world jointly drafted safeguards and best practices to limit intermediary liability. A set of six principles were formed to protect freedom of expression and promote online innovation. These are based on international human rights instruments and other international legal frameworks.
The Santa Clara Principles on Transparency and Accountability of Content Moderation Practices presents three prinicples for social media platforms to provide transparency and accountability in their content moderation practices. This was based on extensive surveys and consultations on best practices.
9. Absence of Clear Rationale for Proposed Amendments
Several concerns arise from the rationale provided for amending the existing intermediary liability rules. The sweeping provisions, over-broad definitions and vague language can cause unintended harm. Moreover, the invasive attacks on fundamental rights of speech, expression and privacy reflect a severe breach of proportionality.
“It is a matter of concern that the proposed changes in legal form have been proposed without explaining the rationale and true authority behind them,” the IFF submission states.
10. Where do We Go From Here ?
The big question then is where do we go from here?
The rules have already been drafted. Is it too late then?
A number of organisations, including the IFF, Mozilla, GitHub, Wikimedia Foundation and Medianama among many others have made an appeal for the draft rules to be abandoned and the present consultation to be recalled.
A process initiated anew with a white paper together with a transparent consultation with all the stakeholders can lead towards a more meaningful solution to the grave problem of misinformation and hate speech.
(Hi there! We will be continuing our news service on WhatsApp. Meanwhile, stay tuned to our Telegram channel here.)