What India’s IT Rule Amendments Mean for Anyone Who Speaks Online

The law can fundamentally reshape how people speak, and are heard, online, writes DevRupa Rakshit.

Devrupa Rakshit
Opinion
Published:
<div class="paragraphs"><p>What the present amendments signal is a consolidation of control over digital spaces that mirrors older forms of state regulation of speech, where the objective is to create an environment in which self-censorship becomes inevitable.</p></div>
i

What the present amendments signal is a consolidation of control over digital spaces that mirrors older forms of state regulation of speech, where the objective is to create an environment in which self-censorship becomes inevitable.

(Photo: Kamran Akhter/The Quint)

advertisement

On 30 March, the Transgender Persons (Protection of Rights) Amendment Bill, 2026 received the President’s assent to become law, despite weeks of unrelenting protests led by the country’s trans community, who it claims to ‘protect.’

On the same day, the Ministry of Electronics and Information Technology (MeitY) proposed amendments to the IT Rules, 2021, with a 15-day window for public consultation. Fifteen days for changes that, if implemented, could significantly alter how online speech is governed in India.

Framed as “clarificatory and procedural”, the amendments—as the Internet Freedom Foundation (IFF) pointed out—instead mark a substantial expansion of executive power over digital content.

Whether the timing of publication of these rules was in the hope that its news will be buried under the sea of posts lamenting the passing of the trans law, one might never know. What we do know, though, is that for a law that can fundamentally reshape how people speak, and are heard, online, 15 days is hardly enough time to even register what is being altered, let alone respond to it meaningfully. 

Who Gets to Speak, and at What Cost

That’s, perhaps, one of the reasons why the discourse around the proposed amendments has been muted.

But for journalists working in India today—especially those of us working independently, across formats and platforms—social media is one of the most important tools of distribution.

And while the bubble of social media being a truly democratic space has long been burst, it still remains relatively more democratic than established media, which means it’s one of the few places where people’s ideas are allowed the freedom to breathe, and where the final call about what to put out and when rests with the person posting it.

Which is also what makes the prospect of regulating it in the proposed ways… somewhat frightening. Dressed in the language of procedure, prima facie, the changes may seem technical. Their implications, though, are anything but.

Rule 3(4) of the proposed amendments, for instance, effectively allows MeitY to issue advisories, directions, and guidelines that platforms must comply with to retain safe harbour protections. In practice, this means intermediaries are incentivised to err on the side of caution, because the cost of getting it wrong is too high.

The rational response, then, is over-compliance, and by extension, over-censorship. Similarly, the expansion of the Inter-Departmental Committee’s scope from hearing ‘complaints’ to examining vague ‘matters’ opens the door to discretionary scrutiny of content, including user-generated political speech.

And when intermediaries are made responsible for interpreting and acting on loosely defined categories, the safest interpretation becomes the narrowest one because the cost of getting it wrong is no longer abstract.

“The ambiguity in the proposed amendments is what worries me the most. When rules around ‘harmful’ or ‘misleading’ content aren’t clearly defined, it creates a chilling effect. As someone who engages with political and social issues, I’m constantly navigating nuance, and the fear is that nuance may not be protected. There’s a real risk of over-censorship, either by platforms or by creators themselves trying to avoid trouble.”
Saumya Rastogi, an independent journalist based in New Delhi

Surveillance and Control

Add to that the possibility of extended data retention requirements, and what begins to emerge is a framework for surveillance and control.

“I want to make content highlighting the government's negligence and malpractices. I want to question them and hold them accountable. I want to amplify ground reality using my platform,” says Archana Das, a content creator.

Citing examples of political dissenters like Sonam Wangchuk, who was imprisoned for nearly six months, and Umar Khalid, whose bail pleas have been rejected since he was imprisoned in September 2020, Das adds,

“They might not convict me of anything, but they can definitely punish me by trapping me in months’, maybe, years’ long legal processes. That sounds harrowing.”
Archana Das

What emerges, then, isn’t just fear of violating any law per se, but simply being accused of doing so, which can draw one into processes that are punitive regardless of outcome.

Rastogi echoes this too, saying,

“I find myself second-guessing posts that I wouldn’t have hesitated to share earlier. Not because the content is irresponsible, but because the consequences feel unpredictable. It pushes creators towards safer, less critical content, which ultimately dilutes meaningful discourse. The danger isn’t just in what gets taken down, but in what never gets said.”
Saumya Rastogi
ADVERTISEMENT
ADVERTISEMENT

The Quiet Workings of Self-Censorship

The idea that the only thing keeping one’s work safe might be its relative obscurity becomes a working assumption because to grow is to be seen, to be seen is to be legible, and to be legible is to be acted upon. This doesn’t produce silence, at least not immmediately. What it produces, instead, is calibration.

“The only thing that has potentially kept my content under the radar is its limited reach and modest following. And, therefore, I have also avoided actively investing more time and effort in growing the reach of my content,” says Maansi Verma, lawyer and founder of the civic engagement initiative Maadhyam.

“It’s a trade-off between wanting to reach a wider audience, and thus, exposing yourself to more risk and wanting to exist even if for a smaller audience.”
Maansi Verma

Over time, the recalibration may feel almost imperceptible—a caption softened here, a post abandoned there, a joke that feels easier to just not make—until the version of what one would have preferred to say, begins to feel increasingly out of reach.

In time, the trade-off Verma alluded to starts to reshape what growth even means since visibility, in this context, is as much an opportunity as it is exposure. And with the consequences it carries, the latter isn’t something everyone can afford.

But for creators whose work is political—especially those who sit outside or against dominant narratives—this isn’t entirely new.

“We have been living under the shadow of online censorship, and blocking of inconvenient content, or even of accounts, has been normalised. In fact, as we saw during the farmers’ protests and the anti-CAA movement, at times, the government has suspended access to the internet altogether. “But now more targeted censorship is possible.”
Maansi Verma

It Doesn't Feel Personal, Until It Does

What can change, then, is less the existence of that risk, and more how it is distributed, normalised, and, crucially, automated. Like, what was once sporadic now feels more systemic.

But as one must, creators are figuring out how to adapt to this shift without necessarily conceding to it.

“Will this amendment affect my content? It’s designed to. That’s the point. But here’s the thing... I run Goa’s first woman-led independent digital news platform, and I didn’t start it because it was easy. I started it because someone had to do honest journalism without waiting for permission from Delhi, or from a businessman,” says Misbah Quadri, founder and editor of MQM24x7.

“The rules might slow me down, add compliance hoops, and make lawyers richer. But change what I publish? No. It will just make me triple check my sources, which I already do. And, maybe, add one extra layer of sarcasm in my edits. For old times’ sake.”
Misbah Quadri

But the fact remains that all of this is still unfolding against a backdrop where attention itself is unstable. Over the past few years, public discourse has moved from one flashpoint to another—each urgent, each overwhelming, each demanding to be the centre of focus.

In 2026 alone, in addition to the world maybe-maybe not being on the brink of another war and LPG shortage that has resulted in, Indian news cycles have covered debates around caste-based reservation, the erasure of trans folks, state elections, and now, the amendment to the IT Rules.

Obviously, most of these are pertinent human rights concerns deserving of national attention and widespread agitation. But, together, their simultaneity makes it harder to hold on to any one shift long enough to even fully register what it changes, let alone realising the cumulative magnitude of the changes. And that’s probably by design.

The Cost of Being Seen

It’s also how this has always worked — as a series of targeted incursions, each affecting a group that can be isolated, debated, discredited, or simply ignored, until the circle of who is ‘at risk’ expands almost stealthily to include more and more of us, echoing what Martin Niemöller wrote decades ago in ‘First They Came.’

They come first for those who can be othered more easily because of existing socio-political biases, and because it doesn’t feel immediate or personal, it is easy to look away… until it isn’t.

What cannot be easily named or isolated is also far harder to resist, even if, right before our eyes, it steadily redraws the boundaries of permissible speech overwhelmingly in favour of those already in power, while disciplining, marginalising, and eroding the visibility of those who seek to question it.

And by the time the shift becomes visible — if it is ever allowed to become visible in ways that cannot be dismissed, that is — it has already settled in, as a state-enabled narrowing of the public sphere itself, where the terms of participation are no longer negotiated collectively but dictated through policy, platform compliance, and the threat of punitive process.

The Algorithm of Caution

What the present amendments signal — beyond their technical language and procedural framing — is a consolidation of control over digital spaces that mirrors older forms of state regulation of speech, where the objective is to create an environment in which self-censorship becomes inevitable.

And by rendering self-censorship inevitable, what these moments begin to reveal is how the language of protection can become a convenient alibi for expanding control — whether over the lives of trans people in law or over the terms of public expression online — until dissent itself begins to disappear, as it did in George Orwell’s 1984.

(DevRupa Rakshit is a queer, autistic individual, ARTivist and independent multimedia journalist based in Bangalore. This is an opinion piece. All views expressed are the author’s own. The Quint neither endorses nor is responsible for them.)

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT