ARTICLE
28 October 2025

2025 Amendment To India's Intermediary Rules: Well Begun But Half Done

DL
DSK Legal

Contributor

DSK Legal is known for its integrity, innovative solutions, and pragmatic legal advice, helping clients navigate India’s complex regulatory landscape. With a client-centric approach, we prioritize commercial goals, delivering transparent, time-bound, and cost-effective solutions.

Our diverse and inclusive culture fosters innovative thinking, enabling us to craft exceptional legal strategies. Recognized for excellence, we attract top talent and maintain strong global networks, ensuring seamless support for cross-border matters and reinforcing our position as a trusted legal partner.

On October 22, 2025, the Ministry of Electronics and Information Technology ("MeitY") notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2025...
India Karnataka Media, Telecoms, IT, Entertainment
Rishi Anand’s articles from DSK Legal are most popular:
  • in United States
DSK Legal are most popular:
  • within Media, Telecoms, IT, Entertainment, Consumer Protection and Strategy topic(s)
  • with readers working within the Securities & Investment and Law Firm industries

Introduction

On October 22, 2025, the Ministry of Electronics and Information Technology ("MeitY") notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2025 ("2025 Amendment"), amending Rule 3(1)(d) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 ("2021 IT Rules"). Rule 3(1)(d) sets out the procedure by which government authorities can require "intermediaries" — online platforms that host, transmit or store third-party information, such as social media companies, messaging services, and search engines — to remove or disable access to unlawful content. The 2025 Amendment, effective November 15 2025, seeks to strengthen these due diligence obligations under the Information Technology Act, 2000 ("IT Act") by introducing additional safeguards to make content removal decisions more transparent, proportionate and accountable.

Notably, MeitY notified the 2025 Amendment without conducting any public stakeholder consultation, contrary to the Government of India's Pre-Legislative Consultation Policy, which mandates public engagement before the introduction of subordinate legislation of broad regulatory consequence. Its timing is equally significant, following closely after the decision of the Karnataka High Court in X Corp. (Formerly Twitter) v. Union of India, which upheld the constitutionality of the unamended Rule 3(1)(d) and the Government's Sahyog portal, designed to streamline the issuance of content removal and user data requests to intermediaries, despite arguments that it created a parallel takedown mechanism under the IT Act without procedural safeguards.

This explainer outlines the pre-amendment framework of Rule 3(1)(d), analyses the 2025 Amendment and its positive contributions, and examines the persisting gaps in procedural fairness and transparency, along with the new challenges it raises for platform autonomy.

The Unamended and Amended Rule 3(1)(d): A Comparative Overview

Under the unamended Rule 3(1)(d), intermediaries were required to remove or disable access to unlawful information within 36 hours of receiving "actual knowledge" through a court order or a notification from the "Appropriate Government". As defined in p 2(1)(e) of the IT Act, the term includes both Central and State Governments, enabling authorities at either level to issue takedown directions. The third proviso also included a Good Samaritan-style protection ("Good Samaritan Proviso") – similar to p 230(c)(2) of the U.S. Communication Decency Act albeit narrower in scope – that allowed intermediaries to remove prohibited content under Rule 3(1)(b) or in response to user grievances, without losing their safe-harbor protection under p 79 of the IT Act.

The 2025 Amendment retains the 36-hour compliance period for intermediaries but restructures the procedure for content takedown. Only senior officers not below the rank of Joint Secretary or equivalent, or, where such rank is not appointed, a Director or officer of equivalent rank, may issue takedown directions. In the case of police authorities, this power is limited to officers not below the rank of Deputy Inspector General. Each Government or authorised agency must act through a single designated officer, creating a single point of contact for issuing takedown directions. All directions must now take the form of reasoned intimations, setting out the legal basis, statutory provision and the specific URL or digital identifier of the impugned content. A monthly review, led by a Secretary-level officer, will evaluate the necessity and proportionality of such directions. The Good Samaritan Proviso, however, has been removed.

Key Improvements Introduced by the 2025 Amendment

The 2025 Amendment introduces several clarifications that enhance procedural certainty under Rule 3(1)(d). It makes explicit what previously required interpretative effort, that State Governments and law enforcement agencies are also competent to issue takedown directions under the IT Act. It also specifies the designation of authorized officers, allowing intermediaries to verify the authenticity of takedown requests.

Replacing the vague term "notification" with the requirement of a reasoned intimation is a significant improvement. Each intimation must state the legal and statutory basis and the alleged unlawful act, reaffirming that takedown orders are administrative decisions that demand justification. Further, the requirement to identify content through specific URLs or digital identifiers implicitly recognises that intermediaries cannot be compelled to proactively monitor user content, a principle affirmed by the Supreme Court in Shreya Singhal v. Union of India (2015) and Google India Pvt. Ltd. v. Visaka Industries Ltd. (2020).

While these refinements improve procedural clarity, the Amendment leaves unresolved questions of due process, natural justice and transparency, and introduces new challenges relating to platform autonomy.

Persisting Concerns Remaining Unaddressed

A. Unresolved Issues of Transparency

The 2025 Amendment fails to remedy the most enduring criticism of Rule 3(1)(d) – its opacity. It imposes no duty to publish takedown directions or notify affected users. 'Reasoned intimations' are to be transmitted directly between government authorities and intermediaries, leaving impacted users and the public unaware of restrictions imposed on online content. This secrecy undermines natural justice and the constitutional right to receive information.

Unlike the 2009 Blocking Rules, which required authorities to attempt notification of affected users or intermediaries, the current mechanism directs orders only to intermediaries, deepening opacity. Such confidentiality is contrary to the Supreme Court's judgment in Anuradha Bhasin v. Union of India (2020), which held that state orders restricting speech must be published to permit judicial review. The Karnataka High Court's ruling in X Corp. further complicates matters by holding that intermediaries cannot themselves claim free speech rights under Article 19(1)(a) of the Constitution, even though they are the sole recipients of takedown directions under Rule 3(1)(d), and therefore, best placed to challenge them.

The only redeeming feature is that, unlike Rule 16 of the 2009 Blocking Rules, which mandates confidentiality of blocking requests and complaints, the Rule 3(1)(d) framework under the 2021 IT Rules does not prohibit intermediaries from publishing takedown orders or notifying affected users. While disclosure remains voluntary, intermediaries may choose to act transparently in the public interest.

B. Gaps in Procedural Safeguards and Oversight

Criticisms of the Rule 3(1)(d) framework's lack of procedural safeguards persist. p 69A of the IT Act, read with the 2009 Blocking Rules – which remains operational – already provides a more robust process that requires inter-departmental scrutiny, notice and hearing to affected parties and reasoned orders. By contrast, the amended Rule 3(1)(d) entails no pre-decisional hearing and confines oversight to a monthly review by a Secretary within the same authority that issued the order. This arrangement collapses the separation between requester and reviewer and provides no explicit power to restore wrongfully removed content. The procedural and accountability deficits of Rule 3(1)(d) remain largely unaddressed, resulting in a framework where oversight of takedown decisions is concentrated within the same administrative structures that implement them.

Evolving Questions Around Platform Autonomy and Good Samaritan Protection

The removal of the Good Samaritan Proviso poses a new regulatory challenge. The provision had protected intermediaries that voluntarily removed unlawful content under Rule 3(1)(b), recognising that intermediaries must be free to moderate harmful content, without forfeiting their safe-harbour protection under p 79, to maintain civility on their platforms. Its deletion raises uncertainty over whether voluntary moderation could now expose intermediaries to liability under p 79(2).

This outcome likely does not reflect regulatory intent, as MeitY's draft amendments to the 2021 IT Rules addressing deepfake content, released for public consultation contemporaneously with the 2025 Amendment, retain a Good Samaritan-style protection for intermediaries acting in good faith to remove synthetically generated information. The coexistence of these approaches suggests an inadvertent omission rather than a policy reversal. Nevertheless, the inconsistency creates regulatory uncertainty that may discourage proactive moderation by intermediaries to maintain civility on their platforms. Clarification from MeitY in this regard may be necessary to preserve platform autonomy and enable responsible content governance.

Conclusion

The 2025 Amendment marks an important step forward, clarifying authority, requiring reasoned intimations, and linking takedowns to specific, identifiable content. At the same time, certain gaps in transparency and due process persist, while new questions have emerged around platform autonomy. Strengthening mechanisms for disclosure, independent oversight, and good-faith moderation would help ensure that the framework continues to evolve in line with the principles of procedural fairness and accountability that underpin India's broader digital governance architecture.

Originally published by Legal Era, Legal500, Chambers & Partners.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More