ARTICLE
8 July 2021

Intermediary Liability In India - Moving Goalposts

BA
BTG Advaya

Contributor

BTG Legal is an Indian law firm with particular focus on: defence; industrials; digital business; energy (renewables and nuclear); retail; transport (railways and electric vehicles); and financial services. Practices include corporate transactions, commercial contracting, public procurement, private equity, regulatory compliance, employment, disputes and white-collar crime.
The internet has revolutionized the way we interact; however, it has also brought with it a host of problems such as hate speech, terrorist recruitment, fake news, illegal lobbying, and personal data theft.
India Media, Telecoms, IT, Entertainment

The internet has revolutionized the way we interact; however, it has also brought with it a host of problems such as hate speech, terrorist recruitment, fake news, illegal lobbying, and personal data theft. A number of these issues involve the new gatekeepers of the internet, online social media platforms, and their regulation will be at the forefront of legal development in the near future.

The concept that an intermediary is only a neutral pipeline for information is no longer sacrosanct. Germany's new social media law, Netzwerkdurchsetzungsgesetz, makes platforms liable for the content they carry. In India, the Supreme Court and the government have repeatedly called for the regulation of intermediaries. The Supreme Court has in the past made intermediaries responsible for actively monitoring content to ensure that they are compliant with child and women protection laws.

There are two questions in the context of growing calls for regulation in India:

  • Are we moving from a "did-not-know" standard to a "ought-to-have-known" standard, and to what extent is this practical?
  • Do we need a new hypothesis of intermediary liability, which is limited but varies with degrees of potential severity?

Section 79 of the Information Technology Act, 2000, provides intermediaries with qualified immunity, as long as they follow the prescribed due diligence requirements and do not conspire, abet or aid an unlawful act. However, the protection lapses if an intermediary with "actual knowledge" of any content used to commit an unlawful act, or on being notified of such content, fails to remove, or disable access to it.

The demand for a further dilution of intermediary defence has come from two sources in India — the copyright protection laws; and the public order and heinous offences laws.

Delhi High Court in MySpace Inc v Super Cassettes Industries Ltd seems to hold that in cases of copyright infringement, a court order is not necessary, and an intermediary must remove content upon receiving knowledge of the infringing works from the content owner. It seems that the intermediary protection provided in this case was considerably less than the "actual knowledge" requirement under section 79 of the IT Act, as read by the Supreme Court in the Shreya Singhal case.

Incidents of sexual abuse, lynching and mob violence have been reported from videos and messages circulated on these platforms. The government has taken up these matters with intermediaries on at least two occasions, asking them to find effective solutions to the misuse of their platforms. The government has indicated that if they do not find solutions, they are "liable to be treated as abettors" and "face consequent legal action". This may mean that intermediaries are prosecuted as abettors under the Indian Penal Code.

It is likely that a regulatory alternative will emerge that will water down the overarching protections available to intermediaries. Hence, intermediaries may have to take a proactive role in policing content. As long as there is broad consensus on what "high-risk" content is, intermediaries should be allowed to evolve a self-regulatory mechanism to address it. Obvious examples are child-harming content, and material that incite violence, religious intolerance or enmity. As noted in the German regulation, such content should be banned or removed within 12–24 hours. For other objectionable content, such as copyright violation, a longer process of adjudication/discussion can be specified.

It may be useful for intermediaries to come together and design a cross-platform format for users to report illegal content, which can then become the basis for guidelines on self-regulation. Such pro-active diligence should be recognized in any future law as being sufficient a criterion to preserve the safe harbour defence. One-off misses in removing high-risk content should not impose a liability if intermediaries can demonstrate that a process was available.

Admittedly, this will be a subjective determination, but as we have seen in the case of the General Data Protection Regulation, some level of subjectivity has become unavoidable in legislation governing online behaviour.

In the absence of a solution from the industry, governments and regulators may go for an extreme "banning" approach, or try to affix "criminal liability" on the intermediaries.

The Supreme Court has, in the Prajwala case, shown willingness to work with intermediaries to find solutions. The choice may come down to intermediaries, whether to work with regulators and evolve a standard of intermediary liability, or to take up a reactive, defensive view on regulations.

Originally published for India Business Law Journal.

Originally published Feb 21, 2019

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More