The Ministry of Electronics and Information Technology (MeitY) issued an advisory dated 3 September 2024 (Advisory) to 'intermediaries' emphasizing on the responsibility to take prompt action to remove any prohibited content from their platforms in order to meet due diligence obligations under the Information Technology Act, 2000 (IT Act) read with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (Intermediary Guidelines).
Under the IT Act read with Rule 3 (1) of the Intermediary Guidelines, an 'intermediary' (which typically includes players like telecom operators, social media companies, cloud-hosting providers, etc.) is required to ensure that no illegal content, (like information infringing intellectual property or other proprietary rights of another person) is hosted on its platform. Additionally, the 'intermediary' is required to remove or disable such information as early as possible, but not later than 36 hours of receiving a court order or an intimation from the Government or its agency in that regard. An 'intermediary' may also remove or disable access to information, data or communication link on a voluntary basis or on the basis of grievances received. Compliance with these due diligence steps is critical for preserving safe harbour against liability relating to third party content under the IT Act and Intermediary Guidelines.
Key highlights of the Advisory
- Reference to Bombay High Court order: MeitY has referred to a recent interim order issued by the Bombay High Court where a social media platform has been directed by the court to remove all false information, including morphed videos related to the National Stock Exchange (NSE), within 10 hours of receiving a complaint from NSE. In this case, there were certain deep fake videos circulated on the social media platform as advertisements featuring NSE Managing Director and CEO who was seen encouraging the users to join stock tips related WhatsApp groups, prompting NSE to lodge grievance with the concerned social media platform. NSE informed the court that while the deceptive content was removed by the social media platform, the process was "extremely time consuming and impracticable".
- Need for prompt action by intermediaries to remove illegal content: MeitY has highlighted that the intermediaries are not promptly removing the illegal content from their platforms, which can potentially lead to irreparable harm to individuals in case of cyber frauds especially taking the use of emerging technologies on the internet into consideration. This is aimed at keeping internet in India open, safe, trusted and accountable, which is in line with several directions and advisories issued by the Government in the past couple of years.
- Reference to the previous advisories: MeitY has cited its previous advisories dated 26 December 2023, on usage of artificial intelligence (AI) based deepfakes technology and advisory dated 15 March 2024 on usage of AI model/large language model/generative AI, etc and indicated that the current Advisory is in continuation of these advisories issued in the past. This reflects MeitY's intention to ensure intermediaries proactively comply with their due diligence requirement under Rule 3 (1) of the Intermediary Guidelines whilst identifying and promptly removing illegal content created using the emerging technologies such as AI which could potentially be misleading or used in cyber fraud activities.
Timelines on removing prohibited content to be seen as an upper threshold: The Advisory further clarifies that the timeline of 36 hours provided under the Intermediary Guidelines to remove illegal content should be only seen as the upper threshold and the intermediaries should put in efforts to delete such content from their platform as early as possible.
Comments
In light of the Advisory, it is clear that the Government is intensifying its efforts to combat the spread of illegal content online. The Advisory emphasizes the need for intermediary platforms to act swiftly, using the Intermediary Guidelines timelines as only a maximum threshold. This proactive stance is crucial as emerging technologies increasingly facilitate the creation and dissemination of fake and prohibited content. As digital threats evolve, platforms must enhance their content moderation mechanisms and respond more effectively to mitigate risks and protect users. This approach not only underscores the importance of regulatory compliance but also highlights the need for ongoing adaptation to technological advancements in the fight against online misinformation and harmful content. However, it will have to be seen how the Advisory is received by intermediaries in view of the complex jurisprudence that has evolved in this regard in the past few years, and especially as many players have expressed concerns about assuming an adjudicatory or censorial function in identifying content that infringes a party's intellectual property rights.
The content of this document does not necessarily reflect the views / position of Khaitan & Co but remain solely those of the author(s). For any further queries or follow up, please contact Khaitan & Co at editors@khaitanco.com.