KEY HIGHLIGHTS
- Precautions: The Advisory reminds intermediaries of their duty under Rule 3[1][b] to make “reasonable efforts” to ensure that users on their platforms do not host, display, upload, or share any information that is “obscene, pornographic, paedophilic, harmful to children, or otherwise unlawful.” To implement this, the intermediaries must deploy accessible reporting and grievance redressal systems.
- Takedown Orders and Grievance Redressal: In case intermediaries receive takedown orders by the Courts, or the Appropriate Government or its authorised agency, the intermediary must immediately remove such content or make it unavailable within prescribed timelines as per Rule 3[1][d]. The Advisory also reiterates Rule 3[2] requirement of quick removal or disabling access to prima facie mature content within twenty-four hours of receiving a complaint by the victim or anyone on their behalf.
- Consequences of Non-Compliance: It cautions intermediaries that non-compliance with any of these provisions can lead to the removal of the safe harbour protection under Section 79 of the IT Act. It can also lead to possible criminal actions under the Bharatiya Nyaya Sanhita, 2023 [“BNS”]. It also noted that such dissemination of sensitive content is also a punishable offence under specialised statutes like the Protection of Children from Sexual Offences Act, 2012 and the Representation of Women [Prohibition] Act, 1986.
- Actionable Advice: Intermediaries are advised to conduct an immediate review of their compliance frameworks, content moderation arrangements, and user-facing enforcement processes, and to maintain continuous compliance with the requirements of the IT Act and the IT Rules.
- Additional Measures by SSMIs: The Advisory also prescribes the additional measures required by Significant Social Media Intermediaries [“SSMIs”], i.e., the social media intermediaries that have more than fifty lakh registered users in India, given the vast amounts of Personal Data they deal with. Specifically, SSMIs are required to make use of automated technology for timely compliance, in addition to general precautions taken.
LEGAL AND REGULATORY OBLIGATIONS
Adherence to these strict timelines and proactive measures is a prerequisite for obtaining the Safe Harbour protection under Section 79 of the IT Act. The failure to comply with these guidelines not only revokes this protection but can also result in criminal action taken and penalties imposed against them.
Failure on the part of the due diligence obligation required by Section 79 of the IT Act and IT Rules, 2021 can make third-party content liable for direct action on the part of the intermediaries, and their loss of immunity. In this manner, third-party content hosted on these websites and portals can make the intermediaries liable, which, in turn, can make these websites and portals and their responsible officers liable for criminal offences in pursuit of the IT Act, the BNS, and other relevant laws and regulations.
LOOKING FORWARD
Considering MeitY’s consistent stress on compliance with the law recently, this Advisory acts as an important reminder for intermediaries to make appropriate investments in content moderation systems and automation tools. Adequate measures to enable the time-bound identification and deletion of sensitive content must be strictly implemented. Otherwise, it will pose risks to the continued application of the Safe-Harbour protection available to them.
LOOKING FORWARD
- Ensure that users are clearly informed of prohibited categories of content through platform policies, terms of service, and user notifications. Such documents must be clearly accessible on their platform.
- Conduct an internal assessment of compliance frameworks, content moderation practices and user enforcement mechanisms, ensuring that they strictly adhere to the IT Act, the 2021 Rules thereunder and other relevant laws and regulations.
- Regularly review and update the user terms of service and the community and platform guidelines to prevent the upload/dissemination of any obscene, indecent, pornographic, sexually explicit, paedophilic, and promptly remove any such material, if uploaded.
- Ensure the presence of a notice and takedown system in accordance with Rule 3[1][d] of the 2021 Rules, respecting the required timelines.
- Have an easily accessible grievance redressal mechanism with the complaint being acknowledged and resolved in terms of Rule 3[2] of the 2021 Rules.
- Establish processes to remove content containing persons in Sexual Acts within 24 hours of the receipt of a valid complaint based on Rule 3[2][b] of the 2021 Rules.
- Where necessary, technologically enabled and automated processes can and must be employed to trace and prevent the uploading and dissemination of prohibited content as specified by provisions outlined in Rule 4 of the 2021 Rules for SSMIs.
- SSMIs must also ensure the appointment of a Chief Compliance Officer, Nodal Contact Person, and Resident Grievance Officer, and the publishing of their contact details on the platform as required under Rule 4 of the IT Rules, 2021.
- Ensure that content moderation, grievance redressal and other policies comply with applicable privacy and data protection obligations.
- To ensure more holistic protection, update your “User Terms” to mandate the necessary identification/labelling of AI-generated/modified content and implement a “Report AI/Deepfake” category in your grievance redressal portal.
- Engage in training of internal staff on a routine basis with respect to content, compliance, and grievances.
- Preserve/retain all relevant information and maintain all necessary records and audit trails related to the receipt of take-down notifications, grievance procedures, etc., for reporting purposes in case of any queries from the concerned authorities, as required by law.





