The Union government has issued a strong warning to social media companies and digital intermediaries, cautioning that continued inaction against obscene and unlawful content could invite serious legal consequences.
In an advisory dated December 29, 2025, the Ministry of
Electronics and Information Technology (MeitY) directed online platforms to
urgently reassess their content governance practices. The ministry made it
clear that legal protection available to intermediaries under Section 79 of the
Information Technology Act is conditional and applies only when platforms
demonstrate adequate “due diligence” in moderating third-party content.
Compliance Failures Flagged
According to MeitY, a recent assessment revealed that
several platforms are failing to respond effectively to content that is
obscene, vulgar, or otherwise illegal. The ministry noted that such lapses
undermine user safety and violate existing legal obligations.
Obligations Under IT Rules, 2021
The advisory reiterates that the Information Technology
(Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 require
platforms to actively prevent users from creating or circulating content that
is:
• Pornographic or sexually explicit
• Pedophilic or exploitative of children
• Harmful to minors in any form
Platforms are expected to take reasonable and proactive
measures to ensure such material does not remain accessible.
Mandatory Action Timelines
MeitY has also reminded intermediaries of strict content
takedown deadlines:
• Sexual or impersonation-related content must be removed
within 24 hours of receiving a complaint.
• Content flagged through court orders or government
directions must be disabled within the timelines prescribed under the IT Rules.
For major social media platforms, the government stressed
the need to move beyond reactive moderation and deploy automated and
technology-driven tools to detect and curb unlawful content at scale.
Legal Risks for Non-Compliance
The advisory leaves little room for ambiguity. Platforms
that fail to comply may face:
• Withdrawal of safe-harbour protection, exposing them to
direct liability for user-generated content.
• Criminal proceedings under the IT Act, the Bharatiya Nyaya
Sanhita (BNS), and other applicable laws, potentially affecting companies,
their executives, and even users.
Government’s Message
Through this advisory, the Centre has signaled a tougher enforcement stance, emphasizing that content moderation is no longer optional or complaint-driven but a core legal responsibility for digital platforms operating in India.
By - Aaradhay Sharma
.jpg)
No comments:
Post a Comment