-
by Admin
23 February 2026 5:55 AM
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, as updated up to 10.02.2026, mark a significant expansion of India’s regulatory framework governing intermediaries, digital media publishers, and online gaming platforms. Notified under Section 87 of the Information Technology Act, 2000, the Rules replace the 2011 Intermediary Guidelines and introduce a layered compliance architecture covering social media intermediaries, significant social media intermediaries, online gaming intermediaries, publishers of news and current affairs content, and publishers of online curated content.
The most striking updates include express regulation of “synthetically generated information” (deepfakes and AI-generated content), a compressed three-hour takedown timeline upon actual knowledge, mandatory labelling and metadata requirements for AI-generated content, and a comprehensive framework for verification of online real money games.
Expanded Definitions: Deepfakes, Gaming and Digital Media Clarified
The amended Rules significantly widen definitional scope. “Synthetically generated information” is defined to include audio, visual or audio-visual information artificially or algorithmically created, generated, modified or altered in a manner that appears real or authentic. However, the proviso clarifies that routine editing, formatting, accessibility improvements, or good-faith design activities that do not materially misrepresent substance are excluded.
The definition is carefully crafted to capture deepfakes and deceptive AI-generated media while shielding legitimate editing and accessibility enhancements. Rule 2(1A) further clarifies that any reference to “information” used to commit an unlawful act includes synthetically generated information.
Online gaming has been separately defined with nuanced categories such as “online real money game,” “permissible online real money game,” and “online gaming self-regulatory body,” creating a formal verification ecosystem.
Due Diligence Under Rule 3: Compressed Timelines and Enhanced Liability
Rule 3 mandates extensive due diligence by intermediaries, including social media intermediaries and online gaming intermediaries.
A key amendment now requires removal or disabling of access to unlawful information within three hours of receiving “actual knowledge” under Section 79(3)(b) of the IT Act. Actual knowledge arises only through a court order or a reasoned written intimation from an authorised government officer not below specified rank, subject to monthly review for proportionality and legality.
This drastically reduces the earlier 36-hour compliance window and heightens operational pressure on intermediaries.
Rule 3(1)(c) and (ca) impose enhanced user-notification obligations, including mandatory communication regarding consequences of creating or disseminating unlawful or synthetically generated information.
Intermediaries enabling AI-based creation or modification of content must deploy “reasonable and appropriate technical measures” to prevent generation of unlawful synthetically generated information, particularly content involving child sexual abuse material, non-consensual intimate imagery, false electronic records, or misrepresentation likely to deceive.
Every synthetically generated information not unlawful must be prominently labelled and embedded with permanent metadata or technical provenance markers identifying the generating resource. Suppression or removal of such labels or metadata is expressly prohibited.
The Rules thereby introduce an enforceable provenance and watermarking obligation in India’s digital regulatory framework.
Significant Social Media Intermediaries: Additional Obligations and Deepfake Declaration
Significant social media intermediaries are subject to enhanced obligations under Rule 4. They must appoint a Chief Compliance Officer, a Nodal Contact Person for 24×7 coordination with law enforcement, and a Resident Grievance Officer.
Under newly inserted Rule 4(1A), such intermediaries must require users to declare whether information is synthetically generated and must deploy technical tools to verify such declarations. If confirmed, the content must be clearly labelled as synthetically generated.
Failure to exercise due diligence in relation to synthetically generated content may expose the intermediary to loss of safe harbour under Section 79.
Traceability obligations under Rule 4(2) continue to apply to messaging platforms, requiring identification of the first originator upon judicial or Section 69 orders for specified serious offences.
Additionally, proactive monitoring through automated tools is mandated for content depicting rape, child sexual abuse, or identical re-uploads of previously removed unlawful content, subject to proportionality and human oversight safeguards.
Grievance Redressal and Appellate Oversight
Intermediaries must acknowledge complaints within 24 hours and resolve them within seven days, with certain categories such as non-consensual intimate imagery requiring removal within two hours.
The Grievance Appellate Committee mechanism under Rule 3A allows users to appeal against decisions of Grievance Officers within thirty days. The Committee must endeavour to resolve appeals within thirty calendar days, operating entirely through digital mode.
Orders of the Grievance Appellate Committee are binding and must be complied with by intermediaries or online gaming self-regulatory bodies.
Online Gaming Framework: Verification, Compliance and Financial Safeguards
Rules 4A and 4B introduce a structured verification mechanism for online real money games. Only games verified by designated online gaming self-regulatory bodies may qualify as “permissible online real money games.”
Self-regulatory bodies must be Section 8 companies with independent boards comprising experts in gaming, psychology, ICT, child rights and public policy. Verification requires confirmation that the game does not involve wagering on outcomes and complies with user-protection safeguards.
Online gaming intermediaries must implement KYC procedures equivalent to RBI-regulated customer verification standards before accepting deposits. Financing of gaming activity by way of credit is prohibited.
Permissible online real money games must display a visible verification mark issued by the self-regulatory body. Monthly compliance reporting and grievance redressal obligations apply.
The Ministry retains power to suspend or revoke designation of self-regulatory bodies and to issue interim directions in public interest.
Digital Media Regulation: Three-Tier Code of Ethics
Part III of the Rules establishes a three-tier regulatory structure for publishers of news and current affairs content and publishers of online curated content.
Level I requires publishers to appoint a Grievance Officer and resolve complaints within fifteen days.
Level II mandates membership in a self-regulating body headed by a retired judge or eminent independent person, empowered to issue warnings, require apologies, mandate reclassification of content, or recommend deletion for public order concerns.
Level III vests oversight in the Ministry of Information and Broadcasting through an Inter-Departmental Committee chaired by an Authorised Officer. In emergency cases, the Secretary, Ministry of Information and Broadcasting may direct blocking of content without prior hearing, subject to subsequent review.
The Review Committee constituted under the Indian Telegraph Rules must examine blocking directions at least once every two months.
Content Classification and Access Control
Publishers of online curated content must classify content into U, U/A 7+, U/A 13+, U/A 16+, and A categories, guided by the Schedule’s content descriptors including violence, nudity, language, discrimination and substance abuse.
For U/A 13+ and above content, parental controls must be implemented. For “A” rated content, reliable age verification mechanisms are mandatory.
Accessibility measures for persons with disabilities must be implemented “to the extent feasible.”
Safe Harbour Conditionality and Liability
Rule 7 explicitly provides that failure to observe these Rules results in loss of safe harbour protection under Section 79(1) of the IT Act, exposing intermediaries to liability under the IT Act and the Bharatiya Nyaya Sanhita, 2023.
Thus, compliance is not procedural but directly linked to statutory immunity.
The updated IT Rules, 2021 reflect India’s shift toward a hybrid regulatory model combining statutory oversight, algorithmic compliance obligations, traceability requirements, content labelling mandates and industry self-regulation mechanisms.
With the introduction of a strict three-hour takedown rule, explicit regulation of synthetically generated information, mandatory provenance mechanisms for AI content, and a detailed verification ecosystem for online real money gaming, the Rules substantially recalibrate intermediary obligations.
The framework balances content moderation, national security interests, digital innovation and user rights, but simultaneously places heightened compliance burdens on digital platforms operating in India.
Legislation: Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (Updated as on 10.02.2026)
Parent Act: Information Technology Act, 2000