The government of India has introduced a new Standard Operating Procedure (SOP) targeting non-consensual intimate content online, mandating that intermediaries must remove or disable access to such content within 24 hours of a valid complaint. This move marks a significant step in protecting digital dignity and privacy.
What the Mandate Covers
Under the SOP issued by the Ministry of Electronics & Information Technology (MeitY), online platforms and intermediaries must act within 24 hours of receiving a complaint about non-consensual intimate imagery (NCII).
Key provisions:
- The SOP is issued under Rule 3(2)(b) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
- The content category includes intimate images/videos shared without consent, morphed intimate imagery, or nudity/sexual acts involving the individual without consent.
- Multiple channels for victims to report such content:
- One-Stop Centres (OSCs) for legal & psychological aid.
- In-app reporting/grievance officers of platforms.
- The National Cybercrime Reporting Portal (NCRP) or dialing 1930.
- Local law-enforcement agencies.
- Intermediaries must not only take takedown action within 24 hours, but also deploy preventive technology (hash-matching, crawlers) especially for Significant Social Media Intermediaries (SSMIs).
- Search engines must de-index flagged URLs and domains must render content inaccessible — within the same time-frame.
Why It Matters
Victim Protection & Digital Dignity
This SOP places emphasis on victims’ ability to reclaim their online identities and dignity. Many people affected by NCII face long-term psychological and social harm. The new 24-hour rule allows much more rapid relief.
Platform Accountability & Technology Use
By mandating hash-matching and crawler tech for repeat uploads, the government is pushing platforms to be more proactive — not just reactive — in managing NCII content. Hindustan Times
Legal & Regulatory Clarity
While the underlying law (IT Rules 2021) already applied, the SOP provides a clearer, victim-centric framework for how reporting and takedown should function in practice. It also follows a directive from the Madras High Court after a high-profile case of image-sharing without consent.
Background: How We Got Here
In July 2025, the Madras High Court, in a case involving a woman advocate whose intimate images were shared online without consent, directed the government to establish a “prototype” procedure for victims of non-consensual intimate imagery.
Following this, MeitY developed the SOP and released it in November 2025.
This marks the next phase of India’s digital-safety regulation, where focus is expanding from general content moderation to specialised categories such as NCII, with specific timelines and mechanisms.
Implementation: What Will Platforms & Victims Need to Do?
For Platform / Intermediary
- Set up a grievance redress channel aligned with the SOP.
- On receiving a complaint regarding NCII: remove or disable access within 24 hours.
- For SSMIs: use hash-matching/crawler tech to detect re-uploads.
- Coordinate with the Indian Cybercrime Coordination Centre (I4C) via the Sahyog Portal to share hashes and track repeat content.
- Inform the complainant of action taken and update if content re-appears.
For Victims
- File a complaint via OSC / platform grievance officer / NCRP / local police as appropriate.
- Keep records of complaint submission and responses from the platform.
- If dissatisfied with platform action, seek recourse via grievance appellate mechanism (as provided in the SOP).
Challenges and Considerations
- Scope of definition: Some analysts note the SOP defines NCII narrowly (nudity, sexual acts) and may exclude suggestive imagery or victims outside women-and-girls.
- Encrypted & unregulated platforms: Content shared via messaging apps or file-sharing networks may fall outside the clear reach of platforms/ intermediaries.
- Effective prevention of re-uploads: Technically, the hash-matching approach can help, but mirror sites and new uploads may circumvent it.
- Awareness & victim access: For victims to benefit, public awareness of the SOP is key. The court has already asked state/union governments to publicise it. The Times of India
- Resource & enforcement capacity: Ensuring all intermediaries comply and the government has oversight and monitoring mechanisms will be important.
Looking Ahead
- The SOP is described as an “evolving document”—MeitY may update it in response to technological changes and implementation feedback.
- Monitoring of compliance: How quickly platforms meet the 24-hour target, how many complaints result in removal, and how effectively re-uploads are prevented will be key metrics.
- Expanded scope: Future iterations may include broadened definitions (e.g., deepfake imagery, trans victims, more inclusive definitions) and will likely address encrypted/ private-messaging scenarios.
- Awareness campaigns: Governments and NGOs may run outreach so that victims know their rights and the channels available.
- Global comparison: India’s timeline of 24 hours places it among stricter regimes globally for NCII takedown.
Conclusion
With the new SOP mandating a 24-hour takedown of non-consensual intimate content online, India has taken an important step in safeguarding digital dignity and privacy. The measure strengthens victim-centric mechanisms, tightens platform accountability, and sets a clearer standard for timely action.
While challenges remain — in definitions, technological enforcement, and awareness — this framework gives victims a more structured remedy and sends a strong signal about digital safety priorities.


