Privacy in a ‘fishbowl society

Quarterly-SFG-Jan-to-March
SFG FRC 2026

Source: The post “Privacy in a ‘fishbowl society” has been created, based on “Privacy in a ‘fishbowl society’” published in “The Hindu” on 03rd December 2025.

UPSC Syllabus: GS Paper-3- Technology

Context: India today faces a “fishbowl society” where pervasive technological surveillance and AI-based harms challenge the traditional understanding of privacy. Despite the Puttaswamy judgment (2017), the IT Act (2000), and the Digital Personal Data Protection Act (2023), the lived experience of privacy violations—especially through deepfakes and NCII—remains inadequately addressed.

Society’s over-reliance on technology, as highlighted by Meredith Broussard in Artificial Unintelligence, has left individuals unprepared to cope with AI-driven risks. Deepfake algorithms create pornographic images without consent, pushing individuals into forced visibility and loss of autonomy. Harms arising from NCII extend far beyond privacy loss and include psychological distress, fear, stigma, and long-term damage to dignity and bodily integrity.

About Standard Operating Procedure (SOP) to Curtail Dissemination of Non-Consensual Intimate Imagery (NCII)

  1. The Ministry of Electronics and IT (MeitY) issued a Standard Operating Procedure to strengthen mechanisms for removing and preventing the spread of NCII content online.
  2. The SOP was developed following directions of the Madras High Court and aims to ensure swift, uniform and victim-centric action across platforms and agencies.
  3. It provides clear guidance to victims, intermediaries and law enforcement agencies for reporting and promptly removing intimate or morphed images shared without consent.
  4. The SOP mandates that all intermediaries take down or disable access to reported NCII content within 24 hours of receiving a complaint.
  5. Victims can report incidents through multiple channels, including One Stop Centres, the National Cybercrime Reporting Portal (NCRP), in-app grievance mechanisms, and local police stations.
  6. Significant Social Media Intermediaries are required to use hash-matching and crawler tools to prevent re-uploads of the same or similar NCII content.
  7. The SOP strengthens inter-agency coordination by involving I4C as the central aggregator of complaints, DoT for URL blocking, and MeitY for monitoring compliance.
  8. Overall, the SOP aims to empower individuals especially women to regain control over their digital identities and reinforces the government’s commitment to ensuring privacy, dignity and safety in cyberspace.

Limitations

  1. The SOP lacks a gender-neutral framework and fails to recognise the vulnerabilities of transgender persons, despite Supreme Court recognition of the third gender.
  2. It does not clearly define the accountability of platforms or AI developers, nor does it specify penalties or enforcement mechanisms.
  3. It lacks detailed regulations on deepfake generation, dissemination, traceability, and investigation, limiting its effectiveness.
  4. The SOP remains merely a starting point and cannot replace the need for comprehensive legislation.

Key Challenges

  1. Lack of data and under-reporting: The NCRB does not collect or publish disaggregated statistics on NCII or cyberbullying, making the scale of the problem invisible.
    1. An RTI filed in October 2025 revealed that the Union government lacks specific data and places the responsibility on States, showing systemic data gaps.
    2. Social stigma and fear of victim-blaming discourage many survivors, especially women and transgender persons, from reporting incidents.
  2. Limited public awareness: Many young users remain unaware of what crimes such as voyeurism, deepfake pornography, or revenge porn legally constitute. Digital illiteracy, combined with societal shaming, further prevents victims from seeking legal remedies.
  3. Weak institutional capacity: Police officials often lack adequate training and sensitivity to handle NCII cases effectively. Cyber-investigative capacity remains limited, leading to slow or ineffective responses. Conviction rates remain low despite the filing of thousands of complaints across the country.

Why Laws Alone Are Not Enough

  1. Existing legal provisions remain ineffective without awareness, accessibility, and societal acceptance.
  2. Victims often avoid reporting due to shame, fear, and lack of trust in investigative systems.
  3. Rapid technological advancements have outpaced current legal and institutional capacities, leaving victims unprotected.

Way Forward

  1. Dedicated NCII legislation: A comprehensive, gender-neutral law should be enacted to specifically address NCII and deepfake harms. Such a law must define the duties of platforms, intermediaries, and AI developers and incorporate strict traceability and takedown norms.
  2. Strengthening institutional capacity: Police forces need specialised training in cyber investigations and gender-sensitive handling of NCII cases. Governments must invest in advanced digital forensic infrastructure to expedite evidence gathering.
  3. Victim-centric mechanisms: Anonymous reporting systems, confidential complaint processes, and psychological support services should be institutionalised. Fast-track mechanisms for content removal, legal assistance, and compensation must be ensured.
  4. Platform and AI developer accountability: Online platforms must conduct mandatory risk assessments and adopt watermarking and detection tools for AI-generated images. Clear obligations must be imposed on platforms to promptly remove harmful content and cooperate with law enforcement.
  5. Public awareness and education: Government and civil society must launch nationwide campaigns to improve digital literacy and awareness of cyber rights. Educational institutions should incorporate modules on consent, online safety, and gender justice.
  6. Independent oversight: A specialised regulatory body for AI harms and digital safety should be established to audit platforms and enforce compliance.

Conclusion: Deepfakes and NCII have transformed privacy from a purely legal right into a domain increasingly shaped by technological vulnerabilities. While the 2025 SOP is a crucial step, it remains insufficient without comprehensive, gender-neutral laws, capable institutions, platform accountability, and robust support mechanisms for victims. A holistic and proactive approach is essential to safeguard dignity, autonomy and digital safety in an increasingly transparent and technology-driven society.
Question: India today lives in a ‘fishbowl society’ where AI-driven harms have outpaced existing legal protections.” Discuss in the context of rising deepfake and NCII crimes.

Print Friendly and PDF
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Blog
Academy
Community