Master UPSC with Drishti's NCERT Course Learn More
This just in:

State PCS

Daily Updates



Indian Society

India's Digital Regulatory Landscape to Safeguard Young Users

  • 31 Mar 2026
  • 12 min read

For Prelims: Artificial Intelligence, Big Data, Body Dysmorphic Disorder, Digital Literacy, Personal Data, Data Fiduciaries, Synthetically Generated Information (SGI), Dark Patterns, Deepfake, Juvenile Justice Act, 2015, POCSO Act, 2012.                            

For Mains: Key facts regarding social media and associated concerns for young users, India's regulatory framework governing social media use for young users and measures needed to mitigate the negative impact of social media on young users.

Source: BS

Why in News?

A Los Angeles court (United States) found social media platforms Meta (Instagram) and YouTube negligent in platform design and failed to adequately warn young users about risks, with the companies ordered to pay a collective USD 6 million in damages. 

  • The case highlighted features like infinite scrolling, algorithm-led recommendations, and autoplay videos as deliberate tools used to ensure children "never put down the phone."

Summary

  • Global regulations are tightening as courts hold social media giants accountable for addictive designs like infinite scrolling. 
  • India utilizes the DPDP Act and IT Rules to mandate parental consent and age-gating. 
  • Balancing innovation with child safety, these frameworks aim to mitigate mental health risks and digital exploitation.

What the Meta-YouTube Ruling Means for Social Media Platforms?

  • Removal of the "Neutral Pipe" Defense: Historically, platforms argued they were just "neutral pipes" (intermediaries) and weren't responsible for the content flowing through them under Section 230 of the U.S. Communications Decency Act, 1996 or Section 79 (safe harbour clause) of the Information Technology (IT) Act, 2000 (India).
  • Algorithmic Transparency: Platforms will now likely be forced to conduct Design Risk Assessments. If an internal document (like the "Facebook Files") shows they knew a feature caused body dysmorphia or addiction but launched it anyway, it constitutes "malice" or "reckless disregard."
  • Radical Redesign of User Experience: Platforms may be forced to disable or modify several core features for minors like replacing endless feeds with "stop" points or "you're all caught up" messages  and eliminate design tricks that make it difficult for users to log off or delete accounts.
  • Global Regulatory Shift: The ruling coincides with international efforts to protect minors, including Australia’s restrictions on social media use for for under-16s and a U.K. pilot program testing age-based bans.
  • Impact on India’s Regulatory Landscape: Digital India Act, 2023 may move away from "safe harbor" protections and toward Product Liability.
    • IT Rules, 2026 already mandate stricter removal timelines (3 hours) for harmful AI content. This verdict adds a new layer i.e., platforms could be sued for the mental health impact of their recommendation engines.

What is India's Regulatory Framework Governing Social Media Use for Young Users?

  • Digital Personal Data Protection (DPDP) Act, 2023: Platforms are strictly prohibited from processing any personal data of a child (defined as anyone under 18) without the verifiable consent of a parent or lawful guardian.
    • Data fiduciaries (platforms) cannot engage in tracking, behavioral monitoring, or targeted advertising directed at children.
    • Non-compliance regarding children’s data can attract fines up to Rs 250 crores.
  • Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: OTT and digital media must classify content into categories (U, U/A 7+, 13+, 16+, and A (Adult)). Platforms are mandated to provide parental locks for content rated U/A 13+ and above , and reliable age verification mechanisms for content classified as “A”. 
    • Intermediaries must remove content harmful to children within 3 hours (2 hours for non-consensual sexual/intimate content) of a government or court order.
    • The 2026 amendments require platforms to label Synthetically Generated Information (SGI) to prevent children from being misled by AI-generated misinformation or non-consensual morphed imagery.
  • Protection of Children from Sexual Offences (POCSO) Act, 2012: The Act criminalizes "online grooming" (befriending young people for sexual abuse) and the storage or distribution of Child Sexual Abuse Material (CSAM).
    • Social media intermediaries are legally obligated to report any instances of sexual offences against children on their platforms to law enforcement agencies.
  • State-Level Interventions: Karnataka announced ban on social media use for children under 16 to prevent digital addiction. Andhra Pradesh proposed a ban for children under 13, citing mental health concerns and "dark patterns" in app design.
  • Juvenile Justice (Care and Protection of Children) Act, 2015: This Act specifically addresses the online facilitation of child exploitation, including human trafficking and the luring of children through digital platforms. 
  • Information Technology Act, 2000: It provides the overarching legal framework for intermediary liability and content regulation, with provisions on obscenity, privacy, and cybercrimes addressing content harmful to children.

Social Media

  • About: Social media refers to interactive, computer-mediated technologies that facilitate the creation and sharing of information, ideas, and interests through virtual communities and networks. 
    • Unlike traditional media (like TV or newspapers) which is a "one-to-many" broadcast, social media is defined by user-generated content and "many-to-many" interaction.
  • Types of Social Media Platforms:

Type

Examples

Primary Function

Social Networking

Facebook, LinkedIn

Connecting with friends or professional networking.

Microblogging

X (formerly Twitter), Threads

Short-form updates and real-time news.

Media Sharing

Instagram, YouTube, TikTok

Visual storytelling through photos and video.

Discussion Forums

Reddit, Quora

Community-based knowledge sharing and debates.

How does Social Media Pose Concerns for Young Users?

  • Engineering of Addiction: Platforms use persuasive design—including infinite scroll, and intermittent rewards (likes/notifications)—to trigger dopamine releases similar to gambling, making it difficult for minors to self-regulate.
  • Mental Health & Body Image: Constant exposure to curated, filtered lives often leads to "Social Comparison." This is a primary driver for Body Dysmorphic Disorder, anxiety, and depression, particularly among teenage girls.
  • Cyberbullying and Harassment: The anonymity and reach of the internet allow for persistent bullying that follows a child home, leading to severe emotional distress and, in extreme cases, self-harm or suicidal ideation.
  • Data Privacy and Exploitation: Young users often lack the "digital literacy" to understand how their personal data is harvested. Concerns also include predatory behavior and sexual exploitation facilitated by recommendation algorithms.
  • Impact on Brain Development: Excessive screen time can interfere with sleep patterns and physical activity, potentially affecting the prefrontal cortex, which is responsible for impulse control and executive function.
  • The "Filter Bubble" & Radicalization: Algorithms prioritize engagement over truth, often pushing young users toward extreme content or misinformation, which can warp their social and political worldview at a formative age.

What Measures can Mitigate the Negative Impact of Social Media on Young Users?

  • Parental & Home-Based Interventions: Move beyond "blocking" to "co-viewing." Discussing content with children helps them develop a critical lens toward unrealistic beauty standards or misinformation.
    • Children often mimic their parents. Setting a good example by limiting one's own "doomscrolling" is crucial for establishing healthy norms
  • Educational & School-Based Measures: Teach students to identify "Dark Patterns" (design tricks that drive addiction) and verify information to avoid falling for deepfakes.
    • Implement  "lockers for phones" in schools to ensure a focused learning environment and encourage face-to-face social interaction during breaks.
  • Technical & Design Solutions: Implementing privacy-preserving age-gating (e.g., via AI face estimation) to ensure children aren't exposed to adult content. 
    • Remove features that drive compulsive use, such as infinite scroll and autoplay, for minor accounts. Include system-level prompts that encourage users to take a break after a certain period of continuous use.
  • Utilize the "Right to Be Forgotten": Provide young users with a simplified mechanism to delete their historical digital footprint ensures that mistakes made during their formative years do not permanently haunt their professional or personal futures.
  • Leverage Legal Protections against Exploitation: Strict enforcement of the Juvenile Justice Act, 2015 and POCSO Act, 2012 is essential to prevent social media from being used as a tool for online grooming or trafficking. Mission Shakti promotes safety, security and women empowerment.

Conclusion

The US court ruling underscores the global reckoning with social media's addictive design, harming youth. India's regulatory framework, including the DPDP Act and IT Rules, addresses these concerns but requires stringent enforcement. A multi-stakeholder approach combining parental guidance, education, and platform accountability is crucial.

Drishti Mains Question:

Q. How do social media algorithms, by valuing engagement over accuracy, impact young users?

Frequently Asked Questions (FAQs)

1.What is the definition of a 'child' and the consent requirement under the DPDP Act, 2023?

Under the DPDP Act, a child is an individual below 18 years; platforms must obtain verifiable parental consent before processing their personal data.

2.How do the IT Rules, 2021, address age-appropriate content for minors?

The rules mandate content classification (U/A 7+, 13+, 16+) and require intermediaries to provide parental locks and reliable age-verification mechanisms.

3.What is 'Synthetically Generated Information' (SGI) under the 2026 IT amendments?

SGI refers to AI-generated content or deepfakes; platforms must label them to prevent misleading minors through morphed imagery or misinformation.

UPSC Civil Services Examination Previous Year Question: 

Mains

Q. What are social networking sites and what security implications do these sites present? (2013)

Q. Child cuddling is now being replaced by mobile phones. Discuss its impact on the socialization of children. (2023)

close
Share Page
images-2
images-2