The Madras High Court’s suggestion that India consider an Australia-style framework to restrict children’s access to social media could have far-reaching implications for digital advertising, platform targeting, and content moderation across the country’s online ecosystem.
A bench comprising Justices G. Jayachandran and K. K. Ramakrishnan made the observation on 23 December while disposing of a public interest litigation on the easy availability of pornographic content on the internet. The court urged the Union government to explore legislation similar to Australia’s and, in the interim, called on authorities to intensify public awareness campaigns.
“Union of India may explore the possibility of passing legislation like Australia. Till such legislation is passed, the authorities concerned shall accelerate their awareness campaign more effectively. They shall take the message to the vulnerable group through all available media,” the court said.
Why the ruling matters to platforms
The reference to Australia is particularly significant for social media platforms operating in India. Australian law places the onus on platforms to prevent users under 16 from creating or holding accounts, rather than on parents or children themselves.
If a comparable regime is introduced in India, platforms may need to bolster age-verification systems, redesign onboarding processes, and impose stricter controls on youth access. Such changes could affect platform scale, especially for services where teenagers form a sizeable share of daily active users and engagement.
From an advertising standpoint, curbs on underage access would also reduce the addressable audience for categories that rely on mass reach, including gaming, entertainment, edtech, FMCG, and mobile-first brands.
Impact on ad targeting and media planning
Restrictions on children’s access to social media could reshape audience segmentation and targeting strategies. Advertisers may increasingly depend on age-gated environments, contextual advertising, and first-party data as regulatory scrutiny around minors grows.
Brands focused on younger audiences could also pivot towards alternative channels such as connected TV, gaming platforms, influencer-led family content, and offline activations, depending on how enforcement evolves.
For platforms, clearer separation between adult and minor audiences could influence ad inventory pricing, brand safety positioning, and compliance-driven product changes.
The public interest litigation was filed in 2018 by S. Vijayakumar, a resident of Madurai, who highlighted concerns over children’s exposure to pornographic material online. The petition sought directions for child rights commissions and internet service providers to implement parental control—or “parental window”—systems, alongside broader awareness initiatives.
During the hearing, the petitioner’s counsel cited Australia’s social media legislation, arguing that platform-level controls could limit children’s exposure to inappropriate content without penalising parents or minors.
Internet service providers countered that intermediaries already operate under existing legal frameworks. They told the court that under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, objectionable websites are reviewed and blocked when violations are reported.
The bench also noted that child rights commissions have statutory obligations under the National Commission for Protection of Child Rights Act, 2005, to promote awareness of child rights and safeguards.
“The Commission has a statutory duty and responsibility to spread child right literacy among various sections of the society and promote awareness of the safeguards available for protection of these rights,” the court said, adding that current efforts were inadequate.
While stopping short of issuing binding directions, the court observed that child sexual abuse material continues to be accessible online, underscoring the need for device-level parental controls and stronger parental awareness.
“Ultimately, it is the individual choice and right to access such obnoxious material or to avoid it. As far as children are concerned, the vulnerability is high, so the parents’ responsibility is higher,” the court said.