Social media platforms blocked for U16s in Australia from Dec.10, 2025

Australia’s new social media minimum age law comes into force on 10 December 2025, requiring platforms to take active steps to prevent anyone under 16 from creating or keeping an account on age-restricted social media platforms. The law is among the strictest of its kind globally, targeting major services like Facebook, Instagram, TikTok, Snapchat, X, and YouTube, among others.

The Minister for Communications & Sports, Anika Wells, in a media statement says the social media platforms must implement the laws in an effective, private and fair way, the onus will be on platforms to see that u-16’s do not have accounts on their services, and failure by platforms to take reasonable steps to comply with the new obligations can attract fines of up to $49.5 million.

“e Safety’s guidance makes clear that platforms must comply with the law, and also provide transparent and accessible information to their users about their age assurance systems.

“We know there won’t be a one-size-fits-all approach to implementing the minimum age, but there are many effective solutions available – many of which are already being used by industry.

“There is no excuse for social media platforms to fail to meet their obligations under the new laws – and from 10 December, there will be significant fines for non-compliance, the Minister says.

Law Overview

  • The minimum age for social media accounts will be 16, applying to both new and existing accounts.

  • The obligation is solely on social media companies, not children, parents, or guardians—no penalties exist for under-16s themselves or their families.

  • “Age-restricted social media platforms” are defined as services primarily aimed at online social interaction, user linking and interaction, and allowing users to post material. Messaging-only apps, online games, and platforms with educational, health, or professional networking purposes are excluded.

Enforcement and Penalties

  • Platforms must take “reasonable steps” to verify user age and prevent under-16s from accessing or keeping accounts.

  • Platforms cannot require government-based ID (like Digital ID) as the only means of age verification and must offer privacy-conscious alternatives.

  • There are strict privacy protections—any personal information collected to meet the age law must be destroyed after use and cannot be repurposed.

  • Failure to comply can lead to fines of up to 150,000 penalty units (currently about AUD $49.5 million) for corporations.

  • Oversight comes from Australia’s eSafety Commissioner and the Information Commissioner, who can seek compliance information and issue public notices regarding breaches.

Context and Debates

  • The law is a response to mounting concerns about the mental health risks and online harms social media poses to young Australians.

  • Age verification may involve various techniques; no single method is deemed universally effective, and data privacy concerns are a central point of discussion.

  • The new law amends the Online Safety Act 2021 and has bipartisan support, following public debate about youth digital wellbeing and parent advocacy.

This policy places Australia at the vanguard of digital child protection regulation, balancing child safety, privacy, and realistic enforcement demands.

Regulatory Guidance

https://www.esafety.gov.au/industry/regulatory-guidance

By SAT News Desk

Share to

Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on email
Tags

Get our Newsletter and e-Paper

Related Articles

Premier Jacinta Allan: Life term for youth gang recruiters in Victoria

Premier Jacinta Allan: Life term for youth gang recruiters in Victoria

Apology demanded for wrong India map in VMC publication

Apology demanded for wrong India map in VMC publication

Honour for 100 Multicultural Innovation & Business Leaders 2025

Honour for 100 Multicultural Innovation & Business Leaders 2025