Australia issues guidance on enforcing social media minimum age rules

On 16 September 2025, the eSafety Commissioner formally adopted the Social Media Minimum Age Regulatory Guidance under the Online Safety Act 2021. The document provides a framework for how providers of age-restricted social media platforms must meet their legal obligations to prevent children under 16 from holding accounts. These obligations will take effect from 10 December 2025.
The guidance requires platforms to take ‘reasonable steps’ to detect and deactivate existing underage accounts, stop immediate re-registration, and ensure affected users are informed about what happens to their accounts. This includes offering clear options for downloading data, signposting support services for those who may be distressed, and providing appeal or review mechanisms when users contest an age determination. Importantly, platforms must also demonstrate ongoing improvements in their age assurance systems, reflecting the regulator’s position that compliance cannot remain static but should adapt to evolving technologies and risks.
Rather than prescribing a single method, the guidance takes a principles-based approach. Providers are expected to implement layered measures that are reliable, accurate, privacy-preserving, inclusive, transparent, proportionate to the risk posed by their services, and evidence-based. Acceptable methods may include age estimation, age inference, and age verification, often applied in combination. Reliance solely on self-declared dates of birth is explicitly ruled out, and the collection of government-issued identification cannot be the only option offered to users. Platforms must also ensure compliance with privacy standards under the Privacy Act 1988 and maintain records to show adherence.
Civil society played a significant role in shaping this outcome. Between June and August 2025, the eSafety Commissioner consulted over 160 organisations and individuals, including representatives from community groups, children and young people, parents, Aboriginal and Torres Strait Islander communities, people with disabilities, culturally and linguistically diverse groups, and those in regional and remote areas. This engagement highlighted concerns about privacy, inclusion, risks of circumvention, and unintended consequences of strict enforcement. Civil society input helped ensure that the final guidance emphasises proportionality, transparency, and accessibility – balancing the need to protect children from online harms with safeguarding fundamental rights, such as privacy, equality, and freedom of expression.