Children in Australia face growing online risks, new report finds
The report provides a detailed snapshot of the online realities shaping the development, well-being, and safety of Australian children.

A new report by the eSafety Commissioner, titled ‘Digital Use and Risk: Online Platform Engagement among Children Aged 10 to 15‘, offers a comprehensive overview of how Australian children engage with digital platforms, and the risks they face in doing so. Based on a nationally representative survey of 2,629 children conducted between December 2024 and February 2025, the report presents detailed findings on social media usage, online harms, and differences across age and gender.
Nearly all children aged 10 to 15 have used social media or communication platforms. YouTube, TikTok, Snapchat, and Instagram emerged as the most commonly used platforms. Platform engagement increases with age and shows marked differences between boys, girls, and trans or gender-diverse youth. Boys more frequently use Reddit, Steam, or gaming chats, while girls are more present on TikTok, Instagram, and FaceTime. Trans and gender-diverse youth show even broader patterns of usage across multiple platforms.
However, the report also reveals widespread exposure to online harm. Over half of children reported having experienced cyberbullying, and one in four had encountered online hate targeted at their identity. Girls and gender-diverse youth were more likely to report being targets of cyberbullying, particularly through platforms like Snapchat and Messenger Kids. Boys were more often bullied through online games and YouTube.
The findings highlight further risks such as online grooming-type behaviours, which were reported by 14% of children. This includes inappropriate contact from adults or much older individuals, requests for sexual content, and offers of money or gifts. Most of these incidents occurred via social media, particularly Snapchat and Instagram.
Sexual harassment online was reported by 24% of children, and image-based abuse by 8%. Exposure to harmful content, such as sexist, violent, or self-harm-related material, was experienced by 71% of respondents. Again, older children were more frequently affected, and platforms like YouTube and TikTok were the most common sources of exposure.
Importantly, the report disaggregates data by age, gender, and platform, offering valuable insight into where and how harms occur. It reveals that specific online environments are more closely associated with particular types of risks, such as communication apps for cyberbullying among younger users, or gaming environments for boys.
Why does this matter for civil society?
This report provides a detailed snapshot of the online realities shaping the development, well-being, and safety of Australian children. For civil society organisations working in child protection, digital rights, education, and public health, the findings underscore the urgent need for coordinated responses. They point to gaps in platform accountability, the need for stronger protective regulation, and the importance of digital literacy and harm prevention programs tailored to children’s diverse experiences.
Equipping families, educators, and communities with the tools and awareness to support children in navigating digital spaces is no longer optional. As civil society continues to advocate for human rights in online environments, this evidence forms a critical foundation for policy development, advocacy, and service delivery that prioritises safety, equity, and inclusion for all young users.