Child safety online

AI and child online protection

AI provides a powerful tool for protecting children online, but it is also being misused by criminals to harm and exploit children online.

AI’s misuse harm children

AI algorithms can help criminals identify weaknesses in online security measures, making it easier to access children’s personal data. AI can also be used to create fake social media accounts, which are then used by criminals to target vulnerable children. AI-enabled chatbots are being used as part of elaborate schemes to lure unsuspecting children into dangerous situations or activities. Furthermore, deepfakes generated with the help of AI technology can be used for various malicious purposes, such as blackmail, extortion, or spreading false information.

AI as a tool to protect children online

AI-based algorithms can detect inappropriate content on social media sites and automatically remove it. AI can also help prevent cyberbullying by using natural language processing (NLP) to analyse text for threatening language or toxic behaviour, allowing moderators to quickly intervene when necessary. AI can also be used to analyse content uploaded by a child’s contacts and by the child themself, helping parents identify potential risk factors associated with their child’s digital companions and the child’s own behaviour.

In addition, AI can detect, filter and remove child sexual abuse materials (CSAM) on the internet. AI-based algorithms are trained to recognise images, videos, and language patterns (hash technologies) that correspond to CSAM, allowing them to quickly identify suspicious content. By scanning websites and social media sites for this content, AI can help prevent children from being exposed to it. Face recognition can also help law enforcement agencies detect victims more quickly. AI can moderate live-streaming events, chatrooms, and other real-time online conversations, where inappropriate interactions between minors and adults can occur. In addition, AI tools can be used to analyse uploaded photos or videos for signs of potential exploitation of children, allowing parents or moderators to intervene if necessary. Read the topic on AI and children’s rights.

Learn more on AI Governance.

Did you know that one in three internet users in the world is a child? Worldwide, young people were 1.24 times more likely to be connected than the rest of the population. Children’s use of the internet and digital technology has increased significantly, mainly due to the shift from television to online viewing, the accessibility and popularity of mobile devices, and the use of technology as part of the educational system. While, for many children, the distinction between the online and offline world is no longer clear, there is a worrying digital divide worldwide. Globally, two-thirds of children and young people aged 25 and under, do not have an internet connection at home. In high-income countries, 87% of children and young people have an internet connection at home, compared to only 6% in low-income countries. 

Children around the world today use digital technologies to access the internet for learning and entertainment, to communicate with their friends and the world around them, to acquire diverse information, and to create and expand their opportunities. Yet, an increasingly complex online environment also presents risks for the safety of children online. Children are especially vulnerable to the risks of the internet, which include age-inappropriate, illegal and harmful content, interactions and activities, privacy violations, and overuse.

When it comes to promoting the benefits of technology for children while, at the same time, fostering a safe and secure online environment, stakeholders need to strike a careful balance between the need to safeguard children, and the need to respect children’s digital rights. The sections below tackle the security aspect of children’s internet use. 

Child safety online

Multiple online risks

Any child can, potentially, be affected by harmful online content and activities. Keeping children safe online also helps create safer environments for children offline. Recent research shows that children who are vulnerable offline are more likely to be vulnerable online, and children’s overall well-being appears to influence their use of the internet. For example, exposure to negative online content, cyber hate, discrimination, and violent extremism is associated with lower levels of happiness and life satisfaction. In addition, children who are exposed to one type of online risk are more likely to be exposed to other types of online risk, as well. Online and offline perpetration and victimisation are also highly correlated. A child can be both a victim and a perpetrator of cyberbullying. Perpetrators of cyberbullying and sextortion are also more likely to be the victims.

Online privacy risk

Children and young people are vulnerable where privacy risks are concerned. Children who do not have sufficient digital literacy may not know how to protect their own personal data and that of others online, what to share and with whom, how to use privacy settings, and how personal data is collected and used by companies or institutions. Children’s data is often collected and processed without genuine informed consent, including geolocation, biometrics, and other sensitive information. This can potentially lead to misuse, identity theft, invasion of privacy, receiving inappropriate advertising and spam, and incurring hidden costs. 

Harmful content and activities online

Children and young people are also likely to be in contact with harmful content and activities online, which may include:

  • content and activities that depict illegal or psychologically maladaptive behaviour, including content that instructs or encourages young people to engage in health-endangering behaviour, such as harming themselves or others, or promoting eating disorders
  • content and activities that portray discrimination, prejudice, hatred, harassment, cyberbullying, gambling, violence or that are specifically targeted at a marginalised group on the basis of race, ethnicity, gender, sexual orientation, religion, identity or ability status 
  • content and activities that misuse social comparisons related to physical appearance, in particular, beauty or appearance-related material, may have a serious negative impact on children’s mental health
  • content and activities that could lead to harmful online sexual abuse and exploitation of children, including pornographic or unwanted sexual material, user-generated sexual images or videos, sexting, sexual harassment, etc. 

Children and adolescents who lack mature self-control skills or effective guidance by caregivers may be at risk of becoming addicted to social media and digital games, leading to more serious psychological harm over time, including gaming disorder.

While the issue of child sexual abuse is not new, the internet has exacerbated the problem. One of the main reasons is that the internet – particularly the darknet (an online space for content which is not normally picked up by search engines) – makes it easy to access child sexual abuse content and to make contact with vulnerable children and young people.

The online risks that children face may put them in situations in which they experience sexual violence of some form. Online child sexual abuse and exploitation includes all acts of child sexual abuse, child prostitution, making or distributing child sexual abuse material, child corruption (intentionally causing a child to witness sexual abuse or sexual activity for sexual purposes), and child solicitation for sexual purposes. These acts may occur virtually, but may be preceded or accompanied by, or may evolve into, exploitation occurring offline. Children can be exposed to predators, leading to grooming and online and/or offline abuse or exploitation. They themselves can also be coerced to become perpetrators of illegal activities. This can include situations in which children are persuaded to create and share sexual content of themselves, which others may then use to harass them. 

Child sexual abuse materials have proliferated at ever-increasing rates. According to the Internet Watch Foundation (IWF), in 2022, more Category A child sexual abuse material online than ever has been identified, more than doubling since 2020. IWF investigated more than 375,000 reports suspected to contain child sexual abuse imagery and nearly 200,000 URLs, confirmed as child sexual abuse material, contained images and videos made and/or shared via an internet-connected device with a camera in 2022, with an increase of 4% and 9% in 2021, respectively. Much of the content that is circulating openly on the internet is recirculated, older content; new content, which could indicate a new victim, is mostly found on the darknet.

A more recent trend in child exploitation is the commercialisation of child sexual exploitation through live distant child abuse (LDCA) – also referred to as on-demand child sexual abuse, or cybersex trafficking – through which perpetrators can direct abuse in real-time. Payment is, generally, made through online payment methods; cryptocurrencies are used to a small extent, possibly because the number of perpetrators who utilise cryptocurrency may still be minimal.

When content depicting a child being sexually abused is discovered online, there are two clear priorities: to remove the content from public view and to find the victim. But this is not always straightforward: criminals have started using artificial intelligence (AI) technology to create ‘deepfakes’, a tool which allows them to swap the faces of victims in moving imagery, making the identification of victims more complex.

There is no single solution that can mitigate the risks children face when using the internet. Rather, a multi-faceted approach must be implemented to tackle the wide array of risks in a broad manner. Such an approach combines: 

  • international treaties, guidance, and collaborative actions, guided by the UN Convention on the Rights of the Child
  • national legislations, policies, and regulations, with efforts from the civil societies
  • industry and business principles and self-regulation 
  • technological progress on more efficient protection of children from online harm
  • education
  • awareness building
  • peer support and community engagement

A safer ecosystem

All stakeholders have a responsibility to ensure child safety online. Parents and educators have a responsibility to guide and support young people, especially younger children, to use services that promote positive behaviours. They play an important role in education and awareness, which is considered an important first line of defence in mitigating risks. 

International organisations, governments, and industry have a responsibility to ensure that the online environment is safe and secure. The proposed Global Digital Compact, which will be agreed upon at the Summit of the Future in September 2024, should outline principles for an open, free, and secure digital future for all, especially for children and next generations. Although the industry has used many tools, such as filters and reporting mechanisms, it has recently been under public and regulative pressure to step up its efforts to counter abusive content and activities, and protect children’s data privacy and safety. This includes more restrictive regulations on algorithm-based automatic content recommendation systems, end-to-end encryption of messages, generative AI, and other services powered by emerging technologies when targeted at children.

Many professionals who are experts in the field are likely to be active in civil society, including non-profit organisations and research communities, and can provide invaluable input through knowledge and experience. Children-focused NGOs, child helplines, and healthcare professionals are also important stakeholders in the fight against child sexual abuse and exploitation – both online and offline – and are valuable partners in understanding the scale and nature of the problem, and in providing counselling and support for victims of abuse.

Go to Top