Canadian privacy watchdogs find TikTok failing to protect children’s data

A joint investigation by Canada’s federal and provincial privacy authorities has found that TikTok’s safeguards to protect children from having their personal information collected and used were not adequate. The inquiry was carried out by the Office of the Privacy Commissioner of Canada along with counterparts in Quebec, British Columbia, and Alberta.
Although TikTok’s rules state that the platform is not intended for children under 13, the investigation revealed that hundreds of thousands of Canadian children use the app each year. Their personal data was being collected and sometimes used for profiling and targeted advertising, practices that can influence children’s development and expose them to inappropriate marketing.
The probe also highlighted broader concerns: TikTok was not transparent enough about how it processes user data, including for teenagers and adults, nor did it always obtain meaningful consent as required under Canadian privacy law. Investigators warned that children, in particular, are not able to fully understand or consent to such practices.
In response, TikTok has pledged to make several changes. These include improving age-verification systems, providing clearer explanations of its data practices, expanding privacy information in both English and French, and ensuring that users under 18 are not targeted with personalised ads beyond broad categories like language and location.
Why does it matter?
This investigation matters for civil society because it shows how digital power is shaping children’s lives in invisible ways. When a child’s online behaviour is tracked, profiled, and fed into algorithms, it is not only a privacy issue — it becomes a question of who gets to influence their identity, their choices, and even their well-being. Civil society groups can play a decisive role by pushing for stronger laws, demanding accountability from tech companies, and educating families about how to protect children’s data. They can also create spaces for young people’s voices in policy debates, ensuring that the very users affected by these practices are not left out of the conversation. In this way, civil society helps turn abstract privacy concerns into collective action to safeguard children’s rights in the digital world.