EU considers bloc-wide social media age limits as member states introduce national rules
The European Commission is examining whether to introduce common age restrictions for social media across the EU, as countries such as France, Spain, and Denmark move forward with their own measures to protect children online.
The European Union is assessing whether to introduce age limits for access to social media platforms across all member states, following growing concerns about children’s safety online and the emergence of national restrictions. The issue was outlined in the European Commission’s recent action plan addressing cyberbullying, which highlights the need for coordinated measures to protect minors in digital environments.
Social media platforms such as TikTok, Instagram, and Snapchat allow users to create profiles, communicate, and share content. Most platforms already set minimum age requirements, typically 13 years, but enforcement varies and often relies on users self-declaring their age. Some EU countries, including France, Spain, and Denmark, are now considering or implementing national laws that would impose stricter and more enforceable age limits. These national initiatives have increased pressure on the EU to examine whether a common framework is needed.
To support this process, the European Commission has established an expert panel focused on child protection in the digital environment. The panel, expected to provide recommendations by summer 2026, will assess possible options for EU-level measures. These may include legislative proposals, as well as measures aimed at helping parents and guardians better understand and manage children’s use of digital services.
One of the key concerns identified by the Commission is the risk of fragmentation. If individual countries adopt different age requirements or enforcement rules, children across the EU may receive different levels of protection depending on where they live. A common EU-wide approach could create more consistent standards and reduce regulatory differences that platforms must navigate when operating across multiple countries.
Until now, the EU has primarily relied on the Digital Services Act, which requires online platforms to take steps to protect minors but does not impose a specific minimum age across the bloc. Instead, the law provides general obligations and guidance, allowing platforms some flexibility in how they implement safeguards. However, the increasing number of national initiatives suggests that the EU may move towards more formal and harmonised regulation.
