Center for Democracy & Technology study reveals gaps in child-safety policies

Based on qualitative research with 45 teens and parents, the study finds that several common policy proposals – including age-verification mandates, strict screen-time controls, parental access tools, and chronological feeds — often conflict with users’ privacy concerns, preferences, and daily realities.

A new report from the Center for Democracy & Technology (CDT) examines the growing disconnect between government child-safety proposals for social media and how young people and their parents perceive these measures in practice. As child-safety regulation becomes a global policy priority, CDT’s research suggests that many proposed interventions remain untested, risk creating unintended harms, and do not align with the groups they are designed to protect.

The study draws on qualitative research with 45 parents and teens and evaluates four prominent policy categories: age-verification systems, screen-time restrictions, algorithmic feed controls, and parental access tools. Participants consistently raised concerns about privacy, intrusiveness, and the practicality of current policy ideas.

For age verification, ID checks and facial-scanning systems were described as invasive and unreliable. Teens and parents instead favoured parent-centred methods that allow caregivers to declare age and approve app downloads, a model they viewed as more flexible and less privacy-invasive. Views on algorithmic feeds diverged between teens and policymakers: while policy debates often call for chronological feeds, teens in this study generally trusted algorithmic recommendations and preferred minimal, unobtrusive controls such as “not interested” functions. Parents expressed interest in tools that help them set content boundaries without disrupting their children’s experience.

Screen-time features were seen as helpful when they involved reminders or parent-led limits, but respondents rejected strict, app-enforced curfews or content-based time blocks as intrusive and unrealistic. On parental access, participants supported visibility and oversight for major actions—such as downloading new social-media apps -but opposed parental ability to delete posts or apps without the teen’s consent. Both groups indicated that excessive approvals for routine activities would be counterproductive.

Across all themes, CDT’s research underscores the need for flexible, transparent, and family-aware approaches. The findings suggest that one-size-fits-all statutory requirements risk creating resistance, workarounds, or unintended harm. The report argues that centring user experience is essential for designing child-safety interventions that are both effective and respectful of privacy, autonomy, and family dynamics.

Go to Top