New York opens consultation on SAFE for Kids Act rules
Participation at this stage helps prevent weak enforcement mechanisms, closes potential loopholes, and signals public demand for meaningful protections against exploitative digital design.

On 15 September 2025, the Office of the New York State Attorney General (OAG) opened a public consultation on proposed rules for the Stop Addictive Feeds Exploitation (SAFE) for Kids Act. This law, passed in 2024, responds to mounting evidence that algorithmically personalised ‘addictive feeds’ and nighttime notifications are worsening the mental health crisis among minors. Research shows that such features encourage compulsive scrolling, disrupt sleep, and contribute to depression, anxiety, and other harms. The new rules seek to turn legislative intent into enforceable obligations for social media platforms
What the rules propose
The draft regulations cover several key areas. First, they define and prohibit addictive feeds – algorithm-driven streams based on a user’s personal data – unless verifiable parental consent is secured. Chronological feeds, direct requests, and private communications are not considered addictive. Second, platforms are barred from sending nighttime notifications about these feeds between midnight and 6 AM, again with parental consent as the only exception.
A central feature of the rules is age assurance. Platforms must deploy ‘commercially reasonable and technically feasible’ methods, certified by accredited third parties, to determine whether a user is a minor. Standards specify maximum error rates, safeguards against circumvention, and privacy-protective options such as zero-knowledge proof verification. Alongside this, platforms must offer verifiable parental consent mechanisms that are clear, easy to withdraw, and do not necessarily require government ID or account creation.
Other provisions address data protection, limiting what can be collected, mandating strong encryption, and requiring deletion once data is no longer needed. Platforms must also maintain compliance records, provide appeal processes for users misclassified as minors, and face remedies or enforcement actions from the Attorney General if they fail to comply.
Significance for civil society
This consultation is an opportunity for civil society to shape a first-of-its-kind framework regulating engagement algorithms at the state level. Civil society groups, parents, educators, and youth advocates have been among the loudest voices warning about the dangers of algorithmic design choices. Their participation is essential to ensure that the rules remain centred on protecting children while also safeguarding privacy, equity, and accessibility.
Civil society actors can contribute in several ways. Parents and caregivers can provide testimonies about the challenges of managing children’s screen time, especially around sleep disruption and compulsive use. Educators can share classroom-level impacts, such as declining attention spans or academic performance linked to late-night notifications. Advocacy groups can bring research and data on mental health harms, propose stronger safeguards against algorithmic manipulation, and highlight best practices in parental consent design. Privacy and digital rights organisations can argue for data minimisation, transparency in age verification methods, and protections against function creep.
The OAG has explicitly requested personal experiences, research evidence, and technology standards to inform its decision-making. Civil society’s submissions can ensure that the final rules are not only workable for industry but also firmly grounded in the lived realities of children and families. Participation at this stage helps prevent weak enforcement mechanisms, closes potential loopholes, and signals public demand for meaningful protections against exploitative digital design.