Kenya issues comprehensive industry guidelines for child online protection and safety

The Communications Authority now requires all ICT providers to implement safety tools, age-verification, privacy-by-design, and complaint systems.

Kenya issues comprehensive industry guidelines for child online protection and safety

The Communications Authority of Kenya (CA) has released the Industry Guidelines for Child Online Protection and Safety, a framework designed to safeguard children in the digital environment. Rooted in the Kenya Information and Communications (Consumer Protection) Regulations of 2010, these guidelines mandate a broad set of organisational and technical standards aimed at curbing online harms and ensuring safe digital experiences for all children under the age of 18.

The guidelines, apply to all ICT licensees, service providers, vendors, and content developers in Kenya, are based on the principle that child protection in cyberspace is a collective responsibility. They call on the entire digital ecosystem – industry, parents, educators, and government – to play a role in shielding children from exposure to online sexual exploitation, cyberbullying, grooming, radicalisation, and other digital risks.

Under the framework, all ICT industry actors are required to adopt and implement detailed internal child online protection policies. These must demonstrate executive commitment, include educational strategies for children and guardians, and promote the development of culturally appropriate, creative, and learning-focused content. Organisations must also designate a focal point for child protection matters and align their practices with existing legislation, including the Children Act, 2022 and the Data Protection Act, 2019.

The guidelines stress the importance of integrating technical safety mechanisms at every level – device, network, and service. Industry players are required to deploy parental controls, enforce age-verification protocols, activate heightened privacy settings by default, and establish robust systems for handling and reporting harmful or illegal content. Service providers must also clearly outline their terms of use, prohibiting the dissemination of child sexual abuse material and supporting law enforcement investigations when such content is discovered.

Special provisions have been issued for various sectors within the ICT industry. Broadcasters must comply with specific content standards under the Programming Code and the Broadcasting Regulations. At the same time, mobile operators are obligated to register SIM cards for minors in line with national legislation. Application and content service providers, including those serving public Wi-Fi areas and learning institutions, must ensure their services integrate both organisational and technical protections. Hardware manufacturers and device vendors are instructed to enable safety features before distribution and guide users on activating protective settings.

To ensure accountability, all service providers must establish clear complaint reporting mechanisms, inform users of their right to redress, and submit quarterly reports to the Authority. Full compliance is expected within six months of the guidelines’ publication, and the Authority will actively monitor adherence and publish regular status updates.

Go to Top