Kazakhstan introduces audit requirement for high-risk AI systems

Kazakhstan has adopted new rules requiring audits for high-risk AI systems before their inclusion in official registries. The framework aims to improve oversight, transparency, and accountability in the use of AI across sectors.

Kazakhstan introduces audit requirement for high-risk AI systems

Kazakhstan has introduced a regulatory framework mandating audits of high-risk artificial intelligence systems prior to their inclusion in official government lists. The measures establish procedures for identifying, assessing, and publishing AI systems considered suitable for deployment.

Under the new rules, sectoral authorities are responsible for compiling and maintaining lists of high-risk AI systems. These registries will be made publicly available on government websites, with the stated aim of increasing transparency and public trust in the use of AI technologies.

Developers and system owners seeking inclusion must submit formal applications supported by documentation. This includes proof of intellectual property rights and a positive audit outcome confirming that the system meets the required criteria.

Authorities will assess applications within ten working days, examining the system’s purpose, functionality, and compliance with documentation requirements. Systems that meet the criteria will be added to the official list and published within five working days.

Where inconsistencies or gaps are identified, applicants will be notified and allowed to revise and resubmit their documentation within a shorter timeframe. The framework also предусматриes regular updates to the lists as systems evolve or new applications are submitted.

Go to Top