GNI policy brief urges stronger civil society role in shaping AI governance

The report, Policy Brief on Government Interventions in AI, provides a detailed framework for assessing how state regulation, funding, and deployment of AI systems can influence privacy, freedom of expression, and equality.

GNI policy brief urges stronger civil society role in shaping AI governance

As governments around the world rush to regulate, fund, and deploy artificial intelligence, a new policy brief from the Global Network Initiative (GNI) warns that the future of AI governance cannot be left to state and corporate actors alone. The reportPolicy Brief on Government Interventions in AI,” makes a clear case: without strong, informed, and vocal civil society engagement, AI risks becoming a tool of control rather than empowerment.

At the heart of the GNI’s message is a simple premise with profound implications: how states choose to shape the AI ecosystem will determine whether technology serves the public interest or undermines it. The brief maps the full range of government influence across the AI value chain, from investment and procurement to regulation and diplomacy, and identifies where interventions, if unchecked, can erode privacy, silence dissent, or deepen inequality.

For civil society organisations (CSOs), the publication outlines a clear set of responsibilities. It calls on them to monitor government AI policies, advocate for transparency in the public use of AI systems, and ensure that new technologies respect established human-rights principles. The GNI also encourages CSOs to build technical expertise, engage in public education, and participate in international and national discussions on AI policy.

The GNI argues that civil society’s presence in AI policymaking is not optional – it is essential. By grounding advocacy in international human rights law, CSOs can challenge overreach, demand transparency, and ensure accountability in the design and deployment of AI systems. The brief also stresses the need for collaboration between civil society, researchers, and ethical technologists to translate complex technical systems into public-interest terms.

What makes this report particularly relevant is its global perspective. It recognises that in both developed and developing contexts, governments are building AI infrastructures faster than civic capacity to monitor them. The result is a widening gap between policy ambition and democratic oversight.

For civil society, the GNI brief is a call to action, not only to react to policy decisions but to shape the very frameworks that define how AI interacts with rights and justice. In a policy arena often dominated by technical jargon and industry lobbying, this brief re-centres the debate on people, power, and accountability – reminding everyone that the question is not just what AI can do, but who decides what it should do.

Go to Top