Oxford Internet Institute study finds ChatGPT reinforces global inequalities in how places are portrayed
New academic research suggests that ChatGPT systematically favours wealthier regions when describing people and places, reflecting structural biases in the data used to train large language models.
A new study by researchers from the Oxford Internet Institute at the University of Oxford and the University of Kentucky finds that ChatGPT consistently reinforces global inequalities in its responses.
The researchers analysed more than 20 million ChatGPT queries that compared countries, cities, and neighbourhoods on attributes such as intelligence, safety, happiness, and innovation. Across these comparisons, the system tended to rank higher-income regions, including the United States, Western Europe, and parts of East Asia, more favourably. Lower-income regions, particularly in Africa, the Middle East, Latin America, and parts of Asia, were far more likely to appear at the bottom of rankings.
According to the study, these patterns appeared not only in subjective questions, such as perceptions of beauty, but also in prompts that seemed more objective. Neighbourhood-level results in cities such as London, New York, and Rio de Janeiro closely mirrored existing social and racial divides rather than measurable community characteristics.
The findings are detailed in a paper published in Platforms and Society. The authors argue that the observed biases are structural features of large language models, rooted in uneven global patterns of data production. Regions with extensive English-language content and stronger digital visibility are more likely to be represented positively, while others remain underrepresented or stereotyped.
As generative AI systems are increasingly used in education, public services, and everyday decision-making, the researchers caution against treating their outputs as neutral or authoritative. They call for greater transparency from AI developers, independent auditing of model behaviour, and a more critical public understanding of how these systems produce knowledge about people and places.
