Drivers: what’s happening
Urban infrastructure and residents are digitally connected and produce vast amounts of data. This data can be traded between governments, corporations, and citizens. IoTs, smart cities, digital urban twins, and now AI are incorporated under the premise of efficiency. However, algorithms are not neutral, and they run the delivery of services and decision-making processes in cities. As artificial intelligence and big data analytics increasingly replace human decision making, “questions about algorithmic ethics become more pressing", say Robert Brauneis of George Washington University Law School and Ellen P. Goodman, a Professor of Law at Rutgers Law School.
The numbers
There are approximately 18.8 billion connected IoT devices globally. Their amount increased by 13% in 2024 and is expected to grow to 41.1 billion by 2030.
Source: IoT analytics
New York City municipality currently uses at least 46 different algorithmic tools across 14 departments. The Department of Health and Mental Hygiene is the most reliant on these tools.
Source: Government Technology
Urban shifts
Algorithmic bias
Datasets often reflect societal prejudices. AI trained on these datasets will inadvertently learn and perpetuate existing biases. The UK's Department for Work and Pensions (DWP) used an AI system to detect welfare fraud in 2024. The algorithm disproportionately targeted groups based on age, disability status, marital status, and nationality, the Guardian reveals. In Los Angeles, a pilot program was launched in 2024 to improve housing allocation for the homeless, using machine learning. However, without proper oversight, the system could reinforce systemic inequalities present in the earlier versions. The AI-powered tenant screening tool, SafeRent, disproportionately scored Black and Hispanic renters lower than white applicants. A class action lawsuit against SafeRent was settled on November 20th, 2024.
The numbers
Commercial facial analysis systems have error rates of up to 34.7% for darker-skinned women, compared to 0.8% for lighter-skinned men.
Source: Joy Buolamwini, MIT
The VI-SPDAT system (housing applications for the homeless) falsely assessed Black individuals as being 6 times less vulnerable and Latino individuals as being 3 times less vulnerable than white individuals.
Source: Vox Media

Digital divide
Algorithms exert control over information distribution to users. Bias gatekeeps information within the AI-mediated communication process. Content filters reinforce homogeneous information at the individual level. The ‘algorithmic digital divide’ reinforces information inequality at the group level. The video recommendation algorithm of Douyin (the counterpart of TikTok in mainland China) restricted the access of socioeconomically disadvantaged groups to authenticated health information.
The numbers
Douyin’s algorithm systematically favours high-income users by providing more authenticated health-related videos, showing 38.06% of authenticated content, compared to 29.64% for users with cheap phones.
92 million low-income individuals are exposed to AI decision-making in the US.
Source: The Guardian
What does it mean for your city?
A digital divide separates those who can afford human evaluation from those subjected to algorithmic judgments. A group of MIT researchers is working on a method to identify and remove biased training data from machine-learning datasets. On October 15, 2024, 15 human rights groups filed legal action against the French government over its algorithm to detect welfare fraud. This is the first time a public algorithm has been legally challenged in France. From February 2025, the EU will ban “social scoring” AI systems that subject people to detrimental treatment based on algorithmic assessments.
Strategic opportunity
When developing governance frameworks, consider auditing and mitigating algorithmic bias. When relying on software, ask yourself, “ How are decisions being made?”

Subscribe to Urban Futures Lab and stay ahead of urban change. Get in-depth analysis, urban foresight, and strategies that help future-proof your city.
Thank you!
Urban Futures Lab is a strategic urban foresight Think Tank. We decode the forces shaping our cities and industries. We help urban decision-makers, developers, researchers, and anyone curious about the future of urban living discover the interconnected systems driving urban change. Our insights offer a holistic perspective on the future of urban ecosystems, helping you make informed, future-proof decisions.
Subscribe for exclusive access to deep dives, global drivers, and strategies to stay ahead in an ever-changing world.