UPDF AI

Data Bias in Human Mobility is a Universal Phenomenon but is Highly Location-specific

Katinka den Nijs,Elisa Omodei,V. Sekara

2025 · DOI: 10.48550/arXiv.2508.00149
arXiv.org · 0 Citations

TLDR

It is found that bias is a universal phenomenon, occurring in all cities, and that each city suffers from its own manifestation of it, and that location-specific models are required to model bias for each city.

Abstract

Large-scale human mobility datasets play increasingly critical roles in many algorithmic systems, business processes and policy decisions. Unfortunately there has been little focus on understanding bias and other fundamental shortcomings of the datasets and how they impact downstream analyses and prediction tasks. In this work, we study `data production', quantifying not only whether individuals are represented in big digital datasets, but also how they are represented in terms of how much data they produce. We study GPS mobility data collected from anonymized smartphones for ten major US cities and find that data points can be more unequally distributed between users than wealth. We build models to predict the number of data points we can expect to be produced by the composition of demographic groups living in census tracts, and find strong effects of wealth, ethnicity, and education on data production. While we find that bias is a universal phenomenon, occurring in all cities, we further find that each city suffers from its own manifestation of it, and that location-specific models are required to model bias for each city. This work raises serious questions about general approaches to debias human mobility data and urges further research.

Cited Papers
Citing Papers