Rooting out bias in data will be a major challenge for smart cities which rely on data-driven decision making. A new study shows how that bias rears its head.

Researchers from New York University studied 311 calls in Kansas City, Missouri and found that some neighborhoods over-report; some under-report; and the types of complaints varied by area. This leads to disparities in delivery of city services.

More from the study:

Governance and decision-making in “smart” cities increasingly rely on resident-reported data and data-driven methods to improve the efficiency of city operations and planning. However, the issue of bias in these data and the fairness of outcomes in smart cities has received relatively limited attention. This is a troubling and significant omission, as social equity should be a critical aspect of smart cities and needs to be addressed and accounted for in the use of new technologies and data tools.

[…]

Despite greater objective and subjective needs, low-income and minority neighborhoods are less likely to report street condition or “nuisance” issues, while prioritizing more serious problems. Our findings form the basis for acknowledging and accounting for data bias in self-reported data and contribute to the more equitable delivery of city services through bias-aware data-driven processes.