This Algorithm Could Ruin Your Life
WIRED
Ref: SRC-003-NLD-002
Accessed: 3/26/2026
Summary
Joint WIRED/Lighthouse Reports investigation documenting how Rotterdam's Accenture-built algorithm assigned risk scores based on 315 factors including age, gender, language skills, and subjective caseworker assessments. Reports that the city paused the system in 2021 after government-backed auditors found citizens could not tell if flagged and data risked biased outputs. Documents Accenture's 'unbiased citizen outcomes' marketing claim versus the reality of systematic discrimination.
View Harvard reference
Constantaras, E., Geiger, G., Braun, J.A. and Mehrotra, D. (2023) 'This Algorithm Could Ruin Your Life', WIRED, 6 March. Available at: https://www.wired.com/story/welfare-algorithms-discrimination/ (Accessed: 26 March 2026).