Back to Sources
Racism and Technology Center (2023) News article / media

Racist Technology in Action: Rotterdam's welfare fraud prediction algorithm was biased

Racism and Technology Center

Ref: SRC-005-NLD-002

Accessed: 3/26/2026

Summary

Analysis characterising Rotterdam's algorithm as 'racist technology in action', noting the systemic pattern of Dutch government agencies deploying discriminatory automated systems against vulnerable populations. Draws connections to the Dutch childcare benefits scandal (toeslagenaffaire).

View Harvard reference

Racism and Technology Center (2023) 'Racist Technology in Action: Rotterdam's welfare fraud prediction algorithm was biased', Racism and Technology Center, 17 March. Available at: https://racismandtechnology.center/2023/03/17/racist-technology-in-action-rotterdams-welfare-fraud-prediction-algorithm-was-biased/ (Accessed: 26 March 2026).

Attached File

FILE
SRC-005-NLD-002_Racist_Technology_in_Action_Rotterdams_welfare_fraud_prediction_algorithm_was_bi.html 47 KB
Download