SyRI -- System Risk Indication (Systeem Risico Indicatie)
Overview
The System Risk Indication (Systeem Risico Indicatie, or SyRI) was a risk-scoring system developed by the Dutch Ministry of Social Affairs and Employment to detect potential fraud across social security benefits, tax allowances, and labour law compliance. Enacted into law in 2014 through amendments to the SUWI Act (Wet Structuur Uitvoeringsorganisatie Werk en Inkomen), specifically Article 64 (authorising cross-agency data linkage) and Article 65 (authorising the Minister to process data through a risk model), SyRI gave central and local government authorities sweeping powers to share and link personal data that had previously been held in separate administrative silos. The system was designed to identify so-called 'unlikely citizen profiles' -- individuals whose data patterns across multiple government databases suggested an elevated probability of benefits fraud -- and to flag them for intensive investigation.
SyRI processed up to 17 broadly defined categories of personal data as specified in Article 5a.1(3) of the SUWI Decree (Besluit SUWI). These categories included employment records, data on administrative sanctions and penalties, fiscal and tax data, real estate and property information, address data, identification data, trade and business data, data related to the integration of foreigners, historical compliance data, educational records, pension data, reintegration data, debt information, data on social security benefit receipt, data on permits and exemptions, childcare allowance data, and health insurance data. The Dutch Council of State observed that these categories were so broad that 'hardly any personal data' could not be processed under the framework. Data was gathered from agencies including the tax authority, municipal social services, the Employee Insurance Agency (UWV), the Social Insurance Bank (SVB), and other public bodies.
The technical operation of SyRI involved two phases. In the first phase, the Inlichtingenbureau -- a private foundation established by the Association of Netherlands Municipalities (VNG) to facilitate data exchange between government bodies -- acted as data processor. The Inlichtingenbureau collected data from participating administrative organs, pseudonymised it by replacing citizen names with unique identifiers, and linked the data across sources. In the second phase, the combined pseudonymised dataset was automatically checked against a risk model containing undisclosed risk indicators. The analysis generated a list of identifiers representing individuals with a heightened risk indication. These identifiers were then de-pseudonymised back to real names, producing risk reports that could be retained for up to two years. Critically, the risk model itself -- including the specific indicators, weightings, and algorithmic logic -- was never disclosed to the public, to affected individuals, or even to the court during litigation.
SyRI was deployed using a 'neighbourhood-oriented approach', meaning it was applied to specific geographic areas rather than the population at large. Between 2008 and 2014, there were 22 projects using SyRI or its precursor systems. From 2015 onward, five additional SyRI projects were conducted. Deployments targeted low-income neighbourhoods including Capelle aan den IJssel, Eindhoven (project G.A.L.O.P. II), the Afrikaanderwijk in Rotterdam, Rotterdam Bloemhof and Hillesluis, and Schalkwijk in Haarlem. Nineteen of the 22 original projects used this neighbourhood-based targeting approach. A planned deployment in Rotterdam-Zuid in early 2019 was halted by Mayor Ahmed Aboutaleb due to unresolved disagreements with the Ministry about the system's legal basis. Significantly, despite years of operation, SyRI had not detected a single new fraudster according to reporting by the Volkskrant in June 2019.
In early 2018, a coalition of civil society organisations led by the Public Interest Litigation Project of the Netherlands Committee of Jurists for Human Rights (PILP-NJCM) and the Platform Bescherming Burgerrechten (Platform for Civil Rights Protection), along with the Dutch trade union federation FNV, Privacy First, and two individual citizens including authors Tommy Wieringa and Maxim Februari, filed a lawsuit against the Dutch state. The coalition launched a public campaign called 'Bij Voorbaat Verdacht' (Suspected from the Outset). In October 2019, the UN Special Rapporteur on extreme poverty and human rights, Philip Alston, submitted an amicus curiae brief to the court, criticising SyRI as posing 'significant potential threats to human rights, in particular for the poorest in society' and noting the broader trend of digital welfare states disproportionately affecting vulnerable populations.
On 5 February 2020, the District Court of The Hague (case C-09-550982-HA ZA 18-388) ruled that the SyRI legislation was unlawful under Article 8 of the European Convention on Human Rights (right to respect for private and family life). The court accepted the state's argument that fraud detection constituted a 'pressing social need' but concluded that the legislation failed to strike a 'fair balance' between the objectives of fraud prevention and the invasion of citizens' privacy rights. Key deficiencies identified by the court included the system's fundamental lack of transparency and verifiability, the excessive breadth of data categories that could be processed, the absence of any duty to inform individuals that their data had been processed, the risk of discrimination against people in lower-income neighbourhoods and those with migrant backgrounds, and the insufficiency of existing safeguards against privacy violations. The government announced on 23 April 2020 that it would not appeal, making the judgment final. The SyRI ruling is widely regarded as one of the first court decisions in Europe to strike down an algorithmic risk-scoring system used in social protection on human rights grounds, and it served as a significant precedent in the broader debate about algorithmic accountability and the digital welfare state. The case is closely linked to the subsequent Dutch childcare benefits scandal (Toeslagenaffaire), in which the Tax and Customs Administration was found to have used algorithms that racially profiled families, ultimately leading to the resignation of the Rutte government in January 2021.
Classification
AI Capabilities
Use Cases
Social Protection Functions
| SP Pillar (Primary) | Social assistance |
| SP Pillar (Secondary) | Social insurance |
Programme Details
| Programme Name | SyRI -- System Risk Indication (Systeem Risico Indicatie) |
| Programme Type | Other |
| System Level | Implementation/delivery chain |
Cross-cutting fraud detection system applied across multiple Dutch social security, benefits, tax, and labour law compliance programmes. Not a benefits programme itself but a risk-scoring tool designed to identify potential fraud across the full spectrum of Dutch social protection and fiscal systems.
Implementation Details
| Implementation Type | Classical ML |
| Lifecycle Stage | Monitoring, Maintenance and Decommissioning |
| Model Provenance | Not documented |
| Compute Environment | Not documented |
| Sovereignty Quadrant | Not assessed |
| Data Residency | Not documented |
| Cross-Border Transfer | Not documented |
Risk & Oversight
| Decision Criticality | High |
| Human Oversight | HOTL |
| Development Process | Mix of in-house and third-party |
| Highest Risk Category | Governance and institutional oversight risks |
| Risk Assessment Status | Independent audit completed |
Documented Risk Events
Court ruled SyRI legislation unlawful under Article 8 ECHR (5 February 2020, case C-09-550982-HA ZA 18-388). System failed to detect a single new fraud case despite years of operation (Volkskrant, June 2019). Targeted exclusively low-income and ethnically diverse neighbourhoods. Risk model and indicators never disclosed to public, affected individuals, or the court. UN Special Rapporteur Philip Alston submitted amicus brief criticising the system. Closely linked to subsequent Toeslagenaffaire childcare benefits scandal involving racial profiling by Dutch tax authority algorithms.
Risk Dimensions
Data-related risks
Governance and institutional oversight risks
Market, sovereignty and industry structure risks
Model-related risks
Operational and system integration risks
Impact Dimensions
Accountability, transparency and redress
Autonomy, human dignity and due process
Equality, non-discrimination, fairness and inclusion
Privacy and data security
Systemic and societal
Safeguards
Deployment & Outcomes
| Deployment Status | Suspended / Halted |
| Year Initiated | 2014 |
| Scale / Coverage | Deployed in selected low-income neighbourhoods across 5 municipalities (Capelle aan den IJssel, Eindhoven, Rotterdam, Haarlem); 27 projects total between 2008 and 2019 using SyRI or precursor systems |
| Funding Source | Dutch government budget (Ministry of Social Affairs and Employment) |
| Technical Partners | Inlichtingenbureau (private foundation under VNG, acted as data processor); system developed under Dutch government direction |
Outcomes / Results
System discontinued following the 5 February 2020 District Court of The Hague judgment. SyRI had not detected a single new fraudster despite years of operation. The court found the system disproportionate and lacking fair balance between privacy rights and fraud detection objectives. The ruling became a landmark precedent for algorithmic accountability in social protection systems across Europe.
Challenges
Fundamental lack of transparency: risk model, indicators, and algorithmic logic were never disclosed. Neighbourhood-based targeting created inherent discrimination risk against low-income and migrant communities. No mechanism to inform individuals that their data had been processed or to contest risk indications. Excessively broad data categories enabled near-total surveillance of citizens' administrative records. The system's complete failure to detect fraud undermined its stated justification.
Sources
- SRC-004-NLD-001 AlgorithmWatch (2020) 'How Dutch activists got an invasive fraud detection algorithm banned', Automating Society Report 2020.
https://algorithmwatch.org/en/syri-netherlands-algorithm/ - SRC-002-NLD-001 Constantinou, A. (2022) 'Human Rights Implications of the Use of AI in the Digital Welfare State: Lessons Learned from the Dutch SyRI Case', Human Rights Law Review, 22(2), ngac010.
https://academic.oup.com/hrlr/article/22/2/ngac010/6568079 - SRC-001-NLD-001 District Court of The Hague (2020) Judgment in case C-09-550982-HA ZA 18-388 (NJCM v. the State of the Netherlands), 5 February 2020.
https://uitspraken.rechtspraak.nl/details?id=ECLI:NL:RBDHA:2020:1878 - SRC-005-NLD-001 PILP-NJCM (n.d.) 'System Risk Indication (SyRI)', dossier page.
https://pilp.nu/en/dossier/system-risk-indication-syri/ - SRC-006-NLD-001 Privacy International (2020) 'The SyRI case: a landmark ruling for benefits claimants around the world', 5 February 2020.
https://privacyinternational.org/news-analysis/3363/syri-case-landmark-ruling-benefits-claimants-around-world - SRC-003-NLD-001 OHCHR (2020) 'Landmark ruling by Dutch court stops government attempts to spy on the poor -- UN expert', press release, 5 February 2020.
https://www.ohchr.org/en/press-releases/2020/02/landmark-ruling-dutch-court-stops-government-attempts-spy-poor-un-expert
How to Cite
DCI AI Hub (2026). 'SyRI -- System Risk Indication (Systeem Risico Indicatie)', AI Hub AI Tracker, case NLD-001. Digital Convergence Initiative. Available at: https://socialprotectionai.org/use-case/NLD-001