IRL-001

Cogent Facial Image Matching Software for Welfare Fraud Detection

Download PDF
Ireland Europe & Central Asia High income Suspended / Halted Confirmed

Department of Social Protection (DSP)

At a Glance

What it does Perception and extraction from unstructured inputs — Identification, verification and record linkage
Who runs it Department of Social Protection (DSP)
Programme Cogent Facial Image Matching Software for Welfare Fraud Detection
Confidence Confirmed
Deployment Status Suspended / Halted
Key Risks Data-related risks
Key Outcomes 155 suspected fraud cases referred by February 2017 (up from 62 in 2015); 22 cases finalized in court with 17 custodial sentences; largest case involved 478,000 euros in fraudulent claims using 12 false identities (5-year sentence); total fraud overpayments assessed at EUR 1,590,500; EUR 461,470 in savings from ceased payments; total savings exceeded EUR 4 million by 2018; DPC fined DSP EUR 550,000 in June 2025 for GDPR violations.
Source Quality 4 sources — News article / media, Legal document / regulation

Ireland's Department of Social Protection (DSP) deployed cogent facial image matching software to detect identity fraud across its welfare benefits system. The system was operational by 2015 and was designed to identify individuals who had registered for welfare benefits under multiple false identities, enabling them to claim jobseeker's allowance, rent supplement and other social assistance payments multiple times. The software works by comparing a photograph of each new applicant captured during the welfare registration process against the entire database of photographs held by the Department of Social Protection. When a potential match is found — indicating that an applicant may already be registered under a different identity — the case is immediately flagged and referred to the Department's special investigation unit for prioritised review.

The system uses deep learning-based facial recognition technology. Modern facial image matching software capable of operating at scale against a national database — as described in the Department's implementation — relies on convolutional neural networks (CNNs) to encode facial features into high-dimensional vector representations and then perform similarity matching across millions of stored images. The vendor, Cogent Systems (now part of 3M's Identity Solutions division), is a well-known provider of biometric identification technologies to government agencies worldwide. The software performs one-to-many (1:N) matching, meaning each new registrant's photograph is compared against every existing photograph in the DSP database, rather than simply verifying a claimed identity against a single stored record. The original system cost EUR 213,000 when deployed in 2012. In 2018, Gemalto was contracted for EUR 383,000 to upgrade the facial verification software, as algorithm standards had improved substantially since the original deployment.

By September 2015, the system had referred 62 cases of suspected fraud to An Garda Siochana (Ireland's national police service) or the Department's own investigations unit. By February 2017, the number of fraud cases had risen to 155, with 22 cases finalized in court and 17 resulting in custodial sentences. A further 18 cases were under Director of Public Prosecutions proceedings and 100 remained under investigation. The most significant case involved an individual who used 12 false identities to fraudulently claim approximately 478,000 euros in welfare payments, including jobseeker's allowance, rent supplement and other benefits. This individual received a five-year custodial sentence at Dublin Circuit Court. Another notable case involved Adrian Vaduva, who used multiple identities to claim approximately EUR 280,000. Other cases finalised in court included an individual who received a two-year custodial sentence for fraudulently claiming 62,000 euros, another who received a three-year suspended sentence at Longford Circuit Court for obtaining 59,000 euros through use of multiple identities, and a further individual who received a three-year custodial sentence with 18 months suspended for using a false identity to obtain 17,000 euros in welfare benefits.

Total fraud overpayments assessed reached EUR 1,590,500, with EUR 461,470 in savings from ceased payments. The system saved more than EUR 4 million in total by 2018, with EUR 1.734 million saved in 2016 and EUR 894,000 in 2017. In the first half of 2018, 28 additional fraud cases were identified. These results were set against the context of Ireland's 19 billion euro annual welfare budget, with research indicating that fraud was involved in approximately 3 per cent of benefit payments.

The decision criticality of this system is high because a positive match from the facial recognition software triggers a formal fraud investigation, referral to the national police, and can result in criminal prosecution, imprisonment and recovery of overpayments. All matches identified by the software are sent to the Department's special investigation unit and prioritised for investigation, indicating a human-in-the-loop process where human investigators review and act on the system's automated flagging. Identity fraud cases are referred to An Garda Siochana under criminal justice legislation.

The system raises significant privacy and civil liberties concerns typical of mass biometric surveillance in social protection contexts. Every new welfare applicant's photograph is compared against the entire DSP database, constituting a form of mass biometric screening. The facial recognition system was linked to the Public Services Card photo database, with biometric templates held for approximately 70 per cent of Ireland's population, including more than 13,000 children.

On 12 June 2025, the Data Protection Commission (DPC) fined the DSP EUR 550,000 for GDPR violations in connection with the use of facial matching technology linked to the Public Services Card. The DPC found no valid lawful basis for the collection and processing of biometric data. The decision identified four categories of infringement: lawful basis deficiency (Articles 5(1)(a), 6(1), 9(1)), retention issues (Article 5(1)(e)), transparency failures (Articles 13(1)(c), 13(2)(a)), and gaps in the Data Protection Impact Assessment (Article 35(7)(b),(c)). The DSP was given a nine-month deadline to either cease biometric processing or identify a valid legal basis. This regulatory action makes this one of the most significant cases of AI governance failure in European social protection, demonstrating how a system that delivered measurable fraud detection results can nonetheless operate in fundamental breach of data protection law.

Classifications follow the DCI AI Hub Taxonomy. Hover over field labels for definitions.

Social Protection Functions

Implementation/delivery chain
Registration primaryAccountability mechanisms
SP Pillar (Primary) The social protection branch: social assistance, social insurance, or labour market programmes. Social assistance
Programme Name Cogent Facial Image Matching Software for Welfare Fraud Detection
Programme Type The type of social protection programme, classified under social assistance, social insurance, or labour market programmes. View in glossary Other
System Level Where in the social protection system the AI is applied: policy level, programme design, or implementation/delivery chain. View in glossary Implementation/delivery chain
Programme Description Facial recognition software deployed by Ireland's Department of Social Protection to compare applicant photographs against the entire DSP database during welfare registration, detecting individuals registered under multiple false identities.
Implementation Type How the AI output is produced: Classical ML, Deep learning, Foundation model, or Hybrid. Affects validation, compute requirements, and governance profile. View in glossary Deep learning
Lifecycle Stage Current stage in the AI lifecycle, from problem identification through to monitoring, maintenance and decommissioning. View in glossary Monitoring, Maintenance and Decommissioning
Model Provenance Origin of the AI model: developed in-house, adapted from open-source, commercial/proprietary, or accessed via third-party API. View in glossary Commercial/proprietary
Compute Environment Where the AI system runs: on-premise, government cloud, commercial cloud, or edge/device. View in glossary Not documented
Sovereignty Quadrant Classification of data and compute sovereignty: I (Sovereign), II (Federated/Hybrid), III (Cloud with safeguards), or IV (Shared Innovation Zone). View in glossary Not assessed
Data Residency Where the data used by the AI system is stored: domestic, regional, or international. View in glossary Not documented
Cross-Border Transfer Whether data crosses national borders, and if so, whether documented safeguards are in place. View in glossary Not documented
Decision Criticality The rights impact of the decision the AI supports. High criticality requires HITL oversight; moderate requires HOTL; low may operate HOOTL. View in glossary High
Human Oversight Type Level of human involvement: Human-in-the-Loop (active review), Human-on-the-Loop (monitoring), or Human-out-of-the-Loop (periodic audit). View in glossary HITL
Development Process Whether the AI system was developed fully in-house, through a mix of in-house and third-party, or fully by an external provider. View in glossary Not documented
Highest Risk Category The most significant structural risk source identified: data, model, operational, governance, or market/sovereignty risks. View in glossary Data-related risks
Risk Assessment Status Whether a formal risk assessment, informal assessment, or independent audit has been conducted for this system. Formal assessment
Documented Risk Events On 12 June 2025, Ireland's Data Protection Commission fined the DSP EUR 550,000 for GDPR violations related to the facial matching system. The DPC found no valid lawful basis for biometric data collection, with templates held for 70% of the population including 13,000+ children. Four categories of infringement identified: lawful basis deficiency (Articles 5(1)(a), 6(1), 9(1)), retention issues (Article 5(1)(e)), transparency failures (Articles 13(1)(c), 13(2)(a)), and incomplete DPIA (Article 35(7)(b),(c)). DSP given 9-month deadline to cease biometric processing or identify valid legal basis.

Risk Dimensions

Data-related risks
Operational and system integration risks

Impact Dimensions

Equality, non-discrimination, fairness and inclusion
  • DPIA/AIA conducted
  • Human oversight protocol
CategorySensitivityCross-System LinkageAvailabilityKey Constraints
Beneficiary registries and MISSensitiveSingle source (no linkage)Currently available and usedPersonal Public Services Number (PPSN) registration records used alongside facial matching to verify identity uniqueness
National ID and biometric databasesSpecial categorySingle source (no linkage)Currently available and usedFacial photographs captured during welfare registration and stored in DSP database; biometric data is special category under data protection law; 1:N matching against entire database constitutes mass biometric screening

Biometric Update (2018) 'Gemalto to upgrade Irish welfare fraud detection facial recognition system', Biometric Update, September. Available at: https://www.biometricupdate.com/201809/gemalto-to-upgrade-irish-welfare-fraud-detection-facial-recognition-system (Accessed: 30 March 2026).

View source News article / media

O'Brien, C. (2015) 'Facial recognition helps detect welfare benefits fraud', The Irish Times, 5 September. Available at: https://www.irishtimes.com/news/ireland/irish-news/facial-recognition-helps-detect-welfare-benefits-fraud-1.2340900 (Accessed: 27 March 2026).

View source News article / media

Kane, C. (2017) 'Facial recognition tool used to expose 155 cases of welfare fraud', The Irish Times, 14 February. Available at: https://www.irishtimes.com/news/crime-and-law/facial-recognition-tool-used-to-expose-155-cases-of-welfare-fraud-1.2967045 (Accessed: 30 March 2026).

View source News article / media

Data Protection Commission (2025) 'DPC announces conclusion of investigation into use of facial matching technology in connection with the Public Services Card by the Department of Social Protection', Data Protection Commission, Ireland, 12 June. Available at: https://www.dataprotection.ie/en/news-media/press-releases/dpc-announces-conclusion-investigation-use-facial-matching-technology-connection-public-services (Accessed: 30 March 2026).

View source Legal document / regulation
Deployment Status How far the system has progressed into real-world operational use, from concept/exploration through to scaled and institutionalised. View in glossary Suspended / Halted
Year Initiated The year the AI system was first initiated or development began. 2015
Scale / Coverage The scale and geographic or population coverage of the deployment. National
Funding Source The source(s) of funding for the AI system development and deployment. Unknown
Technical Partners External technology vendors, academic partners, or development partners involved. Cogent Systems (biometric vendor), Gemalto (2018 upgrade)
Outcomes / Results 155 suspected fraud cases referred by February 2017 (up from 62 in 2015); 22 cases finalized in court with 17 custodial sentences; largest case involved 478,000 euros in fraudulent claims using 12 false identities (5-year sentence); total fraud overpayments assessed at EUR 1,590,500; EUR 461,470 in savings from ceased payments; total savings exceeded EUR 4 million by 2018; DPC fined DSP EUR 550,000 in June 2025 for GDPR violations

How to Cite

DCI AI Hub (2026). 'Cogent Facial Image Matching Software for Welfare Fraud Detection', AI Hub AI Tracker, case IRL-001. Digital Convergence Initiative. Available at: https://socialprotectionai.org/use-case/IRL-001 [Accessed: 1 April 2026].

Change History

Updated 31 Mar 2026, 06:35
by system (system)
Created 30 Mar 2026, 08:40
by v2-import (import)