Skip to main content
AI Hub
Home Browse Cases Countries Sources Explore Taxonomy About Submit
Sign In
DCI AI Hub — AI Tracker socialprotectionai.org/use-case/IND-005
IND-005 Exported 1 April 2026

Samagra Vedika – AI-Enabled Entity Resolution for Welfare Eligibility (Telangana, India)

Country India
Deployment Status Scaled & Institutionalised
Confidence Confirmed
Implementing Agency Department of Information Technology, Electronics & Communications (ITE&C), Government of Telangana

Overview

Samagra Vedika is an integrated data platform deployed by the Government of Telangana, India, that uses machine-learning-based entity resolution to create unified digital profiles of citizens by linking fragmented records across approximately 30 separate government databases. The system was developed by Posidex Technologies Private Limited, a proprietary entity resolution software vendor, and is operated by the Department of Information Technology, Electronics and Communications (ITE&C) of the Government of Telangana. Originally built in 2016 for the Hyderabad City Police to identify persons of interest, the platform was subsequently expanded to determine eligibility for welfare programmes including food security ration cards, Aasara state pensions, and 2BHK housing schemes (Al Jazeera, 2024).

The core technical approach employs probabilistic record linkage using a graph database architecture. The system processes over 380 million individual records drawn from integrated databases including pensions, land registries, electricity connections, ration cards, and vehicle registrations (Amnesty International, 2024). Entity resolution operates through three phases: preprocessing and blocking, where data is cleaned and grouped to reduce computational comparisons; comparison and matching, where similarity scores are computed between record pairs using machine-learning approaches; and clustering, where records determined to represent the same entity are grouped together (Amnesty International, 2024). The system uses name, address, date of birth, phone number, and father's name as primary linking attributes, and was specifically designed to operate without relying on Aadhaar biometric identification as the primary key, due to legal restrictions arising from the Justice K.S. Puttaswamy v Union of India Supreme Court ruling on privacy (Al Jazeera, 2024).

The platform generates what officials describe as a '360-degree view' of individuals, triangulating identity matches across multiple records to assess eligibility against defined criteria. For food security cards in Telangana, families must have annual income below 150,000 rupees in rural areas or 200,000 rupees in urban areas, and exclusion triggers include possession of four-wheelers, government or private employment of family members, and ownership of businesses such as shops, petrol pumps, or rice mills (Al Jazeera, 2024). A critical design parameter is the tolerance threshold—the cutoff similarity score determining what constitutes a match—which creates an inherent trade-off between false positives (legitimate claims wrongly flagged as fraudulent) and false negatives (actual fraud undetected). Amnesty International's analysis found that the system's design embeds financial incentives to reduce false negatives, ensuring the system finds sufficient fraudulent or duplicative applications to demonstrate cost savings, at the expense of increasing false positive rates that wrongly exclude eligible families (Amnesty International, 2024).

Between 2014 and 2019, the Telangana government cancelled 1.86 million existing food security cards and rejected 142,086 fresh applications, in many cases without providing notice to affected beneficiaries (Al Jazeera, 2024). Documented errors include the system incorrectly attributing car ownership to individuals—for example, tagging the deceased husband of a 67-year-old widow named Bismillah Bee as car owner 'Syed Hyder Ali' when his actual name was 'Syed Ali', resulting in denial of subsidised rations during the COVID-19 pandemic while her husband was battling mouth cancer (Al Jazeera, 2024). Another documented case involved a family whose husband has polio-induced paralysis being tagged as possessing a four-wheeler; the Telangana High Court confirmed their eligibility in November 2024, but rations had still not been issued at the time of reporting (Al Jazeera, 2024). A Supreme Court-ordered re-verification of 491,899 applications found that of the 205,734 processed by July 2022, 15,471 were approved—suggesting approximately 7.5 percent of exclusions were wrongful, contradicting the government's claim that error rates were below 5 percent (Al Jazeera, 2024).

The system operates with stated human-in-the-loop oversight, where district officials are mandated to verify and approve actions after algorithmic triage. However, investigations by Al Jazeera and the Pulitzer Center documented that officials routinely deferred to algorithmic decisions, accepted the algorithm's determinations even when presented with contradictory evidence, and required beneficiaries to file formal grievances to trigger reviews (Al Jazeera, 2024). Officials reported that they 'just don't know' how to override algorithmic decisions, effectively reversing the burden of proof onto vulnerable beneficiaries who are 'shunted from one office to another' when attempting to correct erroneous data (Al Jazeera, 2024). Under public pressure following media exposure, the government reinstated 14,000 cancelled ration cards through an appeals process .

Transparency and accountability are severely constrained. The state IT department denied Right to Information Act requests for source code and data format specifications, citing the vendor's proprietary rights (Al Jazeera, 2024; Amnesty International, 2024). Posidex Technologies declined interview requests from both Al Jazeera and Amnesty International. Amnesty International spent a year designing and attempting to conduct an independent audit of Samagra Vedika but was unable to complete it due to inability to access proprietary source code, high procurement costs for testing access, and non-disclosure agreement requirements (Amnesty International, 2024). No public algorithmic audit, human rights impact assessment, or independent transparency review has been documented.

The Samagra Vedika model is being replicated across other Indian states through similar platforms including Haryana's Parivaar Pehchan Patra, Tamil Nadu's Makkal ID, Jammu and Kashmir's Family ID, and Karnataka's Kutumba, significantly expanding the scale of algorithmic welfare administration across the country (Amnesty International, 2024). The case represents one of the most scrutinised AI-enabled welfare implementations globally, with extensive corroboration from investigative journalism, human rights organisations, Right to Information responses, and Supreme Court proceedings.

Classification

AI Capabilities

Clustering (similarity and grouping) (primary)Anomaly and change detectionClassification

Use Cases

Identification, verification and record linkage (primary)Compliance and integrityDecision support for eligibility and benefits

Social Protection Functions

Implementation/delivery chain: Assessment of needs/conditions + enrolment (primary)Implementation/delivery chain: Accountability mechanismsImplementation/delivery chain: Registration
SP Pillar (Primary)Social assistance

Programme Details

Programme NameSamagra Vedika – Integrated Data Platform (Telangana)
Programme TypeFee waivers and targeted subsidies
System LevelImplementation/delivery chain

Cross-cutting data integration platform used by the Government of Telangana to determine eligibility across multiple social assistance programmes including food security ration cards (Public Distribution System), Aasara state pensions, and 2BHK housing schemes. The platform consolidates approximately 30 government databases to create unified citizen profiles for eligibility verification and duplicate detection.

Implementation Details

Implementation TypeClassical ML
Lifecycle StageMonitoring, Maintenance and Decommissioning
Model ProvenanceCommercial/proprietary
Compute EnvironmentNot documented
Sovereignty QuadrantNot assessed
Data ResidencyNot documented
Cross-Border TransferNot documented

Risk & Oversight

Decision CriticalityHigh
Human OversightHITL
Development ProcessFully third-party developed
Highest Risk CategoryGovernance and institutional oversight risks
Risk Assessment StatusNot assessed

Documented Risk Events

1.86 million food security cards cancelled and 142,086 fresh applications rejected between 2014-2019, many without notice. Supreme Court-ordered re-verification found approximately 7.5% wrongful exclusion rate (15,471 approved of 205,734 processed). Documented individual cases of incorrect car ownership attribution and wrongful income inflation. Government reinstated 14,000 cancelled ration cards under public pressure. RTI requests for source code and data formats denied citing vendor proprietary rights. Amnesty International unable to complete year-long audit attempt due to access barriers.

Risk Dimensions

Data-related risks

Consent or lawful basis gapCross-dataset inconsistencyData quality failureRepresentation biasWeak provenance or lineage

Governance and institutional oversight risks

Inadequate grievance or redressInsufficient human oversightPurpose limitation failureUnclear accountabilityWeak documentation or auditability

Market, sovereignty and industry structure risks

LMIC power asymmetryOpaque supply chainRestricted audit accessVendor lock-in

Model-related risks

Opacity or limited explainabilityReliability or generalisation failureSubgroup bias

Operational and system integration risks

Automation complacencyInadequate real-world validationMonitoring gapThreshold or rule misconfiguration

Impact Dimensions

Accountability, transparency and redress

No accessible or effective remedyNo identifiable decision ownerUntraceable decision pathway

Autonomy, human dignity and due process

Inability to contest or appeal outcomeLoss of individual agency or autonomyOpaque or unexplained decisionPsychological stress, stigma or dignity harm

Equality, non-discrimination, fairness and inclusion

Discriminatory outcomeDisparate error rates across groupsSystematic exclusion from benefits or services

Privacy and data security

Disproportionate surveillance or profilingLoss of individual control over personal dataPrivacy violation or data breach

Systemic and societal

Erosion of public trust in SP systemIncreased administrative burden on frontline staffPolitical backlash, litigation or controversy

Safeguards

Grievance mechanismHuman oversight protocol

Deployment & Outcomes

Deployment StatusScaled & Institutionalised
Year Initiated2016
Scale / CoverageState-wide deployment across Telangana (approximately 30 million residents); processes over 380 million individual records from approximately 30 government databases; deployed across most state welfare schemes by 2018; model being replicated in other Indian states
Funding SourceGovernment of Telangana (state budget)
Technical PartnersPosidex Technologies Private Limited provides the entity resolution software. Neither the state government nor the company has placed source code or verifiable performance data in the public domain; vendor retains proprietary rights over system architecture. Reports indicate that other states have hired Posidex to develop similar platforms.

Outcomes / Results

Government-reported outcomes: Aasara pensions: 65,693 new applications processed; 6,625 (10.1%) flagged ineligible (approximately 16 crore rupees per year savings claimed). Hyderabad PDS pilot (August 2016): approximately 86,000 ration cards removed; approximately 4.6 crore rupees per month savings reported. Vendor claims system 'can save a few hundred crores every year' by identifying leakages. No independent verification of savings claims has been conducted.

Challenges

Probabilistic matching errors cause wrongful exclusions of eligible beneficiaries from food security and pension programmes. Officials routinely defer to algorithmic decisions rather than exercising mandated human oversight. Burden of proof effectively reversed onto vulnerable beneficiaries who must navigate complex grievance processes. Complete lack of transparency due to proprietary vendor system and RTI denials. No independent audit completed despite year-long attempt by Amnesty International. System design incentivises minimising false negatives (missed fraud) at the expense of increased false positives (wrongful exclusions). Replication across multiple Indian states amplifies systemic risks without addressing documented failures.

Sources

  1. SRC-001-IND-005 Al Jazeera (2024) 'How an algorithm denied food to thousands of poor in India's Telangana', Al Jazeera, 24 January. Available at: https://www.aljazeera.com/economy/2024/1/24/how-an-algorithm-denied-food-to-thousands-of-poor-in-indias-telangana (Accessed: 24 March 2026).
    https://www.aljazeera.com/economy/2024/1/24/how-an-algorithm-denied-food-to-thousands-of-poor-in-indias-telangana
  2. SRC-002-IND-005 Amnesty International (2024) 'Entity Resolution in India's Welfare Digitalization' (Technical Explainer). London: Amnesty International. Available at: https://www.amnesty.org/en/latest/research/2024/04/entity-resolution-in-indias-welfare-digitalization/ (Accessed: 24 March 2026).
    https://www.amnesty.org/en/latest/research/2024/04/entity-resolution-in-indias-welfare-digitalization/
  3. SRC-003-IND-005 ITE&C Department, Government of Telangana (2019) Samagra Vedika — Telangana's Integrated Platform. World Bank-hosted presentation. Available at: https://thedocs.worldbank.org/en/doc/945071576869997489-0310022019/original/GTVenkateshwarRaoPresentationonSamagraVedikatoWordlBankseminatDec19.pdf (Accessed: 31 October 2025).
    https://thedocs.worldbank.org/en/doc/945071576869997489-0310022019/original/GTVenkateshwarRaoPresentationonSamagraVedikatoWordlBankseminatDec19.pdf
  4. SRC-004-IND-005 Pulitzer Center (2024) 'How an Algorithm Denied Food to Thousands of Poor in India's Telangana', Pulitzer Center. Available at: https://pulitzercenter.org/stories/how-algorithm-denied-food-thousands-poor-indias-telangana (Accessed: 31 October 2025).
    https://pulitzercenter.org/stories/how-algorithm-denied-food-thousands-poor-indias-telangana

How to Cite

DCI AI Hub (2026). 'Samagra Vedika – AI-Enabled Entity Resolution for Welfare Eligibility (Telangana, India)', AI Hub AI Tracker, case IND-005. Digital Convergence Initiative. Available at: https://socialprotectionai.org/use-case/IND-005

Back to case page
AI Hub

Digital Convergence Initiative - AI Hub

Responsible, ethical use of AI in social protection

MarketImpact Platform developed by MarketImpact Digital Solutions
Co-funded by European Union and German Cooperation. Coordinated by GIZ, ILO, The World Bank, Expertise France, and FIAP.