Samagra Vedika is an integrated data platform deployed by the Government of Telangana, India, that uses machine-learning-based entity resolution to create unified digital profiles of citizens by linking fragmented records across approximately 30 separate government databases. The system was developed by Posidex Technologies Private Limited, a proprietary entity resolution software vendor, and is operated by the Department of Information Technology, Electronics and Communications (ITE&C) of the Government of Telangana. Originally built in 2016 for the Hyderabad City Police to identify persons of interest, the platform was subsequently expanded to determine eligibility for welfare programmes including food security ration cards, Aasara state pensions, and 2BHK housing schemes (Al Jazeera, 2024).
The core technical approach employs probabilistic record linkage using a graph database architecture. The system processes over 380 million individual records drawn from integrated databases including pensions, land registries, electricity connections, ration cards, and vehicle registrations (Amnesty International, 2024). Entity resolution operates through three phases: preprocessing and blocking, where data is cleaned and grouped to reduce computational comparisons; comparison and matching, where similarity scores are computed between record pairs using machine-learning approaches; and clustering, where records determined to represent the same entity are grouped together (Amnesty International, 2024). The system uses name, address, date of birth, phone number, and father's name as primary linking attributes, and was specifically designed to operate without relying on Aadhaar biometric identification as the primary key, due to legal restrictions arising from the Justice K.S. Puttaswamy v Union of India Supreme Court ruling on privacy (Al Jazeera, 2024).
The platform generates what officials describe as a '360-degree view' of individuals, triangulating identity matches across multiple records to assess eligibility against defined criteria. For food security cards in Telangana, families must have annual income below 150,000 rupees in rural areas or 200,000 rupees in urban areas, and exclusion triggers include possession of four-wheelers, government or private employment of family members, and ownership of businesses such as shops, petrol pumps, or rice mills (Al Jazeera, 2024). A critical design parameter is the tolerance threshold—the cutoff similarity score determining what constitutes a match—which creates an inherent trade-off between false positives (legitimate claims wrongly flagged as fraudulent) and false negatives (actual fraud undetected). Amnesty International's analysis found that the system's design embeds financial incentives to reduce false negatives, ensuring the system finds sufficient fraudulent or duplicative applications to demonstrate cost savings, at the expense of increasing false positive rates that wrongly exclude eligible families (Amnesty International, 2024).
Between 2014 and 2019, the Telangana government cancelled 1.86 million existing food security cards and rejected 142,086 fresh applications, in many cases without providing notice to affected beneficiaries (Al Jazeera, 2024). Documented errors include the system incorrectly attributing car ownership to individuals—for example, tagging the deceased husband of a 67-year-old widow named Bismillah Bee as car owner 'Syed Hyder Ali' when his actual name was 'Syed Ali', resulting in denial of subsidised rations during the COVID-19 pandemic while her husband was battling mouth cancer (Al Jazeera, 2024). Another documented case involved a family whose husband has polio-induced paralysis being tagged as possessing a four-wheeler; the Telangana High Court confirmed their eligibility in November 2024, but rations had still not been issued at the time of reporting (Al Jazeera, 2024). A Supreme Court-ordered re-verification of 491,899 applications found that of the 205,734 processed by July 2022, 15,471 were approved—suggesting approximately 7.5 percent of exclusions were wrongful, contradicting the government's claim that error rates were below 5 percent (Al Jazeera, 2024).
The system operates with stated human-in-the-loop oversight, where district officials are mandated to verify and approve actions after algorithmic triage. However, investigations by Al Jazeera and the Pulitzer Center documented that officials routinely deferred to algorithmic decisions, accepted the algorithm's determinations even when presented with contradictory evidence, and required beneficiaries to file formal grievances to trigger reviews (Al Jazeera, 2024). Officials reported that they 'just don't know' how to override algorithmic decisions, effectively reversing the burden of proof onto vulnerable beneficiaries who are 'shunted from one office to another' when attempting to correct erroneous data (Al Jazeera, 2024). Under public pressure following media exposure, the government reinstated 14,000 cancelled ration cards through an appeals process .
Transparency and accountability are severely constrained. The state IT department denied Right to Information Act requests for source code and data format specifications, citing the vendor's proprietary rights (Al Jazeera, 2024; Amnesty International, 2024). Posidex Technologies declined interview requests from both Al Jazeera and Amnesty International. Amnesty International spent a year designing and attempting to conduct an independent audit of Samagra Vedika but was unable to complete it due to inability to access proprietary source code, high procurement costs for testing access, and non-disclosure agreement requirements (Amnesty International, 2024). No public algorithmic audit, human rights impact assessment, or independent transparency review has been documented.
The Samagra Vedika model is being replicated across other Indian states through similar platforms including Haryana's Parivaar Pehchan Patra, Tamil Nadu's Makkal ID, Jammu and Kashmir's Family ID, and Karnataka's Kutumba, significantly expanding the scale of algorithmic welfare administration across the country (Amnesty International, 2024). The case represents one of the most scrutinised AI-enabled welfare implementations globally, with extensive corroboration from investigative journalism, human rights organisations, Right to Information responses, and Supreme Court proceedings.