Editor: Varun Sharma and Vishal Kumar
Camera: Abhishek Ranjan
On Friday, when the nation’s attention was fixed upon the Union budget, a small advertisement tucked away within newspapers called for applicants to bid for the implementation of a nationwide Automated Facial Recognition System.
Unlike the 122-page long Finance Bill, 2019, the 172-page Request for Proposal (RPF) released by the National Crime Records Bureau (NCRB), hidden within the folders of its website, generated nearly no attention even though its implication are wide and severe.
NCRB, which has not published statistics on crimes committed in India since 2016, writes in the proposal that it has “conceptualised the Automated Facial Recognition System (AFRS)” as an effort towards “modernising the police force, information gathering, criminal identification.”
According to the document, stakeholders of this project are the NCRB, the Ministry of Home Affairs and all state police forces.
A digital system for facial recognition of citizens has legitimate benefits in identifying criminals, tracing missing persons and aiding investigations.
However, at the same time, the move raises pertinent concerns as worldwide instances of surveillance, privacy abuse, inaccurate results and most importantly, disproportionate impact on minorities have surfaced repeatedly.
In May, San Francisco, at the heart of Silicon Valley’s technology revolution, became the first American city to ban the use of facial technology by the police.
The block came amidst growing fears of abuse by the government and pushing the city towards overt surveillance.
In India, two specific concerns arise:
- The absence of a data protection law which defines how the state can collect, store, process and share citizens’ data.
- The need for comprehensive surveillance reform, which currently exists without judicial oversight and has been criticised for being opaque and ambiguous.
What is Automated Facial Recognition System ?
Automatic facial recognition (AFR) is an advanced way of recognising people by using computers to scan their faces. According to a research paper, “It aims to identify people in images or videos using sophisticated pattern recognition techniques.”
Automated facial recognition is widely used in applications ranging from social media to advanced authentication systems. It is used to identify our faces in a group photograph on Facebook, to unlocking our phones as well as to identify faces from CCTV footage.
The NRCB, is seeking to implement a system where a police officer can take a photograph of an individual, say at a protest or a crime scene, and match it with it photographs in their database to identify the person and get basic information about the person.
Some intended objectives of the AFRS:
- The AFRS will be a centralised web application hosted at the NCRB Data Centre in Delhi
- The repository shall act as a foundation for a national level searchable platform of facial images
- Capture face images from CCTV feed and generate alerts if a blacklist match is found
- The system should have option to upload bulk images of an individual
- The solution should be compatible with other biometric solutions such as Iris and fingerprint for generation of comprehensive biometric authentication reports
- The database should be able to store 1.5 crore images and accommodate 2,500 simultaneous users
What Could Go Wrong ?
The question to ask is – why is this a worry for Indians? Isn’t it a good thing that law enforcement agencies will be better equipped to apprehend criminals?
While that is the stated aim with which this, or any AFRS project, embarks on its journey, there are a number of major concerns:
No Data Protection Law: Among the biggest problems in launching such a massive project is the absence of a law to supervise and inform how a government agency can go about using and processing our images.
The draft data protection bill submitted to the government by the Justice BN Srikrishna Committee in July, 2018, has NOT been tabled in Parliament yet.
According to Electronics and IT Minister, Ravi Shankar Prasad, when asked about the bill on Thursday while debating Aadhaar said, “it is a work in progress”.
Little Supervision on Surveillance: As discussed earlier, experts have been fervently calling for a comprehensive surveillance reform. Currently, electronic surveillance is authorised under section 69 of the Information Technology Act, 2000.
It is not only vague with language like “sovereignty or integrity of India”, “friendly relations with other states”, “security of the state” or “public order”, given as grounds for surveillance but there is almost no publicly available information about the grounds on which surveillance decisions are granted.
Merging with Iris and Fingerprint Databases: Under section 2.2 – Functional Requirements of the AFRS System – point 21 states that the solution “should be compatible with other biometric solutions such as iris and fingerprints for generating comprehensive biometric authentication reports.”
This does raise serious concerns as it is unclear whether, in the absence of specific laws, it can be linked to Aadhaar’s CIDR database.
360 Degree Profiling: Point 31 states that AFRS System will be integrated with existing AFRS systems (and any other AFRS system established before signing of contract) of some advanced states. The Andhra Pradesh and Telangana governments do possess advanced databases like the controversial State Resident Data Hubs (SRDH).
SRDH are well-documented examples of 360-degree profiling of residents. A Huffington Post report had investigated how citizens could be looked up by religion or caste using Aadhaar numbers to connect disparate strands of data about them.
Who Can Bid For This Project ?
The RFP sets down a list of criteria that a bidder must fulfil in order to be eligible. The last date for submission of bids is 16 August and they will be opened on 19 August. Below are some of the conditions stated.
- Earnest Money Deposit (EMD) of Rs 40 lakh along with submission
- Should have an annual turnover of at least Rs 100 crores in each of the last 3 financial years
- Should have a positive net worth in each of the last 3 financial years
- Must have successfully executed and completed in the last ten financial years at least 3 AFRS installations (with at least 10 lakh-entry database) for law enforcement agencies across the world
A Pattern of Invasive Surveillance
This is not the first attempt at a project that has serious privacy and surveillance concerns. In just the past one year, several similar plans have been unveiled.
1. Social Media Communication Hub: In April 2018, the Information & Broadcasting Ministry had issued a similar RFP for Social Media Communication Hub to monitor social media activity.
Upon a legal challenge by TMC’s Mahua Moitra, it later had to be dropped after the Supreme Court observed that it would be akin to “creating a surveillance state”.
2. 10 Govt Agencies Authorised to Snoop: In December 2018, the government authorised 10 intelligence and investigating agencies and the Delhi Police to intercept, monitor and decrypt "any information" generated, transmitted, received or stored in "any computer".
3. Intermediary Liability Rules: Just five days later, on 24 December, the Ministry of Electronics and IT (MeitY), issued the The Draft Information Technology [Intermediaries Guidelines (Amendment) Rules], 2018.
These proposed amendments, drafted without any prior consultation with the public, propose that messaging apps, social networks, search engines, internet service providers, cyber cafes among others follow a content policing and filtering system.
Hard Lessons From China, US, UK
Before India embarks on a project with such severe implications, it is wise to take a look at how other countries, that have already implemented it, are faring.
Impact on Minorities/Govt Abuse: A New York Times report revealed how the Chinese government was using a vast, ‘secret’ system of advanced facial recognition technology to track and control the Uighurs, a Muslim minority community.
“The facial recognition system looks exclusively for Uighurs based on their appearance and keeps records of their comings and goings for search and review,” the report states.
Bias Against Women: At least one study carried out at Massachusetts Institute of Technology has revealed that FRS from giants like IBM and Microsoft is less accurate when identifying females. In the US, many reports have discussed how such softwares are particularly poor at accurately recognising African-American women.
Dangerous Inaccuracies: Even Amazon cannot get it right. Yes, Amazon! In a major embarrassment to the company, a test of its software called “Rekognition”, incorrectly identified 28 members of US Congress as other people arrested for crimes.
Disproportionately Harmful: In a scathing editorial in June, UK news publication The Guardian, denounced facial-recognition as a “danger to democracy”.
The disproportionality draws from all the previous points and busts the myth that government abuse happens only in autocratic or authoritarian countries.