The Delhi Police is now using Automated Facial Recognition Systems (AFRS) to screen alleged 'rabble-rousers and miscreants', after the farmers' tractor march in the national capital took a violent turn, on 26 January.
However, the accuracy of Facial Recognition Technology (FRT) has been a primary concern while deploying it for security and law enforcement purposes.
The FRT that the government used in Delhi was first acquired by the Delhi Police to identify missing children. In 2019, Delhi Police told the Delhi High Court that its FRT had an accuracy rate of two percent.
“FRT even failed to distinguish between boys and girls,” said New Delhi-based digital rights advocacy group, Internet Freedom Foundation, which also questioned the sophistication of the facial recognition system. This is a trend worldwide, with similar levels of accuracy reported in the UK and the US.
The same technology was used to identify Delhi rioters in 2020, when more than 1,100 individuals allegedly took part in the communal violence in northeast Delhi, on 24 and 25 February 2020. Speaking about the same, Home Minister Amit Shah said, “We have used facial recognition software to initiate the process of identifying faces.”
According to a report by Hindustan Times,140 CCTV cameras were installed by Delhi Police during the Republic day parade to keep an eye on the parade route at 30 spots on Rajpath. The facial recognition system was reportedly fed with a database of suspected terrorists, criminals, and anti-social elements totaling up to 50,000 people.
“We have the video footage of those who indulged in the violence and we are analysing them. They are being identified with the help of the face recognition system, they will be arrested, and legal action will be taken against them.”SN Srivastava, Delhi Police Commissioner
Key Concerns of FRT
- Accuracy: Facial recognition data is not free from error and cannot be 100 percent accurate, which could lead to people being implicated for crimes they have not committed. For instance, a slight change in camera angle or a change in appearance, such as a new hairstyle, could lead to an error. Amazon’s facial recognition technology in 2018 had falsely identified 28 members of the US Congress, as people arrested for crimes, Newsweek reported.
- Bias: FRT algorithms in a number of commercial software have shown racial, ethnic and gender biases. A study carried out at Massachusetts Institute of Technology (MIT) has revealed that FRT from giants like IBM and Microsoft is less accurate when identifying females. In the US, many reports have discussed how such softwares are particularly poor at accurately recognising African-American women.
According to Vidushi Marda, a lawyer, and researcher at Article 19, a human rights organisation, "Multiple studies across the world has demonstrated that the accuracy rates of facial recognition algorithms are particularly low in the case of minorities, women, children, and the use of such technology in a criminal justice system, where vulnerable groups are over-represented, makes them susceptible to being subjected to false positives (being wrongly identified as a criminal)." - Legality: Without legal safeguards, FRT will undermine democratic values and rights. Mass surveillance presents a threat to privacy, freedom of speech, and diminishes our ability to be anonymous in public, according to Privacy International.
Apar Gupta, executive director, Internet Freedom Foundation, which has raised concerns over the dangers of facial recognition systems, said, “All of this is being done without any clear underlying legal authority and is in clear violation of the Right to Privacy judgment.”