Tech

Facial Recognition Is Out of Control in India

A screen showing views from four surveillance cameras in an office building, with a list of identified faces on the right

In May 2021, during India’s second wave of COVID-19, SQ Masood and his father-in-law were returning home on his silver-coloured motor scooter in Hyderabad, in the Indian state of Telangana. While driving through the busy lanes of Shahran, a Muslim-dominated neighborhood, they were pulled over by two police officers.

Masood wasn’t concerned at first because he assumed that he would be asked to show his driver’s license or the vehicle registration card. So he and his father-in-law cooperated and they got off the motor scooter.

Videos by VICE

But instead, they were asked to remove their masks so the police constables could take mugshots of their faces on a handheld tablet.

“As a minority and as a Muslim in India in the current political scenario, I was disturbed because I didn’t know where my photograph had been stored, which department has my photograph and how they will use or misuse my data,” Masood told Motherboard.

In December of 2021, after inquiring about the incident and receiving no response, Masood, a social activist and cofounder of the Muslim community nonprofit ASEEM, filed a lawsuit against the Indian state of Telangana.

Privacy advocates have sounded the alarm about police use of facial recognition (FRT) in Telangana, with Amnesty International warning that the capital city of Hyderabad is “on the brink of becoming a total surveillance city.”

The Hyderabad city police department is known for employing facial recognition for a variety of objectives, including questionable cordon and search operations, profiling people for narcotics, and unlawful phone-searching activities. They claim that facial recognition technology has worked as a “deterrent” and helped them apprehend criminals.

“We don’t infringe upon the privacy of any individual, as we are not barging into anybody’s house to take pictures,” C.V. Anand, Hyderabad’s police commissioner, told Reuters in January. “The technology is being used only to keep surveillance on criminals or suspected criminals.”

In Telangana, there are numerous facial recognition datasets that are being integrated into a “smart governance program,” called Samagram, which gives the state government a full picture of every resident’s life, including their employment status and other personal information. The goal isn’t only to track down criminals, but to build up a ‘360 degree view’ of every single person.

There isn’t much information about this program on Telangana’s official websites and publications. This project, formerly known as the Integrated People Information Hub, would use data from police records as well as “phone/water/electricity connections, tax payments, passports, voter IDs, RTA license and registration data, e-challans and even terrorist records,” as stated by then Police Commissioner Mahendar Reddy in a now-deleted article in Telangana Today (regarded as a mouthpiece for the TRS government). A complete profile is created by combining this data with other identifying characteristics such as name, address, and phone number.

Telangana’s IT Secretary stated this in 2019 on the usage of digital footprints: “We have created a best algorithm through which this machine learning capabilities has become so robust that today we have reached a level of almost 96-97 percent accuracy. So if you tell me one person’s name I can give his entire digital footprint at about 96 percent accuracy to them… this tool throws up the results in a matter of seconds and the tool also is very useful in doing what is called family tree analysis or relationship analysis.”

Police in Telangana have installed the most CCTV cameras in India, according to the Bureau of Police Research and Development’s Data On Police Organizations (DOPO) 2021 report. Telangana leads the country with 2,82,558 CCTV cameras, followed by Tamil Nadu with 1,50,254 CCTV cameras. Telangana accounts for half of all CCTV cameras installed in the country, and also has the largest number of facial recognition technology projects in India, according to a report by the human rights organization Article 19.

Masood said that the Telangana police department was maintaining a database of suspected people, criminals, and ex-criminals by capturing the photographs of people on their smartphones with a mobile phone-based application called TSCOP.

“My main concern was that taking photographs of random civilians was a violation of the privacy rights of an individual,” said Masood. “There were no safeguards for the public, no redress mechanism, and no legal support or instructions for implementing FRT.”

In India, a biometric identity system called Aadhar assigns a 12-digit number to each citizen. But while Aadhar was mandated by an act of Parliament, there are no such regulations or safeguards for facial recognition technology in the country.

In addition to photographing suspects using CCTV cameras, the police can employ facial recognition at their own discretion. Several privacy advocates, like India’s Internet Freedom Foundation, believe the technology is being utilized for mass surveillance rather than for a specific purpose, with the government claiming it as a solution to fight crime.

In April, Indian President Ram Nath Kovind signed the Criminal Procedure (Identification) Act with the goal of making the criminal justice system “more effective” by linking it to technology.

The new law states that anyone engaged in a criminal investigation, even those not convicted, will be compelled to provide biometric identifiers such as fingerprint and palmprint impressions, iris and retina scans, biological samples, and behavioral traits such as handwriting.

Privacy rights activists oppose the collection of information, which the government can keep for up to 75 years—higher than the country’s average life expectancy.

“If there is no adequate oversight, it can lead to harm, especially in the case of facial recognition technology where so much personal data has been collected and is being shared between departments that it can lead to 360-degree surveillance,” Anushka Jain, an associate Counsel for Transparency & Right to Information at the Internet Freedom Foundation, told Motherboard.

Anushka Jain says that in the event of a mismatch, the facial recognition system returns a false positive or negative.

“In the first example, an innocent person is accused. In a false negative, the system cannot identify a person. If facial recognition technology is used to verify Aadhar or Voter ID, the system may fail to identify the individual, barring them from accessing government services or benefits,” says Anushka Jain.

The Criminal Justice and Police Accountability Project is one organization in India that aims to end the disproportionate targeting of marginalized populations by the criminal justice system.

Researchers from the Criminal Justice & Police Accountability Project released an essay on the website of The Transnational Institute in 2021 detailing how in India’s second-largest state, Madhya Pradesh, the use of biometrics and video monitoring by law enforcement is increasing the prejudice of India’s caste system.

In her work as a lawyer with the CPA, Sanjana Meshram has dealt with a number of Habitual Offender community members who have been jailed for minor offenses or had additional charges tacked on due to the police’s caste bias towards these groups.

Meshram recounts one incident where a police informant reported a 19-year-old teenager from the marginalized Gond tribe named Sanju for allegedly planning to steal bells from Hindu temples.

“The offense hadn’t even been committed but the police picked up Sanju and filed an [First Information Report] against him for theft,” Meshram told Motherboard. “Not only that but even tacked on several other cases of temple bell theft on him. The police recorded his biometric data, took his pictures, and entered him into the list of suspected criminals in the CCTNS.”

The CCTNS combines numerous dossiers, such as history sheets and goonda files, fingerprints, footprints, and information on the family members of the accused.

A goonda is a term police use to describe people who they believe are more prone to engage in acts of public violence and rioting, such as assault or disturbance of the peace. Goonda is a Hindi word that loosely translates to “rowdy” or “hooligan,” and the police have used different Goonda Acts since 1926 to keep tabs on those who have been designated as such.

“The systems like facial recognition technology and CCTNS have led to a digitized caste system. To a caste-based police regime, the CCTNS is only a technical façade,” Meshram added.

According to Raman Jit Singh Chima, the Asia Policy Director and Senior International Counsel at the digital rights group Access Now, the initiative against facial recognition technology needs to take place in state governments across India.

“We do believe this is something where the state assemblies can and should pass legislation putting restrictions on […] police or government usage of facial recognition technology within their state,” Chima told Motherboard.  “Because ultimately, India is a federal country and on this issue, the matter is not just something left in New Delhi, this is something where the states can take leadership but so far they haven’t done enough.”

 Several private companies are providing facial recognition technology to Indian government agencies, including IDEMIA, NEC India, Staqu, Vision-Box, and Innefu Labs.

NEC India has claimed that their facial recognition algorithms could identify people even when their features were hidden by masks, while Staqu allegedly supplied facial recognition technology to law enforcement agencies in at least eight states, including Uttar Pradesh, where the police used facial recognition on people protesting the controversial Citizenship Amendment Act.

On September 25, 2020, Amnesty International sent a letter to Innefu requesting information on the company’s sales of surveillance technologies to government agencies, as well as the company’s human rights due diligence. Innefu’s statement indicated that it did not have a defined human rights policy.

Namrata Maheshwari, the Asia Pacific Policy Counsel at Access Now, says that the sprawling nature of the system makes it unclear where facial recognition data will ultimately end up—or who is responsible if it becomes compromised by hackers in a breach.

“If a cybersecurity incident happens, who is in charge of it? Whose duty is to prevent it? And who do you go to when something goes wrong?” Maheshwari told Motherboard. “That’s not clear, Whether it is the vendor itself or a criminal hacker working in another nation, your personal data may be readily accessed by bad actors or other governments because of the free-for-all system that we now have here in India.”

Anushka Jain emphasizes that facial recognition technology must be brought before the country’s Constitutional Court, because right now, there are still no clear rules for the storage and use of facial data.

“If somebody’s data has been misused, then what do you do, where do you go, and where to register the complaint?” said Jain. “Right now there is nothing, no information on how FRTs can be used, which is very worrying.”