Police in York and Peel regions, both near Toronto, are buying controversial facial recognition software that will let cops compare photo and video evidence against a database of mugshots in the hopes of identifying suspected criminals.
The impending purchase comes as protests against racism and police brutality continue across Canada and the U.S., with many activists calling for the defunding of police. Research has shown that facial recognition systems are racially biased.
Videos by VICE
In February, York police admitted members of the force had used a free trial of the powerful facial recognition app Clearview AI, which had come under fire in the U.S. for building a large database of photos of U.S. citizens not accused of any wrongdoing, and making that database searchable for the thousands of clients to whom it has already sold the technology.
Public documents show that York Regional Police budgeted $1.68 million in 2019 to pay for a “Facial Recognition and Automated Palm and Fingerprint Identification System” over two years.
VICE reached out to police in both regions to ask how the forces will use facial recognition. Police in York said they are in the process of buying the technology. Peel police confirmed they will have their own licence to use the system.
“Our intention is to procure software to compare images and/or videos of crime scenes against a police mugshot database,” said a York police spokesperson. “The images/videos to be compared will already be in the lawful possession of York Regional Police and connected to a specific investigation.”
In February, York police admitted that some of its officers had used Clearview AI after initially denying they had used the software. Police across Canada have also admitted to using Clearview.
Cops are also increasingly using in-house facial recognition systems to try to ID suspects caught on video or in still photographs. In May 2019 it was revealed that police in Toronto had been using facial recognition for over a year.
Experts in law and digital privacy say the use of facial recognition only serves to reinforce systemic racism present in the justice system.
“We already know from research that (Black, Indigenous, people of colour) are overrepresented in the Canadian judicial system and are more likely to be targeted by law enforcement,” said Toni Morgan, managing director at the Center of Law, Innovation and Creativity (CLIC) at Northeastern University. “When you add unregulated software that makes it OK to embed racial profiling into new technology, it’s not only an issue of privacy but an issue of history repeating. The (Greater Toronto Area) goes from a modern 21st century city to the Antebellum South.”
A National Institute of Standards and Technology (NIST) study in the U.S. in 2019 tested nearly 200 different facial recognition algorithms built by 99 corporations. The study found that photos of Indigenous, Black, and Asian faces had some of the highest rates of false matches and that mugshots of Black and Asian people were misidentified by facial recognition 10 to 100 times as much as Caucasians.
Amid ongoing BLM protests, IBM announced it would no longer offer facial recognition technology to law enforcement. Amazon has put a one-year ban on police use of its deeply flawed Rekognition software. But other companies remain willing to supply police with the technology.
In recent weeks, “several prominent makers of face surveillance technology have essentially acknowledged that (facial recognition) facilitates mass surveillance, is harmful to privacy, (and) is also racially biased and feeds systemic racism in policing,” Brenda McPhail, director of the Privacy, Surveillance and Technology Project at the Canadian Civil Liberties Association (CCLA), said.
“No police force in Canada should be investing public funds in a technology that even its creators admit is so flawed in conception and design that its use will further systemic racism,” she added.
Documents show that Motorola and Veritone, among others, have submitted bids to supply police in York and Peel regions with facial recognition tech.
“The cost is one issue, but the real problem is the practice of normalizing the use of these technologies against our communities,” says Morgan. “Whether it’s $1.6 million or $1.60, all money used to further an agenda that justifies our dehumanization needs to be reconsidered.”
Follow Nathan Munn on Twitter.