Tech

Companies in the UK Are Mining Users’ Personal Data to Place Billboard Ads

A new investigation reveals that in the UK, the billboards watch you.
11_30_2020_RACIST_FACIAL_RECOGNITION_SOFTWARE_CV
final-02
This Series explores surveillance and its intersection with race and civil rights. made possible with support from Columbia University’s Ira A. Lipman Center.

Companies in the UK are collecting data from millions of phones to decide which advertisements to show on billboards in locations all around Britain, according to a new investigation by Big Brother Watch, a London-based civil liberties group known for confronting public surveillance issues. 

The report details how personalized ads—a phenomenon that has more than once raised privacy concerns over digital spying—are no longer confined to our private feeds, but instead have begun to overflow into our public lives. 

Advertisement

“We’ve uncovered new ways in which millions of people’s movements and behaviors are tracked to target us with ads on the streets, resulting in some of the most intrusive advertising surveillance we’ve ever seen in the UK,” Jake Hurfurt, head of research and investigations at Big Brother Watch, said in a press release about the analysis. 

The report identifies several companies who were the first to introduce facial-detecting advertising technology to different cities across the country. Unlike traditional paper billboards whose advertisements are printed on vinyl, digital billboards can be programmed to offer more than one message. Many of them also have high-definition cameras to peer down onto the unsuspecting public. Algorithms then attempt to detect a person’s face, physical characteristics, and even what they might be wearing to tailor advertisements to people walking in the street, in malls, and even on tablets in the backs of cars. 

ALFI, an American ad tech developer, already has many of these face-scanning tablets in various Lyft and Uber vehicles in the US. The company claims that they use AI and machine learning algorithms to analyze how their audience interacts with ads, and shows them more relevant ones. It seems now more than ever, everything is a camera, and every camera is a computer. 

The report also notes that two influential billboard owners in the UK, Ocean Outdoor and Clear Channel, rely on facial detection tools made from the French company Quividi. The company claims that its technology is able to scan up to a 100 faces at once and detect how long someone dwelled near or paid attention to an ad. It also attempts to discern factors like age, gender, and mood—capabilities which have been heavily disputed and de-bunked by machine learning experts. 

Advertisement

The report notes that this data, combined with crowd size and information on attentiveness, can be used to trigger changes that target audiences on a large scale.

While it’s one thing to recognize that predictive analytics may be controlling what we see and interact within the comfort of our own homes, it's another to realize that you and the people around you are being collectively influenced. Arvind Narayanan, a professor of computer science at Princeton University, says that one of the main problems with companies using data-gathering technologies to personalize billboards is that it “erodes the idea of public spaces.” 

“It is hard to have spontaneous and casual social interactions with strangers when you're staring at content targeted at you and you know you're being surveilled,” Narayanan told Motherboard over email. “These technologies manage the feat of simultaneously harming our privacy and our sense of community."

“The Quividi software relies on face detection, not on face recognition,” a Quividi spokesperson told Motherboard. “These are two different technologies. Face detection only looks for the presence of a face whereas facial recognition looks for and identifies a particular person.”

”This means that the Quividi software cannot recognize an individual, either in absolute terms (full identity) or in terms of repeated exposures (e.g. recognizing that someone was at a sequence of different locations, or visited the same location twice),” Quividi’s spokesperson said.

Advertisement

Targeted advertising is virtually unavoidable for anyone who owns a smartphone or computer, yet some experts say that the push to use our privacy against us didn’t begin with the advent of the internet or AI; in reality, the concept has been intertwined with capitalism for more than a century.  

“The whole point of surveillance advertising or digital advertising is to modify our behaviors in certain ways or modify our attitudes in certain ways,” Matthew Crain, an associate professor of media and communication at Miami University, told Motherboard. Presumptuous as it is, Crain says, the more information a brand has about its potential audience, the less money it wastes on sending ads to people or groups outside of its target market. 

The way businesses access our data is both sinister and surprisingly mundane; the report notes that companies use tracking data apps and the vague language in their privacy policies to gain users’ “consent” to collect large amounts of data to generate advertising profiles. These individual identifiers can include aspects from how users interact with personal apps to which stores they frequent the most, and this secret amalgamation of our likes and dislikes is then sold off to data analytics companies for them to use indefinitely. 

The inquiry also found that profiles of some interest groups are linked to GPS tracking data that allows brands to target people based on where and when they'll likely be that day, crafting advertisements almost in real-time. The report specifically calls out Adsquare, a German advertising tech company that has “pioneered” this phone-to-billboard strategy, as 1 in 10 mobile devices in the UK contains trackers that send personal data back to them. That means there are at least 8 million phones that could be sending location and behavioral data to Adsquare at any one time.  

Advertisement

But these scarily efficient advances are only confined to the UK; proof of this ongoing practice has already been witnessed both in the U.S and in other places around the world. For instance, though Adsquare claims to comply with privacy laws regarding the use of these tracking tools, one of their data brokers includes the controversial company X-Mode, now known as Outlogic, which was banned by Apple and Google’s app stores in 2020 for selling data to the US military. 

Hurfurt said that the only way to force data harvesters to respect people’s privacy and give them real choices is to make radical and transparent reforms to the tech sector.  

Steven Feldstein, a senior fellow at the Carnegie Endowment for International Peace, agrees with that sentiment. “When it comes to surveillance, there's been a pretty significant regulatory lag when it comes to actually having the right rules in place and laws in place to regulate these industries,” Feldstein told Motherboard. “There's a real gap in terms of regulations catching up to practice, and making sure that the privacy needs of individuals are protected.”

Similar instances of companies abusing advertising data have since inspired public policy actors in the U.S to speak up, fostering legislation that would prohibit advertising networks from using personal data as well as data based on protected class information, such as race, gender, and religion to target advertisements illegal. 

“It's not that bad tracking doesn't have a place in the digital ecosystem, but that right now, it's so unbalanced in one direction,” says Feldman. “There’s so little accountability, and there's so little transparency of how it's being used, and so little protection when it comes to consent, that it's really out of whack, and I think it's leading to troubling harms as a result.” 

This article is part of State of Surveillance, made possible with the support of a grant from Columbia University’s Ira A. Lipman Center for Journalism and Civil and Human Rights in conjunction with Arnold Ventures. The series will explore the development, deployment, and effects of surveillance and its intersection with race and civil rights.