Tech

Eye-Tracking Tech Is Another Reason the Metaverse Will Suck

GettyImages-1073724472

With its much-hyped Meta re-brand, Facebook CEO Mark Zuckerberg made crystal clear that the company is going all-in on its vision for virtual social spaces. It’s not the first time a tech mogul has confidently declared this virtual reality renaissance, where people will supposedly inhabit online avatars and spend real-world money on digital furniture.

But this time around, advances in machine learning are promising to give tech companies access to entire categories of extremely intimate data—including biometrics like eye movements that can potentially reveal highly sensitive details about our preferences and mindset.

Videos by VICE

In a new paper, researchers from Duke University describe a system called EyeSyn that makes analyzing a person’s eye movements easier than ever before. Instead of collecting huge amounts of data directly from human eyes, however, the researchers trained a set of “virtual eyes” that mimic real eye movements. The system is fed templates for typical eye movement patterns—such as reading text, watching a video, or talking to another person—and then learns to match and recognize those patterns in actual humans.

In other words, the system uses example data to guess what a person is doing or looking at based entirely on their eye movements.

According to the researchers, this process removes some of the privacy concerns associated with capturing large amounts of biometric data for training algorithms. Instead of using huge, cloud-based datasets filled with human eye movements, the EyeSyn system is trained to recognize eye patterns from the template models loaded onto a local device. This also makes the system less resource-intensive, so that smaller developers can render virtual environments without huge amounts of computing power.

But the researchers also admit eye tracking can be used to create predictive systems that determine what catches a person’s attention—and potentially, infer deeply private details that they never intended to reveal.

“Where you’re prioritizing your vision says a lot about you as a person,” wrote Maria Gorlatova, one of the study’s authors, in a statement released by Duke University. “It can inadvertently reveal sexual and racial biases, interests that we don’t want others to know about, and information that we may not even know about ourselves.”

A previous study from 2019 goes further, concluding that tracking a person’s gaze “may implicitly contain information about a user’s biometric identity, gender, age, ethnicity, body weight, personality traits, drug consumption habits, emotional state, skills and abilities, fears, interests, and sexual preferences.”

In other types of algorithmic systems like emotion recognition, many machine learning experts are extremely skeptical about the accuracy of these predictions. But that’s likely not going to stop tech companies from deploying them anyway—especially platforms like Facebook, which make money by monitoring and predicting users’ behavior in order to show them ads.

“When it comes to Facebook/Meta they’ve long ago exhausted the assumption of good faith operations, particularly when it comes to privacy,” Dr. Chris Gilliard, a professor at Macomb Community College who studies algorithmic discrimination, told Motherboard. “When I think about Meta’s push to make the ‘metaverse’ a place where people live, work, and play, there are many nefarious and frankly discriminatory ways this is likely to play out.”

The researchers behind EyeSyn are not working with Facebook, and say they’re hoping to open up the technology to smaller companies entering the VR market. Speaking with Motherboard, Gorlatova noted that eye tracking is distinct from other technologies that predict emotions by observing the entire face; some of its oldest uses have been in product testing, psychological studies, and medical applications, for example. But more recently, tech companies have taken a renewed interest in developing the technology to try and measure things like cognitive activity by observing factors like eye movements, blinking, and pupil dilation.

After it bought virtual reality company Oculus in 2014, Facebook said it had no plans to use biometric and motion sensor data to nudge user behavior or sell ads. But more recently, Facebook’s parent company Meta was granted several patents related to eye tracking and biometric sensors, and seems intent on using those types of metrics to bolster its ad platform in the Metaverse.

Gorlatova emphasizes that privacy needs to be built into eye tracking technologies from the very start. Specifically, she says data on eye movements should be processed locally on consumer-end devices, so that sensitive biometric information never makes it into the hands of Facebook or another third party.

“There are many promising techniques in this general space that train classifiers locally, without sending private data to the cloud … or add noise to the data before transmitting it to the cloud so that it does not reveal sensitive information about a specific user” Gorlatova told Motherboard in an email. “I personally think that edge computing is the key to realizing many next-generation applications, including augmented reality specifically.”