Tech

‘NO’: Grad Students Analyze, Hack, and Remove Under-Desk Surveillance Devices Designed to Track Them

In October, the university quietly introduced heat sensors under desk without notifying students or seeking their consent. Students removed the devices, hacked them, and were able to force the university to stop its surveillance.
‘NO’: Grad Students Analyze, Hack, and Remove Under-Desk Surveillance Devices Designed to Track Them
Image via Tech Workers Coalition

Surveillance has been creeping unabated across schools, universities, and much of daily life over the past few years, accelerated by the COVID-19 pandemic. Back in October, however, graduate students at Northeastern University were able to organize and beat back an attempt at introducing invasive surveillance devices that were quietly placed under desks at their school.

Early in October, Senior Vice Provost David Luzzi installed motion sensors under all the desks at the school's Interdisciplinary Science & Engineering Complex (ISEC), a facility used by graduate students and home to the "Cybersecurity and Privacy Institute" which studies surveillance. These sensors were installed at night—without student knowledge or consent—and when pressed for an explanation, students were told this was part of a study on “desk usage," according to a blog post by Max von Hippel, a Privacy Institute PhD candidate who wrote about the situation for the Tech Workers Coalition’s newsletter

Advertisement

Academic institutions typically jockey for facilities to use and those that are the best funded or bring in the most grant money tend to win. The ISEC is a nice building, the computer science department brings in a lot of money, they get to use it a lot, and so it may make sense for the university to try and study how desks are used so they can expand or optimize access to it. 

Von Hippel told Motherboard, however, that desk usage can already be tracked because desks are assigned and badges are required to enter the rooms. Instead, he believes the sensors were a rationale for the administration—which owns the building—to push out computer science students who don’t use the building as much as others might.

"During the pandemic, a lot of computer science students stopped coming to the office so often and for good reason: it was unsafe to come for many students and, moreover, all we do is write computer code—we don't really need to be in the office. It was sort of bad optics,” von Hippel said. “If you walked around this big, beautiful glass building you'd look around and see a big empty building—but this is one of the buildings that Northeastern uses to advertise the school. You can see how this would bother the administration, so they'd want to move more students and people into the building, which is reasonable enough.”

Advertisement

In response, students began to raise concerns about the sensors, and an email was sent out by Luzzi attempting to address issues raised by students.

“In order to develop best practices for assigning desks and seating within ISEC, the Office of the Provost will be conducting a study aimed at quantifying the usage of currently assigned seating in the write-up areas outside of the labs and the computational research desks,” Luzzi wrote in the email. “The results will be used to develop best practices for assigning desks and seating within ISEC (and EXP in due course).”

To that end, Luzzi wrote, the university had deployed “a Spaceti occupancy monitoring system” that would use heat sensors at groin level to “aggregate data by subzones to generate when a desk is occupied or not.” Luzzi added that the data would be anonymized, aggregated to look at “themes” and not individual time at assigned desks, not be used in evaluations, and not shared with any supervisors of the students. Following that email, an impromptu listening session was held in the ISEC.

At this first listening session, Luzzi asked that grad student attendees "trust the university since you trust them to give you a degree," Luzzi also maintained that "we are not doing any science here" as another defense of the decision to not seek IRB approval.

Advertisement

"He just showed up. We're all working, we have paper deadlines and all sorts of work to do. So he didn't tell us he was coming, showed up demanding an audience, and a bunch of students spoke with him," von Hippel said. "He was pretty patronizing, ignored their concerns, and said it was really productive—that he was glad they were working together to find a solution, which was ridiculous because the only solution we'd accept was one where they got rid of the sensors."

After that, the students at the Privacy Institute, which specialize in studying surveillance and reversing its harm, started removing the sensors, hacking into them, and working on an open source guide so other students could do the same. Luzzi had claimed the devices were secure and the data encrypted, but Privacy Institute students learned they were relatively insecure and unencrypted. "The students of this facility, including myself, the way that we get publications is that we take systems like this and we explore flaws in them. We explain what's bad about them, why they don't work, and so they could not have picked a group of students who were more suitable to figure out why their study was stupid."

After hacking the devices, students wrote an open letter to Luzzi and university president Joseph E. Aoun asking for the sensors to be removed because they were intimidating, part of a poorly conceived study, and deployed without IRB approval even though human subjects were at the center of the so-called study.

Advertisement

“Resident in ISEC is the Cybersecurity and Privacy Institute, one of the world’s leading groups studying privacy and tracking, with a particular focus on IoT devices,” the letter reads. “To deploy an under-desk tracking system to the very researchers who regularly expose the perils of these technologies is, at best, an extremely poor look for a university that routinely touts these researchers’ accomplishments. At worst, it raises retention concerns and is a serious reputational issue for Northeastern.”

Another listening session followed, this time for professors only, and where Luzzi claimed the devices were not subject to IRB approval because "they don't sense humans in particular - they sense any heat source." More sensors were removed afterwards and put into a "public art piece" in the building lobby spelling out NO!

Luzzi then sent an email scheduling another listening session to address students and faculty in response to the open letter, which has circulated and received hundreds of signatures, as well as continued complaints and sensor removals. That listening session was, by all accounts, a disaster. In a transcript of the event reviewed by Motherboard, Luzzi struggles to quell concerns that the study is invasive, poorly planned, costly, and likely unethical. Luzzi says that they submitted a proposal to the Institutional Review Board (IRB)—which ensures that human research subject's rights and welfare are protected—only to admit that this never happened when a faculty member reveals the IRB never received any submission. Luzzi also attempted to dismiss the concerns as particular to the Privacy Institute because "your lived experience is more desk-centric" as opposed to other graduate students. 

Advertisement

Afterwards, von Hippel took to Twitter and shares what becomes a semi-viral thread documenting the entire timeline of events from the secret installation of the sensors to the listening session occurring that day. Hours later, the sensors are removed and Luzzi writes one last email:

"Given the concerns voiced by a population of our graduate students around the project to gather data on desk usage in a model research building (ISEC), we are pulling all of the desk occupancy sensors from the building. For those of you who have engaged in discussion, please accept my gratitude for that engagement."

This was a particularly instructive episode because it shows that surveillance need not be permanent—that it can be rooted out by the people affected by it, together. Von Hippel reasons that part of their success is owed to the fact that the Computer Science department is saturated with union members. A large number of the students involved were not unionized and more generally the university’s graduate students are not under an official NLRB union. Still, graduate students are well positioned to extract demands from universities whenever they impose onerous conditions or unethical demands.

“The most powerful tool at the disposal of graduate students is the ability to strike. Fundamentally, the university runs on graduate students. We either teach or TA a phenomenal amount of classes and you have these classes of hundreds of undergrads in them that literally cannot function without graduate students to grade the assignments,” von Hippel said. 

“The computer science department was able to organize quickly because almost everybody is a union member, has signed a card, and are all networked together via the union. As soon as this happened, we communicated over union channels. We met personally and spoke in person about the problem, came up with a set of concrete actions we could take, and we took those actions. Removing the sensors, hacking the sensors, having people write up meetings and share them online, and tweeting or writing about it together."

This sort of rapid response is key, especially as more and more systems adopt sensors for increasingly spurious or concerning reasons. Sensors have been rolled out at other universities like Carnegie Mellon University, as well as public school systems. They’ve seen use in more militarized and carceral settings such as the US-Mexico border or within America’s prison system. 

These rollouts are part of what Cory Doctrow calls the "shitty technology adoption curve” whereby horrible, unethical and immoral technologies are normalized and rationalized by being deployed on vulnerable populations for constantly shifting reasons. You start with people whose concerns can be ignored—migrants, prisoners, homeless populations—then scale it upwards—children in school, contractors, un-unionized workers. By the time it gets to people whose concerns and objections would be the loudest and most integral to its rejection, the technology has already been widely deployed.

Not every graduate student can strike or can afford to leave a program that refuses to halt the roll out of a surveillance program—as von Hippel tells Motherboard, computer science PhDs will earn high salaries in the industry regardless of whether they complete their program or not. But infrastructure to act collectively—unions, strike funds, communication infrastructure—makes all the difference in getting people together to figure out how to best fight back.