Researchers at Carnegie Mellon University developed a method for detecting the three dimensional shape and movements of human bodies in a room, using only WiFi routers.
To do this, they used DensePose, a system for mapping all of the pixels on the surface of a human body in a photo. DensePose was developed by London-based researchers and Facebook’s AI researchers. From there, according to their recently-uploaded preprint paper published on arXiv, they developed a deep neural network that maps WiFi signals’ phase and amplitude sent and received by routers to coordinates on human bodies.
Videos by VICE
Researchers have been working on “seeing” people without using cameras or expensive LiDAR hardware for years. In 2013, a team of researchers at MIT found a way to use cell phone signals to see through walls; in 2018, another MIT team used WiFi to detect people in another room and translate their movements to walking stick-figures.
The Carnegie Mellon researchers wrote that they believe WiFi signals “can serve as a ubiquitous substitute” for normal RGB cameras, when it comes to “sensing” people in a room. Using WiFi, they wrote, overcomes obstacles like poor lighting and occlusion that regular camera lenses face.
Interestingly, they position this advancement as progress in privacy rights; “In addition, they protect individuals’ privacy and the required equipment can be bought at a reasonable price,” they wrote. “In fact, most households in developed countries already have WiFi at home, and this technology may be scaled to monitor the well-being of elder people or just identify suspicious behaviors at home.”
They don’t mention what “suspicious behaviors” might include, if this technology ever hits the mainstream market. But considering companies like Amazon are trying to put Ring camera drones inside our houses, it’s easy to imagine how widespread WiFi-enabled human-detection could be a force for good—or yet another exploitation of all of our privacy.