Who wouldn’t enjoy a little X-ray vision, really? You could cheat at cards, for one. And that game where someone puts something under one of three cups and you have to guess where it is. Easy.
Of course, X-ray vision would come with a downside, in that you’d be spraying all your surveillance targets with radiation. So researchers at the MIT Computer Science and Artificial Intelligence Laboratory, actualizers of all things science fiction, have taken a different tack to seeing through walls: radio waves. By flinging ultra-low-power radio signals, 1,000 times milder than standard Wi-Fi, they can not only detect humans behind a wall, but track their movements in fine detail.
The system works not unlike aircraft radar. But instead of bouncing off planes and returning to the ground, the signal here travels through the wall, bounces off a human (we’re full of water, which radio signals have a hard time penetrating), and comes back through the wall and into a detector.
It’s a simple concept that was difficult to execute—because once that signal makes it back to the researchers, it’s very, very noisy. “You’re not just receiving a reflection from the human body, you’re receiving reflections from everything,” says MIT CSAIL computer scientist Dina Katabi, coauthor on a new paper describing the process. “The reflection from the wall will be much much bigger than the reflection from the signal that traversed the wall and reflected off the human body and traversed the wall again back toward you.”
Yeah, it’s messy. But that’s what neural networks are for. Your classic machine learning relies on labeling to train an AI. So, “this is a cat,” for instance, to get it to recognize objects in photos. Or, in Silicon Valley, “hotdog” or “not hotdog.”
Radio signals are rather more … mysterious: You can’t just look at one and say, “Aha, an elbow!” So the researchers devised a clever workaround. They set up a camera to simultaneously record a person they were bombarding with radio signals. “From that image you can extract the key points of the body,” says Katabi. “We use annotations in the image as the teacher for the neural network that is working just with radio signal.” The AI trained on video could then be matched to the mess of radio signals, allowing it to associate those labeled body parts with the subtle radio reflections coming back through the wall. “Imagine you teaching a kid some math problem and suddenly he becomes smarter, he can solve problems that you can’t solve,” says Katabi.
What you end up with is a human visualized as blobs, which correspond to points on the body, like knees and shoulders. The researchers then turn this into a stick figure that shows a person moving behind a wall in great detail. Such great detail, in fact, that the system could identify individuals 83 percent of the time, by first determining their unique features and movement style.
“It’s not just a location,” Katabi says. “It’s exact movements. So by looking at the gait, that is actually a feature that distinguishes one person from another in the same way your fingerprint distinguishes you.”
Potentially invasive in the wrong hands, sure, but also potentially good for privacy in other applications. (In fairness, all the data they collected so far was anonymous and encrypted.) Imagine using it to non-intrusively keep tabs on the sleeping, eating, and moving schedules of an elderly parent, as well as signs of distress. “Think about the other extreme: You can deploy cameras everywhere in someone’s home and try to get similar information,” Katabi says. This radio system, after all, would be clothing-agnostic, since it only produces stick figures.
Seeing through walls would also be handy for robots—they could peer around corners to avoid running into people coming the other way. (Alternatively, you might see around corners with lasers or even by detecting subtle changes in light.)
Superman would be so proud. Or jealous. One of the two.