Who Will Watch the Watchers?
Two UT hackers have built a safeguard against the privacy intrusions of camera-wired apps.
Ever get the feeling that your computer is watching you? As long as you’re running Skype, FaceTime, or some other application that captures video, it already is. In the near future, experts predict that the number of applications that access cameras and other perceptual hardware—from microphones to motion sensors—will explode. Just as in decades past, our lives were changed by the rise of personal computing, then social media, and smartphones, perceptual computing—applications and devices that see, hear, and feel their surroundings—may be the next big disruptive technology.
This technological leap forward brings great promise and new concerns. On the plus side, we’ll soon be able to control our machines with voice, gesture, or even eye motions instead of keystrokes and touch. Perceptual computing may even help long-dreamt-of advances like driverless cars finally take off. But first we need to figure out how to overcome some serious privacy risks.
“This software sees everything the camera sees,” warns UT computer science professor Vitaly Shmatikov. “That includes what it needs to function correctly, but also anything that falls in its field of vision—the contents of computer monitors, credit cards lying on the table, license plates in the street.”
So far, the leaders in perceptual computing are the biggest technology corporations in the world—Microsoft, Apple, Intel, and Google. While these Sillicon Valley titans have long-standing privacy problems of their own, we can be reasonably confident that they won’t try to steal our credit card numbers from a camera feed.
“But think of a world where there’s an app store for perceptual apps,” says Suman Jana, a UT graduate student who works with Shmatikov. For instance, an office worker could pay $1.99 for an app that monitors her posture at her desk, download it to her desktop computer, and end up granting that app-maker all-day, every-day access to her computer’s camera feed. For each app, Jana says, “You’ll have to ask yourself, is it legitimate? Or is it doing something more, storing data, seeing things it’s not supposed to see? The risk is much bigger.”
Seen in these terms, perceptual computing can look like a privacy disaster waiting to happen. Shmatikov and Jana have emerged as thought leaders pointing a way out of the mess. Both are hackers by temperament and sometimes by employment who admit that they originally got into computer security partly out of a thrill for destruction and subversion. Jana has participated in breaches of network security for Amazon and PayPal; Shmatikov’s hacks have been covered in the New York Times and the Economist.
“This is one of the very few areas of science in general, and computer science in particular, where you get to break things, attack them, take them apart,” Shmatikov says with a grin. He and Jana were, of course, invited to test the security systems they’ve broken into. They are paid hackers, employed to uncover the security flaws that more malevolent hackers would surely discover in time.
With their perceptual computing security project, nicknamed A Scanner Darkly, Shmatikov and Jana took things one step further. Their project proposes an intermediary console that lets consumers change how much perceptual information any given app can access.
Consider again the example of the office worker who buys a perceptual app that monitors her posture at her desk. Such an app would require all-day access to a computer’s camera hardware, but with A Scanner Darkly, the user would be able to limit the app’s access. Instead of seeing fine-grained details like faces and credit card numbers, the app could be limited to only recognize shapes like the contour of a human body.
Thanks to UT’s investment of time, money, and brainpower, these dystopian fears may be avoided.
With this proposed system, Shmatikov says, “You can look at the world through the application’s eyes. What is the application seeing? Show me. Then there is a knob that allows you to say, Oh boy, it seems like it’s seeing so much. Let me slide the knob so it sees less.”
The reception from the computer-security community has been enthusiastic. Shmatikov and Jana’s paper outlining this system won the 2014 PET Award, a prize supported by Microsoft for the best privacy-enhancing technology research paper of the year.
The coming privacy crisis in perceptual computing is exactly the sort of problem that modern research universities like UT were built to solve. “Most companies do not have any incentive to fix privacy problems,” Jan points out. “Unless you force them, they’re not going to put these fixes in their systems. It does not give any benefit to the company, except protecting them from lawsuits.”
Thanks to UT’s investment of time, money, and brainpower, the dystopian fears that Shmatikov and Jana raise may be avoided. Shmatikov now expects the approaches used in A Scanner Darkly to be implemented widely as perceptual computing continues to take hold.
“We’re academics,” Shmatikov says proudly. “Part of our job—this is something companies can’t afford to do, but we can—is to look ahead.”
No comments
Be the first one to leave a comment.