If there's anything creepier than a drone flying up to your home and peering through your window, it's the thought of your technology—your cellphone, laptop camera, car radio, or even an implanted medical device—being turned on you for an even more intimate view of your private life. But the reaction last week to a drunken government intelligence agent borrowing his buddy's drone and crashing it into the White House lawn is a reminder that shortsighted solutions to the first problem could exacerbate the second.

As the White House reacted to the drone crash with a call for more regulation, the manufacturer of the downed quadcopter announced it would push a firmware update to all its units in the field, permanently preventing those drones from taking off or flying within 25km of downtown Washington DC.

This announcement may have been an effort by the manufacturer DJI, whose Phantom model is one of the most popular consumer drone units, to avoid bad press and more regulation. But it also reinforced the notion that people who "own" these drones don't really own anything at all. The manufacturer can add or remove features without their agreement, or even their knowledge.

In this case, there are reasons to restrict the airspace above Washington, DC, so DJI’s unilateral action may find support in community norms. But its action also underscores how your ownership of the stuff you buy is overridden by the manufacturer's ability to update or change it—a phenomenon that is proliferating to anything with a networked computer. In 2015, that's a huge portion of the things in your life.

In the world of gadgets, this has become a well-known problem. Nearly five years ago, for example, Sony made headlines by pressuring Playstation users to install an update that removed their ability to run unapproved software. People had been able to install GNU/Linux, and had even combined Playstations to assemble powerful supercomputers. Sony removed that feature from consoles in people's homes.

A more alarming example may be your car. New cars come with numerous on-board computers that can be reprogrammed—but not usually by you, the owner. Tesla made waves last week by "texting" new code to its cars, updating an algorithm to improve acceleration. But the gee-whiz quality of that upgrade should be tempered by some more uncomfortable realities.

One is a report in the New York Times last September, which documented the practice among lenders to install GPS trackers and "starter interrupt" devices to remotely locate and disable cars when, say, somebody falls behind on payments or drives outside of a certain area. The Times tells the story of a woman who couldn't bring her daughter to the hospital because she was three days late with a payment, and another of a woman whose car was found and towed a day after she left the agreed-upon radius in order to flee an abusive boyfriend.

These examples are from companies changing the products they control because it's in their self-interest to do so. But of course, the threat is not just from the manufacturer, but from anybody who can compel, coerce, or compromise its ability to issue those remote updates. These possibilities are not hypothetical. BMW announced just last week it would be fixing a vulnerability in its cars that would allow an attacker to hijack a remote unlock mechanism. And over a decade ago, the FBI attempted to take over OnStar voice-operated dashboard computers to snoop on drivers—a plan only foiled because it would have interfered with emergency operations of the devices. The government's ability to use official update channels for their own ends too goes back years, as revealed by examinations of the Stuxnet malware.

Fundamentally, the problem here is a system where users don't have control over the technology they own and rely upon. That's not just about a certain technological architecture; it's about the legal system that props it up. In this case, one major problem is the anti-circumvention provision of the Digital Millennium Copyright Act (DMCA), which exists to support DRM software.

Without those DRM laws, users could replace the firmware on their devices with new software that was trusted and auditable. But instead, the law casts a shadow of doubt on users that would modify that software, researchers that would examine it for security vulnerabilities, and companies that would create competitive alternatives. It's a law that's overflowed its banks, affecting technology that touches almost every aspect of our lives.

For evidence of that legal excess, look no further than the list of exemptions proposed in the DMCA's currently ongoing triennial rulemaking process. From security researchers worried that the DMCA keeps them from uncovering life-threatening vulnerabilities, to the Software Freedom Conservancy's request to access the operating system of so-called Smart TVs, to many, many others, it's clear this law is no longer about "content," but about control. Control that's being denied to users.

(Yes, we're requesting exemptions for people to be able to repair and conduct security research on cars. Sign our petition to support those requests.)

The fate of small drone flights over DC may seem like a little thing—a spat worked out among private players. But these small battles shape the notion of what it means to own something and illustrate the growing control of manufacturers over user conduct.

Related Issues