Last weekend’s Cambridge Analytica newsthat the company was able to access tens of millions of users’ data by paying low-wage workers on Amazon’s Mechanical Turk to take a Facebook survey, which gave Cambridge Analytica access to Facebook’s dossier on each of those turkers’ Facebook friendshas hammered home two problems: first, that Facebook’s default privacy settings are woefully inadequate to the task of really protecting user privacy; and second, that ticking the right boxes to make Facebook less creepy is far too complicated. Unfortunately for Facebook, regulators in the U.S. and around the world are looking for solutions, and fast.

But theres a third problem, one that platforms and regulators themselves helped create: the plethora of legal and technical barriers that make it hard for third partiescompanies, individual programmers, free software collectivesto give users tools that would help them take control of the technologies they use.

Think of an ad-blocker: you view the web through your browser, and so you get to tell your web-browser which parts of a website you want to see and which parts you want to ignore. You can install plugins to do trivial things, like replace the word “millennials” with “snake people”and profound things, like making the web readable by people with visual impairments.

Ad-blockers are nearly as old as the web. In the early days of the web, they broke the deadlock over pop-up ads, allowing users to directly shape their online experience, leading to the death of pop-ups as advertisers realized that serving a pop-up was a guarantee that virtually no one would see your ad. Wethe usersdecided what our computers would show us, and businesses had to respond.

Web pioneer Doc Searls calls the current generation of ad-blockers “the largest consumer revolt in history.” The users of technology have availed themselves of the tools to give them the web they want, not the web that corporations wanted us to have. The corporations that survive this revolt will be the ones who can deliver services that users are willing to use without add-ons that challenge their business-models.

In his 1999 classic Code and Other Laws of Cyberspace, Lawrence Lessig argued that our world is regulated by four forces:

  1. Law: what's legal
  2. Markets: what's profitable
  3. Norms: what's morally acceptable
  4. Code: what's technologically possible

Under ideal conditions, companies that do bad things with technology are shamed and embarrassed by bad press (norms); they face lawsuits and regulatory action (law); they lose customers and their share-price dips (markets); and then toolsmiths make add-ons for their product that allow us all to use them safely, without giving up our personal information, or being locked into their software store, or having to get repairs or consumables from the manufacturer at any price (code).

But an increasing slice of the web is off-limits to the “code” response to bad behavior. When a programmer at Facebook makes a tool that allows the company to harvest the personal information of everyone who visits a page with a “Like” button on it another programmer can write a browser plugin that blocks this button on the pages you visit.

This week, we made you a tutorial explaining the torturous process by which you can change your Facebook preferences to keep the companys “partners” from seeing all your friends data. But what many folks would really like to do is give you a tool that does it for you: go through the tedious work of figuring out Facebooks inscrutable privacy dashboard, and roll that expertise up in a self-executing recipea piece of computer code that autopiloted your browser to login to Facebook on your behalf and ticked all the right boxes for you, with no need for you to do the fiddly work.

But they cant. Not without risking serious legal consequences, at least. A series of court decisionsoften stemming from the online gaming world, sometimes about Facebook itselfhas made fielding code that fights for the user into a legal risk that all too few programmers are willing to take.

That's a serious problem. Programmers can swiftly make tools that allow us to express our moral preferences, allowing us to push back against bad behavior long before any government official can be convinced to take an interestand if your government never takes an interest, or if you are worried about the government's use of technology to interfere in your life, you can still push back, with the right code.

Today, we are living through atechlash” in which the world has woken up to realize that a single programmer can make choices that affect millionsbillionsof peoples lives. Americas top computer science degree programs are making ethics an integral part of their curriculum. The ethical epiphanies of geeks have profoundly shaped the way we understand our technology (if only all technologists were so concerned with the ethics of their jobs).

We need technologists to thoughtfully communicate technical nuance to lawmakers; to run businesses that help people master their technology; to passionately make the case for better technology design.

But we also need our technologists to retain the power to affect millions of lives for the better. Skilled toolsmiths can automate the process of suing Equifax, filing for housing aid after you’re evicted, fighting a parking ticket or forcing an airline to give you a refund if your tickets price drops after you buy it (and thats all just one programmer, and he hasnt even graduated yet!).

When we talk about walled gardens,” we focus on the obvious harms: an App Store makes one company the judge, jury and executioner of whose programs you can run on your computer; apps cant be linked into and disappear from our references; platforms get to spy on you when you use them; opaque algorithms decide what you hear (and thus who gets to be heard).

But more profoundly, the past decades march to walled gardens has limited what we can do about all these things. We still have ad-blockers (but not for “premium video” anymore, because writing an ad-blocker that bypasses DRM is a potential felony), but we cant avail ourselves of tools to auto-configure our privacy dashboards, or snoop on our media players to see if theyre snooping on us, or any of a thousand other useful and cunning improvements over our technologically mediated lives.

Because in the end, the real risk of a walled garden isnt how badly it can treat us: its how helpless we are to fight back against it with our own, better code. If you want to rein in Big Tech, it would help immensely to have lots of little tech in use showing how things might be if the giants behaved themselves. If you want your friends to stop selling their private information for a mess of potage, it would help if you could show them how to have an online social life without surrendering their privacy. If you want the people who bet big on the surveillance business-model to go broke, there is no better way to punish them in the marketplace than by turning off the data-spigot with tools that undo every nasty default they set in the hopes that we'll give up and use products their way, not ours.