Shadow Regulations are voluntary agreements between companies (sometimes described as codes, principles, standards, or guidelines) to regulate your use of the Internet, often without your knowledge.
Shadow Regulation has become increasingly popular after the monumental failure of restrictive Internet laws such as ACTA, SOPA and PIPA. This is because Shadow Regulation can involve restrictions that are as effective as any law, but without the need for approval by a court or parliament. Indeed, sometimes Shadow Regulation is even initiated by government officials, who offer companies the Hobson's choice of coming up with a "voluntary" solution, or submitting to government regulation.
Shadow Regulation is used in many different contexts, including copyright enforcement, regulation of "hate speech," and to restrict sales of lawful products, among others. You can see some examples of Shadow Regulation in the Blog section below, and in the interactive "Free Speech Weak Links" infographic linked in the sidebar.
What's wrong with these agreements? The crux of the problem is that they can quietly reshape our Internet without our knowledge or input. We weren't consulted during their development, don't know how they are being applied, and typically have little or no means of recourse when they are used to shut down our speech online. You can read more of our thoughts about Shadow Regulation in this Deeplinks post.
This doesn't mean that voluntary agreements with Internet companies are always a bad thing. Such agreements can be a positive way to avoid heavy-handed and inflexible regulation, and there are ways of reaching such agreements in an inclusive, balanced, and accountable way. We've developed some criteria for how these agreements can be done right, which we introduce in this Deeplinks post and summarize in the second infographic to the side.
But although voluntary agreements can be done right, more often Shadow Regulation is deliberately exclusive and opaque, resulting in private parties acting as the Internet police to enforce content removal or the restriction of online behavior, while elected governments avoid responsibility. That's both unfair and undemocratic, and EFF will be calling it out when we see it.