The blockchain ecosystem has drastically changed over the last nine years, and the realities of today don’t closely resemble how many early enthusiasts imagined Bitcoin would evolve. People are no longer mining Bitcoin on their home laptops, and most people aren’t storing private keys on their own hard drives and then sending Bitcoin directly to friends and merchants. Instead, we’ve seen the rise of companies building software that handles these and other tasks on behalf of users. At the same time, creators are developing dozens of new tools to interact with the Bitcoin blockchain and many alternative blockchains. This in turn has inspired a wide array of new companies that mine, store, and exchange these alternative coins.

The result? Many users in the cryptocurrency space have traded banks and credit card networks for cryptocurrency exchanges, wallet providers, payment processors, and other software tools and companies that are relatively young and untested. Each of these stakeholders sets policies for how and when they’ll allow cryptocurrency storage or exchange, who is allowed to have an account, how and when accounts can be frozen, and how they’ll react to government regulation and demands for user data.

While blockchain protocols may be designed to favor censorship-resistance and autonomy, the real-world experiences of most of the users of cryptocurrencies are dictated by the policies of a few, centralized corporate intermediaries.

This post is designed to address policy concerns that companies and startups in this space should be thinking about early in their development. This post is specifically designed to speak to startups within the larger blockchain ecosystem that work at the transactional layer—including any of the multitude of tools, businesses, and services being created atop distributed ledgers such as communications platforms, methods for tracking assets, and smart contracts, and other innovative projects within the blockchain space. This specific post isn’t designed to address projects working only at the protocol layer, since projects focused on developing decentralized protocols are dedicated (if they are genuine) to solving the issues of centralization and power that we'll be addressing in this post. Some of the ideas in this post may also apply to other projects within the larger decentralized web space, where ideals of autonomy and decentralization are running up against the practical realities of businesses bringing products to market in a way that requires minimal effort from users.

The problem with centralization is that it creates points of potential failure. It can mean a small number of entities can be more easily pressured to censor speech or spy on users. Governments frequently serve as the pressure-bearers, pushing companies either directly through regulation or legal demands or indirectly through scrutiny, threats, or requests for assistance. Pressure can also include the government asking companies to build backdoors into their software to facilitate surveillance, requiring companies to shut down specific accounts or types of accounts, asking companies to keep open or freeze certain accounts or types of accounts, requiring or requesting detailed data on users, and more.

Make no mistake: governments are far from the only parties that can pressure a blockchain startup. Pressure may come from investors, advertising and business partners, from outside advocacy groups, external pundits, users of the service, and even from people who work at the company itself.

Blockchain companies that host content, open user accounts, and hold customer funds should take great care about when and how they’ll cave to pressures. The networks they are built on top of may be decentralized and censorship-resistant but they—as exchanges, merchant processors, or hosted wallet providers—are powerful choke points, capable of betraying the trust of their users. It’s vital for leaders within these companies to examine their values and philosophies early on, before there is an emergency or significant public pressure. This will allow for ample opportunity to discuss how the company can stand up for users, where it will embrace transparency, and how it will fulfill legal and ethical obligations to respond to government requests, all while there is still time for a nuanced and thoughtful analysis. We encourage leaders at blockchain startups to consider the unique challenges facing their own company, and to stretch themselves to take affirmative, strong steps to defend civil liberties in writing their policies.

While there are countless ways blockchain startups could consider user rights when developing policies, we offer the following initial concepts as a good starting place:

Transparency reports

Transparency reports are public reports from a company providing an overview of how many government requests the company received in a set period of time (such as a year). A company may also include other details, such as how many requests it complied with, how many accounts were affected, and any requests to censor or take down accounts. Transparency reports have become standard practice among Internet giants like Google, Facebook, and Twitter. If blockchain startups want to merit the trust of users, they should be embracing at least that level of transparency.

Applying the practice of transparency reporting to the blockchain space takes some creativity and flexibility. For example, many blockchain companies that store or transfer cryptocurrency on behalf of users may be required to file Suspicious Activity Reports with the U.S. government. While not traditionally in a transparency report, including information about how many such reports are filed would be vital for the public’s understanding of the company’s relationship to the government. At minimum, any company that has to file such reports should clearly and prominently tell their users that they are required by law to provide user data to the government and explain the circumstances under which they do so.

Other companies may rarely get demands for user data but could face less official requests to shut down certain accounts. Finding creative ways to best reflect these types of censorship requests—especially if the company complies with them—could help draw attention to the ways in which blockchain companies are facing pressure to stifle and surveil users.

Notifying users of government requests

When the government seeks access to user data, sometimes the user herself is the last to find out. That’s why EFF has long applauded policies that commit to notifying users when the government seeks access to their data. For example, in 2011 Twitter successfully fought for the right to tell Twitter users that the government was seeking access to their data as part of its WikiLeaks investigation. From this decision, we were able to learn that Birgitta Jónsdóttir was a target of this government data demand, and EFF took her as a client and fought for her privacy in the case. Since then, we’ve been urging other companies to adopt similar policies of informing users about government data demands.

The gold standard is to tell users before a company surrenders the data, with enough time that the user can secure legal counsel and challenge the request in court. There are exceptions to this—such as emergencies where someone’s in grave physical danger, or when an account has been compromised and so notice would be useless—but even when notice can’t be provided to users before data is shared with the government, it’s still a best practice to promise to notify a user after an emergency has ended. There will also be times when a company can’t provide notice because of a gag order, or because doing so would violate another law.

Many blockchain companies may think they don’t have user data that may be of interest to the government, especially because they may think that the most important data related to their application is already shared publicly in a blockchain. But there are many types of data and metadata that could attract attention. Do you have a list of the email addresses or device IDs of everyone who has downloaded your app? Do you have IP addresses in web server logs? Do you store records about who has used your service to submit or query specific pieces of data from a blockchain? If someone accesses your service from a mobile device, will you collect data about their geolocation or what services they accessed and when? And if you host communications, don’t assume that end-to-end encryption obviates all government interest; law enforcement may still seek to know with whom your users are conversing, when, and how often. Do you have logs of any of that?

Even startups in the blockchain space that think of themselves as small today, who perhaps may not have received their first demand for an account shut down or warrant for user content, should start thinking through their notification practices now. Making affirmative commitments to notify users about government demands for their data now ensures you’ll know how you’ll react the first time you get a warrant or subpoena for identifiable user content.

Commitment to freedom of expression

One of the key characteristics of decentralized blockchain technologies is that they are built to resist censorship. Editing out data from the blockchain is extremely costly, to the point of being near-impossible. It creates a history of records both difficult to erase and difficult to falsify.

But for the multitude of corporations who are using blockchain technologies, the issue is not so simple. Could a content-addressing system used by blockchain companies, such as IPFS, have default blacklists, supplied by governments or the IP lobbies? Could a government force a mining pool to reject a transaction? Could an exchange like Kraken be pressured to freeze accounts for controversial online writers, whether they are are known for publishing erotica or polarizing political viewpoints?

Each project within the larger ecosystem will need to think through its own policies around speech individually. But it’s useful to recognize early on that some of your users will be people you disagree with and that some of them will use your tools for purposes other than your initial intentions. Deciding how you’ll handle those circumstances early on, before you’re faced with a user you find annoying or abhorrent, could help you create fair policies that promote speech over personalities.

Swift, transparent appeals process

In May, EFF and a coalition of other civil liberties groups called on technology companies to adopt transparency and accountability around account shutdowns and content censorship. Some of the ideas in the Santa Clara principles, as they are called, could apply also to many companies in the larger blockchain ecosystem. In particular, we urged companies to “provide human review of content removal by someone not involved in the initial decision, and enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.” Though these principles are focused on content removal, we think a transparent appeals process is an important value to consider when limiting or shutting down user accounts or other ways that users can participate in an online community or service.

We recognize that there are many different reasons for account closures and freezes, including government demands, fraud, and terms of service violations. We could imagine cryptocurrency exchanges that automatically limit or freeze accounts that exhibit certain behaviors, or wallet providers that flag accounts that receive an unusually large influx of assets from different sources in a brief period of time. But an account closure for a small business or individual can have potentially disastrous effects. Creating transparent and swift methods of appealing automated decisions can stave off some of the worst scenarios.

Minimizing data collection

Finally, one of the best tools that blockchain startups have when it comes to protecting users of their service is limiting how much data is collected. Taking steps such as collecting only the minimum data necessary for the service, deleting unnecessary data, allowing users to delete accounts fully, ensuring any backups are encrypted and deleted when appropriate, allowing user-side encryption where possible and appropriate, and otherwise limiting data collection can ensure that the government doesn’t see your service as a honeypot for surveillance. In addition, your users may actively choose your service because you’ve made commitments around data protection and deletion.

There are types of data you may need to collect—such as data you are required to collect or keep by law, data that users need to provide in order for the service to function, and data necessary to prevent abuses of the system and fraud. Every blockchain startup will need to analyze their own data protection practices and make individual decisions about how and when to collect and keep info. But when you set up these systems, we urge you to remember that anything you keep could well be sought for unintended purposes, from government snoops to civil litigants. We urge you to seek ways to minimize unnecessary data collection and retention, and tell users what data you are keeping.

These five concepts are far from exhaustive, but they do represent a few basic tenants around defending user rights that too often get lost in the shuffle during the early growth spurts of new companies. As blockchain technology kickstarts a range of innovations, we urge developers to remember that different actors within the space may have wildly different incentives to stand up for users, and those incentives may change over time. Do not assume that just because a blockchain is designed to be censorship-resistant means that these concerns are irrelevant to the blockchain community. Instead, think of every company in the space as being another potential pressure point that could be targeted by those who would squelch the speech, autonomy, and privacy of users. Proactive, pro-user policies adopted today can create the norms for the future of the larger blockchain economy and ecosystem.

Note that  we’ll be hosting a workshop on these ideas at the upcoming Decentralized Web Summit. Please be sure to register today and attend the workshop on Defending Your Users, featuring Protocol Labs’s  Marvin Ammori, Human Rights Watch’s Cynthia Wong, and EFF’s Rainey Reitman.

A number of people provided helpful insights, ideas, and feedback to early versions of this blog post. Deep thanks to Seth Schoen, Sydney Li, Joe Bonneau and Peter Van Valkenburgh.