As the details continue to emerge regarding Facebook's failure to protect its users' data from third-party misuse, a growing chorus is calling for new regulations. Mark Zuckerberg will appear in Washington to answer to Congress next week, and we expect lawmakers and others will be asking not only what happened, but what needs to be done to make sure it doesn't happen again.

As recent revelations from Grindr and Under Armour remind us, Facebook is hardly alone in its failure to protect user privacy, and we're glad to see the issue high on the national agenda. At the same time, it’s crucial that we ensure that privacy protections for social media users reinforce, rather than undermine, equally important values like free speech and innovation. We must also be careful not to unintentionally enshrine the current tech powerhouses by making it harder for others to enter those markets. Moreover, we shouldn’t lose sight of the tools we already have for protecting user privacy.

With all of this in mind, here are some guideposts for U.S. users and policymakers looking to figure out what should happen next.

Users Have Rights

Any responsible social media company must ensure users’ privacy rights on its platform and make those rights enforceable. These five principles are a place to start:

Right to Informed Decision-Making

Users have the right to a clear user interface that allows them to make informed choices about who sees their data and how it is used. Any company that gathers data on a user should be prepared to disclose what they’ve collected and with whom they have shared it. Users should never be surprised by a platform’s practices, because the user interface showed them exactly how it would work.

A free and open Internet must be built on respect for the rights of all users. 

Right to Control

Social media platforms must ensure that users retain control over the use and disclosure of their own data, particularly data that can be used to target or identify them. When a service wants to make a secondary use of the data, it must obtain explicit permission from the user. Platforms should also ask their users' permission before making any change that could share new data about users, share users' data with new categories of people, or use that data in a new way.

Above all, data usage should be "opt-in" by default, not "opt-out," meaning that users' data is not collected or shared unless a user has explicitly authorized it. If a social network needs user data to offer a functionality that its users actually want, then it should not have to resort to deception to get them to provide it.

Right to Leave

One of the most basic ways that users can protect their privacy is by leaving a social media platform that fails to protect it. Therefore, a user should have the right to delete data or her entire account. And we mean really delete: not just disabling access but permanently eliminating it from the service's servers.

Furthermore, if users decide to leave a platform, they should be able to easily, efficiently, and freely take their uploaded information away and move it to a different one in a usable format. This concept, known as "data portability" or "data liberation," is fundamental to promote competition and ensure that users maintain control over their information even if they sever their relationship with a particular service. Of course, for this right to be effective, it must be coupled with informed consent and user control, so unscrupulous companies can’t exploit data portability to mislead you and then grab all of your data for unsavory purposes.

Right to Notice

If users’ data has been mishandled or a company has suffered a data breach, users should be notified as soon as possible. While brief delays are sometimes necessary in order to help remedy the harm before it is made public, any such delay should be no longer than strictly necessary.

Right of Redress

Rights are not meaningful if there’s no way to enforce them. Avenues for meaningful legal redress start with (1) clear, public, and unambiguous commitments that, if breached, would subject social media platforms to unfair advertising, competition, or other legal claims with real remedies; and (2) elimination of tricky terms-of-service provisions that make it impossible for a user to ever hold the service accountable in court.

Many companies will say they support some version of all of these rights, but we have little reason to trust them to live up to their promises. So how do we give these rights teeth?

Start with the Tools We Have

We already have some ways to enforce these user rights, which can point the way for us—including the courts and current regulators at the state and federal level—to go further. False advertising laws, consumer protection regulations, and (to a lesser extent) unfair competition rules have all been deployed by private citizens, and ongoing efforts in that area may find a more welcome response from the courts now that the scope of these problems is more widely understood.

The Federal Trade Commission (FTC) has challenged companies with sloppy or fraudulent data practices. The FTC is currently investigating whether Facebook violated a 2011 consent decree on handling of user data. If it has, Facebook is looking at a fine that could genuinely hurt its bottom line. We should all expect the FTC to fulfill its duty to stand in for the rest of us.

But there is even more we could do.

Focus on Empowering Users and Toolmakers

First, policymakers should consider making it easier for users to have their day in court. As we explained in connection with the Equifax breach, too often courts dismiss data breach lawsuits based on a cramped view of what constitutes "harm." These courts mistakenly require actual or imminent loss of money due to the misuse of information that is directly traceable to a single security breach. If the fear caused by an assault can be actionable (which it can), so should the fear caused by the loss of enough personal data for a criminal to take out a mortgage in your name.

There are also worthwhile ideas about future and contingent harms in other consumer protection areas as well as medical malpractice and pollution cases, just to name a few. If the political will is there, both federal and state legislatures can step up and create greater incentives for security and steeper downsides for companies that fail to take the necessary steps to protect our data. These incentives should include a prohibition on waivers in the fine print of terms of service, so that companies can’t trick or force users into giving up their legal rights in advance.

Second, let’s empower the toolmakers. If we want companies to reconsider their surveillance-based business model, we should put mechanisms in place to discourage that model. When a programmer at Facebook makes a tool that allows the company to harvest the personal information of everyone who visits a page with a "Like" button on it, another programmer should be able to write a browser plugin that blocks this button on the pages you visit. But too many platforms impose technical and legal barriers to writing such a program, effectively inhibiting third parties’ ability to give users more control over how they interact with those services. EFF has long raised concerns about the barriers created by overbroad readings of the Computer Fraud and Abuse Act, the Digital Millennium Copyright Act and contractual prohibitions on interoperabilityRemoving those barriers would be good for both privacy and innovation.

Third, the platforms themselves should come up with clear and meaningful standards for portability—that is, the user’s ability to meaningfully leave a platform and take her data with her. This is an area where investors and the broader funding and startup community should have a voice, since so many future innovators depend on an Internet with strong interconnectivity where the right to innovate doesn’t require getting permission from the current tech giants. It’s also an area where standards may be difficult to legislate. Even well-meaning legislators are unlikely to have the technical competence or foresight to craft rules that will be flexible enough to adapt over time but tough enough to provide users with real protection. But fostering competition in this space could be one of the most powerful incentives for the current set of companies to do the right thing, spurring a race to the top for social networks.

Finally, transparency, transparency, transparency.  Facebook, Google, and others should allow truly independent researchers access to work with, black box test, and audit their systems. Users should not have to take companies’ word on how data is being collected, stored, and used.

Watch Out for Unintended Effects on Speech and Innovation

As new proposals bubble up, we all need to watch for ways they could backfire.

First, heavy-handed requirements, particularly requirements tied to specific kinds of technology (i.e. tech mandates) could stifle competition and innovation. Used without care, they could actually give even more power to today’s tech giants by ensuring that no new competitor could ever get started. 

Second, we need to make sure that transparency and control provisions don’t undermine online speech. For example, any disclosure rules must take care to protect user anonymity. And the right to control your data should not turn into an unfettered right to control what others say about youas so-called "right to be forgotten" approaches can often become. If true facts, especially facts that could have public importance, have been published by a third party, requiring their removal may mean impinging on others’ rights to free speech and access to information. A free and open Internet must be built on respect for the rights of all users. 

Asking the Right Questions

Above all, the guiding question should not be, "What legislation do we need to make sure there is never another Cambridge Analytica?" Rather, we should be asking, "What privacy protections are missing, and how can we fill that gap while respecting other important values?" Once we ask the right question, we can look for answers in existing laws, pressure from users and investors, and focused legislative steps where necessary. We need to be both creative and judicious— and take care that today’s solutions don’t become tomorrow’s unexpected problems.