For many years, EFF has urged technology companies and legislators to do a better job at protecting the privacy of technology users and other members of the public. We hoped the companies, particularly mature players, would realize the importance of implementing meaningful privacy protections. But this year’s Cambridge Analytica scandal, following on the heels of many others, was the last straw.  Corporations are willfully failing to respect the privacy of technology users, and we need new approaches to give them real incentives to do better—and that may include updating our privacy laws.

To be clear, any new regulations must be judicious and narrowly tailored, avoiding tech mandates and expensive burdens that would undermine competition—already a problem in some tech spaces. To accomplish that, policymakers must start by consulting with technologists as well as lawyers.  After the passage of SESTA/FOSTA, we know Congress can be insensitive about the potential consequences of the rules it embraces. Looking to experts would help.

Just as importantly, new rules must also take care not to sacrifice First Amendment protections in the name of privacy protections; for example, EFF opposes the “right to be forgotten,” that is, laws that force search engines to de-list publicly available information. Finally, one size does not fit all: as we discuss in more detail below, new regulations should acknowledge and respect the wide variety of services and entities they may affect.  Rules that make sense for an ISP may not make sense for an open-source project, and vice versa.

With that in mind, policymakers should focus on the following: (1) addressing when and how online services must acquire affirmative user consent before collecting or sharing personal data, particularly where that data is not necessary for the basic operation of the service; (2) creating an affirmative “right to know,” so users can learn what data online services have collected from and about them, and what they are doing with it; (3) creating an affirmative right to “data extraction,” so users can get a complete copy of their data from a service provider; and (4) creating new mechanisms for users to hold companies accountable for data breaches and other privacy failures. 

But details matter. We offer some below, to help guide lawmakers, users, and companies alike in properly advancing user privacy without intruding on free speech and innovation.

Opt-in Consent to Online Data Gathering

Technology users interact with many online services. The operators of those services generally gather data about what the users are doing on their websites. Some operators also gather data about what the users are doing on other websites, by means of tracking tools. They may then monetize all of this personal data in various ways, including but not limited to targeted advertising, and selling the bundled data—largely unbeknownst to the users that provided it.

New legislation could require the operator of an online service to obtain opt-in consent to collect, use, or share personal data, particularly where that collection, use, or transfer is not necessary to provide the service. The request for opt-in consent should be easy to understand and clearly advise the user what data the operator seeks to gather, how the operator will use it, how long the operator will keep it, and with whom the operator will share it. The request should be renewed any time the operator wishes to use or share data in a new way, or gather a new kind of data. And the user should be able to withdraw consent, including for particular purposes.

Some limits are in order. For example, opt-in consent might not be required for a service to take steps that the user has requested, like collect a user's mailing address in order to ship them the package they ordered. But the service should always give the user clear notice of the data collection and use, especially when the proposed use is not part of the transaction, like renting the shipping address for junk mail.

Finally, there is a risk that extensive and detailed opt-out requirements can lead to “consent fatigue.” Any new regulations should encourage entities seeking consent to explore new ways of obtaining meaningful consent to avoid that fatigue. At the same time, research suggests companies are becoming skilled at manipulating consent, steering users to share personal data.  

Right to Know About Data Gathering and Sharing

Users should have an affirmative “right to know” what personal data companies have gathered about them, where they got it, and with whom these companies have shared it (including the government).

Again, some limits are in order to ensure that the right to know doesn’t impinge on other important rights and privileges.  For example, there needs to be an exception for news gathering, which is protected by the First Amendment, when undertaken by professional reporters and lay members of the public alike. Thus, if a newspaper tracked visitors to its online edition, the visitors’ right-to-know could cover that information, but not extend to a reporter’s investigative file. 

Data Extraction

In general, users should have a legal right to extract a copy of the data they have provided to an online service. People might use this copy in myriad ways, such as self-publishing their earlier comments on social media. Also, this copy might help users to better understand their relationship with the service provider.

In some cases, it may be possible for users to take this copy of their extracted data to a rival service. For example, if a user is dissatisfied with their photo storage service, they could extract a copy of their photos (and associated data) and take it to another photo storage system. In such cases, data portability may promote competition, and hopefully over time will improve services.

However, this right to extraction may need limits for certain services, such as social media, where various users’ data is entangled. For example, suppose Alice posts a photo of herself on social media, under a privacy setting that allows only certain people to see the photo, and Bob (one of those people) posts a comment on the photo. If Bob seeks to extract a copy of the data he provided to that social media, he should get his comment, but might not necessarily also get Alice’s photo.

Data Breach

Many kinds of organizations gather sensitive information about large numbers of people, yet fail to securely store it. As a result, such data is often leaked, misused, or stolen. What is worse,  some organizations fail to notify and assist the injured parties. Victims of data breaches often suffer financial and non-financial harms for years to come.

There are many potential fixes, some easier than others.  An easy one: it should be simple and fast to get a credit freeze from a credit reporting agency, which will help prevent any credit fraud following a data breach.

Also, where a company fails to adopt basic security practices, it should be easier for people harmed by data breaches—including those suffering non-financial harms—to take those companies to court.

Considerations When Drafting Any Data Privacy Law

  • One Size Does Not Fit All: Policymakers must take care that any of the above requirements don’t create an unfair burden for smaller companies, nonprofits, open source projects, and the like. To avoid that, they should consider tailoring new obligations based on size and purpose of the service in question. For example, policymakers might take account of the entity’s revenue, the number of people employed by the entity, or the number of people whose data the entity collects, among other factors.
  • Private Causes of Action: Policymakers should consider whether to include one of the most powerful enforcement tools: Giving ordinary people the ability to take violators to court.
  • Independent Audits: Policymakers should consider requiring periodic independent privacy audits. Audits are not a panacea, and policymakers should attend to the issues raised here.
  • Data Collection Is Complicated: Policymakers should consult with data experts so they understand what data can be collected and used, under what circumstances.
  • Preemption Should Not Be Used To Undermine Better State Protections: There are many benefits to having a uniform standard, rather than forcing companies to comply with 50 different state laws. That said, policymakers at the federal level should take care not to allow weak national standards to thwart better state-level regulations.
  • Waivers: Too often, users gain new rights only to effectively lose them when they “agree” to terms of service and end user license agreements that they haven’t read and aren’t expected to read. Policymakers should consider whether and how the rights and obligations they create can be waived, especially where users and companies have unequal bargaining power, and the “waiver” takes the form of a unilateral form contract rather than a fully negotiated agreement. We should be especially wary of mandatory arbitration requirements given that mandatory arbitration is often less protective of users than a judicial process would be.
  • No New Criminal Bans: Data privacy laws should not expand the scope or penalties of computer crime laws. Existing computer crime laws are already far too broad.

No privacy law will solve all privacy problems. And every privacy bill must be carefully scrutinized to ensure that it plugs existing gaps without inadvertently stifling free speech and innovation.

Related Issues