Strong privacy legislation in the United States is possible, necessary, and long overdue. EFF emphasizes the following concrete recommendations for proposed legislation regarding consumer data privacy.
Three Top Priorities
First, we outline three of our biggest priorities: avoiding federal preemption, ensuring consumers have a private right of action, and using non-discrimination rules to avoid pay-for-privacy schemes.
No federal preemption of stronger state laws
We have long sounded the alarm against federal legislation that would wipe the slate clean of stronger state privacy laws in exchange for one, weaker federal one. Avoiding such preemption of state laws is our top priority when reviewing federal privacy bills.
State legislatures have long been known as “laboratories of democracy” and they are serving that role now for data privacy protections. In addition to passing strong laws, state legislation also allows for a more dynamic dialogue as technology and social norms continue to change. Last year, Vermont enacted a law reining in data brokers, and California enacted its Consumer Privacy Act. Nearly a decade ago, Illinois enacted its Biometric Information Privacy Act. Many other states have passed data privacy laws and many are considering data privacy bills.
But some tech giants aren’t happy about that, and they are trying to get Congress to pass a weak federal data privacy law that would foreclose state efforts. They are right about one thing: it would be helpful to have one nationwide set of protections. However, consumers lose—and big tech companies win—if those federal protections are weaker than state protections.
Private right of action
It is not enough for government to pass laws that protect consumers from corporations that harvest and monetize their personal data. It is also necessary for these laws to have bite, to ensure companies do not ignore them. The best way to do so is to empower ordinary consumers to bring their own lawsuits against the companies that violate their privacy rights.
Often, government agencies will lack the resources necessary to enforce the laws. Other times, regulated companies will “capture” the agency, and shut down enforcement actions. For these reasons, many privacy and other laws provide for enforcement by ordinary consumers.
Companies must not be able to punish consumers for exercising their privacy rights. New legislation should include non-discrimination rules, which forbid companies from denying goods, charging different prices, or providing a different level of quality to users who choose more private options.
Absent non-discrimination rules, companies will adopt and enforce “pay-for-privacy” schemes. But corporations should not be allowed to require a consumer to pay a premium, or waive a discount, in order to stop the corporation from vacuuming up—and profiting from—the consumer’s personal information. Privacy is a fundamental human right. Pay-for-privacy schemes undermine this fundamental right. They discourage all people from exercising their right to privacy. They also lead to unequal classes of privacy “haves” and “have-nots,” depending upon the income of the user.
In addition to the three priorities discussed above, strong data privacy legislation must also ensure certain rights: the right to opt-in consent, the right to know, and the right to data portability. Along with those core rights, EFF would like to see data privacy legislation including information fiduciary rules, data broker registration, and data breach protection and notification.
Right to opt-in consent
New legislation should require the operators of online services to obtain opt-in consent to collect, use, or share personal data, particularly where that collection, use, or transfer is not necessary to provide the service.
Any request for opt-in consent should be easy to understand and clearly advise the user what data the operator seeks to gather, how they will use it, how long they will keep it, and with whom they will share it. This opt-in consent should also be ongoing—that is, the request should be renewed any time the operator wishes to use or share data in a new way, or gather a new kind of data. And the user should be able to withdraw consent, including for particular purposes, at any time.
Opt-in consent is better than opt-out consent. The default should be against collecting, using, and sharing personal information. Many consumers cannot or will not alter the defaults in the technologies they use, even if they prefer that companies do not collect their information.
Some limits are in order. For example, opt-in consent might not be required for a service to take steps that the user has requested, like collecting a user's phone number to turn on two-factor authentication. But the service should always give the user clear notice of the data collection and use, especially when the proposed use is not part of the transaction, like using that phone number for targeted advertising.
There is a risk that extensive and detailed opt-out requirements can lead to “consent fatigue.” Any new regulations should encourage entities seeking consent to explore new ways of obtaining meaningful consent to avoid that fatigue. At the same time, research suggests companies are becoming skilled at manipulating consent and steering users to share personal data.
Finally, for consent to be real, data privacy laws must prohibit companies from discriminating against consumers who choose not to consent. As discussed above, “pay-for-privacy” systems undermine privacy rules and must be prohibited.
Right to know
Users should have an affirmative “right to know” what personal data companies have gathered about them, where they got it, and with whom these companies have shared it (including the government). This includes the specific items of personal information, and the specific third parties who received it, and not just categorical descriptions of the general kinds of data and recipients.
Again, some limits are in order to ensure that the right to know doesn’t impinge on other important rights and privileges. For example, there needs to be an exception for news gathering, which is protected by the First Amendment, when undertaken by professional reporters and lay members of the public alike. Thus, if a newspaper tracked visitors to its online edition, the visitors’ right-to-know could cover that information, but not extend to a reporter’s investigative file.
There also needs to be an effective verification process to ensure that an adversary cannot steal a consumer’s personal information by submitting a fraudulent right to know request to a business.
Right to data portability
Users should have a legal right to obtain a copy of the data they have provided to an online service provider. Such “data portability” lets a user take their data from a service and transfer or “port” it elsewhere.
One purpose of data portability is to empower consumers to leave a particular social media platform and take their data with them to a rival service. This may improve competition. Other equally important purposes include analyzing your data to better understand your relationship with a service, building something new out of your data, self-publishing what you learn, and generally achieving greater transparency.
Regardless of whether you are “porting” your data to a different service or to a personal spreadsheet, data that is “portable” should be easy to download, organized, tagged, and machine-parsable.
Information fiduciary rules
One tool in the data privacy legislation toolbox is “information fiduciary” rules. The basic idea is this: When you give your personal information to an online company in order to get a service, that company should have a duty to exercise loyalty and care in how it uses that information.
Professions that already follow fiduciary rules—such as doctors, lawyers, and accountants—have much in common with the online businesses that collect and monetize users’ personal data. Both have a direct relationship with customers; both collect information that could be used against those customers; and both have one-sided power over their customers.
Accordingly, several law professors have proposed adapting these venerable fiduciary rules to apply to online companies that collect personal data from their customers. New laws would define such companies as “information fiduciaries.” However, such rules should not be a replacement for the other fundamental privacy protections discussed in this post.
Data broker registration
Data brokers harvest and monetize our personal information without our knowledge or consent. Worse, many data brokers fail to securely store this sensitive information, predictably leading to data breaches (like Equifax) that put millions of people at risk of identity theft, stalking, and other harms for years to come.
Legislators should take a page from Vermont’s new data privacy law, which requires data brokers to register annually with the government (among other significant reforms). When data broker registration and the right-to-know are put together, the whole is greater than the sum of the parts. Consumers might want to learn what information data brokers have collected about them, but have no idea who those data brokers are or how to contact them. Consumers can use the data broker registry to help decide where to send their right-to-know requests.
Data breach protection and notification
Given the massive amounts of personal information about millions of people collected and stored by myriad companies, the inherent risk of data theft and misuse is substantial. Data privacy legislation must address this risk. Three tools deserve emphasis.
First, data brokers and other companies that gather large amounts of sensitive information must promptly notify consumers when their data is leaked, misused, or stolen.
Second, it must be simple, fast, and free for consumers to freeze their credit. When a consumer seeks credit from a company, that company runs a credit check with one of the major credit agencies. When a consumer places a credit freeze with these credit agencies, an identity thief cannot use their stolen personal information to borrow money in their name.
Third, companies must have a legal duty to securely store consumers’ personal information. Also, where a company fails to meet this duty, it should be easier for people harmed by data breaches—including those suffering non-financial harms—to take those companies to court.
Some Things To Avoid
Data privacy laws should not expand the scope or penalties of computer crime laws. Existing computer crime laws are already far too broad.
Any new regulations must be judicious and narrowly tailored, avoiding tech mandates.
Policymakers must take care that any of the above requirements don’t create an unfair burden for smaller companies, nonprofits, open source projects, and the like. To avoid one-size-fits-all rules, they should tailor new obligations based on size of the service in question. For example, policymakers might take account of the entity’s revenue, or the number of people whose data the entity collects.
Too often, users gain new rights only to effectively lose them when they “agree” to terms of service and end user license agreements that they haven’t read and aren’t expected to read. Policymakers should consider the effect such waivers have on the rights and obligations they create, and be especially wary of mandatory arbitration requirements.
There is a daily drip-drip of bad news about how big tech companies are intruding on our privacy. It is long past time to enact new laws to protect consumer data privacy. We are pleased to see legislators across the country considering bills to do so, and we hope they will consider the principles above.