Clearview AI extracts faceprints from billions of people, without their consent, and uses these faceprints to offer a service to law enforcement agencies seeking to identify suspects in photos. Following an exposé by the New York Times this past January, Clearview faces more than ten lawsuits, including one brought by the ACLU, alleging the company’s faceprinting violates the Illinois Biometric Information Privacy Act (BIPA). That watershed law requires opt-in consent before a company collects a person’s biometrics. Clearview moved to dismiss, arguing that the First Amendment bars this BIPA claim.

EFF just filed an amicus brief in this case, arguing that applying BIPA to Clearview’s faceprinting does not offend the First Amendment. Following a short summary, this post walks through our arguments in detail. 

Above all, EFF agrees with the ACLU that Clearview should be held accountable for invading the biometric privacy of the millions of individuals whose faceprints it extracted without consent. EFF has a longstanding commitment to protecting both speech and privacy at the digital frontier, and the case brings these values into tension. But our brief explains that well-settled constitutional principles resolve this tension.

Faceprinting raises some First Amendment interests, because it is collection and creation of information for purposes of later expressing information. However, as practiced by Clearview, this faceprinting does not enjoy the highest level of First Amendment protection, because it does not concern speech on a public matter, and the company’s interests are solely economic. Under the correct First Amendment test, Clearview may not ignore BIPA, because there is a close fit between BIPA’s goals (protecting privacy, speech, and information security) and its means (requiring opt-in consent).

Clearview’s Faceprinting Enjoys Some Protection

The First Amendment protects not just free expression, but also the necessary predicates that enable expression, including the collection and creation of information. For example, the U.S. Supreme Court has ruled that the First Amendment applies to reading books in libraries, gathering news inside courtrooms, creating video games, and newspapers’ purchasing of ink by the barrel.

Thus, courts across the country have held that the First Amendment protects our right to use our smartphones to record on-duty police officers. In the words of one federal appellate court: “The right to publish or broadcast an audio or audiovisual recording would be insecure, or largely ineffective, if the antecedent act of making the recording is wholly unprotected.” EFF has filed many amicus briefs in support of this right to record, and published suggestions about how to safely exercise this right during Black-led protests against police violence and racism.

Faceprinting is both the collection and creation of information and therefore involves First Amendment-protected interests. It collects information about the shape and measurements of a person’s face. And it creates information about that face in the form of a numerical representation. 

First Amendment protection of faceprinting is not diminished by the use of computer code to collect information about faces, or of mathematics to represent faces. Courts have consistently held that “code is speech,” because, like a musical score, it “is an expressive means for the exchange of information and ideas.” EFF has advocated for this principle from its founding through the present, in support of cryptographers, independent computer security researchers, inventors, and manufacturers of privacy-protective consumer tech.

Clearview’s Faceprinting Does Not Enjoy The Strongest Protection

First Amendment analysis only begins with determining whether the government’s regulation applies to speech or its necessary predicates. If so, the next step is to select the proper test. Here, courts should not apply “strict scrutiny,” one of the most searching levels of judicial inquiry, to BIPA’s limits on Clearview’s faceprinting. Rather, courts should apply “intermediate scrutiny,” for two intertwined reasons.

First, Clearview’s faceprinting does not concern a “public issue.” The Supreme Court has repeatedly held that the First Amendment is less protective of speech on “purely private” matters, compared to speech on public matters. It has done so, for example, where speech allegedly violated a wiretapping statute or the common law torts of defamation or emotional distress. The Court has explained that, consistent with the First Amendment’s core protection of robust public discourse, the universe of speech that involves matters of public concern is necessarily broad, but it is not unlimited.

Lower courts follow this distinction when speech allegedly violates the common law tort of publication of private facts, and when collection of information allegedly violates the common law tort of intrusion on seclusion. The courts held that that these privacy torts do not violate the First Amendment as long as they do not restrict discussion of matters of public concern.

Second, Clearview’s interests in faceprinting are solely economic. The Supreme Court has long held that “commercial speech,” meaning “expression related solely to the economic interests of the speaker and its audience,” receives “lesser protection” compared to “other constitutionally guaranteed expression.” Thus, when faced with First Amendment challenges to laws that protect consumer data privacy from commercial data processing, lower courts apply intermediate judicial review under the commercial speech doctrine. These decisions frequently focused not just on the commercial motivation, but also the lack of a matter of public concern.

To be sure, faceprinting can be the predicate to expression that is relevant to matters of public concern. For example, a journalist or police reform advocate might use faceprinting to publicly name the unidentified police officer depicted in a video using excessive force against a protester. But this is not the application of faceprinting practiced by Clearview.

Instead, Clearview extracts faceprints from billions of face photos, absent any reason to think any particular person in those photos will engage in a matter of public concern. Indeed, the overwhelming majority of these people have not and will not. Clearview’s sole purpose is to sell the service of identifying people in probe photos, devoid of journalistic, artistic, scientific, or other purpose. It makes this service available to a select set of paying customers who are contractually forbidden from redistribution of the faceprinting. 

In short, courts here should apply intermediate First Amendment review. To pass this test, BIPA must advance a “substantial interest,” and there must be a “close fit” between this interest and how BIPA limits speech.

Illinois Has Substantial Interests

BIPA advances three substantial government interests.

First, Illinois has a substantial interest in protecting biometric privacy. We have a fundamental human right to privacy over our personal information. But everywhere we go, we display a unique and indelible marker that can be seen from a distance: our own faces. So corporations can use face surveillance technology (coupled with the ubiquity of digital cameras) to track where we go, who we are with, and what we are doing.

Second, Illinois has a substantial interest in protecting the many forms of expression that depend on privacy. These include the rights to confidentially engage in expressive activity, to speak anonymously, to converse privately, to confidentially receive unpopular ideas, and to confidentially gather newsworthy information from undisclosed sources. Police use faceprinting to identify protesters, including with Clearview’s help. Government officials can likewise use faceprinting to identify who attended a protest planning meeting, who visited an investigative reporter, who entered a theater showing a controversial movie, and who left an unsigned pamphlet on a doorstep. So Clearview is not the only party whose First Amendment interests are implicated by this case.

Third, Illinois has a substantial interest in protecting information security. Data thieves regularly steal vast troves of personal data. Criminals and foreign governments can use stolen faceprints to break into secured accounts that can be opened by the owner’s face. Indeed, a team of security researchers did this with 3D models based on Facebook photos.

There Is A Close Fit Between BIPA and Illinois’ Interests 

BIPA requires private entities like Clearview to obtain a person’s opt-in consent before collecting their faceprint. There is a close fit between this rule and Illinois’ substantial interests. Information privacy requires, in the words of the Supreme Court, “the individual’s control of information concerning [their] person.” The problem is our lost control over our faceprints. The solution is to restore our control, by means of an opt-in consent requirement.

Opt-in consent is far more effective at restoring this control, compared to other approaches like opt-out consent. Many people won’t even know a business collected their faceprint. Of those who do, many won’t know they have the right to opt-out or how to do so. Even an informed person might be deterred because the process is time-consuming, confusing, and frustrating, as studies have shown. Indeed, many companies use “dark patterns” to purposefully design the user’s experience in a manner that manipulates so-called “agreement” to data processing.

Thus, numerous federal appellate and trial courts have upheld consumer data privacy laws like the one at issue here because of their close fit to substantial government interests.

Next Steps

Moving forward, EFF will continue to advocate for strong biometric privacy laws, and robust judicial interpretations of those laws. We will also continue to support bans on government use of face surveillance, including (as here) acquisition of information from corporations that wield this dangerous technology. More broadly, Clearview’s faceprinting is another reminder of the need for comprehensive federal consumer data privacy legislation. Finally, EFF will continue to oppose poorly taken First Amendment challenges to such laws, as we’ve done here.

You can read here our amicus brief in ACLU v. Clearview AI.