It’s time for governments to confront the harmful consequences of using facial recognition technology as an instrument of surveillance. Yet law enforcement agencies across the country are purchasing face surveillance technology with insufficient oversight—despite the many ways it harms privacy and free speech and exacerbates racial injustice.

EFF supports legislative efforts in Washington and Massachusetts to place a moratorium on government use of face surveillance technology. These bills also would ban a particularly pernicious kind of face surveillance: applying it to footage taken from police body-worn cameras. The moratoriums would stay in place, unless lawmakers determined these technologies do not have a racial disparate impact, after hearing directly from minority communities about the unfair impact face surveillance has on vulnerable people.

We recently sent a letter to Washington legislators in support of that state’s moratorium bill.

We also support a proposal in the City of San Francisco that would permanently ban government use and acquisition of face surveillance technology.

EFF objects to government use of face surveillance technology for several reasons. These technologies can track everyone who lives and works in public spaces by means of a unique identifying marker that is difficult to change or hide – our own faces.

Monitoring public spaces with this technology will chill protests, an important form of free speech. Courts have long recognized that government surveillance has a “deterrent effect” on First Amendment activity.

Many governments already employ powerful spying technologies in ways that harm minority communities. This includes spying on the social media of activists, particularly advocates for racial justice such as participants in the Black Lives Matter movement. Also, police watch lists are often over-inclusive and error-riddled, and cameras often are over-deployed in minority areas—effectively criminalizing entire neighborhoods. If past is prologue, we expect police will engage in racial profiling with face surveillance technology, too.

Governments often deploy these tools without proper consideration for their technological limits. Several studies, including by Joy Buolamwini of the M.I.T. Media Lab and the ACLU, show that face surveillance technologies are more inaccurate when identifying the faces of young people, women, and minorities. And these spying tools increasingly are being used in conjunction with powerful mathematical algorithms, which often amplify bias.

It’s important to consider all of these problems with face surveillance now. Once government builds this spying infrastructure, and starts harvesting and stockpiling a record of where we have been and who we were with, there is the inherent risk that thieves will steal this sensitive data, employees will misuse it, and policymakers will redeploy it in new unforeseen manners.

For all of these reasons, companies shouldn’t sell face surveillance technology to governments. EFF supports the effort, led by ACLU, to persuade companies to stop doing so.

Face surveillance erodes everyone’s privacy, chills free speech, and has an outsized negative impact on minority communities. So governments should not use these tools. Rather, they must face the facts about how damaging this surveillance technology is to the people they have a duty to protect.