Skip to main content

DEEPLINKS BLOG

EFF and Eight Other Privacy Organizations Back Out of NTIA Face Recognition Multi-Stakeholder Process

June 16, 2015

EFF, along with eight other consumer-focused privacy advocacy organizations, has backed out of the National Telecommunications Information Administration’s multi-stakeholder process to develop a privacy-protective code of conduct for companies using face recognition. After 16 months of active engagement in the process, we decided this week it was no longer an effective use of our resources to continue in a process where companies wouldn’t even agree to the most modest measures to protect privacy.

EFF was joined by ACLU; Center for Democracy & Technology; Center for Digital Democracy; Consumer Action; Consumer Federation of America; Consumer Watchdog; Common Sense Media; and Alvaro M. Bedoya, the executive director of the Center on Privacy & Technology at Georgetown University Law Center.

EFF has been actively engaged in encouraging companies and federal, state, and local law enforcement agencies to limit the use of face recognition. As our joint public statement notes:

We believe that people have a fundamental right to privacy. People have the right to control who gets their sensitive information, and how that information is shared. And there is no question that biometric information is extremely sensitive. You can change your password and your credit card number; you cannot change your fingerprints or the precise dimensions of your face. Through facial recognition, these immutable, physical facts can be used to identify you, remotely and in secret, without any recourse.

Despite the sensitivity of face recognition data, however, the federal government and state and local law enforcement agencies continue to build ever-larger face recognition databases. Last year the FBI rolled out its NGI biometric database with 14-million face images, and we learned through a Freedom of Information Act (FOIA) request that it plans to increase that number to 52-million images by this year. Communities such as San Diego, California are using mobile biometric readers to take pictures of people on the street or in their homes and immediately identify them and enroll them in face recognition databases. These databases are shared widely, and there are few, if any, meaningful limits on access. 

EFF has been especially concerned about commercial use of face recognition because of the possibility that the data collected will be shared with law enforcement and the federal government. Several years ago, in response to a FOIA request, we learned the FBI’s standard warrant to social media companies like Facebook seeks copies of all images you upload, along with all images you’re tagged in. In the future, we may see the FBI seeking access to the underlying face recognition data instead.

Because of this concern, we hoped, by participating in the NTIA process, we could encourage companies to place meaningful limitations on their use of face recognition. We, along with the other privacy groups, advocated for an opt-in regime so that people can choose whether they participate in a face recognition database.

This is not an outlandish goal. Already Europe and two states—Illinois and Texas—require opt-in for commercial face recognition systems. However, the companies participating in the NTIA process would not even agree that an opt-in system was appropriate in the most extreme scenario—where companies that a consumer has never heard of use face recognition to identify and track people walking down public streets. This position is not only at odds with consumer expectations, current industry practices, and existing state law, it calls into question whether companies—on their own—will ever agree to any limitations on their ability to use technology to track consumers. 

One of the industry representative participants in the NTIA process has said he thinks that, even without EFF and other privacy and consumer advocacy groups, the company stakeholders “can reach consensus on transparency, notice, data security and giving users meaningful control over the sharing of their facial recognition information with anyone who otherwise would not have access.” However, given that there is no one left in the process to speak for consumers and the public, we question whether consumers will ever have meaningful control as a result of this process.

Here is a copy of our joint public statement.

JavaScript license information