Press Releases: February 2018
Independent Oversight, Privacy Protections Are Needed
San Francisco, California—Face recognition—fast becoming law enforcement’s surveillance tool of choice—is being implemented with little oversight or privacy protections, leading to faulty systems that will disproportionately impact people of color and may implicate innocent people for crimes they didn’t commit, says an Electronic Frontier Foundation (EFF) report released today.
Face recognition is rapidly creeping into modern life, and face recognition systems will one day be capable of capturing the faces of people, often without their knowledge, walking down the street, entering stores, standing in line at the airport, attending sporting events, driving their cars, and utilizing public spaces. Researchers at the Georgetown Law School estimated that one in every two American adults—117 million people—are already in law enforcement face recognition systems.
This kind of surveillance will have a chilling effect on Americans’ willingness to exercise their rights to speak out and be politically engaged, the report says. Law enforcement has already used face recognition at political protests, and may soon use face recognition with body-worn cameras, to identify people in the dark, and to project what someone might look like from a police sketch or even a small sample of DNA.
Face recognition employs computer algorithms to pick out details about a person’s face from a photo or video to form a template. As the report explains, police use face recognition to identify unknown suspects by comparing their photos to images stored in databases and to scan public spaces to try to find specific pre-identified targets.
But no face recognition system is 100 percent accurate, and false positives—when a person’s face is incorrectly matched to a template image—are common. Research shows that face recognition misidentifies African Americans and ethnic minorities, young people, and women at higher rates than whites, older people, and men, respectively. And because of well-documented racially biased police practices, all criminal databases—including mugshot databases—include a disproportionate number of African-Americans, Latinos, and immigrants.
For both reasons, inaccuracies in face recognition systems will disproportionately affect people of color.
“The FBI, which has access to at least 400 million images and is the central source for facial recognition identification for federal, state, and local law enforcement agencies, has failed to address the problem of false positives and inaccurate results,” said EFF Senior Staff Attorney Jennifer Lynch, author of the report. “It has conducted few tests to ensure accuracy and has done nothing to ensure its external partners—federal and state agencies—are not using face recognition in ways that allow innocent people to be identified as criminal suspects.”
Lawmakers, regulators, and policy makers should take steps now to limit face recognition collection and subject it to independent oversight, the report says. Legislation is needed to place meaningful checks on government use of face recognition, including rules limiting retention and sharing, requiring notification when face prints are collected, ensuring robust security procedures to prevent data breaches, and establishing legal processes governing when law enforcement may collect face images from the public without their knowledge, the report concludes.
“People should not have to worry that they may be falsely accused of a crime because an algorithm mistakenly matched their photo to a suspect. They shouldn’t have to worry that their data will end up in the hands of identity thieves because face recognition databases were breached. They shouldn’t have to fear that their every move will be tracked if face recognition is linked to the networks of surveillance cameras that blanket many cities,” said Lynch. “Without meaningful legal protections, this is where we may be headed.”
For the report:
Online version: https://www.eff.org/wp/law-enforcement-use-face-recognition
One pager on facial recognition: https://www.eff.org/document/facial-recognition-one-pager
Bad Copyright Law Prevents Innovators from Creating Cool New Tools
San Francisco - The Electronic Frontier Foundation (EFF) has launched its “Catalog of Missing Devices”—a project that illustrates the gadgets that could and should exist, if not for bad copyright laws that prevent innovators from creating the cool new tools that could enrich our lives.
“The law that is supposed to restrict copying has instead been misused to crack down on competition, strangling a future’s worth of gadgets in their cradles,” said EFF Special Advisor Cory Doctorow. “But it’s hard to notice what isn’t there. We’re aiming to fix that with this Catalog of Missing Devices. It’s a collection of tools, services, and products that could have been, and should have been, but never were.”
The damage comes from Section 1201 of the Digital Millennium Copyright Act (DMCA 1201), which covers digital rights management software (DRM). DRM was designed to block software counterfeiting and other illegal copying, and Section 1201 bans DRM circumvention. However, businesses quickly learned that by employing DRM they could thwart honest competitors from creating inter-operative tools.
Right now, that means you could be breaking the law just by doing something as simple as repairing your car on your own, without the vehicle-maker’s pricey tool. Other examples include rightsholders forcing you to buy additional copies of movies you want to watch on your phone—instead of allowing you to rip the DVD you already own and are entitled to watch—or manufacturers blocking your printer from using anything but their official ink cartridges.
But that’s just the beginning of what consumers are missing. The Catalog of Missing Devices imagines things like music software that tailors your listening to what you are reading on your audiobook, or a gadget that lets parents reprogram talking toys to replace canned, meaningless messaging.
“Computers aren’t just on our desktops or in our pockets—they are everywhere, and so is the software that runs them,” said EFF Legal Director Corynne McSherry. “We need to fix the laws that choke off competition and innovation with no corresponding benefit.”
The Catalog of Missing Devices is part of EFF’s Apollo 1201 project, dedicated to eradicating all DRM from the world. A key step is eliminating laws like DMCA 1201, as well as the international versions of this legislation that the U.S. has convinced its trading partners to adopt.
For the Catalog of Missing Devices: