NGI Biometrics Image

NextGov.com is reporting that the FBI will begin rolling out its Next Generation Identification (NGI) facial recognition service as early as this January.  Once NGI is fully deployed and once each of its approximately 100 million records also includes photographs, it will become trivially easy to find and track Americans.

As we detailed in an earlier post, NGI expands the FBI’s IAFIS criminal and civil fingerprint database to include multimodal biometric identifiers such as iris scans, palm prints, photos, and voice data. The Bureau is planning to introduce each of these capabilities in phases (pdf, p.4) over the next two and a half years, starting with facial recognition in four states—Michigan, Washington, Florida, and North Carolina—this winter.

Why Should We Be Worried?

Despite the FBI’s claims to the contrary, NGI will result in a massive expansion of government data collection for both criminal and noncriminal purposes. IAFIS is already the largest biometric database in the world—it includes 70 million subjects in the criminal master file and more than 31 million civil fingerprints. Even if there are duplicate entries or some overlap between civil and criminal records, the combined number of records covers close to 1/3 the population of the United States. When NGI allows photographs and other biometric identifiers to be linked to each of those records, all easily searchable through sophisticated search tools, it will have an unprecedented impact on Americans' privacy interests.

Although IAFIS currently includes some photos, they have so far been limited specifically to mug shots linked to individual criminal records. However, according to a 2008 Privacy Impact Assessment for NGI’s Interstate Photo System, NGI will allow unlimited submission of photos and types of photos. Photos won’t be limited to frontal mug shots but may be taken from other angles and may include close-ups of scars, marks and tattoos. NGI will allow all levels of law enforcement, correctional facilities, and criminal justice agencies at the local, state, federal and even international level to submit and access photos, and will allow them to submit photos in bulk. Once the photos are in the database, they can be found easily using facial recognition and text-based searches for distinguishing characteristics.

The new NGI database will also allow law enforcement to submit public and private security camera photos that may or may not be linked to a specific person’s record. This means that anyone could end up in the database—even if they’re not involved in a crime— by just happening to be in the wrong place at the wrong time or by, for example, engaging in political protest activities in areas like Lower Manhattan that are rife with security cameras.

The biggest change in NGI will be the addition of non-criminal photos. If you apply for any type of job that requires fingerprinting or a background check, your potential employer could require you to submit a photo to the FBI. And, as the 2008 PIA notes, “expanding the photo capability within the NGI [Interstate Photo System] will also expand the searchable photos that are currently maintained in the repository.” Although noncriminal information is ostensibly kept separate from criminal, all the data will be in the NGI system, and presumably it would not be difficult to search all the data at once. The FBI does not say whether there is any way to ever have your photo removed from the database.

Technological Advancements Support Even Greater Tracking Capabilities

According to an FBI presentation on facial recognition and identification initiatives (pdf, pp. 4-5) at a biometrics conference last year, one of the FBI’s goals for NGI is to be able to track people as they move from one location to another. Recent advancements in camera and surveillance technology over the last few years will support this goal. For example, in a National Institute of Justice presentation (pdf, p.17) at the same 2010 biometrics conference, the agency discussed a new 3D binocular and camera that allows realtime facial acquisition and recognition at 1000 meters. The tool wirelessly transmits images to a server, which searches them against a photo database and identifies the photo's subject. As of 2010, these binoculars were already in field-testing with the Los Angeles Sheriff’s Department. Presumably, the backend technology for these binoculars could be incorporated into other tools like body-mounted video cameras or the MORIS (Mobile Offender Recognition and Information System) iPhone add-on that some police officers are already using.

NIJ Facial Recognition Binoculars

Private security cameras and the cameras already in use by police departments have also advanced. They are more capable of capturing the details and facial features necessary to support facial recognition-based searches, and the software supporting them allows photo manipulation that can improve the chances of matching a photo to a person already in the database. For example, Gigapixel technology, which creates a panorama photo of lots of megapixel images stitched together (like those taken by security cameras), allows anyone viewing the photo to drill down to see and tag faces from even the largest crowd photos. And image enhancement software, already in use by some local law enforcement, can adjust photos "taken in the wild" (pdf, p.10) so they work better with facial recognition searches. 

Cameras are also being incorporated into more and more devices that are capable of tracking Americans and can provide that data to law enforcement. For example, one of the largest manufacturers of highway toll collection systems recently filed a patent application to incorporate cameras into the transponder that sits on the dashboard in your car. This manufacturer's transponders are already in 22 million cars, and law enforcement already uses this data to track subjects. While a patent application does not mean the company is currently manufacturing or trying to sell the devices, it certainly shows they're interested.

Data Sharing and Publicly-Available Information Will Supplement the FBI's Database

Data sharing between the FBI and other government agencies and the repurposing of photographs taken for noncriminal activities will further support the FBI's ability to track people as they move from one location to another. At least 31 states have already started using some form of facial recognition with their DMV photos, generally to stop fraud and identity theft, and the Bureau has already worked with North Carolina, one of the four states in the NGI pilot program, to track criminals using the state’s DMV records. The Department of Justice came under fire earlier this year for populating the NGI database with non-criminal data from the Department of Homeland Security through the Secure Communities program and could be considering doing the same with facial-recognition ready DMV photos. Even if the FBI does not incorporate DMV photos en masse directly into NGI, the fact that most states allow law enforcement access to these records combined with the new expansion of the FBI's own photo database, may make this point moot.

Commercial sites like Facebook that collect data and include facial recognition capabilities could also become a honeypot for the government. The FBI’s 2008 Privacy Impact Assessment stated that the NGI/IAFIS photo database does not collect information from “commercial data aggregators,” however, the PIA acknowledges this information could be collected and added to the database by other NGI users like state and local law enforcement agencies. Further, the FBI's 2010 facial recognition presentation (pdf, p.5) notes another goal of NGI is to “Identify[ ] subjects in public datasets.” If Facebook falls into the FBI’s category of a public dataset, it may have almost as much revealing information as a commercial data aggregator.

The Problem of False Positives in Large Data Sets

As the FBI's facial recognition database gets larger and as more agencies at every level of government rely on facial recognition to identify people, false positives—someone being misidentified as the perpetrator of a crime—will become a big problem. As this 2009 report (pdf) by Helen Nissenbaum and Lucas Introna notes, facial recognition 

performs rather poorly in more complex attempts to identify individuals who do not voluntarily self-identify . . . Specifically, the “face in the crowd” scenario, in which a face is picked out from a crowd in an uncontrolled environment, is unlikely to become an operational reality for the foreseeable future.

(p. 3). The researchers go on to note that this is not necessarily because the technology is not good enough but because "there is not enough information (or variation) in faces to discriminate over large populations." (p.47) In layman's terms, this means that because so many people in the world look alike, the probability that any facial recognition system will regularly misidentify people becomes much higher as the data set (the population of people you are checking against) gets larger. German Federal Data Protection Commissioner Peter Schaar has noted false positives in facial recognition systems pose a large problem for democratic societies. "[I]n the event of a genuine hunt, [they] render innocent people suspects for a time, create a need for justification on their part and make further checks by the authorities unavoidable.”(p.37)

It appears it will take a few years for the FBI to bring NGI up to its full potential. In the meantime, we will continue to monitor this troubling trend.