Over the next few years, the Department of Homeland Security (DHS) plans to implement an enormous biometric collection program which will endanger the rights of citizens and foreigners alike. The agency intends to collect at least seven types of biometric identifiers, including face and voice data, DNA, scars, and tattoos, often from questionable sources, and from innocent people.

But DHS isn’t building all of the technology: Northrop Grumman, a defense contractor, won the nearly 100 million-dollar, 42-month contract to “develop increments one and two” of the project, named HART (Homeland Advanced Recognition Technology). Now, a group of concerned investors are demanding that the Board of Directors of the company explain how they will protect human rights while building the tech behind the massive, privacy-invasive database.

It’s unsurprising that shareholders are recognizing the serious harm to civil and human rights the company will be linked to through its work on this project.

The Tri-State Coalition for Responsible Investment (Tri-CRI) filed the shareholder resolution in November after discussions with company management failed. The resolution asks the board to explain how the company will abide by its human rights policy, which reflects the U.N.’s Universal Declaration of Human Rights, while building a database that will greatly expand DHS’s ability to surveil innocent people. Their fears are well-founded: as EFF reported, HART will chill and deter people from exercising their First Amendment protected rights to speak, assemble, and associate—all rights also recognized under the UN’s Declaration of Human Rights, and presumably, by Northrop Grumman’s policy. If it is serious about its commitment to human rights, the shareholders say, the company must step up and explain how it squares that commitment with the harms inherent in this project. Voting on the resolution will take place May 15.

According to Northrop Grumman’s press release, “HART will offer a more accurate, robust way to identify adversaries” using “advanced, proven technologies.” But far from being proven or accurate, face recognition—one of the first components of HART to be built, and which DHS intends to rely on to identify people across a variety of its mission areas—has a high rate of inaccuracy, even according to DHS’s tests of its own systems, and is known to misidentify people of color and women at higher rates than others. In their shareholder resolution, Tri-CRI lay this out in addition to other harms posed by the technology:

There are concerns that the algorithms used to identify facial images that may be stored in the database have inherent racial bias. The HART database will amplify the surveillance capabilities of government agencies, presenting risks to privacy and First Amendment rights and causing harm to immigrant communities. Through the provision of services through the DHS contract, Northrop Grumman may be linked or contribute to these adverse human rights impacts.

Shareholders aren’t alone in their concerns with this technology: a coalition of over 85 groups across a spectrum of civil society recently demanded that other companies, including Amazon, Microsoft, and Google halt all governmental sales of face recognition technology. Amazon faces similar shareholder action later this month. And the pushback isn’t just coming from advocacy groups. Senators have asked DHS to pause at least one aspect of the HART program—the use of facial recognition in airports—until a proper rulemaking. The OIG is investigating the proposal as well, to “assess whether biometric data collected at pilot locations has improved DHS's ability to verify departures.” The Government Accountability Office has criticized the reliability of DHS’s data, and Congress even withheld funds in previous years from DHS’s Office of Biometric Identity Management. Even major cities, including San Francisco and Oakland, are considering permanent bans on any city department utilizing facial recognition technology—an indicator of how dangerous the technology is.

Facial recognition isn’t the only problem with HART. EFF filed comments criticizing DHS’s plan to collect, store, and share the biometric and biographic records it receives from external agencies, and to exempt this information from the federal Privacy Act, making it difficult if not impossible for individuals to review the data collected. In addition, the broad, vague scope of DHS’s proposal makes it exceedingly difficult for the American public to track and comment on the over-collection and overuse of biometric data.

Other data DHS is planning to collect—including information about people’s “relationship patterns” and from officer “encounters” with the public—can be used to identify political affiliations, religious activities, and familial and friendly relationships. These data points are also frequently colored by conjecture and bias. This, combined with the fact that the biometric data will be shared with federal agencies outside of DHS, as well as state and local law enforcement and foreign governments, poses a very real threat to First Amendment-protected activities. DHS must do more to minimize the threats to privacy and civil liberties posed by this vast new trove of highly sensitive personal data—and Northrop Grumman should look carefully at these and other criticisms when responding to shareholders’ concerns.

DHS’s legacy IDENT fingerprint database, which HART builds off of, contains information on 220-million unique individuals—an enormous increase from 20 years ago when IDENT only contained information on 1.8-million people. Between IDENT and other DHS-managed databases, the agency already manages over 10-billion biographic records and adds 10-15 million more each week. Northrop Grumman is set to help the agency—which includes the Transportation Security Administration (TSA), Customs and Border Protection (CBP), and Immigration and Customs Enforcement (ICE)—expand the scope of this unimaginably vast data trove exponentially, with new, questionable data sources gathered from ineffective and dangerous technology.

Private companies aren’t subject to the same pressure as elected officials and government agencies, but it’s unsurprising that shareholders are recognizing the serious harm to civil and human rights the company will be linked to through its work on this project. Transparency is often the first step towards accountability, and we are glad to see shareholders holding the company to account, and pressing it to publicly report how it applies its human rights policy to its work building technology that endangers human rights.

Read the shareholder proposal [.pdf].