Employees at Google, Microsoft, and Amazon have raised public concerns about those companies assisting U.S. military, law enforcement, and the Immigration and Customs Enforcement Agency (ICE) in deploying various kinds of surveillance technologies.

These public calls from employees raise important questions: what steps should a company take to ensure that government entities who purchase or license their technologies don’t misuse them? When should they refuse to sell to a governmental entity?

Tech companies must step up and ensure that they aren’t assisting governments in committing human rights abuses.

While the specific context of U.S. law enforcement using new surveillance technologies is more recent, the underlying questions aren’t. In 2011, EFF proposed a basic Know Your Customer framework for these questions. The context then was foreign repressive governments’ use of the technology from U.S. and European companies to facilitate human rights abuses. EFF’s framework was cited favorably by the European Commission in its implementation guide for technology companies for the United Nations' Guiding Principles on Business and Human Rights.

Now, those same basic ideas about investigation, auditing, and accountability can be, and should be, deployed domestically.

Put simply, tech companies, especially those selling surveillance equipment, must step up and ensure that they aren’t assisting governments in committing human rights, civil rights and civil liberties abuses. This obligation applies whether those governments are foreign or domestic, federal or local.

One way tech companies can navigate this difficult issue is by adopting a robust Know Your Customer program, modeled on requirements that companies already have to follow in the export control and anti-bribery context. Below, we outline our proposal for sales to foreign governments from 2011, with a few updates to reflect shifting from an international to domestic focus. Employees at companies that sell to government agencies, especially agencies with a record as troubling as ICE, may want to advocate for this as a process to protect against future corporate complicity.

We propose a simple framework:

  1. Companies selling surveillance technologies to governments need to affirmatively investigate and "know your customer" before and during a sale. We suggest customer investigations similar to what many of these companies are already required to do under the Foreign Corrupt Practices Act and the export regulations for their foreign customers.
  2. Companies need to refrain from participating in transactions where their "know your customer" investigations reveal either objective evidence or credible concerns that the technologies provided by the company will be used to facilitate governmental human or civil rights or civil liberties violations.

This framework can be implemented voluntarily, and should include independent review and auditors, employee participation, and public reporting. A voluntary approach can be more flexible as technologies change and situations around the world shift. Nokia Siemens Networks has already adopted a Human Rights Policy that incorporates some of these guidelines. In a more recent example, Google's AI principles contain many of these steps along with guidance about how they should be applied

If companies don’t act on their own, however, and don’t act with convincing transparency and commitment, then a legal approach may be necessary. Microsoft has already indicated that it not only would be open to a legal (rather than voluntary) approach, but that such an approach is necessary. For technology companies to be truly accountable, a legal approach can and should include extending liability to companies that knowingly and actively facilitate governmental abuses, whether through aiding and abetting liability. EFF has long advocated for corporate liability for aiding governmental surveillance, including in the Doe v. Cisco case internationally and in our Hepting v. AT&T case domestically. 

Elaborating on the basic framework above, here are some guidelines:

[Note: These guidelines use key terms—Technologies, Transaction, Company, and Government—that are defined at the bottom and capitalized throughout.]

Affirmatively Investigate: The Company must have a process, led by a specifically designated person, to engage in an ongoing evaluation of whether Technologies or Transactions will be, or are being, used to aid, facilitate, or cover up human rights, civil rights, and civil liberties abuses (“governmental abuses”). 

This process needs to be more than lip service and needs to be verifiable (and verified) by independent outsiders. It should also include concerned employees, who deserve to have a voice in ensuring that the tools they develop are not misused by governments. This must be an organizational commitment, with effective enforcement mechanisms in place. It must include tools, training, and education of personnel, plus career consequences when the process is not followed. In addition, in order to build transparency and solidarity, a Company that decides to refuse (or continue) further service on the basis of these standards should, where possible, report that decision publicly so that the public understands the decisions and other companies can have the benefit of their evaluation.

The investigation process should include, at a minimum:

  1. Review what the purchasing Government and Government agents, and Company personnel and agents, are saying about the use of the Technologies, both before and during any Transaction. This includes, among other things, review of sales and marketing materials, technical discussions and questions, presentations, technical and contractual specifications, and technical support conversations or requests. For machine learning or AI applications, it must include review of training data and mechanisms to identify what questions the technology will be asked to answer or learn about. Examples include:
    1. Evidence in the Doe v. Cisco case, arising from Cisco’s participation with the Chinese government in building surveillance tools aimed at identifying Falun Gong, are the presentations made by Cisco employees that brag about how their technology can help the Chinese Government combat the “Falun Gong Evil Religion.”
    2. In 2016, the ACLU of Northern California published a report outlining how Geofeedia advertised that its location-based, social media surveillance system could be used by government offices and the police to monitor the protest activities of activists, including specifically of color, raising core First Amendment concerns.
  2. Review the capabilities of the Technology for human rights abuses and consider possible mitigation measures, both technical and contractual.
    1. For instance, the fact that facial recognition software misidentifies people of color at a much higher rate than white people is a clear signal that the Technology is highly vulnerable to governmental abuses.
    2. Note that we do not believe that Companies should be held responsible merely for selling general purpose or even dual-use products to the government that are later misused, as long as the Company conducted a sufficient investigation that did not reveal governmental abuse as a serious risk.  
  3. Review the Government’s laws, regulations, and practices regarding surveillance, including approval of purchase of surveillance equipment, laws concerning interception of communications, access to stored communications, due process requirements, and other relevant legal process. For sellers of machine learning and artificial intelligence tools, the issue of whether the tool can be subject to true due process requirements–that is, whether a person impacted by a system's decision can have sufficient access to be able to determine how an adverse decision was made–should be a key factor.
    1. For instance, Nokia Siemens says that it will only provide core lawful intercept (i.e. surveillance) capabilities that are legally required and are "based on clear standards and a transparent foundation in law and practice." 
    2. In some instances, as with AI, this review may include interpreting and applying legal and ethics principles, rather than simply waiting for “generally accepted” ones to emerge, since law enforcement often implements technologies before those rules are clear. EFF and a broad international coalition have already interpreted key international legal doctrines on mass surveillance in the Necessary and Proportionate Principles.
  4. For domestic uses, this review must include an evaluation of whether sufficient local control is in place. EFF and the ACLU have worked to ensure this with a set of proposals called  Community Control Over Police Surveillance or (CCOPS). If local control and protections are not yet in place, the company should decline to provide the technology until they are, especially in locations in which the population is already at risk from surveillance.
  5. Review credible reports about the Government and its human rights record, including news or other reports from nongovernmental sources or local sources that indicate whether the Government engages in the use or misuse of surveillance capabilities to conduct human rights abuses. 
    1. Internationally, this can include U.S. State Department reports as well as other governmental and U.N. reports, as well as those by well-respected NGOs and journalists.
    2. Domestically, this can include all of the above, plus Department of Justice reports about police departments, like the ones issued about Ferguson, MO, and San Francisco, CA.
    3. For both, this review can and should included nongovernmental and journalist sources as well.

Refrain from Participation: The Company must not participate in, or continue to participate in, a Transaction or provide a Technology if it appears reasonably foreseeable that the Transaction or Technology will directly or indirectly facilitate governmental abuses. This includes cases in which:

  1. The portion of the Transaction that the Company is involved in or the specific Technology provided includes building, customizing, configuring, or integrating into a system that is known or is reasonably foreseen to be used for governmental abuses, whether done by the Company or by others.
  2. The portion of the Government that is engaging in the Transaction or overseeing the Technologies has been recognized as committing governmental abuses using or relying on similar Technologies.
  3. The Government's overall record on human rights generally raises credible concerns that the Technology or Transaction will be used to facilitate governmental abuses.
  4. The Government refuses to incorporate contractual terms confirming the intended use or uses of the Technology, confirming local control similar to the CCOPS Proposals, or allowing the auditing of their use by the Government purchasers in sales of surveillance Technologies.
  5. The investigation reveals that the technology is not capable of operating in a way that protects against abuses, such as when due process cannot be guaranteed in AI/ML decision-making, or bias in training data or facial recognition outcome is endemic or cannot be corrected.

Key Definitions and the Scope of the Process: Who should undertake these steps? The field is actually pretty small: Companies engaging in Transactions to sell or lease usrveillance Technologies to Governments, defined as follows:

  1. “Governmental Abuses” includes governmental violations of international human rights law, international humanitarian law, domestic civil rights violations, domestic civil liberties violations and other legal violations that involve governments doing harm to people. As noted above, in some instances involving new or evolving technology or uses of technology, this may include interpreting and applying those principles and laws, rather than simply waiting for legal interpretations to emerge.
  2. “Transaction” includes all sales, leases, rental or other types of arrangements where a Company, in exchange for any form of payment or other consideration, either provides or assists in providing Technologies, personnel or non-technological support to a Government. This also includes providing of any ongoing support to Governments such as software or hardware upgrades, consulting or similar services.
  3. “Technologies” include all systems, technologies, consulting services, and software that, through marketing, customization, government contracting processes, or otherwise are known to the company to be used or be reasonably likely to be used to surveil third parties. This includes technologies that intercept communications, packet-sniffing software, deep packet inspection technologies, facial recognition systems, artificial intelligence and machine learning systems aimed at facilitating surveillance, certain biometrics devices and systems, voting systems, and smart meters. 
    1. Note that EFF does not believe that general purpose technologies should be included in this, unless the Company has a clear reason to believe that they will be used for surveillance.
    2. Surveillance technologies like facial recognition systems are generally not sold to Governments off the shelf. Technology providers are almost inevitably involved in training, supporting, and developing these tools for specific governmental end users, like a specific law enforcement agency.
  4. “Company” includes subsidiaries, joint ventures (especially joint ventures directly with government entities), and other corporate structures where the Company has significant holdings or has operational control.
  5. “Government” includes all segments of government: local law enforcement, state law enforcement, and federal and even military agencies. It includes formal, recognized governments, including State parties to the United Nations.
    1. It also includes governing or government-like entities, such as the Chinese Communist Party or the Taliban and other nongovernmental entities that effectively exercise governing powers over a country or a portion of a country.
    2. For these purposes “Government” includes indirect sales through a broker, reseller, systems integrator, contractor, or other intermediary or multiple intermediaries if the Company is aware or should know that the final recipient of the Technology is a Government.

If tech companies want to be part of making the world better, they must commit to making business decisions that consider potential governmental abuses. 

This framework is similar to the one in the current U.S. export controls and also to the steps required by Companies under the Foreign Corrupt Practices Act. It is based on the recognition that companies involved in domestic government contracting, especially for the kinds of expensive, service-heavy surveillance systems provided by technology companies, are already participating in a highly regulatory process with many requirements. For larger federal contractors, these include providing complex cost or pricing data, doing immigration checks and conducting drug testing. Asking these companies to ensure that they are not facilitating governmental abuses is not a heavy additional lift.

Regardless of how tech companies get there, if they want to be part of making the world better, not worse, they must commit to making business decisions that consider potential governmental abuses.  No reasonable company wants to be known as the company that knowingly helps facilitate governmental abuses. Technology workers are making it clear that they don’t want to work for those companies either. While the blog posts and public statements from a few of the tech giants are a good start, it’s time all tech companies take real, enforceable steps to ensure that they aren’t serving as "abuse’s little helpers."