EXECUTIVE SUMMARY

On May 20, 2003, the Defense Advanced Research Projects Agency (DARPA) issued its "Report to Congress regarding the Terrorism Information Awareness Program" (TIA). The Report, mandated by Congress and written to "assess[] the likely impact of the implementation" of TIA on civil liberties and privacy, was an opportunity for DARPA to make a careful review of the components of TIA and require accountability for each of these components. Unfortunately, the Report did not take advantage of this opportunity.

The Report makes one thing quite clear: TIA is being tested on "real problems" using "real data" pertaining to U.S. persons, apparently from Defense Department (DoD) intelligence files.

Otherwise, the Report doesn't shed much light on the issues that concern EFF. It provides an overview of the various TIA components, including some that we hadn't heard of before. Unfortunately, several of these new programs only make us more worried about TIA: If successful, they'll make surveillance and dataveillance even more powerful.

The Report also provides a few not-very-reassuring clues to the government's thinking about privacy and civil liberties. As far as the government's concerned, existing law protects our privacy. But there's little concern for data accuracy, and there's no mention of TIA's accountability to individuals. Also conspicuously absent is any concrete discussion of privacy or civil liberties issues in the actual use of TIA.

In short, the Report is a major disappointment. The government had an opportunity to open public discourse about TIA; for the most part, it chose to hide behind broad and vague generalities.

  1. Is there anything new in the Report?

    1. A new name

      Formerly "Total Information Awareness," TIA has been renamed "Terrorism Information Awareness." The renaming is intended to correct the impression "that TIA was a system to be used for developing dossiers on U.S. citizens." TIA's intent, DARPA says, is to "protect citizens by detecting and defeating foreign terrorist threats before an attack." Report, Executive Summary p. 1 (ES-1).

      This change seems purely cosmetic, reminiscent of the FBI's renaming its Carnivore tool "DCS-1000." There is no question that TIA, if implemented, will process information about U.S. persons. For instance, TIA technologies will be tested on a "realistic world of synthetic transaction data" that simulates "the behavior of normal people, unusual-but-benign people, and terrorists." Appendix p. 11 (A-11).

      More important, EFF's concerns are not limited to the compilation of dossiers. The problems with TIA and similar programs like CAPPS II include both privacy and its close cousin, accountability. These issues exist whenever the government can query and analyze vast amounts of personal information, whether in one giant database or divided among many smaller databases in both government and private hands. Keep in mind that a major goal of at least one of the components of TIA, the Genisys program, is to "create technology that enables many physically disparate heterogeneous databases to be queried as if it were one logical 'virtually' centralized database." A-11.

    2. New programs

      When TIA was first announced, we knew about these programs: Genoa (which was ending); Genoa II; Genisys; Evidence Extraction and Link Discovery (EELD); Wargaming the Asymmetric Environment (WAE); Translingual Information Detection, Extraction and Summarization (TIDES); Effective, Affordable, Reusable Speech-to-Text (EARS); Human Identification at a Distance (HumanID); Bio-Surveillance ; Communicator; and Babylon. Information on all of these programs is available from EFF's TIA pages.

      The Report describes TIA as encompassing five overall "threads": secure collaborative problem solving; structured discovery with sources and methods security; link and group understanding; context-aware visualization; and decision making with corporate memory. ES-2,3; A-3-5. The programs fall into three categories: advanced collaborative and decision support programs; language translation programs; and data search, pattern recognition, and privacy protection programs. Report, p. 2-3 (R-2-3).

      The new TIA programs are:

      • Rapid Analytical Wargaming (RAW), which seeks to provide decision-makers with the ability to better anticipate future political, policy, security, and military/terrorism activity;
      • Futures Markets Applied to Prediction (FutureMAP), which seeks to use "policy markets" in which experts trade "outcome futures" to answer questions like "will terrorists attack Israel with bioweapons next year?";
      • Global Autonomous Language Exploitation (GALE), which seeks to teach computers to find critical foreign intelligence information from broadcasts, conversations, newswires and the Internet and then provide it to humans without their specifically requesting it;
      • Scalable Social Network Analysis (SSNA), which aims to model networks of connections like social interactions, financial transactions, telephone calls, and organizational memberships;
      • MisInformation Detection (MInDet), which seeks to detect intentional misinformation and inconsistencies in publicly available data and to identify false or misleading statements in textual documents;
      • Activity, Recognition, and Monitoring (ARM), which seeks to automate the ability to capture, identify and classify human activities in surveillance environments (including crowds) using video, agile sensors, low power radar, infrared, and radio frequency tags; and
      • Next-Generation Facial Recognition (NGFR), which seeks to improve face-recognition technology using 3-D imagery and processing techniques, infrared and multispectral imagery, and expression analysis.

      The Report's description of these programs is provided in the appendix to this review. Clearly, it is reasonable to expect that programs will continue to be added, which again highlights the need for close oversight. If TIA is permitted to continue, EFF will not be surprised if DARPA's new "LifeLog" program, for instance, joins the TIA "surveillance product line" in the next year or two.

    3. How much are we spending on TIA?

      The Report states that TIA funding "for FY 2003 through FY 2005 as proposed in the FY 2004 President's Budget submission is $53,752,000." ES-2.

      This number is misleading, because it only counts the line item for TIA which is separate from the line items for EELD, Genisys, and so on. According to EFF's arithmetic, the budget for all TIA programs described in the Report is about $140 million in FY 2003 and about $169 million in FY 2004.

    4. Lip service paid to privacy and civil liberties concerns even while TIA is experimenting with real data about real U.S. persons

      Unsurprisingly, the Report strongly emphasizes privacy issues. Most obviously, the Report highlights the Genisys Privacy Protection Program, a subcomponent of the Genisys database or "data repository" technology program.

      While DARPA has talked about the need for operational or technical (as opposed to legal) TIA privacy safeguards for some time, and deserves credit for having done so, EFF is disappointed by the superficiality of the Report's discussion. The remainder of this review will identify shortcomings in the Report's approach to privacy.

      The best example needs to be highlighted here: while the Report tries to reassure us that TIA is being developed with concern for privacy and civil liberties, it tells us that TIA is being tested on real data about real U.S. persons. "TIA's research and testing activities have depended entirely on (1) information legally obtainable and usable by the Federal Government under existing law, or (2) wholly synthetic, artificial data that has been generated to resemble and model real-world patterns of behavior." R-27 (emphasis in original).

      This statement is troubling. It's OK that TIA R&D is using synthetic data (sort of like "The Sims" gone wild). It would be even more interesting to know the full set of "character attributes" used to generate these 10 million imaginary people. And that this synthetic data includes imaginary people "calling other imaginary people, traveling places, and buying things" tells us that TIA really is intended to analyze the full spectrum of transactions in everyday life.

      But what about the "information legally obtainable and usable by the Federal Government under existing law"? The Report doesn't say much about how much or what kind of this information is actually being used. We know from TIA Program Directive Number 2 (Data Concerning Information About U.S. Persons) that DARPA is using DoD "intelligence entities" as "test nodes." Directives, p. 4 (D-4). These intelligence entities apparently include the Central Intelligence Agency, the National Security Agency, the Defense Intelligence Agency and DoD's Counterintelligence Field Activity. A-2.

      More to the point, the directive says: "During experiments, DARPA, contract and contract support personnel analyze real data with various tools to examine real problems. . . . As a result of these experiments, interesting results from an intelligence perspective may be generated. Judgments regarding the value of such results and any subsequent production of intelligence is the purview of the operational users and analysts, not DARPA." D-5. We see here, all too clearly, that DARPA has already washed its hands as to the potential effects of using TIA on data about real people.

  2. The Report's discussion of privacy is too limited

    1. A red herring: giant databases

      A clear message of the Report is that TIA is not intended to create a giant government database. R-27 ("the TIA Program is not attempting to create or access a centralized database that will store information gathered from various publicly or privately held databases").

      But this message is no comfort. As noted above, part of TIA aims to make physically disparate heterogeneous databases seem like a giant "virtual" database. If so, does it really matter that there is no "real" centralized database? People are already concerned about the loss of "practical obscurity" as searchable public records databases go online and as search engines make it easier to find information about them across many websites.

    2. Few TIA programs are actually evaluated

      The Report admits that "ultimate implementation of some of the component programs of TIA may raise significant and novel privacy and civil liberties policy issues." R-27. But it does little to address these issues. Instead, the Report addresses privacy issues that might arise during DARPA's development of TIA. And even here, the Report raises more questions than it answers: TIA is being tested on "real data" about real people.

      When the Report does talk about specific TIA programs, it takes shortcuts. Of the 18 TIA programs, the Report identifies only eight that raise privacy concerns: Genisys, EELD, SSNA, MInDet, Bio-ALIRT, HumanID, ARM, and NGFR. But almost in the same breath, the Report sets aside Bio-ALIRT and the three "human identification" tools because "they are not the programs that have given rise to the greatest level of concern (or that gave rise to this report)." R-31.

      This is a pretty blatant dodge. For example, ARM and NGFR weren't even funded in FY 2003; there's hardly been time (or public record information) for them to "give rise" to concerns.

      The Report does recognize that "the various tools for human identification at a distance (HumanID, ARM, and NGFR) may raise significant privacy issues if deployed in particular contexts." R-35. But it doesn't discuss how those issues would or should be resolved.

    3. Little concrete discussion of privacy

      Even for the remaining four programs Genisys, EELD, SSNA and MinDet the discussion is sparse. The Report identifies the main privacy issues for these programs as: aggregation of data, unauthorized access to TIA, and unauthorized use of TIA. R-33. It doesn't seem to think that authorized use of TIA raises a major privacy issue.

      But it doesn't address those issues so much as it deflects them. First, the Report emphasizes DARPA's commitment to TIA's effectiveness and accuracy. R-33. Unfortunately, it's not clear how TIA's effectiveness will be evaluated. "We can never know for certain that there is a terrorist plan out there to be detected until after the fact; therefore, DoD is developing collateral measures of performance." R-15. DARPA admits that testing TIA's data-mining technologies is a "very difficult problem" and that it's just beginning these tests. R-17.

      Second, the Report emphasizes privacy protection technologies, like automated audit trails, selective revelation, and anonymization. R-34. But the probable effectiveness of these technologies is not discussed. No information is given about the current state of these techniques or how well they will in a large and complex system.

      Third, the Report relies heavily on the mantra that existing law protects privacy. For instance, each operational component of DoD that hosts TIA tools or technologies is supposed to "prepare a substantive legal review that . . . analyzes the legal issues raised by the underlying program to which the TIA tools will be applied." R-34. Maybe these reviews will be more enlightening than the Report itself.

  3. The report ignores problems in existing privacy law

    The Report tells us that TIA must "operate within the confines of existing law." R-32; R-28 ("This report does not recommend any changes in statutory law"). There are three problems here. First, there's no reason to think that existing law adequately protects personal privacy or civil liberties. For example, Watergate-era laws like the Privacy Act are widely regarded today as under-enforced and riddled with loopholes; the Foreign Intelligence Surveillance Act's reliance on secret courts and proceedings is of highly questionable constitutionality; and the Fourth Amendment's constitutional protections have been greatly weakened by the Supreme Court's restricted concept of "reasonable expectation of privacy."

    Second, the gaps in existing privacy law are widening because of new technologies that expose more of our lives to others and eliminate "reasonable" privacy expectations. The rise of the Internet and e-mail means that warrantless surveillance can gather information about people's reading and viewing habits something that was unlikely to happen before the Internet.

    Third, "existing" privacy law changes. Since 9/11, the passage of the USA-PATRIOT Act, the Homeland Security Act, and the Aviation Security Act have caused tectonic shifts in the privacy landscape, and not for the better. Tellingly, the Report actually lists the USA-PATRIOT Act and the Homeland Security Act as laws that "might either constrain or (as a logistical matter) completely block deployment of TIA search tools." R-18.

    And sometimes, when new technologies might make surveillance harder for the government, new laws or regulations lighten the government's burden. The use of encryption has been held back by government regulation of encryption export; when the FBI told Congress that digital telephony might hinder its ability to wiretap phone, the Communications Assistance to Law Enforcement Act (CALEA) was enacted to ensure that the FBI will always be able to intercept phone conversations.

  4. The report gives short shrift to other civil liberties issues

    By law, this Report was required to "assess[] the likely impact of the implementation" of TIA on civil liberties as well as of privacy. EFF's concerns about programs like TIA and CAPPS II always include accountability, because accountability is essential to both "fair information principles" and civil liberties other than privacy. But there's even less discussion in the Report of accountability and civil liberties issues.

    1. Public accountability in TIA's development

      The Report emphasizes administrative controls on TIA's development, such as a DoD oversight board and a Federal Advisory Committee of outside experts. R-31. Such administrative controls are no substitute for true public accountability; the lack of information in this congressionally-mandated report should make that clear.

    2. Accountability in the use of TIA

      Privacy Act concepts like the right to a copy of one's records, the right to dispute or correct information believed to be inaccurate, the right to know how one's personal information is used and who has access to it, and the right to know what institutions and record systems contain personal information all revolve around accountability. But the Report doesn't discuss these issues even though TIA is already being tested on real data about real people. For the ordinary person, TIA is a giant suspicion-generating machine. TIA's most obvious purpose is to identify suspected terrorists (although, given the recent allegations about the use of the Homeland Security Department to track Democratic legislators in Texas, one should be concerned that TIA will be used for other purposes). How do you clear your name if a TIA analyst, aided by an "intelligent agent," mistakenly decides that you're suspicious? Will you even know? Amazingly, while EFF worries about the accuracy and quality of the data that TIA would use, the Report blithely dismisses the issue: "TIA does not, in and of itself, raise any particular concerns about the accuracy of individually identifiable information." R-32. The Report's logic is that TIA is "simply a tool for more efficiently inquiring about data in the hands of others," and this concern about data quality "would exist regardless of the method employed." R-32-33. It's remarkable that the government can so easily ignore the harm that suspicion based on bad data might cause to people, given the problems we already see with "no-fly" and other watchlists.

    3. Civil liberties and TIA

      The Report defines civil liberties as "relat[ing] primarily to the protection of the individual's constitutional rights to, among others, freedom of expression, freedom of the press and assembly, freedom of religion, interstate travel, equal protection, and due process of law." R-27. But it says nothing meaningful about how implementing TIA might affect these civil liberties, even though some impacts are pretty obvious.

      We noted above, for instance, that the Report recognized that TIA's human identification tools raised privacy issues. But they raise obvious civil liberties issues as well. ARM (Activity Recognition Monitoring) is intended to improve the ability to interpret crowd behavior. In conjunction with NGFR (Next-Generation Face Recognition) and HumanID, the ability to monitor political demonstrations, religious assemblies, and gatherings of all kinds will be enhanced. We don't even have to add in the other tools the potential for chilling effects on protected expressional activity is clear.

      Even without TIA, we've had hints of the problems. One example: an FBI database, the Violent Gang and Terrorist Organization File (VGTOF), is expanding. In 1995 VGTOF was mainly used to track violent urban street gangs; today, it includes categories like "anarchists," "militia," "white supremacist," "black extremist," "animal rights extremist," "environmental extremist," "radical Islamic extremist," and "European origin extremist." And of course, data accuracy is a problem here. The Denver police department had for years been keeping secret files on political activists such as the American Friends Service Committee, a Quaker peace-activist group, and the pro-gun lobby. Last summer, when a man listed in the Denver files as a gun-rights group member got into a fender-bender, a police officer checking VGTOF found him described as "a member of a terrorist organization" and part of a "militia." According to a Denver police memo, the officer reported the stop to the FBI as a "terrorist contact." The Denver police and the FBI decline to comment on how the man ended up in VGTOF.

      We have no good information about how many mistakes are in these databases; we should be especially concerned by their reliance on inherently fuzzy concepts like "extremist." And yet only recently the Justice Department exempted the FBI's National Crime Information Center (NCIC) database, which provides over 80,000 law enforcement agencies with access to data on wanted persons, missing persons, gang members, stolen cars, boats, and other information, from the Privacy Act requirements of accuracy, relevance, timeliness, and completeness. Why? Because "it is impossible to determine in advance what information is accurate, relevant, timely and complete."

    4. The report ignores how deploying TIA might expand surveillance.

      Finally, the Report is almost silent on how the very existence and use of TIA might cause mission creep, source creep, and so on. There are hints the Report recognizes that human identification tools might be used "to justify longer retention of[] stored surveillance tapes of public places." R-35. Here the Report implicitly recognizes that technology not only can make surveillance practices more efficient, but can also expand their range or scope.

      Elsewhere, however, the Report is blind to this dynamic. For instance, the Report finds that because TIA "take[s] the data as it finds it" in private databases, TIA does not pose the privacy concern that "parties whose databases would be queried [would] begin collecting data that they do not already collect." R-32. But it is just as plausible that once TIA begins using private databases, there will be political, legal or social pressure for private parties to collect more information or to store it longer. The obvious historical precedent is the bank records retention requirements of the Bank Secrecy Act made infamous in California Bankers Association v. Shultz. Even without TIA, there has been talk of requiring ISPs to retain records of their subscriber's Internet use.

  5. Conclusion

    EFF's criticisms may seem unfairly harsh. Congress certainly did not expect DARPA to produce a rigorous dissertation on privacy and civil liberties. Nevertheless, we are disappointed by the lack of concrete discussion. In our experience, researchers usually think a great deal about how their work might be used, and often have a better idea of their work's implications than do outsiders. EFF hoped, perhaps vainly, that some of that concrete thinking about TIA's implications would be revealed in the Report. Instead, the Report is largely content to speak in broad and vague terms about what TIA may accomplish and how the privacy and civil liberties concerns might be addressed if everything works.

" "84288","https://www.eff.org/node/84288","Section 215 of the Patriot Act Expires in June. Is Congress Ready?","

You may have heard that the Patriot Act is set to expire soon. That’s not quite the case. The Patriot Act was a large bill, as were the reauthorizations that followed in 2005 and 2006. Not all of it sunsets. But three provisions do expire on June 1st: Section 215, the "Lone Wolf provision," and the "roving wiretap" provision.

All of these sections are concerning, but Section 215 takes the cake. It’s the authority that the NSA, with the FBI’s help, has interpreted to allow the U.S. government to vacuum up the call records of millions of innocent people. It’s also been the focus of most of the NSA reform efforts in Congress over the last year and a half. But if there were ever a time to reform the NSA, it’s now—because a vote for reauthorization, without comprehensive reform of NSA spying, will very clearly be a vote against the Constitution.

NSA Spying is Not a Partisan Issue, and it Must be Dealt With

There have been many legislative efforts to reform the NSA over the last year and a half. But none of them have been successful. Most recently, in December, the Senate failed to move the USA FREEDOM Act forward for a final vote.

What lawmakers said about USA FREEDOM, and what they are saying now, gives us some idea of where we’re headed over the next few months.

One of the most notable “no” votes on USA FREEDOM came from Sen. Rand Paul, who objected to the fact that the bill extended the sunset of Section 215 by two years. Sen. Paul has been a vocal critic of the Patriot Act and NSA spying, and has made it clear that he will vote against reauthorization. The question now is whether he and other Republican critics will be able to push for genuine reform—something essential in a Congress that is now majority Republican in both houses.

And that’s where things get troublesome. The cloture vote on USA FREEDOM was a preview for the kind of anti-reform rhetoric we can expect to hear over the next few months. Some civil liberties advocates thought the bill didn’t go far enough—but the lawmakers who voted against it mostly thought it went too far.

Majority Leader Mitch McConnell proclaimed on the Senate floor that “now is not the time to be considering legislation that takes away the exact tools we need to combat ISIL.” Similarly, Sen. Marco Rubio—who is now on record as stating that he thinks Section 215 should never expirestated “the world is as dangerous as ever, and extremists are being cultivated and recruited right here at home. This legislation would significantly weaken and, in some cases, entirely do away with some of the most important counter-terrorism capabilities at our disposal, which is why I will not support it."

The rhetoric isn't confined to the Senate floor. After a man was arrested for allegedly plotting a D.C. shooting spree, Speaker of the House John Boehner claimed that “we would have never known about this had it not been for the FISA program and our ability to collect information on people who pose an imminent threat,” even though the criminal complaint against the arrestee states that it was a government informant that supplied information about his plans. 

Why The Focus on Section 215

Section 215 is an obvious target for reform. As the pro-NSA rhetoric reaches a fever pitch, it’s important to remember that we have little to no evidence that bulk collection of telephone call records under Section 215 has ever stopped a terrorist attack.

In fact, even the administration agrees—it’s not necessary. The White House admitted that the government can accomplish its goals without bulk telephone records collection. What’s more, the President’s Review Board said “the information contributed to terrorist investigations by the use of section 215 telephony meta-data was not essential to preventing attacks.” And the Privacy and Civil Liberties Oversight Board could not identify one time when bulk collection under Section 215 of the PATRIOT Act “made a concrete difference in the outcome of a counterterrorism investigation.” Similarly, an in-depth analysis of 225 cases of people charged with terrorism found that “the contribution of NSA’s bulk surveillance programs to these cases was minimal.”

If that weren’t enough, Senators with access to classified information have also argued that the program is unnecessary. In an amicus brief in EFF’s case First Unitarian Church of Los Angeles v. NSA, Sens. Ron Wyden, Mark Udall, and Martin Heinrich stated that, while the administration has claimed that bulk collection is necessary to prevent terrorism, they “have reviewed the bulk-collection program extensively, and none of the claims appears to hold up to scrutiny.”

What’s more, claims that collection of call records is not “that big” of an invasion of privacy are simply untrue. As we point out in our amicus brief in Klayman v. Obama, “The call records collected by the government are not just metadata—they are intimate portraits of the lives of millions of Americans.”  In one short-term study of a small sample of call detail information, researchers at Stanford were able to identify one plausible inference of a subject obtaining an abortion; one subject with a heart condition; one with multiple sclerosis; and the owner of a specific brand of firearm.

In fact, former director of the NSA and CIA Michael Hayden recently admitted: “We kill people based on metadata.”  And former NSA General Counsel Stu Baker said: “metadata absolutely tells you everything about somebody’s life. If you have enough metadata, you don’t really need content.”

Of course, voting against reauthorization isn’t the ultimate goal when it comes to reforming NSA spying. Congress needs to address myriad problems—perhaps the reason Section 215 isn’t essential is because it is such a small part of the NSA’s huge repertoire.

What Real Reform Looks Like

Congress needs to pass comprehensive reform. EFF has continued to call for legislation that would:

  • End untargeted, bulk collection;
  • End illegal, warrantless "backdoor" searches of Americans' communications collected under Section 702 of the FISA Amendments Act;
  • Provide Americans a clear path to assert legal standing to sue the government for privacy abuses;
  • Reform the FISA Court by making significant opinions public and putting a special advocate for privacy in the court;
  • Shorten the FISA Amendments Act sunset;
  • Enhance the Privacy and Civil Liberties Oversight Board powers;
  • Ban the NSA from undermining commonly used encryption standards;
  • Strengthen privacy protections for innocent people inside and outside of the United States; and
  • Fix the broken “classified information” system.

A no vote on Section 215—or even the prospect of a no vote over the coming months—would force the NSA’s defenders to take reform seriously.

What’s Next?

Even though the Section 215 vote is many months away, it’s important to let Congress know what we’re expecting—a no vote on reauthorization, or any other legislation that allows suspicionless surveillance of millions of innocent people. In the meantime, as new legislation emerges, EFF will fight to ensure that all bad bills die, and push hard for any good bills to advance.

Related Issues