The number of Internet-enabled sensors in homes across the country is steadily increasing. These sensors are collecting personal information about what’s going on inside the home, and they are doing so in a volume and detail never before possible. The law, of course, has not kept up. There are no rules specifically designed for law enforcement access to data collected from in-home personal assistants or other devices that record what’s going on inside the home, even though the home is considered the heart of Fourth Amendment protection. That’s why it’s critical that companies push back on requests via currently existing rules for data collected via these new in-home devices. EFF applauds Amazon for doing just that—pushing back on a law enforcement request for in-home recordings from its Echo device.
The widely-publicized case involves a first-degree murder investigation out of Bentonville, Arkansas. The victim, Victor Collins, was found in November 2015 his friend’s home. The two had been drinking and watching football with a few others at the friend’s home the night before. The friend, James Bates, was charged with first-degree murder. He pled not guilty and is currently awaiting trial.
During a search of the defendant’s home in December 2015, police found an Amazon Echo in the kitchen. The police seemed to think that the device—which is “always listening” to its surrounding for its “wake” words, Alexa, Echo, or Amazon—may have recorded what went on inside the home. They seized the device and later served Amazon with a warrant for any “audio recordings, transcribed records, or other text records related to communications and transactions” between the Echo device and Amazon’s servers for a 48-hour period surrounding the incident, along with subscriber and account information. Amazon turned over the defendant’s subscriber information and purchase history, but it refused to turn over any recordings or transcripts.
The police sought to get the data via another route. A few months later, they got a second warrant—this time to search the devices they had in their possession: the physical Echo device and the defendant’s two cell phones (which, if the defendant used the Alexa app, could have contained Alexa recordings or transcripts). They were able to “extract the data” stored on the Echo device and one of the defendant’s phones, but the second phone was encrypted.
In December 2016, the State of Arkansas informed Amazon that it intended to enforce the original warrant. Amazon filed a motion to quash the warrant on February 17, 2017. Amazon argued that the request for the Alexa recordings and transcripts implicated First Amendment protected speech and that the police therefore needed to make a heightened showing before it could compel Amazon to turn over the information. As Amazon explained, the First Amendment protects not only users’ verbal requests to Alexa, but also Alexa’s responses. Alexa’s responses are protected for two reasons. First, they contain expressive material specifically requested by the user, such as podcasts, books, or music. Second, the responses are also the speech of Amazon, and they are protected the same way that a search engine’s results are protected. (Read: Despite some early reports to the contrary, Amazon never argued that the device itself had constitutional rights.)
Amazon argued that because the police were seeking access to First Amendment protected content, they needed to show a compelling need for the information and establish a sufficient nexus between the information sought and the underlying investigation. The Bentonville police hadn’t done that, so Amazon was right to push back.
A hearing on Amazon’s motion was scheduled for March 8, but it was cancelled after the defendant agreed to release the information to the authorities. With Bates’ consent, Amazon has since turned over the requested recordings to the Bentonville police. We applaud Amazon for sticking up for its user’s rights and pushing back until it had that consent.
 Depending on what data is requested, generally applicable data protection laws may apply, but they may not in all cases, especially where the data requested is especially sensitive.
The collapse of the Trans-Pacific Partnership (TPP) was the worst defeat suffered by big content since we killed SOPA and PIPA five years ago. But our opponents are persistent, well-funded, and stealthy, and we can't expect them to give up that easily. So, just as they have continued to push for SOPA-like Internet censorship mechanisms in variousotherfora, so too we have been keeping a watchful eye for the recycling of TPP proposals into other trade negotiations. It hasn't taken long for that to happen.
Preliminary steps towards the renegotiation of NAFTA, the North American Free Trade Agreement, have already begun, and Alan Davidson, former director of digital economy issues at the Commerce Department, has flagged the problematic e-commerce provisions of the TPP as suitable for transplanting into the renegotiated agreement. "TPP is a terrific starting point," he is reported as saying.
Across the other side of the world, TPP is also being touted as the right standard for Asia's secretive Regional Comprehensive Economic Partnership (RCEP), whose negotiators met in Tokyo last week. This week an Independent Commission on Trade Policy, composing seven former trade negotiators and academics from the Asia-Pacific region, released a report titled Charting a Course for Trade and Economic Integration in the Asia-Pacific. The report recommends "that policy makers should, in light of the U.S. withdrawal, advance the TPP’s high standards in the Asia-Pacific region."
Tying these developments together, the trade ministers of the former TPP countries, which include most of the NAFTA and RCEP members, are convening in Chile next week [PDF], and it is expected that several countries will use that meeting to push for the resurrection of the TPP, without the participation of the United States.
But the folly of this project is that by failing to learn from the history of the TPP's demise, the participating countries are doomed to repeat it. The proximate cause of the deal's collapse was not the withdrawal of the United States, but the factors that caused that withdrawal—widespread public dissatisfaction with the secrecy of these agreements and their domination by big business, all in the promise of economic gains that have failed to materialize.
Such is the message that more than 200 civil society groups from across the world gave today, in a letter sent to their trade ministers as they head to Chile. The letter, which EFF endorsed, says in part:
[W]e believe it is not acceptable for TPP rules to be used as a model for future trade negotiations whether bilateral, regional or multilateral, including the World Trade Organisation. We urge you to accept that this model has failed, and to engage with us and others in a more open and democratic process to develop alternative approaches that genuinely serve the interests of our peoples, our nations and the planet.
Without correcting the underlying faults in the process by which the TPP was negotiated, there is no point in attempting to replicate its provisions in future trade deals. We join colleagues from around the world in calling on trade ministers to abandon the closed, captured model of trade negotiation that led to the failed TPP. As disappointing for trade ministries as the failure of the TPP was, they need to head back to the drawing board, fix this broken process, and meaningfully consult with users before attempting any future trade deals that affect the Internet.
In the wake of the European Commission’s dangerous proposal to require user-generated content platforms to filter user uploads for copyright infringement, European digital rights advocates are calling on Internet users throughout Europe to stand up for freedom of expression online by urging their MEP (Member of European Parliament) to stop the #CensorshipMachine and “save the meme.”
Last year, the European Commission released a proposed Directive on Copyright in the Digital Single Market, Article 13 of which would require all online service providers that “store and provide to the public access to large amounts of works or other subject-matter uploaded by their users” to reach agreements with rights holders to keep allegedly infringing content off their sites – including by implementing content filtering technologies.
This week, two EU-based organizations are calling on Internet users to stand up for their rights to lawfully use copyrighted works, and to call on the European Parliament to remove Article 13 from the proposed directive.
Simultaneously, the activist group Xnet, with support from EFF, EDRi, and several other digital rights groups released this video highlighting how Article 13 would give copyright holders the ability to censor a wide swath of online expression.
Digital rights advocates aren’t the only ones seeing problems with this proposal. Article 13 has been criticized by academics and academic research centers, and members of the EU’s startup community as well. And earlier this month, an important committee charged with reviewing the proposal, the European Parliament Committee on the Internal Market and Consumer Protection, criticized Article 13 as “incompatible with the limited liability regime” currently in effect in the EU under the e-Commerce Directive, legislation the committee refers to as “enormously beneficial.” The committee’s report warns of Article 13’s “negative impacts on the digital economy [and] internet freedoms of consumers, ” as well as its potential effect on market entry for online services. The Committee also criticized the proposal’s call to implement technological filtering solutions, explaining “[t]he use of filtering potentially harms the interests of users, as there are many legitimate uses of copyright content that filtering technologies are often not advanced enough to accommodate.”
There’s still time to stop Article 13 before it becomes law in the EU. The proposed directive must pass through several more rounds of review by European Parliament Committees, followed by an informal “trialogue”, where the European Parliament, the European Commission, and the Council of the European Union try to agree on the text of the directive, before it finally moves to consideration by Parliament. If you’re in Europe, you can take action to stop Article 13 by going to savethememe.net. If you’re not, you can share that link with your European friends.
But it appears some members of Congress didn’t get the message, because they’re trying to roll back the FCC’s privacy rulesright now without having anything concrete ready to replace them. We’re talking here about basic requirements, like getting your explicit consent before using your private information to do anything other than provide you with Internet access (such as targeted advertising). Given how much private information your ISP has about you, strict limits on what they do with it are essential.
Luckily, we can stop this train wreck before it happens. But we need your help: please call your senators and your representative right now and tell them to oppose any use of the Congressional Review Act (“the CRA”—they’ll know what it is) to roll back the FCC’s new rules about ISP privacy practices.
Together, we can stop Congress from undermining crucial privacy protections.
What's the tl;dr?
Late last year, the FCC passed rules that would require ISPs to protect your private information. It covered the things you would usually associate with having an account with a major company (your name and address, financial information, etc.) but also things like any records they keep on your browsing history, geolocation information (think cell phones), and the content of your communications. Overall, the rules were pretty darn good.
But now, Senator Flake (R-AZ) and Representative Blackburn (R-TN) want to use a tool known as a Congressional Review Act resolution to totally repeal those protections. The CRA allows Congress to veto any regulation written by a federal agency (like the FCC). Worse yet, it forbids the agency from passing any “substantially similar” regulations in the future, so the FCC would be forbidden from ever trying to regulate ISP privacy practices. At the same time, some courts have limited the Federal Trade Commission’s ability protect your privacy, too.
With the hands of two federal agencies tied, ISPs themselves would be largely in change of protecting their customer’s privacy. In other words, the fox will be guarding the henhouse.
If we seem a little insistent that you take action to stop this, that’s because we sincerely believe that together, we can stop this disaster before it comes to pass. Every time someone calls their representative or senators, an angel gets its wings we’re one step closer to protecting the privacy of all U.S. Internet users. If we raise our voices the same way we did when it came to passing net neutrality, Congress won’t be able to ignore us.
On March 2, EFF Executive Director Cindy Cohn sat down with Alexander Macgillivray and Nicole Wong, both former U.S. Deputy Chief Technology Officers (CTO) under President Obama as well as former legal counsel for Google and Twitter. The panel explored the development of the Obama administration’s policies on the Internet, intellectual property, and digital privacy and speculated on the future of the White House CTO position under President Trump. On March 3rd, we learned that Peter Thiel's former chief of staff Michael Kratsios will be stepping in to the role formerly held by Macgillivray and Wong.
Both panelists underscored the contributions that technologists can make in civil service and that law and policymakers need informed voices in the room and at the table. "It's incumbent upon [the tech community] to start engaging" said Wong. "I think the technical talent within government is getting much better" and part of that comes from convening the right people within government agencies—including those who are more technologically sophisticated—who understand the issues at stake.
Macgillivray also stressed the impact that the tech community at large can have on government policy, stating "the engineering community, as it becomes more powerful, will be able to exercise more moral and political muscle."
Watch below for the full discussion touching upon diverse digital civil liberties issues including net neutrality, device searches at the border, predictive policing algorithms, and more:
Imagine this: the government, for reasons you don't know, thinks you're a spy. You go on vacation and, while you're away, government agents secretly enter your home, search it, make copies of all your electronic devices, and leave. Those agents then turn those devices upside down, looking through decades worth of your files, photos, and online activity saved on your devices. They don't find any evidence that you're a spy, but they find something else—evidence of another, totally unrelated crime. You're arrested, charged, and ultimately convicted, yet you're never allowed to see what prompted the agents to think you were a spy in the first place.
Sounds like something from dystopian fiction, right? Yet it's exactly what happened to Keith Gartenlaub. In January 2014, the FBI secretly entered Gartenlaub's home while he and his wife were on vacation in China. Agents scoured the home, taking pictures, searching through boxes and books, and—critically—making wholesale copies of his hard drives.
Agents were authorized by the secret Foreign Intelligence Surveillance Court ("FISC") to search for evidence that Gartenlaub was spying for the Chinese government. There’s only one problem with that theory: the government has never publicly produced any evidence to support it. Nevertheless, Gartenlaub now sits in jail. Not for spying, but because the FBI’s forensic search of his hard drives turned up roughly 100 files containing child pornography, buried among thousands of other files, saved on an external hard drive.
Gartenlaub was tried and convicted, and he appealed his conviction to the Ninth Circuit Court of Appeals. EFF (along with our friends at the ACLU) recently filed an amicus brief in support of his appeal.
There are plenty of troubling aspects to Gartenlaub’s prosecution and conviction. For one, and unlike normal criminal prosecutions, neither Gartenlaub nor his lawyers have ever seen the affidavit and order issued by the FISC that authorized the search of his home. There are also legitimate concerns about the sufficiency of the evidence used to convict him.
But we got involved for a different reason: to weigh in on the Fourth Amendment implications of the FBI’s searches of Gartenlaub’s electronic devices. The unusual facts of this case gave us an unusually good opportunity to push for greater Fourth Amendment protections in all searches of electronic devices.
Here’s why: when agents copied and searched Gartenlaub’s devices, they were only authorized to search for national security-related information. But the prosecution that resulted from those searches and seizures had nothing to do with national security at all. So, either the FBI seized information that was outside of the warrant (which the Fourth Amendment prohibits); or it was relying on an exception to the warrant requirement, like “plain view”—an exception that allows law enforcement to seize immediately obvious contraband when the government is in a place to lawfully observe it.
Plain view makes sense in the physical world. If cops are executing a search warrant for a home to search for drugs, they shouldn’t have to ignore the dead body lying in the living room. But the way plain view works in the digital context—especially forensic computer searches—is not at all clear. How far can cops rummage around our computers for the evidence they’re authorized to look for? Does a warrant to search for evidence of drug dealing allow cops to open all the photos stored on our computer? Does an order authorizing a search for national security information let the government rifle through a digital porn collection? And where do we draw the line between a specific search, based on probable cause for specific information stored on a computer—which the Fourth Amendment allows— and a general search for evidence of criminal activity—which the Fourth Amendment prohibits?
Our electronic devices contain decades' worth of personal information about us. And, in many ways, searches of our electronic devices can be more intrusive than searches of our homes: there is information stored on our phones, computers, and hard drives, about our interests, our political thoughts, our sexual orientations, or religious beliefs, that might never have been previously stored in our homes—or, for that matter, anywhere at all. Because of the sensitivity of this data, we need clear restrictions on law enforcement searches of our electronic devices, so that every search doesn't turn into the type of general rummaging the Fourth Amendment was designed to prevent.
In our brief, we argued this case gave the Court a perfect opportunity to set a clear rule. We argued that the FBI’s search of Gartenlaub’s hard drives for evidence of regular, domestic crimes violated the Fourth Amendment, and we urged the Court to adopt a rule that would prohibit the FBI from using evidence that it obtained that was outside the scope of the initial search authorization. This would be a promising first step in limiting law enforcement’s electronic search powers and in protecting our right to privacy in the digital age.
Wikileaks today released documents that appear to describe software tools used by the CIA to break into the devices that we all use at home and work. While we are still reviewing the material, we have not seen any indications that the encryption of popular privacy apps such as Signal and WhatsApp has been broken. We believe that encryption still offers significant protection against surveillance.
The worst thing that could happen is for users to lose faith in encryption-enabled tools and stop using them. The releases do reaffirm that users should make sure they are using the most current version of the apps on their devices. And vendors should move quickly to patch these flaws to protect users from both government and criminal attackers.
The dark side of this story is that the documents confirm that the CIA holds on to security vulnerabilities in software and devices—including Android phones, iPhones, and Samsung televisions—that millions of people around the world rely on. The agency appears to have failed to accurately assess the risk of not disclosing vulnerabilities to responsible vendors and failed to follow even the limited Vulnerabilities Equities Process. As these leaks show, we're all made less safe by the CIA's decision to keep -- rather than ensure the patching of -- vulnerabilities. Even spy agencies like the CIA have a responsibility to protect the security and privacy of Americans.
A dangerous bill in California would make it easy for the government to search the cell phones and online accounts of students and teachers. A.B. 165 rips away crucial protections for the more than 6-million Californians who work at and attend our public schools. Under the proposed law, anyone acting “for or on the behalf of” a public school—whether that’s the police or school officials—could search through student, teacher, and possibly even parent digital data without a court issuing a warrant or any other outside oversight.
A.B. 165 runs contrary to our values. California is proud to be a leader in protecting the privacy of our citizenry. Not only is the right to privacy baked into the California Constitution, but in 2015 our lawmakers also passed CalECPA—heralded as “the nation’s best digital privacy law”—with broad support from Republicans, Democrats, civil libertarians, tech companies, and members of the law enforcement community. This law strikes the right balance when it comes to protecting privacy and empowering government officials to do their jobs. It ensures that when someone in the government wants to search our digital devices or a police offer wants to search our online accounts, they go to a judge and get a warrant based on probable cause. And it also ensures that the government can act swiftly when life and limb are on the line by providing an exemption when there is an emergency.
Some may argue that schools should have different rules. But not only do Californians of all ages and backgrounds deserve and need digital privacy, A.B. 165 is a sledgehammer, not a scalpel. It destroys all the important CalECPA safeguards that protect Californians in the school context from wide-ranging government searches.
If A.B. 165 is enacted, CalECPA protections would be stripped from students and teachers, meaning:
Anyone acting “for or on the behalf of” a public school can conduct a search—that could potentially be anyone from lunch room attendants to on-campus police officers.
School officials have no outside oversight when conducting searches and don’t have to report those searches to anyone.
School officials aren’t required to notify anyone—the individual or parents or guardians—about a search
There are no clear limits on what digital data can be searched—photos, appointments, social media accounts, email accounts, text messages, and browser history could all be up for grabs.
There are no safeguards to protect how data is used or shared, including with federal agencies.
In effect, this means that a school official could search through the cell phones or online accounts of California students and teachers without any type of warrant or oversight and pass that data to federal agencies like U.S. Immigration and Customs Enforcement or others.
California students use cell phones to access and communicate deeply sensitive information, like learning about local political events, investigating reproductive health, discussing the immigration status of a family member, or exploring their own gender identity. We can show our students that their dignity and privacy matters by safeguarding their rights to read and communicate without the specter of unfettered government access.
Unfortunately, backers of A.B. 165 are the same legislators who fought the passage of CalECPA two years ago. This bill may be aimed at California public schools, but make no mistake: the battle won’t stop here. If these legislators are able to destroy safeguards for our schools, they’ll turn to other communities and try to strip away these legal protections for other Californians. We need to hold the line.
A.B. 165 is currently in the privacy subcommittee of the California Assembly. That means that right now is a very important time to make sure all our California legislators hear us. Please speak out now against A.B. 165.
And if you are a California student or teacher who has witnessed the search of a digital device or online account on school property, please report it using our form.
Not in California? You can still make a difference. Please reach out to your friends in California and ask them to speak out, and please share this blog post on social media.