The Firesheep Firefox extension has been scaring users across the Internet since its
introduction at the Toorcon security conference this past weekend by security researchers Eric Butler and Ian Gallagher. Firesheep demonstrates a security flaw that the computer security community has been concerned about for years — that any network eavesdropper can take over another user's session (say, a login to a webmail or social networking account) just by sniffing packets and copying the victim's cookie. In other words, if the websites you visit are not taking steps to encrypt your communications, or you're not taking advantage of the encryption they offer, it's now an obvious and trivial fact that anyone else on that same network can use features from your accounts on Facebook, Twitter, Yelp, Flickr, and a number of other popular web sites. Since Firesheep is extensible, people will probably teach it to "support" more web sites in short order.
This has made some people anxious about using public wifi networks, where this attack could easily be carried out by strangers; but as Danny O'Brien explains, in the long run, the real issue isn't public wifi, but the need for encryption to protect users. Firesheep works because many websites fail to encrypt one of the most important pieces of information they exchange with you: the session identifier that tells them that you are the user behind your browser. When you load a web page whose URL begins with "https://...", your interaction with that page is encrypted. If the site is using HTTPS properly, your communications will be protected from eavesdroppers. When the URL begins with "http://...", there is no such guarantee.
But often, that protection is undermined immediately thereafter. The website drops a cookie in your browser with a code that allows you to say, "Hi, I am logged in to this website as <your name here>." Your browser constantly repeats that cookie value back to the site as you navigate, providing a way for the site to check that you're allowed to do the things on the site that you ask to do. But many sites fail to encrypt these interactions, running them over plain HTTP and allowing an eavesdropper to capture the cookie each time your browser retransmits it, ultimately allowing that eavesdropper to also say to the website "Hi, I am logged in as <your name here>." (Of course, that same eavesdropper can just as easily watch everything else you're doing on such sites.)
Firesheep makes loud and clear something that EFF has said for some time: major websites need to implement HTTPS properly and completely. For the last few months, EFF has been developing HTTPS Everywhere — a Firefox plugin that makes your web browser demand an HTTPS connection if it's available. But note the phrase "if it's available." HTTPS Everywhere only works if a site implements HTTPS; many of the most popular sites still haven't deployed HTTPS properly, if at all. HTTPS Everywhere can, in fact, help protect users against Firesheep, but only for sites that are set up to offer HTTPS protections consistently.
We're communicating with some of the companies whose sites the initial version of Firesheep targets to emphasize this point. We will be sending letters to more site operators soon. There's evidence that computers have gotten fast enough that routine use of encryption on web sites should be practical. Google reported that "[i]n order to [turn on HTTPS for all Gmail users] we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead." Although there's engineering effort involved in making this happen, the idea that sites usually need to buy lots of new servers in order to turn on HTTPS is partly a relic of an earlier era.
More than 50,000 new users have installed HTTPS Everywhere since the Firesheep story broke, showing that users deeply value about their on-line security and will take steps to protect themselves. For websites that care about protecting users' security, the solution is long overdue: now is the time for HTTPS.
Real privacy risks are presented by all of the Web's solutions for embedded video — from user-generated-content sites like YouTube to proprietary sites like MSNBC and Comedy Central. When you visit a site with embedded video, you're not only sending your information to your destination site, but also to the website which hosts that video. In addition, you're allowing the video-host to place cookies and other tracking devices onto your computer. This means that loading an embedded video from within a blog could enable the video hosting site (and, in some cases, its advertising partners) to compile a history of which blog entries you were reading and when — even if you didn't try to play the video.
In February 2008, as a way to free EFF.org from this risk, we developed MyTube. It's a plugin for Drupal, the open-source content management system that powers EFF.org. It makes sure that no user information is sent to YouTube.com or any other third-party video host until the user has been informed of the risks and explicitly consents to the sharing.
A year later, President Obama's web team ran into exactly the kind of privacy problems MyTube was designed to prevent, when the newly-redesigned Whitehouse.gov accidentally exposed its visitors' data to YouTube. In response, YouTube launched a new feature for "privacy-enhanced embeds", which was modeled in part after MyTube. YouTube's "privacy enhancement" remains available to all Web users — although it does have some shortcomings, and the White House has since stopped using it.
Now, after months of work, Brian Swaney and other students at the Ohio State University Open Source Club have launched a new version of MyTube. Site administrators will find that protecting their users with this new version is far easier, more versatile, and less buggy than before. We've been using it on EFF.org for a few weeks now and it's already saved us time and made our jobs easier. You can see it in action on our "Fair Use Examples" page. New support for Vimeo embeds can be seen here.
The privacy risks posed by embedded third-party Web content are receiving fresh scrutiny this month, after it was revealed that many popular Facebook apps had been sharing private user information with advertisers. We'd like to see web developers everywhere consider solutions similar to MyTube to allow their users full control over how and with whom their data is shared.
This week, EFF is taking part in the 32nd Annual Conference of Data Protection and Privacy Commissioners, where we urged the Privacy Authorities to call for the repeal of the European Union's 2006 Data Retention Directive, which requires Internet service providers operating in Europe to retain telecom and Internet traffic data about all of their customers' communications for a period of at least six months and up to two years, for possible use by law enforcement.
The Data Retention Directive is highly controversial, if not wildly unpopular throughout the European Union. The directive was strongly opposed by European privacy activists. For several years, mass protests have been held in cities across Europe under the banner of "Freedom Not Fear." As each country in the EU has implemented the Data Retention Directive in their own law, they have faced challenges in state courts. In 2007, the German Working Group on Data Retention (AK Vorrat) filed a class-action lawsuit representing 35,000 people challenging the German law. The court found the law was unconstitutional and ordered the immediate deletion of all the data stored since the law went into effect in 2008 and the suspension of data collection until a revised national law is proposed. In 2009, the Romanian Constitutional Court ruled that the Romanian implementation of the EU directive fundamentally violated Article 8 of the European Convention on Human Rights, which guarantees the right to respect for private life and correspondence. The Swedish government has so far refused to implement the Data Retention Directive at all, leading to a lawsuit from the European Commission.
As if the data retention obligations in the Data Retention Directive were not bad enough, European privacy Authorities have found that compliance at national level of Telecom and ISPs with the obligations required from national traffic data retention legislation was unlawful. Data retention periods were found to be as high as ten years, well in excess of the 24-month maximum set in the directive. While the directive itself is limited to the storage of traffic data, Privacy Authorities found that data relating to the contents of communications is also being stored. Several service providers were found to retain URLs of websites, headers of e-mail messages as well as recipients of e-mail messages in "CC"- mode at the destination mail server. And when monitoring phone traffic data, phone companies continuously track the location of the caller.
The experience in Europe makes clear that mandatory data retention regimes are disproportionate and unnecessary. We continue to believe that the legitimate needs of law enforcement can be met by a more targeted data preservation regime, without the collateral damage inflicted by the 2006 directive. Rather than fighting the privacy battle across Europe one state at a time, EFF urges European Privacy Authorities to call upon the European Commission to stand up for Internet users' fundamental rights, and repeal the 2006 Data Retention Directive outright.
Back in 2007, Stephanie Lenz posted a video to YouTube of her children dancing and running around in her kitchen. Stephanie wanted to share the moment with her family and friends. But they weren't the only ones watching: a few months later, Universal Music Corp. had the video removed from YouTube, claiming that the video infringed its copyright.
With help from EFF and Keker & Van Nest, Lenz fought back. She filed a lawsuit asking a federal court to hold Universal accountable for misrepresenting that her fair use violated copyright law. Since then, Lenz has won a number of key decisions from the judge in the case, including a ruling that content owners must consider fair use before sending copyright takedown notices. On Monday, Lenz filed a motion for summary judgment asking the judge to rule that Universal violated the law.
Over the years, the case has garnered a greatdeal of mediacoverage. One reason for the interest is that Ms. Lenz was accused of infringement for doing something parents do all the time: documenting and sharing precious moments in the lives of their children. As explained in Monday's brief:
Every day, thousands of parents take pictures and make videos of their kids doing all sorts of things. Many of those pictures and videos incorporate copyrighted works in myriad ways -- a child may be wearing a t-shirt with a copyrighted character on it, or she may be standing in front of a copyrighted sculpture, or there may be copyrighted music playing in the background. This activity doesn't make parents of America copyright scofflaws. And everyone versed in copyright law (such as a major music publisher) knows why: because these examples are fair uses.
Universal filed a motion for summary judgment of its own Monday, arguing that the video was not an obvious case of fair use. (You'll notice that both of the motions have many redactions, as a protective order in the case requires some information to be treated as confidential.) A hearing on the matter is set for December 10, and hopefully we'll have a resolution to this long-running case soon.
California and other states are moving towards a "Smart Grid" -- touted by the federal government as a way to boost reliability, security, and conservation in America's electrical systems. But while a Smart Grid has great potential for consumers, there are still critical questions unanswered about the privacy and security of customers' information.
How much energy you use -- and when you use it -- can reveal surprisingly detailed information about your daily life. This wasn't true when energy usage was only measured once a month. But with shorter intervals and more frequent metering, the picture of your home life is remarkably clear. An executive with Siemens Energy recently told the Smart Grids and Cleanpower conference in Britain, "We, Siemens, have the technology to record it (energy consumption) every minute, second, microsecond, more or less live...From that we can infer how many people are in the house, what they do, whether they're upstairs, downstairs, do you have a dog, when do you habitually get up, when did you get up this morning, when do you have a shower: masses of private data." It's a virtual window into the home. The Smart Grid Interoperability Panel–Cyber Security Working Group recently issued a report on "Privacy and the Smart Grid" that covered these issues in depth.
Under EFF and CDT's proposed policies, California customers would be able to better understand what energy usage information is collected, how the information may be used and shared, and how to exercise meaningful control of the use of the data. Law enforcement would also be required to get a warrant before accessing energy usage information, and utilities would regularly report to the PUC on how often it received these requests. Our proposed rules aren't, of course, a complete answer; it would be better if energy management systems were designed to keep information about our household activities within the home in the first place.
Without strong protections, energy data can and will be used in ways that will hurt consumers. Marketing companies will desperately want to access this data to get intimate new insights into your family's day-to-day routine, and it's not hard to imagine an insurance company interpreting the data in a way that allows it to penalize you. Our privacy rights should be strongest in our home. The states -- and the federal government -- should ensure that energy customers get the protection they deserve.
UPDATE: EFF and CDT filed similar comments with the Department of Energy on November 1.
They can promise strong encryption. They just need to figure out how they can provide us plain text. - FBI General Counsel Valerie Caproni, September 27, 2010
[W]e're in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge's authority where we can get there if somebody is planning a crime. - FBI Director Louis Freeh, May 11, 1995
As noted in late September, the FBI is on a charmoffensive, seeking to ease its ability to spy on Americans by expanding the reach of the Communications Assistance to Law Enforcement Act (CALEA). Among other things, the government appears to be seriously discussing a new requirement that all communications systems be easily wiretappable by mandating "back doors" into any encryption systems.
If this sounds familiar, it's because regulating encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans' privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it's now rising from the grave, bringing the same disastrous flaws with it.
For those who weren't following digital civil liberties issues in 1995, or for those who have forgotten, here's a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago. We'll be posting more analysis when more details on the "new" proposal emerge, but this list is a start:
It will create security risks. Don't take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First, it's hard to secure communications properly even between two parties. Cryptography with a back door adds a third party, requiring a more complex protocol, and as Bellovin puts it: "Many previous attempts to add such features have resulted in new, easily exploited security flaws rather than better law enforcement access."It doesn't end there. Bellovin notes:
Complexity in the protocols isn't the only problem; protocols require computer programs to implement them, and more complex code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called 'lawful intercept' mechanisms in the switch — that is, the features designed to permit the police to wiretap calls easily — was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister's. This attack would not have been possible if the vendor hadn't written the lawful intercept code.
More recently, as security researcher Susan Landau explains, "an IBM researcher found that a Cisco wiretapping architecture designed to accommodate law-enforcement requirements — a system already in use by major carriers — had numerous security holes in its design. This would have made it easy to break into the communications network and surreptitiously wiretap private communications."
This isn't just a problem for you and me and millions of companies that need secure communications. What will the government itself use for secure communications? The FBI and other government agencies currently use many commercial products — the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or does the government plan to stop using commercial communications technologies altogether?
It won't stop the bad guys. Users who want strong encryption will be able to get it — from Germany, Finland, Israel, and many other places in the world where it's offered for sale and for free. In 1996, the National Research Council did a study called "Cryptography's Role in Securing the Information Society," nicknamed CRISIS. Here's what they said:
Products using unescrowed encryption are in use today by millions of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data, using whatever means were available, before their data were accepted by an escrowed encryption device or system. Users could store their data on remote computers, accessible through the click of a mouse but otherwise unknown to anyone but the data owner, such practices could occur quite legally even with a ban on the use of unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand from criminal elements. — CRISIS Report at 303
None of that has changed. And of course, more encryption technology is more readily available today than it was in 1996.
It will harm innovation. In order to ensure that no "untappable" technology exists, we'll likely see a technology mandate and a draconian regulatory framework. The implications of this for America's leadership in innovation are dire. Could Mark Zuckerberg have built Facebook in his dorm room if he'd had to build in surveillance capabilities before launch in order to avoid government fines? Would Skype have ever happened if it had been forced to include an artificial bottleneck to allow government easy access to all of your peer-to-peer communications?This has especially serious implications for the open source community and small innovators. Some open source developers have already taken a stand against building back doors into software.
It will harm US business. If, thanks to this proposal, US businesses cannot innovate and cannot offer truly secure products, we're just handing business over to foreign companies who don't have such limitations. Nokia, Siemens, and Ericsson would all be happy to take a heaping share of the communications technology business from US companies. And it's not just telecom carriers and VOIP providers at risk. Many game consoles that people can use to play over the Internet, such as the Xbox, allow gamers to chat with each other while they play. They'd have to be tappable, too.
It will cost consumers. Any additional mandates on service providers will require them to spend millions of dollars making their technologies compliant with the new rules. And there's no real question about who will foot the bill: the providers will pass those costs onto their customers. (And of course, if the government were to pay for it, they would be using taxpayer dollars.)
It will be unconstitutional.. Of course, we wouldn't be EFF if we didn't point out the myriad constitutional problems. The details of how a cryptography regulation or mandate will be unconstitutional may vary, but there are serious problems with nearly every iteration of a "no encryption allowed" proposal that we've seen so far. Some likely problems:
The First Amendment would likely be violated by a ban on all fully encrypted speech.
The First Amendment would likely not allow a ban of any software that can allow untappable secrecy. Software is speech, after all, and this is one of the key ways we defeated this bad idea last time.
The Fourth Amendment would not allow requiring disclosure of a key to the backdoor into our houses so the government can read our "papers" in advance of a showing of probable cause, and our digital communications shouldn't be treated any differently.
The Fifth Amendment would be implicated by required disclosure of a private papers and the forced utterance of incriminating testimony.
Right to privacy. Both the right to be left alone and informational privacy rights would be implicated.
It will be a huge outlay of tax dollars. As noted below, wiretapping is still a relatively rare tool of government. Yet the tax dollars needed to create a huge regulatory infrastructure staffed with government bureaucrats who can enforce the mandates will be very high. So, the taxpayers would end up paying for more expensive technology, higher taxes, and lost privacy, all for the relatively rare chance that motivated criminals will act "in the clear" by not using encryption readily available from a German or Israeli company or for free online.
The government hasn't shown that encryption is a problem. How many investigations have been thwarted or significantly harmed by encryption that could not be broken? In 2009, the government reported only one instance of encryption that they needed to break out of 2,376 court-approved wiretaps, and it ultimately didn't prevent investigators from obtaining the communications they were after.The New York Times reports that the government officials pushing for this have only come up with a few examples (and it's not clear that all of the examples actually involve encryption) and no real facts that would allow independent investigation or confirmation. More examples will undoubtedly surface in the FBI's PR campaign, but we'll be watching closely to see if underneath all the scary hype there's actually a real problem demanding this expensive, intrusive solution.
The real issue with encryption may simply be that the FBI has to use more resources when they encounter it than when they don't. Indeed, Bellovin argues: "Time has also shown that the government has almost always managed to go around encryption." (One circumvention that's worked before: keyloggers.) But if the FBI's burden is the real issue here, then the words of the CRISIS Report are even truer today than they were in 1996:
It is true that the spread of encryption technologies will add to the burden of those in government who are charged with carrying out certain law enforcement and intelligence activities. But the many benefits to society of widespread commercial and private use of cryptography outweigh the disadvantages.
The Wall Street Journal is reporting today on yet another major Facebook privacy blunder. Despite Facebook's various polices and promises about users' privacy when using apps, apps have been feeding Facebook users' information to advertisers and Internet tracking companies regardless of the individual user's Facebook privacy settings.
Internet advertising networks claim to track users "anonymously," but the Facebook leak allows these web marketing snoops to associate Facebook users with the supposedly-anonymous browsing-history cookies that trackers use to see a user's movements across the web. Based on the WSJ's reporting, the leak has the potential to affect tens of millions of Facebook users, as all of the top ten Facebook apps — like Farmville and Mafia Wars — were found to be violating the Facebook app developer agreement and users' privacy by handing their personal data over to advertising and data aggregation companies.
If this outrageous episode sounds familiar, it's because earlier this year, Facebook was caught leaking the exact same data to advertisers. At the time, Facebook promised to fix the problem, but it's clear that their so-called fixes failed to apply to the more than half a million apps available on the site. EFF and other privacy advocates have long warned Facebook that apps are the weakest link in the Facebook privacy ecosystem, and this report from the Wall Street Journal overwhelmingly validates that concern.
Facebook reassures privacy-conscious users by pointing to the developer agreement that requires app providers to take strong steps to protect privacy. But given that Facebook apps have been found to be leaking data that Facebook promised to protect five months ago, it's obvious that Facebook has no way of effectively enforcing those rules for the countless apps on the Facebook Platform.
Facebook simply can't claim that apps are safe to use when serious privacy issues around apps — like this referrer security breach — are abundant and endemic.
If you're a Facebook user concerned about apps leaking your data, the most straightforward fix at the moment is to turn off apps completely. To do this, log in to Facebook, open up the "Account" menu in the upper-right corner, chooose "Privacy Settings," choose to edit your settings for "Applications and Websites" in the lower-left corner, and click on the option to "Turn off all platform applications." Also, check out EFF's earlier blog post on How to Get More Privacy From Facebook's New Privacy Controls to minimize the information on your Facebook account that's accessible to others.
While the overall picture of what is happening is clear, the as-yet murky details will have a serious impact on understanding the breadth and depth of the breach, the roster of companies involved, and the list of the best solutions. EFF is looking into these factors and will follow up with our findings here on Deeplinks. In the meantime, Harlan Yu has posted a clear explanation of the suspected technical details of the leak on the Freedom to Tinker blog. Facebook application developers should consider catching up on a best-practices paper by Justine Osborne describing Secure Application Development on Facebook.
Last Friday, in a brief filed with the Ninth Circuit Court of Appeals, the Obama Administration continued the government's half-decade-long battle to ensure that no judge ever rules on the legality of the National Security Agency's warrantless dragnet surveillance program, a program first revealed in 2005 by the New York Times and detailed by technical documents provided by former AT&T technician Mark Klein.
The brief filed Friday is the government's response to EFF's appeal of the Jewel v. NSA case, a lawsuit brought against the government and government officials on behalf of AT&T phone and internet customers whose communications have been swept up in the mass surveillance program along with those of millions of other Americans.
In January, the district court issued an order dismissing the case based on the incorrect argument that, because so many Americans have had their communications and communications records illegally obtained by the government, no single person has legal "standing" to challenge the ongoing program of government surveillance. In other words: if everyone is being spied on, no one can sue. As EFF argued to the Ninth Circuit in its opening appellate brief, that ruling "risks creating a perverse incentive for the government to violate the privacy rights of as many citizens as possible in order to avoid judicial review of its actions."
The government in Friday's brief gives some lip service to the district court's conclusion that the plaintiffs claims were merely a "generalized grievance" that cannot be litigated. However, and emboldened by the Ninth Circuit's recent and dangerously misguided decision in Mohamed v. Jeppesen Dataplan, the government dedicates most of its brief to arguing the same thing it has been arguing for the past five years in every other warrantless wiretapping case: that any attempt by the courts to judge the legality of the alleged surveillance would violate the state secrets privilege and harm national security.
We've heard the government sing this tune before, most notably in our other warrantless wiretapping case Hepting v. AT&T where we withstood a motion to dismiss based on the state secrets privilege before Congress stepped in to try and shut down the case with a new immunity law for the telecoms that helped the NSA break the law. Just as we're vigorously appealing the dismissal of Hepting based on that unconstitutional law, we also look forward to responding to the government's latest attempt to sweep the NSA program under the rug, both in our reply brief due to be filed with the Ninth Circuit next month and eventually at oral argument before the court.