Yesterday, EFF asked the U.S. Copyright Office to grant an exemption to the Digital Millennium Copyright Act for “jailbreaking” smart phones, tablets, and video game consoles. The exemptions are designed to dispel any legal clouds that might prevent users from running applications and operating systems that aren’t approved by the device manufacturer. The exemptions stem from section 1201 of the DMCA, which prohibits circumvention of “a technological measure that effectively controls access to a work protected under this title.”
In 2009, over strident opposition from Apple, EFF won an exemption from the Copyright Office for users who wish to jailbreak iPhones and other smartphones. Due in part to this ruling, a vibrant jailbreaking community has developed online that has immeasurably improved innovation, security, and privacy in these devices.
So why might Apple and other manufacturers still oppose the process? That’s a great question. When Apple first fought one's legal ability to jailbreak, they claimed it would cut into their business model and ruin their ability to make money. But Apple profits are at an all-time high by every relevant metric.
In fact, rather than hurting companies like Apple, the jailbreaking community often ends up helping them, as Apple and other manufacturers later adopt many features they rejected at first. Let’s take a look back at all the benefits jailbreaking has brought both manufacturers and users of smartphones, and why they should be expanded to tablets and video game consoles like the PlayStation 3, Nintendo Wii, and Xbox 360.
By all accounts, the jailbreaking community has greatly improved smartphone usability. For example, the community developed applications—first rejected by Apple—that allowed older versions of the iPhone to record video. Jailbreakers were also the first to successfully configure keyboards to wirelessly connect with the smartphone. Apple later adopted both of these features.
Security fixes developed by the jailbreaking community protect smartphone users when the manufacturer is slow to fix vulnerabilities or doesn’t fix them at all.
When a security flaw was discovered when iPhone’s web browser opened PDF files, Apple was slow to patch it. Users who didn’t want to wait for the manufacturer to fix the problem had a better way to protect themselves: jailbreak their phones to install an “unauthorized” patch created by an independent developer.
But the 2011 DigiNotar debacle is the clearest example of why jailbreaking is so vital. Until recently, DigiNotar was a certificate authority—an organization that issues digital certificates used to authenticate and secure communications between various services online, such as credit card transactions. But in September, it was hacked and started issuing fraudulent certificates, allowing malicious users to compromise devices and services. Early versions of Android didn’t update automatically, leaving users with older operating systems no recourse except to jailbreak their phones so they could protect themselves.
In an era of increased worries about privacy on mobile devices, the jailbreaking community has also been vital in securing users’ privacy when manufacturers won’t.
Jailbreakers were the first to introduce an unauthorized app on the iPhone that hid text messages from automatically appearing on the front screen for anyone to see who was nearby. Jailbreakers were also responsible for introducing a patch that prevented Apple's unauthorized logging of detailed location data on iPhones. Similarly, on the Android, an unauthorized application called LBE Privacy Guard allows for personal research and monitoring of sensitive data that third-party applications may try to access. But these privacy-protective applications are only available to users who jailbreak their devices.
The popularity of tablets has exploded over the past few years, and EFF wants users of devices such as the iPad and NOOK to have the same benefits as smartphone users have enjoyed for the past three years.
But that’s not all. We are also applying for an exemption for video game consoles.
Video Game Consoles
Manufacturers of video game consoles like the PlayStation 3, Xbox, and Nintendo Wii also limit users’ operating system and software options, even when there is no evidence that other programs will infringe copyright. Our exemption would allow users to run the operating system of their choice on their consoles, as well as “homebrew” applications.
Video game consoles have powerful computer processors that can allow a user to run them as an inexpensive alternative to a desktop. Researchers, and even the U.S. military, turned clusters of PS3s into powerful supercomputers back when Sony supported the installation of alternative operating systems. But Sony axed that option with a 2010 firmware update, and PS3s can no longer run Linux without being jailbroken. Indeed, earlier this year Sony went so far as to sue several researchers for publishing information about security holes that would let people install and run Linux on their own PS3s. We hope the exemption we’re seeking will clarify that people can run the operating system and applications of their choice on their own boxes.
EFF implores Apple, Sony, and others to support these exceptions to the DMCA to improve user experience and keep their users’ information private and secure.
Two weeks ago, the New York Times published a letter to the editor from Christopher Wolf, who leads the Internet Task Force of the Anti-Defamation League, in which he suggested:
It is time to consider Facebook’s real-name policy as an Internet norm because online identification demonstrably leads to accountability and promotes civility.
People who are able to post anonymously (or pseudonymously) are far more likely to say awful things, sometimes with awful consequences, such as the suicides of cyberbullied young people. The abuse extends to hate-filled and inflammatory comments appended to the online versions of newspaper articles — comments that hijack legitimate discussions of current events and discourage people from participating.
The New York Times invited readers to pen replies to Wolf's letter. The paper published several excellent, on-point replies, but did not publish EFF's. So, we decided to publish it here instead:
Opponents of online anonymity often repeat the platitude that “real name” identification promotes civility. While that may be true, it is often at the expense of free expression. Not only does anonymity enable dissidents in oppressive regimes, but it also helps the small-town kid experimenting with his sexuality or the abuse survivor starting a new life.
Internet intermediaries offer tools that allow users to maintain civility without sacrificing anonymity. On social networks, users can moderate offensive comments or block users who are harassing them. Newspapers can institute systems for flagging inappropriate comments.
Concerns about cyber-bullying and other online crimes shouldn’t be dismissed, but law enforcement already has tools to identify anonymous criminals.
Christopher Wolf makes many claims about the negative effects of anonymous speech, but the truth is that not one of them is backed up by research. We should not be willing to sacrifice free expression for the possibility of civility, especially not when there are more effective alternatives.
We are happy to see dialogue on this topic on the New York Times. Newspapers have been engaged in an ongoing struggle to manage commentary on their websites. This week, USA Today announced that they would require a Facebook login in order to comment on their stories, while the New York Times announced changes to their comment system that would allow "trusted commenters" who had a track record of good behavior to post immediately, without having their commenty reviewed by a moderator. We look forward to seeing how these experiments play out.
Proposal Would Gut Privacy Laws, Allow Unprecedented Data-Grab by Government
We’re for better network, computer, and device security. Unfortunately, "cybersecurity" bills often go off track—case in point: the " Internet kill switch. " The latest example comes courtesy of the leaders of the House Intelligence Committee. Committee Chairman Mike Rogers (R-Mich.) and ranking member Dutch Ruppersberger (D-Md.) are introducing "The Cyber Intelligence Sharing and Protection Act of 2011"(PDF).
The bill would allow a broad swath of ISPs and other private entities to "use cybersecurity systems" to collect and share masses of user data with the government, other businesses, or "any other entity" so long as it’s for a vaguely-defined "cybersecurity purpose." It would trump existing privacy statutes that strictly limit the interception and disclosure of your private communications data, as well as any other state or federal law that might get in the way. Indeed, the language may be broad enough to bless the covert use of spyware if done in "good faith" for a "cybersecurity purpose."
This broad data-sharing between companies wouldn’t be subject to any oversight or transparency measures (users can’t restrict companies’ sharing), while the only oversight for sharing with the federal government, ironically, would be through the Privacy and Civil Liberties Oversight Board—which hasn’t existed since January 2008.
Worse yet, the bill doesn’t limit what the federal government can do with the data or private communications that ISPs and others hand over, except to say that it can’t be used for "regulatory" purposes—apparently it can be used for law enforcement and intelligence targeting purposes.
Based on how this proposal diverges from the White House’s own cybersecurity proposal from May 12, we hope and expect that the Administration isn’t happy with this House Intelligence bill for several reasons—insufficient privacy protections, lack of oversight, skepticism about efficacy. Perhaps at the top of the list is concern over the fact that the bill allows information sharing with any federal agency—including the National Security Agency (NSA)—thereby threatening civilian control of domestic cybersecurity efforts. As Rod Beckstrom, former Director of DHS’s National Cybersecurity Center, said when he resigned in March 2009:
"NSA currently dominates most national cyber efforts…. I believe this is a bad strategy…. The intelligence culture is very different from a network operations or a security culture [and] the threats to our democratic processes are significant if all top level government network security and monitoring are handled by any one organization (either directly or indirectly).
Lawmakers should not rush to approve such a broad expansion of government power to obtain private information about its citizens without so much as a hearing on the bill. EFF flatly opposes this bill, and urges House Intelligence Committee members to oppose the bill and support any amendments to make it more privacy-protective if and when the Committee considers the proposal tomorrow. Eviscerating our online privacy protections won’t strengthen our cybersecurity, it will only undermine it.
EFF is proud to announce the newest member of our growing staff, Ellie Young. Ellie may be familiar to many of you from her longtime work overseeing the operational and administrative functions of several San Francisco Bay Area not-for-profit organizations. For the past 22 years, Ellie was the Executive Director of the USENIX Association, which puts on conferences that are essential to the community of computing engineers, sysadmins, academics and researchers. Prior to that, Ellie worked at the University of California Press and at the Boalt Hall School of Law at UC Berkeley. Ellie comes to EFF in the role of Special Assistant to Executive Director Shari Steele, and she will work on financial planning as well as on development and operational activities at EFF.
Ellie joins EFF at a critical time. We’re working to transform our newly-purchased building at 2567 Mission Street in San Francisco into a permanent home for digital rights protection. Renovations will include much-needed conference and collaborative areas, workspace for the EFF team, and improvements necessary to bring the aging building up to modern code. We’re thrilled to have Ellie on our team as we move forward on this important project for EFF’s future.
This week, Google activated a web privacy feature called “forward secrecy”, becoming one of the web’s first major players to put this important component in place. It’s an important step, and other sites should follow suit. In order to understand why enabling forward secrecy is so important, it’s helpful to know how HTTPS works in the first place.
HTTPS encrypts requests that your browser makes to web servers, and then encrypts the resulting pages. That makes the exchanged messages incomprehensible to anybody in between, such as your ISP or an eavesdropper. Each web server has a secret key, and only somebody with that secret key can decrypt the messages.1 That arrangement provides a basic layer of security from many online threats to your privacy.
(It’s worth noting that some websites that allow HTTPS connections don’t use them by default. To tell your browser to default to encrypted connections with over 1,000 sites, you can use our Firefox extension HTTPS Everywhere.)
Without forward secrecy enabled, the encrypted messages can be stored and decrypted with the private key at any time. That can lead to major issues: if your traffic has been intercepted, and the web server’s key is ever compromised, there’s no way to stop the attacker from decrypting and reading the old messages — even years later.
Forward secrecy is the way to address that threat. With forward secrecy enabled, some of the information that’s needed to decrypt those messages is ephemeral and never stored. That means that even if the secret key is compromised, only new encrypted traffic is at risk — and if the web server operator detects the attack, they can revoke the old secret key and create a new one.
This technique is already in use in other cryptographic technologies. One popular example is the Off-The-Record (OTR) messaging protocol, co-developed by 2011 EFF Pioneer Award winner Ian Goldberg. Because it uses forward secrecy, instant messages exchanged using OTR can only be decrypted with a private key at the time they are received, and encrypted messages that are intercepted and stored can never again be unscrambled and read.
Other web sites have implemented HTTPS with forward secrecy before — we have it enabled by default on https://www.eff.org/ — but it hasn’t yet been rolled out on a site of Google’s scale. Some sites have publicly resisted implementing forward secrecy because it is more CPU intensive than standard HTTP or HTTPS. In order to address that problem, Google made improvements to the open source OpenSSL library, and has incorporated those changes into the library for anybody to use.
Forward secrecy is an important step forward for web privacy, and we encourage sites, big and small, to follow Google’s lead in enabling it!
1. Technically, the web server’s secret key is used to encrypt data that becomes a new, random session key that is shared between the two parties. But because the session key is encrypted with the server key, a compromised server key can decrypt the session key, which can then decrypt the data.
Earlier today, the Federal Trade Commission announced a settlement with Facebook over allegations that the social network operator deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly expanding that which is shared and made public. We are heartened to see that many of the provisions of the settlement are in alignment with the Bill of Privacy Rights for Social Network Users that EFF proposed in May 2010.
barred from making misrepresentations about the privacy or security of consumers' personal information;
required to obtain consumers' affirmative express consent before enacting changes that override their privacy preferences;
required to prevent anyone from accessing a user's material no more than 30 days after the user has deleted his or her account;
required to establish and maintain a comprehensive privacy program designed to address privacy risks associated with the development and management of new and existing products and services, and to protect the privacy and confidentiality of consumers' information; and
required, within 180 days, and every two years after that for the next 20 years, to obtain independent, third-party audits certifying that it has a privacy program in place that meets or exceeds the requirements of the FTC order, and to ensure that the privacy of consumers' information is protected.
Many of these provisions are similar to EFF’s proposed Bill of Rights, in which we outlined three basic principles that all users should expect from a social networking service:
The right to informed decision making;
The right to control over the use and disclosure of their data; and
The right to leave a service.
It’s good to see the FTC and Facebook reach an agreement that will help uphold the rights of users in ways we’ve long recommended. As we explained in the Bill of Rights, “When the service wants to make a secondary use of the data, it must obtain explicit opt-in permission from the user.” The requirement for “affirmative express consent” adopts this opt-in procedure. Likewise, the settlement’s requirement to prevent access to deleted data helps implement a more effective right to leave. While there is more in our Bill of Privacy Rights, these are postive steps.
It remains to be seen whether the 20-year privacy audit provision will be useful. An audit alone does not ensure privacy, and the auditors will be looking at a very high-level policy view of Facebook’s practices. The real test of whether the audits are successful will be whether Facebook is able to keep out of privacy hot water.
Other social sites should use this settlement as an opportunity to closely examine their own practices when it comes to safeguarding the private data of users. And this doesn’t necessarily mean paying a third-party auditor to review one’s privacy practices. It means making sound decisions about collecting and using personal information of users to ensure their privacy expectations aren’t violated, and then holding true to those commitments when it comes to respecting their privacy choices.
By taking active steps to honor the privacy choices of users from the beginning, companies seeking to implement social features can avoid the privacy pitfalls that can lead to public relations disasters and lengthy proceedings with regulatory agencies. The EFF Bill of Privacy Rights for Social Network Users can serve as a roadmap to developing privacy practices that a responsible social network service should provide to its users
Under Secure Communities, local law enforcement agencies have lost control over the data they collect for purely local purposes. They are required to submit fingerprints and detailed information on all individuals they arrest to the Federal Bureau of Investigation (FBI), which then sends a copy of the data to the U.S. Immigration and Customs Enforcement (ICE). ICE then checks the immigration status of the individuals, and moves to deport those who do not have appropriate residency standing. Notably, individuals can be arrested, fingerprinted, and deported even if they are not convicted of a crime. For example, individuals engaged in civil disobedience at a protest rally but whose charges are later dismissed or individuals who are wrongfully arrested due to racial discrimination or false evidence could find their fingerprint data collected and face potential deportation. In fact, ICE reports that 21% of the program’s deportees were never convicted of a crime, contrary to the due process principles that are fundamental to the American legal system.
The FIPPs define 8 principles, including:
Purpose Specification: DHS should specifically articulate the authority that permits the collection of PII and specifically articulate the purpose or purposes for which the PII is intended to be used.
Data Minimization: DHS should only collect PII that is directly relevant and necessary to accomplish the specified purpose(s) and only retain PII for as long as is necessary to fulfill the specified purpose(s).
Use Limitation: DHS should use PII solely for the purpose(s) specified in the notice. Sharing PII outside the Department should be for a purpose compatible with the purpose for which the PII was collected.
The Secure Communities Program runs counter to these principles by transferring data between agencies in ways that exceed the purpose for which the data was originally collected. In particular, fingerprint data of individuals booked into jails is obtained for the purpose of identification and checking preexisting criminal history; it is not collected to review an individual’s immigration status for possible deportation. Being booked into a jail – especially when one is not convicted of a crime – should not give the government carte blanche to share one’s personal information between government agencies. This secondary usage of the data is incompatible with the purpose for which the data was originally collected, and the transfer of data from detention facilities such as local jails to a central database within ICE violates the principles of use limitation and data minimization.
The expediency of the Secure Communities process comes at the cost of dearly held American rights to privacy and due process, and sacrificing civil liberties for such expediency in immigration enforcement creates a dangerous precedent. The Secure Communities of today may be only the first step in DHS’s efforts to expand its dragnet data collection program. While Secure Communities is currently operating with data collected from arrestees, if left unchecked this program has the potential to expand to personally identifiable information from a range of other sources.
The Secure Communities Program sets a dangerous precedent for overcollection and misuse of sensitive personally identifiable information, with ramifications for the privacy and due process rights of all Americans.
Check out EFF's letter here. And anyone local to San Francisco can learn more about the privacy and security implications of Secure Communities at the Securing Our Rights event this Thursday and Friday.
In one way, though, PIPA is much worse: while SOPA is still in the House committee stage and has been the target of extraordinary public opposition, PIPA is already out of committee and poised for consideration of the full Senate. That means PIPA is a few dangerous steps further along in the process of becoming law. And with only a few weeks to go in this legislative session, the Senate may try to rush the bill through before the public has a chance to respond.
We're not going to let that happen. Despite their efforts to push this through under the radar, folks who care about the Internet and innovation are tracking this bill and getting the word out. You can help, in an old-school and very effective way: Pick up the phone.
Right now, the best response to this threat is to let your Senator hear your voice, explaining why you as a constituent think PIPA is such a bad idea. That’s why we’ve joined with many other public interest groups, including Public Knowledge, Fight for the Future, Demand Progress and others, in asking the public to call in to the Senate.
Even if you’ve already used our action alert (and thank you), please take a few minutes now and get on the phone with your Senator’s office. Let them know that Internet censorship is unacceptable.
Here are some talking points for you to mention during the phone call:
Hello, my name is [YOUR NAME] and I am a constituent of the Senator.
I think S. 968, the PROTECT IP Act, is a bad idea, and I hope the Senator will stand against it.
PROTECT IP is overbroad, and could be used as a tool for online censorship. Further, it creates a bad precedent internationally for fragmenting the Internet.
Thank you for your consideration, and for acting against this dangerous bill.
Big content is not going to give up on the idea that the best way to protect its slow-moving business model is to ensure that it gets to dictate the pace of innovation. Let’s send a signal that the next generation of creators and innovators will not let big content decide the future of the Internet.