In March, we wrote about Playstation 3 owners who were up in arms after Sony announced that a new firmware "upgrade" would actually disable a feature that enables users to run GNU/LINUX and other operating systems on their PS3 consoles. In response, a class action lawsuit has now been filed against Sony on behalf of PS3 owners who purchased their consoles after November 16, 2006 and before March 27. The complaint alleges breach of contract, breach of the covenant of good faith and fair dealing, and unfair and deceptive business practices.
Consumers should not have to sit idly by when the devices they have purchased are retroactively downgraded without their consent. We look forward to seeing how this lawsuit turns out.
Wolfire Games is running an innovative pay-what-you-want promotion for five great indie video games with some proceeds benefiting EFF! Normally the five games would be valued at $80, but from now until Tuesday, 5/11, you can pay what you want for the entire game bundle including:
World of Goo
The games are DRM-free and work with Mac, Windows, and Linux. The coolest part is that you can choose how to divvy up your payment between the game developers, Child's Play charity for kids, and the Electronic Frontier Foundation! Have fun, feel good, and don't forget drop some change in the EFF bucket. But wait! There's more! EFF will offer a complimentary Pioneer Membership with our top-shelf swag to the first 30 people to donate $100 or more (divided in any manner you choose) for the bundle! Check out the Humble Indie Bundle site for all the details (in both print and convenient video rap form).
We at EFF would like to offer our heartfelt thanks to Wolfire for including us, and cheers to all of the developers for their generosity and creativity. Now go get your bundle!
"Connections." It's an innocent-sounding word. But it's at the heart of some of the worst of Facebook's recent changes.
Facebook first announced Connections a few weeks ago, and EFF quickly wrote at length about the problems they created. Basically, Facebook has transformed substantial personal information — including your hometown, education, work history, interests, and activities — into "Connections." This allows far more people than ever before to see this information, regardless of whether you want them to.
Since then, our email inbox has been flooded with confused questions and reports about these changes. We've learned lots more about everyone's concerns and experiences. Drawing from this, here are six things you need to know about Connections:
Facebook will not let you share any of this information without using Connections. You cannot opt-out of Connections. If you refuse to play ball, Facebook will remove all unlinked information from your profile.
Facebook will not respect your old privacy settings in this transition. For example, if you had previously sought to share your Interests with "Only Friends," Facebook will now ignore this and share your Connections with "Everyone."
Facebook has removed your ability to restrict its use of this information. The new privacy controls only affect your information's "Visibility," not whether it is "publicly available."
Explaining what "publicly available" means, Facebook writes:
"Such information may, for example, be accessed by everyone on the Internet (including people not logged into Facebook), be indexed by third party search engines, and be imported, exported, distributed, and redistributed by us and others without privacy limitations."
Facebook will continue to store and use your Connections even after you delete them. Just because you can't see them doesn't mean they're not there. Even after you "delete" profile information, Facebook will remember it. We've also received reports that Facebook continues to use deleted profile information to help people find you through Facebook's search engine.
Facebook sometimes creates a Connection when you "Like" something. That "Like" button you see all over Facebook, and now all over the web? It too can sometimes add a Connection to your profile, without you even knowing it.
Your posts may show up on a Connection page even if you do not opt in to the Connection. If you use the name of a Connection in a post on your wall, it may show up on the Connection page, without you even knowing it. (For example, if you use the word "FBI" in a post).
You can send Facebook your comments on the new Connections here.
Updated, May 5: We changed Item #6 to clarify how Facebook uses your post.
The Washington Post reports that FCC Chairman Genachowski is considering basing the FCC's proposed net neutrality rules on precisely the legal foundation discredited in a recent court ruling:
Specifically, he is exploring a legal push under the current legal framework for broadband, which is under Title I, that would make possible the FCC’s push for a new net neutrality rule and reforms under a national broadband plan, the sources said. That could include a legal push in courts where it would assert that the FCC has the mandate from Congress to deploy broadband to all Americans in a timely manner.
This is a bad idea, no matter what your views of the wisdom of the FCC's proposed net neutrality regulations.
While we're big fans of net neutrality, we worry that the FCC may want to build its net neutrality regulations on a rotten legal foundation—"Title I ancillary authority"— which is both discredited and unbounded. As we've said before, if “ancillary jurisdiction” is enough for net neutrality regulations (something we might like) today, the FCC could just as easily invoke it tomorrow for any other Internet regulation that the Commission dreams up (including things we won’t like, like decency rules and copyright filtering). That's why we cheered the D.C. Circuit Court of Appeals ruling in early April 2010 that reined in the FCC's authority to punish Comcast for interfering with its subscribers' use of BitTorrent. While we were at the forefront of uncovering and condemning Comcast's behavior, we don't think that the FCC has—or should have—broad powers to regulate the Internet for any reason that strikes the Commissioners' fancy.
Whatever your views on net neutrality, this is a terrible idea. If you oppose the proposed FCC net neutrality regulations because you are worried about expansive federal regulation of the Internet, then you should oppose an expansive reading of "Title I ancillary authority," because that reading would be an invitation for even more federal regulations down the road.
And if you support net neutrality regulations from the FCC, it's hard to imagine a less stable legal footing than the theory that the D.C. Court of Appeals just rejected in the Comcast ruling. There, the court found that the Commission had overstepped the limits of its "Title I ancillary authority" when it disciplined Comcast for doing exactly the sort of thing that the proposed net neutrality rules would prohibit. There is little chance future network neutrality rules could withstand a court challenge if the FCC rests on the same discredited argument that the court just rejected. In fact, following the Comcast ruling, many net neutrality advocates asked the FCC to rely on a different source of authority, Title II of the Communications Act, which applies to telecommunications "common carriers." Title II would certainly provide a more stable, and narrower, basis of authority to impose open network rules, as well as other regulations familiar to telecommunications providers.
EFF will be attending PrivacyCamp SF on Friday May 7th after the end of the Web 2.0 Expo, and we hope you will join us. The topic of the day will be Privacy and Social Networks.
This first annual PrivacyCamp in San Francisco will be a day-long user-generated "unconference" of engineers, privacy advocates, professors, lawyers, entrepreneurs and social network users that will focus on the privacy implications of social networks like Facebook, Twitter, and Google Buzz. If you will be in the Bay Area and want to engage in smart conversation with experts in tech and policy about what social networks mean for privacy and to brainstorm about how social networks can be designed to better protect privacy, register now.
What is an unconference? Well, there's no pre-planned agenda, no keynotes, no panels, and no “Q&As,” just a space to meet, discuss, debate, and share knowledge with others who are interested in a particular topic--in this case, Privacy and Social Networks. We at EFF certainly have a lot to say on that topic, and we hope you'll join us to help define the dialogue. Helping us with that will be Craig Newmark of Craigslist, who's planning to speak with conference participants before lunch, and we hope to see representatives from other Web 2.0 companies participating in the discussion as well.
Since there's no pre-planned agenda, the topics of discussion will be collaboratively defined the morning of the conference (for example, here's the agenda that was settled on for the DC PrivacyCamp, which focused on privacy and government policy). But to give you an idea of possible topics for discussion, here are a few initial ideas and questions suggested by the PrivacyCamp blog:
Privacy by Design: Where in the design process should privacy be addressed? How far have we come and in what direction are we heading? What are the biggest obstacles to designing a private network, and what are some ways to overcome them?
All Out in the Open: How can privacy exist on a public network? In an age that seemingly embraces oversharing, are privacy controls a futile exercise? What are users’ expectations and how can they be addressed?
The Money Question: Does privacy work against the very tenets of social networking monetization? Can networks emphasize privacy and still be profitable? Is it possible to compete on privacy?
Too Much Control: Are granular controls the answer to privacy? How detailed can controls get before they become too complicated? How sophisticated is the “average user” and how can sites encourage users to educate themselves about the full functionality of privacy controls?
Update Headaches: What works when you change your site’s privacy controls? What doesn’t?
What would you like to see discussed at PrivacyCamp SF? Register now to get in on the conversation. You can also participate in or follow the discussion on Twitter via @privacycampdc and hashtags #privacycamp and #privacy2010, on the PrivacyCamp Facebook page, and on the PrivacyCamp blog.
Registration is still open, and the contest is still very much up for grabs! Current first place team Holy Handgrenades is sitting pretty at $575, with individual contestants Evan Keiser at second place with $65 and Robert Rowley at third place with $25. The pool of fabulous prizes is still within your reach!
Big thanks to Ninja Networks and Friends for sponsoring the contest and raising over $1200; however, contestants please note that its team has declared themselves ineligible for the prize package, leaving the contest wide open. Form a team; put a badge up on your blog; ask your friends and family -- there are lots of ways to help EFF and compete for the prizes. (See Official Rules for full details).
EFF is also thrilled to announce that security firms iSEC Partners and IOActive have joined us to sponsor the Defcon Getaway Contest! We're grateful for their support of the contest and EFF's Coders' Rights Project.
iSEC Partners is a proven full-service security consulting firm that provides penetration testing, secure systems development, security education and software design verification. iSEC Partners' security assessments leverage our extensive knowledge of current security vulnerabilities, penetration techniques and software development best practices to enable customers to secure their applications against ever-present threats on the Internet.
Established in 1998, IOActive is an industry leader that offers comprehensive computer security services with specializations in smart grid technologies, software assurance, and compliance. Boasting a well-rounded and diverse clientele, IOActive works with a majority of Global 500 companies including power and utility, hardware, retail, financial, media, router, aerospace, high-tech, and software development organizations.
Stay tuned for more developments and updates regarding EFF's Defcon Getaway Contest. If you haven't already registered, what are you waiting for? Click here, and see you in Vegas!
Social networking companies don't have it easy. Advertisers covet their users' data, and in a niche that often seems to lack a clear business model, selling (or otherwise leveraging) that data is a tremendously tempting opportunity. But most users simply don't want to share as much information with marketers or other "partners" as corporations would like them to. So it's no surprise that some companies try to have it both ways.
Monday evening, after an exasperating few days trying to make sense of Facebook's bizzare new "opt-out" procedures, we asked folks on Twitter and
">Facebook a question:
The world needs a simple word or term that means "the act of creating deliberately confusing jargon and user-interfaces which trick your users into sharing more info about themselves than they really want to." Suggestions?
And the suggestions rolled in! Our favorites include "bait-and-click", "bait-and-phish", "dot-comfidence games", and "confuser-interface-design".
Although we didn't specifically mention Facebook in our question, by far the most popular suggestions were variations on
">this one from @heisenthought on Twitter:
How about "zuck"? As in: "That user-interface totally zuckered me into sharing 50 wedding photos. That kinda zucks"
Other suggestions included "Zuckermining", "Infozuckering", "Zuckerpunch" and plenty of other variations on the name of Facebook's Founder and CEO, Mark Zuckerberg. Others suggested words like "Facebooking", "Facebaiting", and "Facebunk".
It's clear why folks would associate this kind of deceptive practice with Zuckerberg. Although Zuckerberg told users back in 2007 that privacy controls are "the vector around which Facebook operates," by January 2010 he had changed his tune, saying that he wouldn't include privacy controls if he were to restart Facebook from scratch. And just a few days ago, a New York Times reporter quoted a Facebook employee as saying Zuckerberg "doesn't believe in privacy".
Despite this, we'd rather not use Zuckerberg's name as a synonym for deceptive practices. Although the popularity of the suggestion shows how personal the need for privacy has become for many Facebook users, we'd prefer to find a term that's less personal and more self-explanatory.
As Conti describes it, a good interface is meant to help users achieve their goals as easily as possible. But an "evil" interface is meant to trick users into doing things they don't want to. Conti's examples include aggressive pop-up ads, malware that masquerades as anti-virus software, and pre-checked checkboxes for unwanted "special offers".
The new Facebook is full of similarly deceptive interfaces. A classic is the "Show Friend List to everyone" checkbox. You may remember that when Facebook announced it would begin treating friend-lists as "publicly available information" last December, the change was met with user protests and government investigation. The objections were so strong that Facebook felt the need to take action in response. Just one problem: Facebook didn't actually want to give up any of the rights it had granted itself. The result was the obscure and impotent checkbox pictured here. It's designed to be hard to find — it's located in an unlikely area of the User Profile page, instead of in the Privacy Settings page. And it's worded to be as weak as possible — notice that the language lets a user set their friend-list's "visibility", but not whether Facebook has the right to use that information elsewhere.
A more recent example is the process introduced last week for opting out of Instant Personalization. This new feature allows select Facebook partner websites to collect and log all of your "publicly available" Facebook information any time you visit their websites. We've already documented the labyrinthine process Facebook requires users to take to protect their data, so I won't repeat it here. Suffice to say that sharing your data requires radically less work than protecting it.
Of course, Facebook is far from the only social networking company to use this kind of trick. Memorably, users of GMail were surprised last February by the introduction of Google Buzz, which threatened to move private GMail recipients into a public "frequent contacts" list. As we noted at the time, Buzz's needlessly complex "opt-out" user-interface was a big part of the problem.
OK, perhaps the word "evil" is a little strong. There's no doubt that bad user-interfaces can come from good intentions. Design is difficult, and accidents do happen. But when an accident coincidentally bolsters a company's business model at the expense of its users' rights, it begins to look suspicious. And when similar accidents happen over and over again in the same company, around the same issues, it's more than just coincidence. It's a sign something's seriously wrong.
In yesterday's post, we asserted that the REACT high tech task force search of Gizmodo editor Jason Chen's home and seizure of his computers and other property as part of their investigation of that blog's reporting on the iPhone 4G prototype was almost certainly illegal. That claim causedsome to question whether the California shield law and the federal Privacy Protection Act (PPA) apply if the reporter himself is suspected of criminal activity.
Both statutory provisions likely apply here, and for good reason. The First Amendment does not excuse illegal activities, but it certainly provides safeguards to ensure that free speech interests are not trampled along the way.
Regarding the PPA, as we said in our original post, "[t]he PPA includes an exception for searches targeting criminal suspects (which Chen may or may not be), but that exception does not apply 'if the offense to which the materials relate consists of the receipt, possession, communication, or withholding of such materials or the information contained therein.'" If Chen’s property was seized under the theory that he or Gizmodo might be guilty of, say, receiving stolen property for taking possession of the iPhone about which the blog reported, even if he had reason to believe that it was stolen, then the seizure likely violated Chen’s PPA rights because the alleged crime would be one covered by the federal statute.
The California law is more stark. Penal Code section 1524(g) says sets forth that "no warrants shall issue" for unpublished "notes, outtakes, photographs, tapes or other data of whatever sort" if that information was "obtained or prepared in gathering, receiving or processing of information for communication to the public." There is no statutory exception for cases in which the journalist is the one under investigation. If the California legislature intended such an exemption, it could easily have included one, as it did in another part of the same Penal Code section 1524, subdivision (c), which prohibits search warrants targeting physicians, psychotherapists, and members of the clergy, with an explicit exception if they are “reasonably suspected of engaging or having engaged in criminal activity related to the documentary evidence for which a warrant is requested." (For a review of the respective histories of Penal Code subsections 1524(c) and (g), see PSC Geothermal Services Co. v. Superior Court, 25 Cal. App. 4th 1697, 1705 (Cal. Ct. App. 1994).)
Notwithstanding the clear language of the statute, some observers have pointed to the case of Rosato v. Superior Court, 51 Cal.App.3d 190 (1975), arguing that it stands for the proposition that California's state shield law "wouldn't apply to subpoenas or searches for evidence of such criminal activity." The Rosato decision, however, addresses whether a constitutional right (in that case the right to receive a fair trial) could trump the Evidence Code under certain circumstances. One problem with relying on Rosato is that the reporter’s privilege is now a constitutional and not merely a statutory right, having been overwhelmingly approved by voters in 1980 (after the Rosato decision). See, e.g., Liggett v. Superior Court (Gregerson), 260 Cal. Rptr. 161 (Cal. App. Ct. 1989) ("The purpose of adding the shield law to the Constitution was ostensibly to trump the reasoning of Rosato and Farr and to further insulate the shield law from judicial tampering.") (vacated on other grounds). If the reporter’s privilege is to give way to a competing right, that right must be constitutional in nature, as the California Supreme Court noted in Miller v. Superior Court, 21 Cal. 4th 883, 898 (Cal. 1999):
[T]here is nothing illogical in interpreting “the people['s] ... right to due process” not to include the right to compel the press through the sanctions of contempt-incarceration and substantial fines-to supply unpublished information obtained in the newsgathering process. The fact that the assertion of this immunity might lead to the inability of the prosecution to gain access to all the evidence it desires does not mean that a prosecutor's right to due process is violated, any more than the assertion of established evidentiary privileges against the prosecution would be a violation.
A bigger problem is that Rosato had nothing to say about the warrant restrictions Penal Code section 1524(g) sets forth to ensure that police investigations involving reporters do not disturb the confidentiality of sources or other unpublished information.
Protections for journalists implicate not only the journalist's right to speak but also the public's interest in obtaining information. That is why the First Amendment protects reporters who publish truthful information, even when it was illegally gathered. See, e.g., Bartnicki v. Vopper, 532 U.S. 514, 527-28, 533-35 (2001) (First Amendment barred imposition of civil damages under wiretapping law for publishing contents of conversation relevant to matter of public concern); Smith v. Daily Mail Pub. Co., 443 U.S. 97 (1979) (First Amendment barred prosecution under state statute for publishing name of a juvenile defendant). These protections apply even when the reporter has arguably stolen commercial trade secrets or otherwise violated the law. See, e.g., Proctor & Gamble Co. v. Bankers Trust Co., 78 F.3d 219 (6th Cir. 1996) (overturning an injunction preventing Business Week from publishing information about a court case even though the District Court had found that the magazine had "knowingly violated the protective order" by obtaining the documents that necessarily reflected "trade secrets or other confidential research, development or commercial information...."); CBS Inc v. Davis, 510 U.S. 1315 (1994) (permitting broadcast of footage of a meat-packing operations obtained through “calculated misdeeds.”).
To be sure, if Gizmodo or Chen did break the law, the First Amendment will likely not affect their potential civil or criminal liability. (The police have as of yet not identified what crime was allegedly committed, who allegedly committed that crime, and what evidence supports such an allegation.) But even in instances in which a reporter may have violated the law, and could be subject to criminal or civil liability for that violation, the First Amendment still applies, as do the procedural safeguards in California law and the federal PPA. Simply put, while a court may conclude that under particular facts and circumstances that a reporter must divulge sources or unpublished materials, or that he is liable for his misdeeds, police may not decide on their own to ignore free speech protections for journalists merely by claiming that the reporter may have committed a crime.