Facebook's "Evil Interfaces"
Social networking companies don't have it easy. Advertisers covet their users' data, and in a niche that often seems to lack a clear business model, selling (or otherwise leveraging) that data is a tremendously tempting opportunity. But most users simply don't want to share as much information with marketers or other "partners" as corporations would like them to. So it's no surprise that some companies try to have it both ways.
Monday evening, after an exasperating few days trying to make sense of Facebook's bizzare new "opt-out" procedures, we asked folks on Twitter and ">Facebook a question:
The world needs a simple word or term that means "the act of creating deliberately confusing jargon and user-interfaces which trick your users into sharing more info about themselves than they really want to." Suggestions?
And the suggestions rolled in! Our favorites include "bait-and-click", "bait-and-phish", "dot-comfidence games", and "confuser-interface-design".
How about "zuck"? As in: "That user-interface totally zuckered me into sharing 50 wedding photos. That kinda zucks"
Other suggestions included "Zuckermining", "Infozuckering", "Zuckerpunch" and plenty of other variations on the name of Facebook's Founder and CEO, Mark Zuckerberg. Others suggested words like "Facebooking", "Facebaiting", and "Facebunk".
It's clear why folks would associate this kind of deceptive practice with Zuckerberg. Although Zuckerberg told users back in 2007 that privacy controls are "the vector around which Facebook operates," by January 2010 he had changed his tune, saying that he wouldn't include privacy controls if he were to restart Facebook from scratch. And just a few days ago, a New York Times reporter quoted a Facebook employee as saying Zuckerberg "doesn't believe in privacy".
Despite this, we'd rather not use Zuckerberg's name as a synonym for deceptive practices. Although the popularity of the suggestion shows how personal the need for privacy has become for many Facebook users, we'd prefer to find a term that's less personal and more self-explanatory.
No, our favorite idea came from Twitter user @volt4ire, who suggested we use the phrase "Evil Interfaces". The name refers to a talk by West Point Professor Greg Conti at the 2008 Hackers On Planet Earth conference.
Here's Conti explaining Evil Interfaces to a puppet named Weena:
As Conti describes it, a good interface is meant to help users achieve their goals as easily as possible. But an "evil" interface is meant to trick users into doing things they don't want to. Conti's examples include aggressive pop-up ads, malware that masquerades as anti-virus software, and pre-checked checkboxes for unwanted "special offers".
The new Facebook is full of similarly deceptive interfaces. A classic is the "Show Friend List to everyone" checkbox. You may remember that when Facebook announced it would begin treating friend-lists as "publicly available information" last December, the change was met with user protests and government investigation. The objections were so strong that Facebook felt the need to take action in response. Just one problem: Facebook didn't actually want to give up any of the rights it had granted itself. The result was the obscure and impotent checkbox pictured here. It's designed to be hard to find — it's located in an unlikely area of the User Profile page, instead of in the Privacy Settings page. And it's worded to be as weak as possible — notice that the language lets a user set their friend-list's "visibility", but not whether Facebook has the right to use that information elsewhere.
A more recent example is the process introduced last week for opting out of Instant Personalization. This new feature allows select Facebook partner websites to collect and log all of your "publicly available" Facebook information any time you visit their websites. We've already documented the labyrinthine process Facebook requires users to take to protect their data, so I won't repeat it here. Suffice to say that sharing your data requires radically less work than protecting it.
Of course, Facebook is far from the only social networking company to use this kind of trick. Memorably, users of GMail were surprised last February by the introduction of Google Buzz, which threatened to move private GMail recipients into a public "frequent contacts" list. As we noted at the time, Buzz's needlessly complex "opt-out" user-interface was a big part of the problem.
OK, perhaps the word "evil" is a little strong. There's no doubt that bad user-interfaces can come from good intentions. Design is difficult, and accidents do happen. But when an accident coincidentally bolsters a company's business model at the expense of its users' rights, it begins to look suspicious. And when similar accidents happen over and over again in the same company, around the same issues, it's more than just coincidence. It's a sign something's seriously wrong.
Recent DeepLinks Posts
Aug 28, 2015
Aug 28, 2015
Aug 28, 2015
Aug 28, 2015
Aug 28, 2015
- Fair Use and Intellectual Property: Defending the Balance
- Free Speech
- Know Your Rights
- Trade Agreements and Digital Rights
- State-Sponsored Malware
- Abortion Reporting
- Analog Hole
- Anti-Counterfeiting Trade Agreement
- Bloggers' Rights
- Broadcast Flag
- Broadcasting Treaty
- Cell Tracking
- Coders' Rights Project
- Computer Fraud And Abuse Act Reform
- Content Blocking
- Copyright Trolls
- Council of Europe
- Cyber Security Legislation
- Defend Your Right to Repair!
- Defending Digital Voices
- Development Agenda
- Digital Books
- Digital Radio
- Digital Video
- DMCA Rulemaking
- Do Not Track
- E-Voting Rights
- EFF Europe
- Encrypting the Web
- Export Controls
- FAQs for Lodsys Targets
- File Sharing
- Fixing Copyright? The 2013-2015 Copyright Review Process
- Genetic Information Privacy
- Hollywood v. DVD
- How Patents Hinder Innovation (Graphic)
- International Privacy Standards
- Internet Governance Forum
- Law Enforcement Access
- Legislative Solutions for Patent Reform
- Locational Privacy
- Mandatory Data Retention
- Mandatory National IDs and Biometric Databases
- Mass Surveillance Technologies
- Medical Privacy
- National Security and Medical Information
- National Security Letters
- Net Neutrality
- No Downtime for Free Speech
- NSA Spying
- Online Behavioral Tracking
- Open Access
- Open Wireless
- Patent Busting Project
- Patent Trolls
- PATRIOT Act
- Pen Trap
- Policy Analysis
- Public Health Reporting and Hospital Discharge Data
- Reading Accessibility
- Real ID
- Search Engines
- Search Incident to Arrest
- Section 230 of the Communications Decency Act
- Social Networks
- SOPA/PIPA: Internet Blacklist Legislation
- Student and Community Organizing
- Surveillance and Human Rights
- Surveillance Drones
- Terms Of (Ab)Use
- Test Your ISP
- The "Six Strikes" Copyright Surveillance Machine
- The Global Network Initiative
- The Law and Medical Privacy
- TPP's Copyright Trap
- Trans-Pacific Partnership Agreement
- Travel Screening
- Trusted Computing
- Video Games