The Federal Trade Commission is likely to announce that Facebook’s many violations of users’ privacy in recent years also violated its consent decree with the commission. In its financial filings, Facebook has indicated that it expects to be fined between $3 and $5 billion by the FTC. But punitive fines alone, no matter the size, are unlikely to change the overlapping privacy and competition harms at the center of Facebook’s business model. Whether or not it levies fines, the FTC should use its power to make Facebook better in meaningful ways. A new settlement with the company could compel it to change its behavior. We have some suggestions.

A $3 billion fine would be, by far, the largest privacy-related fine in the FTC’s history. The biggest to date was $22.5 million, levied against Google in 2012. But even after setting aside $3 billion to cover a potential fine, Facebook still managed to rake in $3.3 billion in profit during the first quarter of 2019. It’s rumored that Facebook will agree to create a “privacy committee” as part of this settlement. But the company needs to change its actions, not just its org chart. That’s why the settlement the FTC is negotiating now also needs to include limits on Facebook’s behavior.

Stop Third-Party Tracking

Facebook uses “Like” buttons, invisible Pixel conversion trackers, and ad code in mobile apps to track its users nearly any time they use the Internet—even when they’re off Facebook products. This program allows Facebook to build nauseatingly detailed profiles of users’—and non-users’—personal activity. Facebook’s unique ability to match third-party website activity to real-world identities also gives it a competitive advantage in both the social media and third-party ad markets. The FTC should order Facebook to stop linking data it collects outside of Facebook with user profiles inside the social network.

Don’t Merge WhatsApp, Instagram, and Facebook Data

Facebook has announced plans to build a unified chat platform so that users can send messages between WhatsApp, Messenger, and Instagram accounts seamlessly. Letting users of different services talk to each other is reasonable, and Facebook’s commitment to end-to-end encryption for the unified service is great (if it’s for real). But in order to link the services together, Facebook will likely need to merge account data from its disparate properties. This may help Facebook enrich its user profiles for ad targeting and make it harder for users to fully extricate their data from the Facebook empire should they decide to leave. Furthermore, there’s a risk that people with one set of expectations for a service like Instagram, which allows pseudonyms and does not require a phone number, will be blindsided when Facebook links their accounts to real identities. This could expose sensitive information about vulnerable people to friends, family, ex-partners, or law enforcement. In short, there are dozens of ways the great messenger union could go wrong.

Facebook promises that messaging “interoperability” will be opt-in. But corporations are fickle, and Facebook and other tech giants have repeatedly walked back privacy commitments they’ve made in the past. The FTC should make sure Facebook stays true to its word by ordering it not to merge user data from its different properties without express opt-in consent. Furthermore, if users do decide to opt-in to merging their Instagram or WhatsApp accounts with Facebook data, the FTC should make sure they reserve the right to opt back out.

Stop Data Broker-Powered Ad Targeting

Last March, Facebook shut down its “Partner Categories” program, in which it purchased data from data brokers like Acxiom and Oracle in order to boost its own ad-targeting system. But over a year later, advertisers are still using data broker-provided information to target users on Facebook, and both Facebook and data brokers are still raking in profit. That’s because Facebook allows data brokers to upload “custom audience data files”—lists of contact information, drawn from the brokers’ vast tranches of personal data—where they can charge advertisers to access those lists. As a result, though the interface has changed, data broker-powered targeting on Facebook is alive and well.

Data brokers are some of the shadiest actors in the digital marketplace. They make money by buying and selling detailed information about billions of people. And most of the people they profile don’t know they exist. The FTC should order Facebook to stop allowing data brokers to upload and share custom audiences with advertisers, and to explicitly disallow advertisers from using data broker-provided information on Facebook. This will make Facebook a safer, less creepy place for users, and it will put a serious dent in the dirty business of buying and selling private information.

A Good Start, But Not the End

We can’t fix all of the problems with Facebook in one fell swoop.  Facebook’s content moderation policies need serious work. The platform should be more interoperable and more open. We need to remove barriers to competition so that more privacy-respecting social networks can emerge. And users around the world deserve to have baseline privacy protections enshrined in law. But the FTC has a rare opportunity to tell one of the most powerful companies in the world how to make its business more privacy-protective and less exploitative for everyone. These changes would be a serious step in the right direction.

Related Issues