Facebook’s first reactions to the Cambridge Analytical headlines looked very different from the now contrite promises Mark Zuckerberg made to the U.S. Congress this week. Look closer, though, and you'll see a theme running through it all. The message coming from Facebook’s leadership is not about how it has failed its users. Instead it's about how users —and especially developers —have failed Facebook, and how Facebook needs to step in to take exclusive control over your data.

You may remember Facebook's initial response, which was to say that whatever Cambridge Analytica had gotten away with, the problem was already solved. As Paul Grewal, Facebook's deputy general counsel, wrote in that first statement, "In the past five years, we have made significant improvements in our ability to detect and prevent violations by app developers" Most significantly, he added, in 2014 Facebook "made an update to ensure that each person decides what information they want to share about themselves, including their friend list. This is just one of the many ways we give people the tools to control their experience. "

By the time Zuckerberg had reached Washington, D.C., however, the emphasis was less about users controlling their experience, and more about Facebook’s responsibility to make sure those outside Facebook—namely developers and users—were doing the right thing.

A week after Grewal's statement, Zuckerberg made his first public comments about Cambridge Analytica. "I was maybe too idealistic on the side of data portability,” he told Recode's Kara Swisher and Kurt Wagner. Going forward, preventing future privacy scandals "[is] going to be solved by restricting the amount of data that developers can have access to."

In both versions, the fault supposedly lay with third parties, not with Facebook. But Mark Zuckerberg has made it clear that he now takes a “broader view” of Facebook's responsibility. As he said in his Tuesday testimony to Congress:

We didn't take a broad enough view of our responsibility... It's not enough to just connect people, we have to make sure that those connections are positive. It's not enough to just give people a voice, we need to make sure that people aren't using it to harm other people or to spread misinformation. And it's not enough to just give people control over their information, we need to make sure that the developers they share it with protect their information, too. Across the board, we have a responsibility to not just build tools, but to make sure that they're used for good.

By far the most substantial shift Facebook has already taken in this direction has been to limit further how third-parties can access data it holds for its users.

Facebook began removing access to its APIs across its platforms earlier this month, from Instagram to Facebook search.

But that move just causes a new form of collateral damage. Shutting down third-party access and APIs doesn't just mean restricting creepy data snatchers like Cambridge Analytica. APIs are the ways that machines ingest what we read in our web browsers and Facebook's offical apps. Even then, we're only operating one hop from a bot: your Web browser is an automated client, tapping into Facebook's data feeds using Facebook's own web code. Facebook apps are just automated programs that talk to Facebook with a larger set of the interfaces that it permits others to use. Even if you're just hitting “refresh” on the feed over lunch, it's all automated in the end.

By locking down its APIs, Facebook is giving people less power over their information and reserving that power for itself.

By locking down its APIs, Facebook is giving people less power over their information and reserving that power for itself. In effect, Facebook is narrowing and eliminating the ways that legitimate users can engage with their own data. And that means that we're set to become even more dependent on Facebook's decisions about what to do with that information.

What is worse, we've seen a wave of similar rhetoric elsewhere. Twitter announced (then backed down from) a rapid shutdown of an API that creators outside the company depended upon for implementing alternative Twitter clients.

In response to a court case brought by academics to establish their right to examine services like Facebook for discrimination, Bloomsberg View columnist Noah Feldman claimed last week that permitting such automated scanning would undermine Facebooks' ability to protect the privacy of its users. We'd emphatically disagree. But you can expect more columns like that if Facebook decides to defend your data by doubling down on laws like the CFAA.

Here’s a concrete example. Supposing you'd like to #deletefacebook and move to another service—and you'd like to take your friends with you. That may mean you'd like to persuade them to move to a new better service with you. Or you might just want to keep the memories of your friends and your interactions with them intact as you move on from Facebook. You wouldn't want to leave Facebook without your conversations and favorite threads. You wouldn't want to find that you could take your own photographs, but not pictures of you with your family and colleagues. You'd like all that data as takeout. That's what data portability means.

But is that personal data really yours, asks the new "responsible" Facebook? Facebook's old model implied that it was yours: that's why the Graph API allowed you to deputize apps or third-parties to examine the all the data you could see yourself. Facebook's new model is that what you know about your friends is not yours to examine or extract how you'd like. Facebook's APIs now prevent you from accessing that in an automated way. Facebook is also locking down other ways you might extract that information—like searching for email addresses, or looking through old messages.

As the APIs shut down, the only way to access much of this data becomes the "archive download," a huge, sometimes incomplete slab of all your data that is unwieldy for users or their tools to parse or do anything with.

This isn't necessary a purely self-serving step by Facebook (though anyone who has tried to delete their account before, and had their friends pictures' deployed by Facebook in poignant pleas to remain, might well be skeptical of the company's best motives). It merely represents the emerging model of the modern social network's new responsibilities as the biggest hoarder of private information on the planet.

The thinking goes: There are bad people out there, and our customers may not understand privacy options well enough to prevent this huge morass of information being misused. So we need to be the sole keepers of it. We're too big to let others fail.

But that's entirely the wrong way of seeing how we got here. The problem here cannot be that people are sucking Facebook's data out of Facebook’s systems. The problem is that Facebook sucked that data from us, subject to vague assurances that a single Silicon Valley company could defend billions of people's private information en masse. Now, to better protect it from bad people, they want to lock us out of it, and prevent us taking it with us.

This new model of "responsibility" fits neatly with some lawmakers' vision of how such problems might get fixed. Fake news on Facebook? Then Facebook should take more responsibility to identity and remove false information. Privacy violations on the social media giants? Then the social media giants should police their networks, to better conform with what the government thinks should be said on them. In America, that "fix thyself" rule may well look like self-regulation: you drag Mr Zuckerberg in front of a committee, and make him fix the mess he's caused. In Europe, it looks like shadow regulation like the EU's Code of Conduct on Hate Speech, or even the GDPR. Data is to be held in giant silos by big companies, and the government will punish them if they handle it incorrectly.

We are unwittingly hard-wiring the existence of our present day social media giants into the infrastructure of society.

In both cases, we are unwittingly hard-wiring the existence of our present day social media giants into the infrastructure of society. Zuckerberg is the sheriff of what happens between friends and family, because he's best placed to manage that. Twitter's Jack Dorsey decides what civil discourse is, because that happens on his platform, and he is now the responsible adult. And you can't take your data out from these businesses, because smaller, newer companies can't prove their trustworthiness to Zuckerberg or Dorsey or Congress or the European Commission. And you personally certainly can't be trusted by any of them to use your own data the way you see fit, especially if you use third-party tools to achieve that.

The next step in this hard-wiring is to crack down on all data access in ways "unauthorized" by the companies holding it. That means sharpening the teeth of the notorious Computer Fraud and Abuse Act, and more misconceived laws like the Georgia's new Computer Crimes act. An alliance of the concerns of politicians and social media giants could well make that possible. As we saw with SESTA/FOSTA, established tech companies can be persuaded to support such laws, even when they unwind the openness and freedom that let those companies prosper in the first place. That is the shift that Zuckerberg is signaling in his offers to take responsibility in Washington.

But there is an alternative: We could empower Internet users, not big Internet companies, to decide what they want to do with their private data. It would mean answering some tough questions about how we get consent for data that is personal and private for more than one person, like conversations, photographs, my knowledge of you  and you of me—and how individuals can seek redress when those that they trust—whether it's Facebook, or a small developer—break that trust.

But beginning that long-overdue conversation now is surely going to end with a better result than making the existing business leaders of Silicon Valley the newly deputized sheriffs of our data, with shiny badges from Washington, and an impregnable new jail in which to hold it.