Do Not Track: Are Weak Protections Worse Than None At All?
The debate to nail down the long overdue Do Not Track (DNT) standard continued at the W3C Tracking Protection Working Group face-to-face meeting in Sunnyvale last week. Despite a less hostile tone in the room, there seemed to be no clear path forward towards agreement regarding the core issue of ensuring that the standard provides users with enough privacy protection to justify its existence. With the group set to begin winding down in July, there is a lot of uncertainty about whether a consensus standard can be reached with such a short time frame, and if no consensus standard emerges, what will happen next.
What is "Do Not Track"?
Do Not Track is an optional feature that users can enable in their browsers. Once you turn it on, your browser sends a signal to websites that you do not want to be tracked. But what websites must do in response is an open question, and the subject of much debate within the W3C's standard-setting Tracking Protection Working Group, of which I am a member.
How Privacy Protective Is the Proposed Standard Going To Be?
We think that a DNT standard must provide enough privacy protection to users to justify its existence. If the standard is too weak, users who enable DNT in their browsers will only be getting the guise of privacy, which we think is worse than having no standard at all and no expectation of increased privacy. For those who haven't been following the details, the chart below provides a barometer of where we think the debate sits right now:
Intuitively, users who we've talked to want Do Not Track to provide meaningful limits on collection and retention of data. From the user's perspective, sending the DNT browser signal to websites should indicate: don't keep any records of my information, and collect the bare minimum amount of information required to provide me with the service that you are offering. The real world is more complicated, and this ideal is very hard to achieve when the details of how web technology functions are considered. EFF and other user advocates participating in the W3C have gone to great lengths to ensure that our DNT proposals have been technically realistic, and have offered many compromises in building proposals that are privacy protective but simultaneously considerate towards the engineering realities of the current web.
Although many different perspectives are represented in the working group and some participants have been friendly to privacy advocates' concerns, the predominant representation from industry -- especially the advertising industry -- has offered very little in response to these concessions, and continues to insist that "Do Not Track" actually ought to mean "Pretend Not To Track", with only superficial changes to how data is handled. While we encourage all efforts to improve the way that user data is minimized and kept secure, these measures cannot supplant meaningful limits on the collection and retention of user data, which are central to any DNT standard.
The Danger of a Standard That Is Too Weak
In order to build a solid foundation of trust between users and advertisers, we must encourage advertisers to explore robust technological solutions where advertising and user privacy can co-exist peacefully. With a weak DNT standard in hand, advertising industry players will be able to signal to legislators and regulators that they have solved the privacy problems related to online tracking, but will have little incentive to depart significantly from the status quo. At best, this will be a very shaky and temporary compromise, unlikely to be accepted by users in the long run. The advertising industry will likely move as a block to uniformly implement largely superficial changes to adhere to the standard, without feeling enough pressure to do anything more. Instead, if we have a strong but fair standard, that would force companies to compete on privacy, and create incentives to explore the many viable privacy protective advertising technologies that exist today, creating a path towards a world in which user privacy and industry profit are not at odds.