Google and the Age of Privacy Theater

Google got some good press a few weeks ago when it announced in a blog post that it would be moving forward with its plans to remove third-party cookies from the Chrome browser. The move had been announced early last year as part of the company’s Privacy Sandbox initiative, but now Google has clarified that it didn’t intend to replace those cookies with some equivalent, substitute technology. Other browsers, including Safari and Firefox, already block third-party trackers, but given that Chrome is the most popular browser in the world, by far, with a market share in the 60-something percent range, the news was widely billed as a big step toward the end of letting companies target ads by tracking people around the internet. “Google plans to stop selling ads based on individuals’ browsing across multiple websites” is how The Wall Street Journal put it.

This news, however, met with a fair bit of skepticism—and not only because Google, like other tech giants, has not always honored similar commitments in the past. Even on its face, Google’s plan is hardly a sea change for privacy. It isn’t even true, when you dig into it, that Chrome will no longer allow ads based on people’s browsing habits. Google’s announcement is a classic example of what you might call privacy theater: While marketed as a step forward for consumer privacy, it does very little to change the underlying dynamics of an industry built on surveillance-based behavioral advertising.

To understand why, you have to look at what the company is actually planning. This is difficult, because there are many proposals in Google’s Privacy Sandbox, and it hasn’t confirmed which ones will be implemented, or precisely how. They also are all highly technical and leave open questions unresolved. I spoke with several professional online privacy experts, people who do this for a living, and interpretations varied. Still, the basic outlines are clear enough.

The most prominent proposal is something called Federated Learning of Cohorts, or FLoC. (It’s pronounced “flock.” All the Google proposals, somewhat charmingly, have bird-themed names.) Under this proposal, instead of letting anyone track you from site to site, Chrome will do the tracking itself. Then it will sort you into a small group, or cohort, of similar users based on common interests. When you visit a new website, in theory, advertisers won’t see you, Jane C. Doe; they’ll just see whatever cohort you belong to, say, thirtysomething unmarried white women who have an interest in Bluetooth headphones. As the blog post, by David Temkin, director of product management, ads privacy and trust, puts it, FLoC will allow Chrome to “hide individuals within large crowds of people with common interests.” He touts the technology as a step toward “a future where there is no need to sacrifice relevant advertising and monetization in order to deliver a private and secure experience.”

Privacy experts outside Google have raised questions about precisely how secure the experience will be. Writing for the Electronic Frontier Foundation, Bennett Cyphers notes that splitting users into small cohorts could actually make it easier to “fingerprint” them—using information about someone’s browser or device to create a stable identifier for that person. As Cyphers points out, fingerprinting requires pulling together enough information to distinguish one user from everyone else. If websites already know someone is a member of a small cohort, they only need to distinguish them from the rest of that cohort. Google says it will develop ways to prevent fingerprinting but has not detailed its plans.