What If Regulating Facebook Fails?
What about us? We are the 3 billion, after all. What if every Facebook user decided to be a better person, to think harder, to know more, to be kinder, more patient, and more tolerant? Well, we’ve been working on improving humanity for at least 2,000 years, and it’s not going that well. There is no reason to believe, even with “media education” or “media literacy” efforts aimed at young people in a few wealthy countries, that we can count on human improvement—especially when Facebook is designed to exploit our tendency to favor the shallow, emotional, and extreme expressions that our better angels eschew.
Facebook was designed for better animals than humans. It was designed for beings that don’t hate, exploit, harass, or terrorize each other—like golden retrievers. But we humans are nasty beasts. So we have to regulate and design our technologies to correct for our weaknesses. The challenge is figuring out how.
First, we must recognize that the threat of Facebook is not in some marginal aspect of its products or even in the nature of the content it distributes. It’s in those core values that Zuckerberg has embedded in every aspect of his company: a commitment to unrelenting growth and engagement. It’s enabled by the pervasive surveillance that Facebook exploits to target advertisements and content.
Mostly, it’s in the overall, deleterious effect of Facebook on our ability to think collectively.
That means we can’t organize a political movement around the mere fact that Donald Trump exploited Facebook to his benefit in 2016 or that Donald Trump got tossed off of Facebook in 2021 or even that Facebook contributed directly to the mass expulsion and murder of the Rohingya people in Myanmar. We can’t rally people around the idea that Facebook is dominant and coercive in the online advertising market around the world. We can’t explain the nuances of Section 230 and expect any sort of consensus on what to do about it (or even if reforming the law would make a difference to Facebook). None of that is sufficient.
Facebook is dangerous because of the collective impact of 3 billion people being surveilled constantly, then having their social connections, cultural stimuli, and political awareness managed by predictive algorithms that are biased toward constant, increasing, immersive engagement. The problem is not that some crank or president is popular on Facebook in one corner of the world. The problem with Facebook is Facebook.
Facebook is likely to be this powerful, perhaps even more powerful, for many decades. So while we strive to live better with it (and with each other), we must all spend the next few years imagining a more radical reform program. We must strike at the root of Facebook—and, while we are at it, Google. More specifically, there is one recent regulatory intervention, modest though it is, that could serve as a good first step.
In 2018 the European Union began insisting that all companies that collect data respect certain basic rights of citizens. The resulting General Data Protection Regulation grants users some autonomy over the data that we generate, and it insists on minimal transparency when that data is used. While enforcement has been spotty, and the most visible sign of the GDPR has been extra warnings that demand we click through to accept terms, the law offers some potential to limit the power of big data vacuums like Facebook and Google. It should be studied closely, strengthened, and spread around the world. If the US Congress—and the parliaments of Canada, Australia, and India—would take citizens’ data rights more seriously than they do content regulation, there might be some hope.
Beyond the GDPR, an even more radical and useful approach would be to throttle Facebook’s (or any company’s) ability to track everything we do and say, and limit the ways it can use our data to influence our social connections and political activities. We could limit the reach and power of Facebook without infringing speech rights. We could make Facebook matter less.
Imagine if we kept our focus on how Facebook actually works and why it’s as rich and powerful as it is. If we did that, instead of fluttering our attention to the latest example of bad content flowing across the platform and reaching some small fraction of users, we might have a chance. As Marshall McLuhan taught us 56 years ago, it’s the medium, not the message, that ultimately matters.
More Great WIRED Stories