We’re all in the final stretch of the big contest. But I wanted to flag your attention to a column in the Post about some new peer-reviewed research about Facebook and its effect on political polarization. Unsurprisingly the more time someone spends on Facebook the more polarized their beliefs become. But it’s five times more polarizing for conservatives than for liberals. And that’s not the most telling data.
The study looked at the effect on conservatives of Facebook usage and Reddit usage. The gist is that when conservatives binge on Facebook the concentration of opinion-affirming content goes up (more consistently conservative content) but on Reddit it goes down significantly. This is basically a measure of an echo chamber. And remember too that these are both algorithmic, automated sites. Reddit isn’t curated by editors. It’s another social network in which user actions, both collectively and individually, determine what you see. If you’ve never visited Reddit let’s also just say it’s not all for the faint of heart. There’s stuff there every bit as crazy and offensive as anything you’ll find on Facebook.
The difference is in the algorithms and what the two sites privilege in content. Read the article for the details but the gist is that Reddit focuses more on interest areas and viewers’ subjective evaluations of quality and interesting-ness whereas Facebook focuses on intensity of response. On the merits we can have different opinions about these choices. But the relevant point is that Facebook is an advertising platform and its engine is based on maximizing engagement for advertising. The profit motive is at the root of all commerce. But focus on the consequences. Facebook is a polarization machine, which in most respects means it’s an extremism machine. It’s an extremism machine because being an extremism machine maximizes advertising revenue.
Facebook’s response is that we live in polarized times and Facebook is just a reflection of that polarization. From a regulatory point of view they argue the false analogy that Facebook is just a modern equivalent of the phone company. No one thinks phones are bad because people call each other to spread extremist views. It’s an opinion-less technology that simply facilitates communication. But that is factually not true. The engine is designed to encourage and facilitate certain kinds of emotions and responses. Phones don’t do that. And this study joins other studies that show Facebook is a polarization/extremism generating machine.
It’s not neutral technology. This is obvious if you know the technology. Its patterns are tweaked to maximize intensity of response. And this brings us to the question of externalities.
Properly pricing and assigning the costs of externalities in commerce is the critical thing in all commercial regulation. Producing nuclear energy is insanely profitable if you sell the energy, take no safety precautions and dump the radioactive waste into the local river. In other words, if the profits remain private and the costs are socialized. What makes nuclear energy an iffy financial proposition is the massive financial costs associated with doing otherwise. Facebook is like a scofflaw nuclear power company that makes insane profits because it runs its reactor in the open and dumps the waste in the bog behind the local high school.
Whether it is a proper focus of regulation to take account of extremism machines like Facebook is a complicated question. But the facts at hand are not complicated. Facebook’s gargantuan profitability is based on maximizing the use of algorithms which generate polarization, extremism and facilitate the spread of misinformation. The only real question is whether society at large can or should do anything about that.