These are rough days for Facebook. You don’t need me to tell you that. Here’s another article about how the Facebook algorithm was optimized to drive more provocative and emotion-laden content. Basically, it was refined to put stuff in front of you that makes you angry. When I read these articles I am reminded that most people have not really internalized how the social networks work. Even when people understand in some sense – and often even in detail – how the algorithms work they still tend to see these platforms as modern, digital versions of the town square. There have always been people saying nonsensical things, lying, unknowingly peddling inaccurate information. And our whole civic order is based on a deep skepticism about any authority’s ability to determine what’s true or accurate and what’s not. So really there’s nothing new under the sun, many people say.
But all of these points become moot when the networks – the virtual public square – are actually run by a series of computer programs designed to maximize ‘engagement’ and strong emotion for the purposes of selling advertising. I’ve been mostly off Facebook for a number of years – not for political reasons or ones tied to the critiques I’ve discussed here over the years. I just decided it wasn’t good for my mental health, feeling of well-being. I note this because there are also lots of people I’d lost track of in my life who I got back in touch with because of Facebook. That’s a non-trivial plus to my life. So I get the plus sides. But really all these networks are running experiments that put us collectively into the role of Pavlov’s dogs.
The algorithms are showing you things to see what you react to and showing you more of the things that prompt an emotional response, that make it harder to leave Facebook or Instagram or any of the other social networks. The article I referenced above notes that an ‘anger’ reaction got a score of 5 while a ‘like’ got 1. That sounds bad and it kind of is bad. But really if your goal is to maximize engagement that is of course what you’d do since anger is a far more compelling and powerful emotion than appreciation. Facebook didn’t do that. That’s coded into our neurology. Facebook really is an extremism generating machine. It’s really an inevitable part of the core engine.
I had an exchange recently with an acquaintance who has connections with Facebook. And this person was less than pleased with my writing on the subject. I told them I totally get. I’d be mad at me too if our roles were reversed. That reminded me to make clear in my writing on the topic that it’s not just Facebook. Or perhaps you could say it’s not even Facebook at all. It’s the mix of machine learning and the business models of all the social networks. They have real upsides. They connect us with people. Show us fun videos. But they are also inherently destructive. And somehow we have to take cognizance of that – and not just as a matter of the business decisions of one company.
Cheap movable type print also terrified people. Radio and television even more. Critics said they sowed discontent, weakened social trust. We still have endless critiques of the negative impact of modern advertising – which is of course tied to the evolution of the social networks. So we need to be cognizant of the fact that new technologies have a way of creating moral panics. As a society we tend to find ways to tame new technologies or use them in a way removes or limits their destructive capacities. But the social networks – meaning the mix of machine learning and advertising/engagement based business models – are really something new under the sun. They’re addiction and extremism generating systems. It’s what they’re designed to do.