Don’t Be Surprised About Facebook and Teen Girls. That’s What Facebook Is.

POLAND - 2020/07/15: In this photo illustration a Facebook logo is seen displayed on a smartphone with an European Union flag background. (Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images)
POLAND - 2020/07/15: In this photo illustration a Facebook logo is seen displayed on a smartphone with an European Union flag background. (Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images)
Start your day with TPM.
Sign up for the Morning Memo newsletter

You’ve probably seen the latest controversy about Facebook/Instagram leading vulnerable teenagers to anorexia, fat-shaming content that seems almost designed to send teenage girls and some boys into spirals of self-loathing and unsafe behaviors. What jumps out to me about this latest controversy is that most people still don’t grasp that things like this are close to inevitable because of what Facebook is. It’s foundational to the product. It is not surprising.

Let me explain. First, set aside all morality. Let’s say we have a 16 year old girl who’s been doing searches about average weights, whether boys care if a girl is overweight and maybe some diets. She’s also spent some time on a site called AmIFat.com. Now I set you this task. You’re on the other side of the Facebook screen and I want you to get her to click on as many things as possible and spend as much time clicking or reading as possible. Are you going to show her movie reviews? Funny cat videos? Homework tips? Of course, not. If you’re really trying to grab her attention you’re going to show her content about really thin girls, how their thinness has gotten them the attention of boys who turn out to really love them, and more diets. If you’re clever you probably wouldn’t start with content that’s going to make this 16 year old feel super bad about herself because that might just get her to log off. You’ll inspire or provoke enough negative feelings to get clicks and engagement without going too far.

Now you may be saying: But Josh, I would never do that. I’m not a sociopath. But that’s irrelevant. You’re just saying you’ll refuse to participate in the experiment. We both know what you’d do if you were operating within the goals and structure of the experiment.

This is what artificial intelligence and machine learning are. Facebook is a series of algorithms and goals aimed at maximizing engagement with Facebook. That’s why it’s worth hundreds of billions of dollars. It has a vast army of computer scientists and programmers whose job it is to make that machine more efficient. The truth is we’re all teen girls and boys about some topic. Maybe the subject isn’t tied as much to depression or self-destructive behavior. Maybe you don’t have the same amount of social anxiety or depressive thoughts in the mix. But the Facebook engine is designed to scope you out, take a psychographic profile of who you are and then use its data compiled from literally billions of humans to serve you content designed to maximize your engagement with Facebook.

Put in those terms, you barely have a chance.

Of course, Facebook can come in and say, this is damaging so we’re going to add some code that says don’t show this dieting/fat-shaming content but girls 18 and under. But the algorithms will find other vulnerabilities. Not long ago I read an article about researchers at Instagram experimenting with prompts that looked for people like our hypothetical 16 year old above who is spiraling on damaging content and said, ‘Is Instagram making your feel bad about yourself. Would you like take a break from Instagram for a while?’

That’s a great thing. The algorithms can be taught to find and address an infinite numbers of behaviors. But really you’re asking the researchers and programmers to create an alternative set of instructions where Instagram (or Facebook, same difference) jumps in and does exactly the opposite of its core mission, which is to drive engagement. And where does that line get drawn? What’s if it’s just a 35 man who needs to lose ten pounds and Weight Watchers wants to show him an ad? Is that okay? Maybe you’re helping him. What if I’m just spending a bit too much money on miscellaneous items I see advertised on Facebook? Do I also get a pause prompt?

Slippery slope arguments are almost all bad arguments. You make reasonable decisions on their merits. Life is about weighing factors and making decisions. Reductios ad adsurdum make everything stupid. But why exactly are you creating a separate group of subroutines that yanks Facebook back when it does what it’s supposed to do particularly well? This, indeed, was how the internal dialog at Facebook developed, as described in the article I read. Basically, other executives said: Our business is engagement, why are we suggesting people log off for a while when they get particularly engaged?

That’s an interesting debate within Facebook. But what it makes me think about more is the conversations at Tobacco companies 40 or 50 years ago. At a certain point you realize: our product is bad. If used as intended it causes lung cancer, heart disease and various other ailments in a high proportion of the people who use the product. And our business model is based on the fact that the product is chemically addictive. Our product is getting people addicted to tobacco so that they no longer really have a choice over whether to buy it. And then a high proportion of them will die because we’ve succeeded.

So what to do? The decision of all the companies, if not all individuals, was just to lie. What else are you going to do? Say we’re closing down our multi-billion dollar company because our product shouldn’t exist?

You can add filters and claim you’re not marketing to kids. But really you’re only ramping back the vast social harm marginally at best. That’s the product. It is what it is.

It can be hard to take seriously the comparison between tobacco and Facebook. We know the horrible illness and social toll of tobacco. I go on Facebook and I just see what my old high school classmates are doing living lives of quiet desperation. Surely this can’t be equivalent. But whether or not it’s ‘as bad’ there is definitely an analogy inasmuch as what you’re talking about here aren’t some glitches in the Facebook system. These aren’t some weird unintended consequences that can be ironed out of the product. It’s also in most cases not bad actors within Facebook. It’s what the product is. The product is getting attention and engagement against which advertising is sold. You use algorithms and machine learning to figure out how to keep you engaged with Facebook as much as possible. How good is the machine learning? Well, trial and error with between 3 and 4 billion humans makes you pretty damn good. That’s the product. It is inherently destructive, though of course the bad outcomes aren’t distributed evenly throughout the human population.

The business model is to refine this engagement engine, getting more attention and engagement and selling ads against the engagement. Facebook gets that revenue and the digital roadkill created by the product gets absorbed by the society at large. Facebook is like a spectacularly profitable nuclear energy company which is so profitable because it doesn’t build any of the big safety domes and dumps all the radioactive waste into the local river.

What’s so damaging about Facebook is that there’s no willfully bad person doing this. That’s just what the product is. And in the various articles describing internal conversations at Facebook, the shrewder executives and researchers seem to get this. For the company if not every individual they seem to be following the tobacco companies’ lead.

Ed. Note: TPM Reader AS wrote in to say I was conflating Facebook and Instagram and sometimes referring to one or the other in a confusing way. This is a fair criticism. And the particular controversy I am talking about here with teens is tied to Instagram. But I spoke of them as the same intentionally. In part I’m talking about Facebook’s corporate ownership. Both sites are owned and run by the same parent corporation and as we saw during yesterday’s outage they are deeply hardwired into each other. But the main reason I spoke of them in one breath is that they are fundamentally the same. AS points out that the issues with Instagram are distinct because Facebook has a much older demographic and Facebook is a predominantly visual medium. (Indeed, that’s why Facebook corporate is under such pressure to use Instagram to drive teen and young adult engagement.) But they are fundamentally the same: AI and machine learning to drive engagement. Same same. Just different permutations of the same dynamic.

Masthead Masthead
Founder & Editor-in-Chief:
Executive Editor:
Managing Editor:
Deputy Editor:
Editor at Large:
General Counsel:
Publisher:
Head of Product:
Director of Technology:
Associate Publisher:
Front End Developer:
Senior Designer: