Nasim Aghdam’s Massacre Is Part of the Crisis of Big Tech

Yesterday afternoon Nasim Aghdam, 39, walked onto the YouTube campus in San Bruno, California, fired dozens of shots, injured four people and then killed herself. Initial reports suggested the shooter might be a disgruntled former employee or friend. Aghdam’s name already has led some to jump to the conclusion that the attack is tied to Islamic fundamentalism. But that seems pretty clearly not to be the case. Aghdam’s activism was tied to animal rights and veganism. Her extensive online trail shows that she was intensely angry at YouTube itself for “demonetizing” her YouTube channels and in other ways purportedly discriminating against her. This seems clear to have been the motive behind her rampage. In other words, she was a disgruntled YouTube user.

All of Aghdam’s social media platform accounts have already been suspended. They were down shortly after her name became public last night. But her site remains on line. Here are a couple screen grabs of the site, both to give you some flavor of her world and to let you read some of her grudge.

Fundamentally of course this is the story of a disturbed individual who terrorized hundreds of people and seriously wounded four of them. She was the only fatality, though at least a couple of those four appear to have been injured severely. But it is no disrespect or trivialization to note that this tragedy is part of the broader arc of the political and tech moment as platforms both exercise monopoly power and try to tidy up — in part due to public demands — the often raucous and ugly material that makes up significant amounts of whats on their platforms and a non-trivial part of their revenue. This is a tragedy caused by a deranged individual. But it connects up with numerous trends and controversies in the world of digital media and social network platforms.

First, what is “demonetization” — which seems to have been the root of Aghdam’s rage at YouTube? If you haven’t heard of it, it’s a big, big deal in the various self-made video subcultures that exist on YouTube. It’s a combination of two things. One is that YouTube is so ubiquitous and so much the obvious and almost only place to self-publish video that YouTube no longer wants to share the advertising revenue it gets from these channels. But it goes beyond that. It’s also heavily driven by YouTube’s desire to separate itself from the more offensive and ugly content on its platform. There’s a thriving community of YouTube channels which range from hard right to white supremacist. YouTube doesn’t want to be in business with those people, helping them make money with their hate or making money from them. You probably don’t want them doing that either. So the platform has been “demonetizing” these channels for some time. But even beyond channels that maybe should be kicked off the platform, there are countless others that are offensive to one group or another, silly, getting in fights or allegedly defaming people. For YouTube, the revenue from hosting these channels — or specifically from monetizing them — is just dwarfed by the hassle. The money is in things that are more like TV shows, music videos by big performing acts.

Advertisers always want safety. If YouTube can’t assure and convince advertisers that they’ll never show up on a white supremacist video or just on the weird and bizarre account of Nasim Aghdam (before her attack) they’re not going to feel safe putting their ads there. And this doesn’t just apply to racist channels. Advertisers don’t like controversy. So it’s going to end up hitting things like Black Lives Matter or forms of radicalism on the left.

For YouTube, it is a complex combination of trying clean up its act, changing economic strategies of the company and reacting to its own much milder version of what Facebook is now grappling with: dealing with the negative repercussions of their generally hands off approach to being a platform. But this has created a massive backlash among users (people who post videos and had made money from it) — especially people on the right who see what’s happening as an attack on their free speech. A lot of that is just right-wing special pleading of course. But there’s a lot of it out there.

My point is that while this is an individual tragedy by someone who was angry, unhinged and had access to a weapon — like all the other mass shootings — it is also at almost every turn connected to all the trends and collisions and turbulences roiling America’s (and the world’s) love affair with social media platforms that suddenly look all-powerful and are struggling to find a balance between being private companies and having power (and one imagines responsibility) that is more like a government. When you have large groups of embittered people — even more so when they are creative people or crazies who spend countless hours producing videos for YouTube — one of them is going to connect that bitterness to their own unhingedness and do something terrible. That’s what seems to have happened here.

One other point stood out to me as I was reading up on the latest news yesterday evening. As I noted above, all of Aghdam’s social media accounts — YouTube, Facebook, Instagram et al. — had been taken offline not long after her identity was published in news reports. That’s hardly surprising and it probably makes sense — certainly for YouTube. But her own website is still online, even now going on 24 hours later. Why is that? The platforms are private networks. The people who run the networks, once they decide an account should be taken offline, can do so immediately. It’s all centralized. Technically and administratively it can be dealt with very quickly. In many cases, we expect this of them.

The open web is different. It’s a highly decentralized space. Who knows where her site is hosted or who owns those servers? Her site will likely come down eventually. But there’s a decent chance the server owner doesn’t even know about the situation yet. They may not care. In many ways, that’s a good thing. It’s worrisome that a couple executives in Silicon Valley — top people at Google (which owns YouTube) and Facebook (which also owns Instagram) — can make a whole person’s online existence disappear like that. In this case it’s a good or at least understandable thing. But the same power applies across the board.

Her disappearing from the platform world and remaining on the open web tells a story. It’s part of the larger story of the overweening power of the platforms and their struggle, as people become more skeptical, to use that power in ways that seem equitable and acceptable to the broader public. It is also part of the backlash from people who became accustomed to think of these platforms as something of an entitlement — on which they were entitled to post their videos and get their usually small but not necessarily trivial ad payments. Aghdam’s story is one of an angry and deranged person. But it’s very much a story of the moment, one that connects up with almost every facet of what’s happening in the tech world today.