Can Social Media Giants Win Their Fight Against The QAnon Hydra?

QAnon, tech, Facebook, YouTube, Twitter
Getty Images/TPM
Start your day with TPM.
Sign up for the Morning Memo newsletter

Twitter announced this week that it had shut down 7,000 QAnon-related accounts and that with time, it would reduce the visibility of a total 150,000 in an attempt to quash the adherents’ cyber attacks and promulgation of misinformation.

Other social media platforms are reportedly following their lead. Facebook employees told the New York Times that the platform plans to roll out a similar policy next month. YouTube told TPM that it has already been suspending or removing abusive or misleading QAnon accounts, and is getting better at keeping content that toes the line, like certain QAnon conspiracy theory videos, from being recommended to others. TikTok is banning prominent hashtags associated with the conspiracy theory. Reddit has blocked some QAnon subreddits in the past, but has announced no future moves.

QAnon is a patchwork conspiracy theory that was birthed and grew on the fringes on the internet, in messageboard websites like 4chan. QAnon acolytes believe that President Donald Trump, a near-Messianic figure, is being undermined by a shadowy “deep state,” a cabal of the rich and powerful bent on taking down his presidency from the inside. And Trump’s enemies, whether politicians like Hillary Clinton or wealthy individuals like George Soros, are not just bad — they’re Satanic, pedophilic, cannibalistic villains. A person who goes by “Q” online supposedly leaves breadcrumbs for followers — clues about the quiet struggle and a final judgment coming where Trump’s enemies will be tried and executed.

The social media crackdown will put the resilience of that conspiracy theory to the test. 

‘Removing The Oxygen’

Experts who study the spread of conspiracy theories told TPM that despite the fringey origins of QAnon, booting related accounts and content off of Twitter is likely to go some way toward the intended effect of reducing the theory’s reach. One compared it to the fate of of InfoWars’ Alex Jones, a conspiracy theorist who, among many other bizarre assertions over the years, vociferously pushed the lie that the Sandy Hook massacre was a hoax. 

“One analogy is the deplatforming of Alex Jones a couple years ago by first some podcast sites deplatforming InfoWars, then Apple, then the major platforms — that did a lot to diminish his influence and reach and to delegitimize him,” Russel Muirhead, chair of Dartmouth University’s government department and co-author of “A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy,” told TPM.

Marc Tuters, professor of new media and digital culture at the University of Amsterdam, pointed to another successful deplatforming when Reddit shut down the racist subreddit r/CoonTown. 

“It did lead to overall reduction in the racist speech on the platform,” he said, adding that “the users didn’t disappear, they went elsewhere, but didn’t use as much racist language.”

It will require vigilance on Twitter’s part to enforce the ban — already, some accounts are playing with the spelling of their names or making new ones to slip through the social media giant’s net. Some seem to be relishing the fight, depicting themselves as digital martyrs.

“Nothing that says these folks can’t open new accounts and engage in new targeted harassment,” said Joanne Miller, professor of political science at the University of Delaware. 

But in terms of relegating QAnon back to the margins and stunting its growth, removing it from a platform with 330 million users is a first step, experts said. With other social media companies doing the same, it simply reduces the size of QAnon’s megaphone. 

“People who sort of affiliate with it and find it cool and fun will have less ability to access it, so won’t participate as much,” said Mark Fenster, professor at the University of Florida Law School and author of “Conspiracy Theories: Secrecy and Power in American Culture.” “It’ll be harder to find new people to get interested in it and harder to engage people who just see it and click on it and like it.”

Back To The Fringes … Or To Instagram? 

The removal of more casual QAnon fans would leave a core group of devout believers to carry on their decoding and predicting and accusing on smaller platforms in a more intense and charged atmosphere. 

Tuters said that in some cases, groups like this are “more aggrieved” when they re-form on other websites.

“It comes from an extremely toxic environment,” he said. “These image boards like 4chan and 8chan are replete with extremely hateful bigotry.” 

The conspiracy theory is also dribbling into less predictable places, spearheaded by less predictable messengers. Some lifestyle influencers on Instagram have added Q to their brand, and the conspiracy theory has gained a foothold on TikTok. “It’s started to become a trendy buzzword,” said Tuters, comparing the move to search engine optimization.

Part of the resilience of the conspiracy theory, no matter how social media platforms censor it, is its fungibility as an idea. It’s a constellation of vague but sinister accusations and encompasses so many ideas as to be as teflon as supposedly is the President at its core. 

“It’s not a theory about how the world really is, it’s a mode of participation that channels people’s enthusiasm and a narrative that their side is virtuous and angelic and the other side evil,” Muirhead said. 

The problem is, and the reason why the social media companies are likely wary of the conspiracy theory’s spread, is that some are certain to take it seriously. In December 2016, Edgar Maddison Welch charged into Comet Ping Pong and fired off a gun, enraged by the false rumor that Hillary Clinton and her campaign manager were running a child sex ring in the store’s nonexistent basement. The ideas in PizzaGate were neatly subsumed into the patchwork of QAnon, which came along a bit later. 

“When Twitter says they’re deplatforming these accounts because the information is dangerous, they have a point,” Muirhead said. “It can be dangerous since some will believe it and act on it. That’s one of the features of this fabulous concoction — it makes the other side seem so evil as to make violence justifiable for people who do believe it.” 

Or, as Miller put succinctly: “The Oklahoma City bombing only took one man.” 

Missing The Gatekeepers

QAnon, to some degree, is a product of our technological landscape. Similar conspiracy theories may have been kept completely out of the spotlight before, due to the difficulty in amassing a large audience. That’s what makes believers’ presence on Twitter or Facebook or YouTube so powerful — they suddenly have a many million-person crowd.  

Moves by social media giants to crack down on the theory are “a meek and belated attempt to recreate the gatekeeping function that editors and producers used to perform,” Muirhead said. “There’s only so much space, so many column inches, so many minutes of airtime. Editors and producers decided what makes on and doesn’t make on, and their primary standard is truth.” 

These social media companies have been unable or unwilling to replicate that same standard, giving fringe ideas a much louder voice. 

Twitter is specifically going after QAnon accounts that violate its standards like harassment and multiple account usage, avoiding the squishier problem of what to do about ideas that mainstream people — “normies,” in internet parlance — simply find distasteful. And the cyber abuse is easier to spot: model and cookbook author Chrissy Teigen claimed to have blocked a million people last week after becoming the target of a QAnon swarm. 

“Banning accounts that are inciting violence being abusive and so forth — that’s a behavior that’s in everybody’s best interest to mitigate, whether it’s QAnon or someone else,” Miller told TPM. “I do think it’s trickier when we’re talking about the belief itself.”

“As a storytelling device it’s never gonna go away,” Fenster added. “Conspiracy theorizing is a practice that’s been going on for centuries — you can’t just ban it.”

This post has been updated.

Latest News

Notable Replies

  1. No . Not as they currently exist - no.

  2. All companies with a social media component can choose to control the content or not. If they choose not to, it’s because of money. Only when their income sources start to complain do they take notice.

  3. ”Chrissy Teigen claimed to have blocked a million people last week after becoming the target of a QAnon swarm.”

    A million Russian bots are not the same as a million “people”.

    And herein lies the real problem.

  4. Yes, the inverse Turing Test. Identifying if a QAnoner is not a bot…

Continue the discussion at forums.talkingpointsmemo.com

165 more replies

Participants

Avatar for valgalky23 Avatar for playitagainrowlf Avatar for josephebacon Avatar for meri Avatar for mattinpa Avatar for irasdad Avatar for sparrowhawk Avatar for ralph_vonholst Avatar for danny Avatar for 26degreesrising Avatar for lastroth Avatar for musgrove Avatar for generalsternwood Avatar for bboerner Avatar for fiftygigs Avatar for tena Avatar for dommyluc Avatar for buckson Avatar for thecaptain Avatar for castor_troy Avatar for rascal_crone Avatar for emiliano4 Avatar for PrimeTime Avatar for JR_in_Mass

Continue Discussion
Masthead Masthead
Founder & Editor-in-Chief:
Executive Editor:
Managing Editor:
Deputy Editor:
Editor at Large:
General Counsel:
Publisher:
Head of Product:
Director of Technology:
Associate Publisher:
Front End Developer:
Senior Designer: