5 Types Of People Who Spread Conspiracy Theories They Know Are Wrong

Some online conspiracy-spreaders don’t even believe the lies they’re spewing.
Getty Images
Start your day with TPM.
Sign up for the Morning Memo newsletter

This article is part of TPM Cafe, TPM’s home for opinion and news analysis. It was originally published at The Conversation.

There has been a lot of research on the types of people who believe conspiracy theories, and their reasons for doing so. But there’s a wrinkle: My colleagues and I have found that there are a number of people sharing conspiracies online who don’t believe their own content.

They are opportunists. These people share conspiracy theories to promote conflict, cause chaos, recruit and radicalize potential followers, make money, harass, or even just to get attention.

There are several types of this sort of conspiracy-spreader trying to influence you.

Coaxing conspiracists: the extremists

In our chapter of a new book on extremism and conspiracies, my colleagues and I discuss evidence that certain extremist groups intentionally use conspiracy theories to entice adherents. They are looking for a so-called “gateway conspiracy” that will lure someone into talking to them, and then be vulnerable to radicalization. They try out multiple conspiracies to see what sticks.

Research shows that people with positive feelings for extremist groups are significantly more likely to knowingly share false content online. For instance, the disinformation-monitoring company Blackbird.AI tracked over 119 million COVID-19 conspiracy posts from May 2020, when activists were protesting pandemic restrictions and lockdowns in the United States. Of these, over 32 million tweets were identified as high on their manipulation index. Those posted by various extremist groups were particularly likely to carry markers of insincerity. For instance, one group, the Boogaloo Bois, generated over 610,000 tweets, of which 58% were intent on incitement and radicalization.

You can also just take the word of the extremists themselves. When the Boogaloo Bois militia group showed up at the Jan. 6, 2021, insurrection, for example, members stated they didn’t actually endorse the stolen election conspiracy, but were there to “mess with the federal government.” Aron McKillips, a Boogaloo member arrested in 2022 as part of an FBI sting, is another example of an opportunistic conspiracist. In his own words: “I don’t believe in anything. I’m only here for the violence.”

Combative conspiracists: the disinformants

Governments love conspiracy theories. The classic example of this is the 1903 document known as the “Protocols of the Elders of Zion,” in which Russia constructed an enduring myth about Jewish plans for world domination. More recently, China used artificial intelligence to construct a fake conspiracy theory about the August 2023 Maui wildfire.

Often the behavior of the conspiracists gives them away. Years later, Russia eventually confessed to lying about AIDS in the 1980s. But even before admitting to the campaign, its agents had forged documents to support the conspiracy. Forgeries aren’t created by accident. They knew they were lying.

As for other conspiracies it hawks, Russia is famous for taking both sides in any contentious issue, spreading lies online to foment conflict and polarization. People who actually believe in a conspiracy tend to stick to a side. Meanwhile, Russians knowingly deploy what one analyst has called a “fire hose of falsehoods.”

Likewise, while Chinese officials were spreading conspiracies about American roots of the coronavirus in 2020, China’s National Health Commission was circulating internal reports tracing the source to a pangolin.

Chaos conspiracists: the trolls

In general, research has found that individuals with what scholars call a high “need for chaos” are more likely to indiscriminately share conspiracies, regardless of belief. These are the everyday trolls who share false content for a variety of reasons, none of which are benevolent. Dark personalities and dark motives are prevalent.

For instance, in the wake of the first assassination attempt on Donald Trump, a false accusation arose online about the identity of the shooter and his motivations. The person who first posted this claim knew he was making up a name and stealing a photo. The intent was apparently to harass the Italian sports blogger whose photo was stolen. This fake conspiracy was seen over 300,000 times on the social platform X and picked up by multiple other conspiracists eager to fill the information gap about the assassination attempt.

Commercial conspiracists: the profiteers

Often when I encounter a conspiracy theory I ask: “What does the sharer have to gain? Are they telling me this because they have an evidence-backed concern, or are they trying to sell me something?”

When researchers tracked down the 12 people primarily responsible for the vast majority of anti-vaccine conspiracies online, most of them had a financial investment in perpetuating these misleading narratives.

Some people who fall into this category might truly believe their conspiracy, but their first priority is finding a way to make money from it. For instance, conspiracist Alex Jones bragged that his fans would “buy anything.” Fox News and its on-air personality Tucker Carlson publicized lies about voter fraud in the 2020 election to keep viewers engaged, while behind-the-scenes communications revealed they did not endorse what they espoused.

Profit doesn’t just mean money. People can also profit from spreading conspiracies if it garners them influence or followers, or protects their reputation. Even social media companies are reluctant to combat conspiracies because they know they attract more clicks.

Common conspiracists: the attention-getters

You don’t have to be a profiteer to like some attention. Plenty of regular people share content where they doubt the veracity, or know it is false.

These posts are common: Friends, family and acquaintances share the latest conspiracy theory with “could this be true?” queries or “seems close enough to the truth” taglines. Their accompanying comments show that sharers are, at minimum, unsure about the truthfulness of the content, but they share nonetheless. Many share without even reading past a headline. Still others, approximately 7% to 20% of social media users, share despite knowing the content is false. Why?

Some claim to be sharing to inform people “just in case” it is true. But this sort of “sound the alarm” reason actually isn’t that common.

Often, folks are just looking for attention or other personal benefit. They don’t want to miss out on a hot-topic conversation. They want the likes and shares. They want to “stir the pot.” Or they just like the message and want to signal to others that they share a common belief system.

For frequent sharers, it just becomes a habit.

The dangers of spreading lies

Over time, the opportunists may end up convincing themselves. After all, they will eventually have to come to terms with why they are engaging in unethical and deceptive, if not destructive, behavior. They may have a rationale for why lying is good. Or they may convince themselves that they aren’t lying by claiming they thought the conspiracy was true all along.

It’s important to be cautious and not believe everything you read. These opportunists don’t even believe everything they write — and share. But they want you to. So be aware that the next time you share an unfounded conspiracy theory, online or offline, you could be helping an opportunist. They don’t buy it, so neither should you. Be aware before you share. Don’t be what these opportunists derogatorily refer to as “a useful idiot.”

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation
Latest Cafe

Notable Replies

  1. When it comes right down to it, aren’t they all one and the same? I mean, lying liars that lie have overtaken normal discussion for the last ten years. There’s no difference between or among them. Why give any of them an out or justification for what they do?

  2. Timely.

  3. This all makes sense to me, a relatively rational and evidence-based observer. I wonder if it would wake up a relative who is stuck on some of those theories.

    Years ago he sent out an email repeating a falsehood that Congressional pensions were a million dollars a year, or a similarly vast sum. I was not an original recipient – he knew me too well to try this on me. But a relative of the previous generation was taken in, and further transmitted it, including to me. I checked it out: completely false and unfounded.

    I asked the original sender to include me in his distribution list, so that I could help assure accuracy and combat inaccuracy. He never replied. Would having this article’s profiles help him avoid embodying one of them? I doubt it, and I appreciate his other virtues: as a man in service to his large extended family, with a loving heart and extraordinarily good cheer, too much to try him.

  4. JD Vance tried to justify “they’re eating the pets” by claiming that it was the only way to get media attention.

  5. Avatar for zandru zandru says:

    I don’t see any of these as “justifications.” In fact, this breakdown is a useful way of discrediting the source. If some relative breathlessly reports some nonsense, accompanied by a clickbox to “GIVE NOW!”, you can clearly point out that the guy was just trying to scam you out of your money.

    Or the various other varieties. They want you to lose faith in your local elections, so you won’t vote and will lose your voice. They just want you to make trouble, and maybe get arrested, while they watch and snicker. It made you click on their page, right? So they get advertiser money and more followers.

    Much more effective than just “Well, they’re lying.”

Continue the discussion at forums.talkingpointsmemo.com

19 more replies

Participants

Avatar for system1 Avatar for chris_b Avatar for zandru Avatar for becca656 Avatar for whit Avatar for danny Avatar for backcountry Avatar for joelopines Avatar for tecmage Avatar for gr Avatar for darrtown Avatar for zlohcuc Avatar for jonney_5 Avatar for albesure Avatar for edgarant Avatar for prometheus_bic Avatar for libthinker Avatar for larrykoen Avatar for Paracelsus Avatar for timbomov Avatar for xcopy Avatar for RWarnick

Continue Discussion
Masthead Masthead
Founder & Editor-in-Chief:
Executive Editor:
Managing Editor:
Deputy Editor:
Editor at Large:
General Counsel:
Publisher:
Head of Product:
Director of Technology:
Associate Publisher:
Front End Developer:
Senior Designer: