What Will 2022 Bring In The Way Of Misinformation On Social Media? 3 Experts Weigh In

UNITED STATES - March 25: A cardboard cutout of Mark Zuckerberg, CEO of Facebook, dressed up as the QAnon Shaman, along with other cutouts of people involved in the Capitol insurrection, stand on the National Mall ah... UNITED STATES - March 25: A cardboard cutout of Mark Zuckerberg, CEO of Facebook, dressed up as the QAnon Shaman, along with other cutouts of people involved in the Capitol insurrection, stand on the National Mall ahead of the House Energy and Commerce Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce joint hearing on Disinformation Nation: Social Media's Role in Promoting Extremism and Misinformation in Washington on Thursday, March 25, 2021. The cutouts were placed by the group SumOfUs as an attempt to highlight the role Facebook and other social media organizations played in the Capitol insurrection. (Photo by Caroline Brehman/CQ-Roll Call, Inc via Getty Images) MORE LESS
Start your day with TPM.
Sign up for the Morning Memo newsletter

This article is part of TPM Cafe, TPM’s home for opinion and news analysis. It first appeared at The Conversation.

At the end of 2020, it seemed hard to imagine a worse year for misinformation on social media, given the intensity of the presidential election and the trauma of the COVID-19 pandemic. But 2021 proved up to the task, starting with the Jan. 6 insurrection and continuing with copious amounts of falsehoods and distortions about COVID-19 vaccines.

To get a sense of what 2022 could hold, we asked three researchers about the evolution of misinformation on social media.

Absent regulation, misinformation will get worse

Anjana Susarla, Professor of Information Systems, Michigan State University

While misinformation has always existed in media – think of the Great Moon Hoax of 1835 that claimed life was discovered on the moon – the advent of social media has significantly increased the scope, spread and reach of misinformation. Social media platforms have morphed into public information utilities that control how most people view the world, which makes misinformation they facilitate a fundamental problem for society.

There are two primary challenges in addressing misinformation. The first is the dearth of regulatory mechanisms that address it. Mandating transparency and giving users greater access to and control over their data might go a long way in addressing the challenges of misinformation. But there’s also a need for independent audits, including tools that assess social media algorithms. These can establish how the social media platforms’ choices in curating news feeds and presenting content affect how people see information.

The second challenge is that racial and gender biases in algorithms used by social media platforms exacerbate the misinformation problem. While social media companies have introduced mechanisms to highlight authoritative sources of information, solutions such as labeling posts as misinformation don’t solve racial and gender biases in accessing information. Highlighting relevant sources of, for example, health information may only help users with greater health literacy and not people with low health literacy, who tend to be disproportionately minorities.

A woman stands on stage in front of an audience gesturing with her hands as the screen behind her displays a mosaic of close-up images of parts of people's faces
Carnegie Mellon University’s Justine Cassell discusses algorithmic bias at the World Economic Forum in 2019. World Economic Forum, CC BY-NC-SA

Another problem is the need to look systematically at where users are finding misinformation. TikTok, for example, has largely escaped government scrutiny. What’s more, misinformation targeting minorities, particularly Spanish-language content, may be far worse than misinformation targeting majority communities.

I believe the lack of independent audits, lack of transparency in fact checking and the racial and gender biases underlying algorithms used by social media platforms suggest that the need for regulatory action in 2022 is urgent and immediate.

Growing divisions and cynicism

Dam Hee Kim, Assistant Professor of Communication, University of Arizona

“Fake news” is hardly a new phenomenon, yet its costs have reached another level in recent years. Misinformation concerning COVID-19 has cost countless lives all over the world. False and misleading information about elections can shake the foundation of democracy, for instance, by making citizens lose confidence in the political system. Research I conducted with S Mo Jones-Jang and Kate Kenski on misinformation during elections, some published and some in progress, has turned up three key findings.

The first is that the use of social media, originally designed to connect people, can facilitate social disconnection. Social media has become rife with misinformation. This leads citizens who consume news on social media to become cynical not only toward established institutions such as politicians and the media, but also toward fellow voters.

Second, politicians, the media and voters have become scapegoats for the harms of “fake news.” Few of them actually produce misinformation. Most misinformation is produced by foreign entities and political fringe groups who create “fake news” for financial or ideological purposes. Yet citizens who consume misinformation on social media tend to blame politicians, the media and other voters.

The third finding is that people who care about being properly informed are not immune to misinformation. People who prefer to process, structure and understand information in a coherent and meaningful way become more politically cynical after being exposed to perceived “fake news” than people who are less politically sophisticated. These critical thinkers become frustrated by having to process so much false and misleading information. This is troubling because democracy depends on the participation of engaged and thoughtful citizens.

Looking ahead to 2022, it’s important to address this cynicism. There has been much talk about media literacy interventions, primarily to help the less politically sophisticated. In addition, it’s important to find ways to explain the status of “fake news” on social media, specifically who produces “fake news,” why some entities and groups produce it, and which Americans fall for it. This could help keep people from growing more politically cynical.

Rather than blaming each other for the harms of “fake news” produced by foreign entities and fringe groups, people need to find a way to restore confidence in each other. Blunting the effects of misinformation will help with the larger goal of overcoming societal divisions.

Propaganda by another name

Ethan Zuckerman, Associate Professor of Public Policy, Communication, and Information, UMass Amherst

I expect the idea of misinformation will shift into an idea of propaganda in 2022, as suggested by sociologist and media scholar Francesca Tripodi in her forthcoming book, “The Propagandist’s Playbook.” Most misinformation is not the result of innocent misunderstanding. It’s the product of specific campaigns to advance a political or ideological agenda.

Once you understand that Facebook and other platforms are the battlegrounds on which contemporary political campaigns are fought, you can let go of the idea that all you need are facts to correct people’s misapprehensions. What’s going on is a more complex mix of persuasion, tribal affiliation and signaling, which plays out in venues from social media to search results.

As the 2022 elections heat up, I expect platforms like Facebook will reach a breaking point on misinformation because certain lies have become political speech central to party affiliation. How do social media platforms manage when false speech is also political speech?

Anjana Susarla is a professor of information systems at Michigan State University.

Dam Hee Kim is an assistant professor of communication at the University of Arizona.

Ethan Zuckerman is an associate professor of public policy, communication, and information at UMass Amherst.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation
Latest Cafe

Notable Replies

  1. Avatar for grack grack says:

    Between this topic and the “moderate” one it looks like TPM is just tossing a couple turds in the punch bowl and saying “Happy New Year!” to the commentariat.

    Please proceed, everyone! :cocktail: :champagne:

  2. “Resistance is futile”

    Locutus of Zuckerborg

  3. Metaphysically speaking, is it possible for fake news to become fake fake news?

    TFG: Bigly.

  4. I don’t FacePalm, so, as of yet, I have not yet been assimilated.

    Yet…

  5. Once you understand that Facebook and other platforms are the battlegrounds on which contemporary political campaigns are fought, you can let go of the idea that all you need are facts to correct people’s misapprehensions.

    Fux News and RW talk radio have been fighting to misinform their “base” for decades. Try using facts on someone who has the Fux logo burned into their TV screen.

Continue the discussion at forums.talkingpointsmemo.com

63 more replies

Participants

Avatar for zandru Avatar for jep07 Avatar for eldonlazar Avatar for becca656 Avatar for generalsternwood Avatar for kentkingsbury Avatar for tena Avatar for jinnj Avatar for edhedh Avatar for milhouse Avatar for castor_troy Avatar for grack Avatar for maximus Avatar for 10c Avatar for occamscoin Avatar for zenicetus Avatar for rascal_crone Avatar for gargoyle Avatar for emiliano4 Avatar for kovie Avatar for Hatmama Avatar for old_guru Avatar for Fire_Joni_Ernst Avatar for geographyjones

Continue Discussion
Masthead Masthead
Founder & Editor-in-Chief:
Executive Editor:
Managing Editor:
Deputy Editor:
Editor at Large:
General Counsel:
Publisher:
Head of Product:
Director of Technology:
Associate Publisher:
Front End Developer:
Senior Designer: