Newswise — Washington, DC—Conspiracy theories didn’t begin in the era of Twitter, Facebook, or YouTube, but they have now found a strong foothold on various social platforms. In the last few years, Americans have seen profound, real-world consequences of some of these theories—from the violent insurrection at the Capitol on January 6, 2021, to the widespread rejection of public health norms during the COVID-19 pandemic.

As conspiracy theories continue to circulate on social media, they also shape American politics and life in consequential and troubling ways. A deeper understanding of what drives users to adopt and propagate these theories is essential for efforts to curb and counter their spread.

Research on online conspiracy groups and online conspiracy talk has primarily focused on a psychological approach—the dispositions of the individual. Sociologists, on the other hand, are uniquely positioned to examine the social context in which conspiratorial talk flourishes. Until now, the rapidly growing body of studies of social environments that promote conspiracy talk has emphasized the role of algorithms but has largely ignored how individuals navigate and shape their discursive and social surroundings.

In their new study, “Online Conspiracy Groups: Micro-Bloggers, Bots, and Coronavirus Conspiracy Talk on Twitter,” appearing in the December 2022 issue of The American Sociological Review, authors Henrich R. Greve, INSEAD; and Hayagreeva Rao, Paul Vicinanza, and Echo Yan Zhou, all from Stanford University, seek to illuminate how social interactions specifically shape the trajectory of online conspiracy talk.

The authors focused on COVID-19 conspiracy theories on Twitter, which allowed them to track conspiracy theories over time from very early on. Using Twitter’s public application programming interface (API), they collected approximately 700,000 COVID-19 tweets from 8,000 users dating from January to July 2020.

The authors then identified 13 distinct COVID-19 conspiracy theory topics in users’ tweets by applying a natural language processing methodology called the biterm topic model (BTM). These 13 topics fell into two logics of action: one claiming the virus was a hoax or exaggerated threat (e.g., testing gives false positives or hospitals are secretly empty) and another that described it as a bioweapon spread on purpose (e.g., by Bill Gates, the Chinese, or a world-controlling cabal).

The authors found that users first retweet gateway conspiracy theories—the less extreme and more plausible conspiracy theories—before progressing to more extreme ones. “In our data, a user’s first conspiracy theory is a gateway 40 percent of the time, as opposed to 16 percent if they had picked conspiracy theories randomly.” To distinguish between human and bot accounts, the authors used the Twitter bot classification algorithm Botometer. The authors found that bots “did not require a gateway conspiracy, and they promoted more extreme versions of the COVID-19 conspiracy.”

In addition, people adopted more diverse conspiracy theories than bots, embracing inconsistent messages simultaneously. For example, a person might “embrace COVID-19 talk that implies the virus is an exaggerated hoax, as well as talk that characterizes COVID-19 as spread on purpose, a bioweapon unleashed by malignant actors.” Using event-history analyses, the authors show that individuals tweet new conspiracy theories and inconsistent theories simultaneously when they face a threat posed by the rising COVID-19 case rate and when they receive attention from others via retweets.

By contrast, bots are less responsive to rising case rates, but they are more consistent—they mainly tweet about how COVID-19 was deliberately created by sinister agents. The authors observed that human beings use conspiracy theories to deny the dangerous reality of COVID-19, whereas bots are designed to create moral panic.

The authors concluded that a key implication of their research is that “traditional means of persuasion, marketing, and public relations will be ineffective against conspiracy… Rejecting the content of conspiracy theories treats the symptom, not the illness, and is unlikely to be effective.”

One potential area to target for intervention and future research is gateway conspiracies: “The descent into reality denial begins with a first step; on Twitter, users began with plausible theories of inaccurate testing and death statistics. The apparent innocuousness of gateway conspiracy theories, particularly in relation to their extreme counterparts, belies their importance. Corrective messaging attacking gateway conspiracies offers a promising pathway to combat misinformation and conspiracy theories, because individuals have yet to fully detach from reality and may be more receptive to these interventions.”

For more information and for a copy of the study, contact [email protected].

 

 

 

About the American Sociological Association and the American Sociological Review

The American Sociological Association, founded in 1905, is a nonprofit membership association dedicated to serving sociologists in their work, advancing sociology as a science and profession, and promoting the contributions to and use of sociology by society. The American Sociological Review is ASA's flagship journal.

Journal Link: The American Sociological Review