In the information age, when all of human knowledge rests quite literally at our fingertips, a sudden explosion in popular conspiracy theories comes as a surprise. More people are highly educated than ever before, fact-checking dubious information is easier than it ever was, and access to that information is near-universal, at least in the developed world. In such an environment, facts should prevail over entertaining fiction. But not only are conspiracy theories rising in prominence, they are also evolving with the times.
Conspiracy theories have existed for Many of them grew out of the sentiment that something about a certain event seemed too improbable, or that some unbelievably awful event was too terrible to just happen by chance. The former can be seen in conspiracy theories surrounding the moon landing, Area 51, or historical events supposedly being guided behind the scenes by some shadowy cabal. The latter can be seen in theories about the death of Princess Diana, the assassination of John F. Kennedy, or 9/11.
Political theorists Nancy Rosenblum and Russell Muirhead argue in “A Lot of People Are Saying” that a new kind of conspiracy theory has emerged, a conspiracism that no longer aims to explain big events, but relies purely on an assertion that something is afoot. This can be seen in the Pizzagate theory, which claimed that Hillary Clinton led a child trafficking ring out of a Washington D.C. pizzeria, or “birtherism,” the allegation that Obama was born in Kenya. These are mostly theories centred around the United States, but non-American examples exist, like the global theories about COVID-19 being a hoax. Traditionally, conspiracy theories have involved an elaborate game of connect-the-dots and core pieces of admittedly dubious evidence, such as the idea that “jet fuel can’t melt steel beams,” or the concept of the “magic bullet” in the JFK assassination. However, these new conspiracy theories rest on nothing except for a vague claim and people repeating it.
This is evident in a particularly relevant conspiracy theory: the one stating that the 2020 United States presidential election was rigged against Donald Trump. Several high-profile Republican politicians have called for formal investigations or even voted against the certification of the results purely on the grounds that “many Americans have questions about this election.” No direct evidence is provided, and no theory or evidence is laid out. They raise concerns about the legitimacy of the results, and when someone questions the basis of the concerns, they state with great seriousness that they are concerned because concerns have been raised. No specifics are needed. “Millions of Americans have questions about this election” has become their rallying cry, with nothing to indicate what the questions are or how they come about.
That explains how these new conspiracies justify themselves, but how do they come about? How do they spread? The answer is complicated. First, conspiracy theories tend to spring up in times of social and political uncertainty. In such times of chaos, the idea of order, even a malevolent order, brings a degree of understanding humans so desperately seek. A community of like-minded believers can also be comforting, but more importantly, the existence of such a community lends itself to confirmation bias. The self-reinforcing nature of these theories within online communities lets them flourish among niche in-groups, and allows those who lurk in these spaces to saturate themselves in information that constantly reaffirms their beliefs.
This is where the internet comes in. The way that many large websites make money has led to unique situations in which the platforms are designed to draw viewers in and send them down the rabbit hole of more and more specific content, tailored to their developing tastes. YouTube plays an especially important role in indoctrinating the young into the alt-right and related far-right movements. An analysis by the New York Times in 2019 showed that for youths, YouTube is the platform most often cited as a cause of right-wing radicalization. This can be explained by the way that content on the platform is presented and consumed. The algorithms that drive recommendations are designed, in the words of a former design ethicist, “to steer you toward Crazytown.” If a viewer likes conservative content, they will be recommended more provocative conservative content and more right-wing content in general. Up until a few months ago, these recommendations could also lead them to incredibly fringe conspiracy theorists like Alex Jones, who posits that the Sandy Hook shooting was a false flag.
Since then, YouTube has made efforts to reduce the amount of misinformation and has reduced the amount of harmful content recommended by half. Social media also plays into other factors such as the easily adaptable and mutating form of new conspiracy theories. Again, these new theories are seldom about the concrete argument and more about the underlying worldview. (For example, a study by Psychology Today found that people who believed that Princess Diana faked her own death were more likely to also believe that she was murdered.)
This leads us to possibly the most nefarious and virulent example of the modern conspiracy theory: QAnon. This theory posits that members of the U.S. Democratic Party are secretly Satan-worshipping child eaters who drink blood to extend their lifespan. (An idea not entirely dissimilar from the anti-Semitic ‘blood libel’ theory that circulated in medieval Christendom) The theory goes on to state that Donald Trump is a military-appointed saviour, chosen as the 45th President of the United States for the purpose of rooting out this supposed cannibal cabal. The theory goes that in an event known as “The Storm,” Donald Trump would have all the higher-ups in the Democratic Party (as well as some Republicans, and an assortment of various political personalities) rounded up and arrested.
All the factors described above collided to help QAnon spread like wildfire over the course of four years. Facebook groups, YouTube videos, and forums dedicated to “decoding” and explaining the posts of the anonymous 8chan poster known only as “Q” allowed the curious to become believers, and the believers to become dedicated adherents through a process of continuous saturation and radicalization. Despite revolving around American politics, the influence of the theory is transatlantic. Those of you who live in The Hague-Rotterdam metropolitan area will likely find QAnon slogans spray-painted on the occasional surface or stickered onto a lamppost or an electrical box. Within the European Union, Germany has become rather infamous for its visible strand of the movement, particularly noticeable during the current lockdown protests, and before Twitter removed accounts related to the conspiracy theory, German-language translations of “Q’s” posts abounded in certain online circles.
How such a theory survives anywhere in the Biden era is uncertain. The arrests (incorrectly) predicted by the person behind the Q persona so many times never came, and Ron Watkins, the owner of the forum on which the messages are broadcast, (and widely suspected to be Q himself, for the purposes of boosting his own website,) has signalled that the project is over. In the past, QAnon supporters have committed violence and threatened terror in achieving their aims: it is unknown whether the shattering of their hopes will fuel an angry period of retaliation or a miserable disillusionment. We should hope for the latter, and we should see this as a warning sign for the capacity of these new conspiracies to mislead millions.
So, what can be done to stop these theories from reaching such a fever pitch in the future? It would seem that de-platforming works to some extent. Regardless of your beliefs regarding what constitutes free speech and interference therewith, Donald Trump’s removal from several high-profile social media platforms led to a 73% reduction in posts alleging election fraud within a week. Indeed, this shows just how much misinformation spread by key figures is mirrored by other users. However, just removing the worst offenders isn’t a permanent solution, and some platforms may choose to look within to solve the problem of misinformation. Algorithms like those which drive YouTube recommendations do play a part in the spreading of misinformation and conspiracy theories, but this can be difficult to stop when so much of that misinformation is desired and perpetuated by its audience and when there is money to be made from it. Ultimately, how to best combat the new conspiracism is a debate that will have to be had many times over in spheres both public and private.