Photo by Martin Pope / Getty Images

QAnon Methodology

After years of stoking tensions about the danger of the Covid pandemic and undermining support for Ukraine’s defense against the Russian invasion, climate change has gained ground as a new focal point of attention for QAnon followers. Protests against environmental protection, urbanism, and more sustainable farming practices have reverberated throughout Europe, culminating in the dismal showing of Green parties at the European elections in June 2024.

INTRODUCTION

In this investigation, we have delved into the anatomy of a conspiracy, following the path of seemingly local political grievances to Europe-wide virality. By analysing thousands of Telegram posts on the 15 minute cities conspiracy theory and the climate lockdown narrative, we have uncovered:

Key findings:

– Many radical theories spread due to the deft deployment of conventional marketing techniques: exploiting lulls in social media conversation to introduce new topics, boosting traffic by tapping into influencer networks and diversifying revenue streams.
– Other tricks are unique to the world of QAnon: sprinkling in keywords that reference traditional QAnon lore to improve the chances that the new theory is ingested into the QAnon conspiracy theory meta-narrative.
– Alternative media provide raw materials to fuel conversation on Telegram channels.
– Despite being regarded as a fringe movement, traditional media provides needed oxygen and attention to keep climate change conspiracy theories alive online.
– Long-term success of a local conspiracy theory hinges on breaking into the international QAnon conversation to attain international fame, which fuels local advancement.

SAMPLE CONSTRUCTION

Our analysis is based on a database of QAnon content, Lighthouse Reports built together with Bellingcat. In a first step, area experts identified relevant QAnon channels in countries across Europe, including Germany, Austria, Switzerland, the Netherlands, Italy, France, and the UK (seed channels). In a second step, an additional 1,200 channels were identified using a snowball sampling procedure, based on how frequently they were shared by seed channels, how many distinct channels reshared their content, and their relevance in different geographic contexts. In total, the database contains well in excess of 100 million posts. Here you can find a more extensive writeup of the snowball methodology.

All channels were scraped and preprocessed through a pipeline built by Bellingcat. The database can be queried based on unique post and channel id, date, raw content, urls, hashtags, media, links to other channels, mentions, likes, and detected language.

Our analysis is based on two samples of Telegram posts drawn from our database. The first “climate” sample contains more than 23,000 posts across four climate denialist conspiracy theories from 2020 until June 2023: 15 minute cities, climate lockdowns, insect food, and ultra low emissions zones. The latter two narratives were used to validate the trends observed when analyzing the former two. The second “monetization” sample consists of more than 140,000 posts that contain some attempt at monetization (IBANs, Paypal links, crypto addresses, crowdfunding, or Patreon links). For both samples we followed an iterative process to build up relevant keywords by which to query the database:

1. Start with a list of keywords we gathered from media reports and other research output.
2. Translate the keywords into our five target languages: German, English, French, Italian, and Dutch.
3. Run an initial query of our database using the identified keywords. Closely analyze random subsets of the initial sample to identify terms indicative of both false positives (terms that were plausibly NOT associated with the target topic) and false negatives (terms not contained in our keyword list that were contained in posts discussing our target topic).
4. Identify alternative spellings and synonyms of our keywords and construct Regex search terms to query the full database.

You can find the full sample of climate related posts here and monetization posts here. In addition, all code used for our analysis is contained in this Github repository.

SPREAD ANALYSIS

Our spread analysis is based on relatively straightforward pivot tables along a number of dimensions: we usually grouped the climate sample across the individual topic areas (15 minute cities, climate lockdowns etc.) and language. On top of that, we analyzed the number and share of posts across time, channel, and reshared channel.

We found that:

– Climate related conspiracy theories saw sporadic bouts of attention in 2021 and 2022, but the interest became much more sustained in 2023, driven by posts about the 15 minute cities theory.
– There are many more English and German posts compared to those in the other languages, dominating the results.
– Virality requires large channels and good timing: throughout 2021 and 2022, there were multiple posts referencing 15 minute cities, some based on stories by prominent Alt news sites such as Tichyseinblick. But the theory did not take off until November 2022, when various channels picked up a decision by the Oxford City Council to implement the scheme. A post by prominent conspiracy theorist Davidavocadowolfe appears to have kickstarted the narrative’s “de-localization”, moving it from a local political development into the global spotlight.
– More generally it is clear that larger channels reach higher levels of engagement and are crucial to popularizing formerly niche theories.

CONTENT ANALYSIS

A core focus of our research was to figure out the link between the alternative and the legacy media ecosystems in amplifying fringe beliefs and making them go viral. To do so, we extracted all URLs from the posts in the climate sample and their domains. We then classified the ~100 most frequently shared domains of media organizations and categorized them according to whether they are legacy or alternative media organizations. The graph below shows the number of posts containing links to either legacy or alternative media sources by topic and time.

Key findings:

– Both legacy and alternative media play an important role in spreading climate denialist narratives.
– Mainstream media played a more important role in spreading the climate lockdown narrative than the 15 minute cities narrative.
– There is no strong evidence that theories only go viral if they get picked up by legacy media outlets. Instead, it appears that legacy and alternative media symbiotically promote fringe beliefs.

 

A common feature of narratives that attract virality on QAnon Telegram channels is “delocalization”, the process by which local disputes are connected to long-held grievances in the QAnon community. We had already observed this phenomenon in an investigation on Nitrogen emissions and Dutch farmers protests in 2022, and we observed it again with the climate related conspiracy theories investigated for this project.

We used a number of natural language processing techniques to uncover latent topics connected to our narratives of interest. On the one hand, we utilized parts of speech tagging to identify the most common nouns in each topic and language and did the same for named entities. On Github, you can find the lists with the most commonly found nouns and named entities. In addition, we used LDA and STM topic modelling to group posts into four distinct categories. Key terms indicative of categories in both models can be found on Github.

Based on noun, named entity recognition, and topic modelling we identified salient characters that drove our ground reporting, e.g., AfD MEP Christine Andersen. In addition, we qualitatively analyzed the lists of key terms to understand which pre-existing narratives the climate denialist narratives were linked to. We found that elements of the New World Order conspiracy theory, that roughly holds that the world is controlled by a shadowy cabal of self-interested elites such as the WEF and its executive Chairman Klaus Schwab, Bill Gates, or George Soros, were particularly common. In addition, frequent links were made to the Covid 19 pandemic, even when widely out of context, and references to a freedom/imprisonment dichotomy were a commonly employed rhetorical strategy.

Key findings:

– The freedom/prison framing was more common in posts on the 15 minute cities narrative even though it has a much more tenuous connection than for the climate narrative.
– In German, Covid terms were frequently mentioned in posts on both the 15 minute cities and climate lockdowns throughout the lifecycle of the narratives. Again, this is particularly surprising for 15 minute cities, since the narrative became prominent long after the pandemic had receded in prominence.
– The virality of a conspiracy theory is correlated with connections to the New World Order and other staples of QAnon thought. However, we do not see strong evidence that such a connection precedes virality or that it is made only once a theory has gone viral.

In addition, we also ran the posts through a toxicity classifier, to understand whether certain types of narratives correlate with more toxic content but were unable to detect clear trends.

MONETIZATION

Inspired by a 2023 report by the research institute CeMAS, we looked at our database to understand how QAnon channels are trying to monetize their content. Even with relatively simple search terms, we found more than 140,000 posts with content indicative of monetization: Paypal, Crypto, IBAN numbers etc.

In addition to analyzing the temporal and linguistic spread of monetization, we also tried to estimate how much money the channels may be making by systematically analyzing revenue through crypto currencies, crowdfunding campaigns, and PayPal moneypools. Unfortunately, for posts that link to PayPal or share IBAN numbers to bank accounts it is not possible to find out how much money is being made through either of these means.

Key findings:

– The sheer scale of monetization is vast: in our languages of interest (German, Dutch, English, French, and Italian), we found more than 64,000 posts with paypal links, 29,000 posts with crypto wallet addresses, 27,000 posts containing IBANs, 11,000 posts with links to Amazon products, and 4,000 links to Crowdfunding pages. This is almost certainly an undercount of the true extent of monetization given our relatively basic search query.
– IBANs are most popular in German speaking posts. Banks used are primarily located in Germany, Switzerland, Austria and, interestingly, Lithuania. In total we have more than 600 unique IBANs in the database. When we look at the types of banks used, beyond their geographic location, we can see that Fintech companies (N26, Revolut) and semi-public banks (Sparkassen, Postbank) are the most popular by far, findings that correspond with the CeMAS report.
– Paypal is by far the most common monetization strategy in our dataset (more than 64,000 posts and 1,455 unique urls). Paypal is popular across languages, but German posts do dominate the field once again. It is not possible to see transaction values for regular Paypal accounts but until late 2021, there was a feature called Moneypools where transactions were public. Using Internet Archive, we found several Moneypools in our database with very large transaction values in the tens or even hundreds of thousands of Euros, clearly showing the economic utility of Paypal for Q influencers.

LIMITATIONS

The main limitation of our analysis is the construction of the database itself. While we are relatively confident that most important channels are contained in it, it is not a truly representative subsample of QAnon activity on Telegram or other alternative social media platforms. This leads to three important limitations: first, comparisons between countries should be interpreted cautiously because there is a high likelihood that differences in number of channels between countries are at least partially explained by our database construction and not by actual differences in activity across countries. Second, interpretation of changes over time need to be similarly caveated: we constructed the list of channels in 2022, which means that channels that were active earlier but have since been deleted, would not be contained in the data. Similarly, we would not have captured channels created after 2022. Finally, it is possible, but somewhat unlikely, that there are ‘isolated’ networks of channels, not connected to the channels we investigated and thus not captured through our snowballing approach. So we cannot exclude the possibility that our analysis has blindspots.

The other important limitation of our analysis is that we did not remove reshared posts from the analysis, except for the content analysis. This weighs our results towards content that has gone viral which is intentional but should be kept in mind when interpreting the findings.