SUPPORT US

Addressing Attribution: Theorizing a Model to Identify Russian Disinformation Campaigns Online

Addressing_Attribution_Header.jpg

Image credit: Twitter/CGAI

POLICY PERSPECTIVE

by Tom Robertson and Teah Pelechaty
November 2022

DOWNLOAD PDF


Table of Contents


Introduction

The weaponization of disinformation that Russia practises has achieved substantial recognition in recent years. A historical constant for over a century, these activities were most recently popularized after Russia’s 2014 annexation of Crimea and progressively so in light of the 2016 U.S. election. Today, they are one of the most important components of Russia’s hybrid warfare toolbox in relation to the West and Russia’s near-abroad.1 The difficulty of proving a falsehood drives their success, and this difficulty is due largely to Russia’s ability to hide its role as originator, particularly in cyber-space.2 This paper confronts the problem by theorizing a framework through which attribution may be achieved.

We begin by mapping key Russian disinformation waypoints, from its origins in Soviet-era counterintelligence doctrine to early post-Soviet usage in the First and Second Chechen Wars, retooling in Georgia and at-scale deployment in Ukraine. We frame this review with three useful first principles, or characteristics, of disinformation:

  • Truth seeding, which we define as splicing, or combining facts with fiction to legitimize disinformation and confuse target audiences;
  • Network effects, the deliberate repetition of chosen narratives through multiple, seemingly distinct conduits to increase disinformation’s traction in a given community;
  • Chaotic ambiguity, or the spreading of multiple, often contradictory narratives to obfuscate reality and draw focus away from the underlying truth.

We also call attention to an inherent trait of Russian disinformation: anonymity. While anonymity contributes to the problem of attribution, it is also paradoxically a limitation because it substantially mitigates the power of a given narrative.

Next, we discuss the challenges of uncovering present-day disinformation campaigns given their residency on and exploitation of social media platforms that assist in obfuscating truth. To solve these challenges, we propose a framework that draws on big-data analytics, anchored in the need for gathering and storing large amounts of data (the sheer volume of social media data makes this first task critical), followed by a refining or filtering step that ensures data are structured uniformly so that efforts to uncover the identified first principles are possible. Finally, an emphasis is placed on the importance of a mechanism to frequently query data, given the exponential rate at which new data appear on social media.

We conclude by suggesting next steps to validate the proposed model; namely, using historical datasets of one of the world’s largest social media platforms, Twitter, to back-test a known disinformation campaign. TweetsKB, a resource description framework (RDF) corpus containing over two billion tweets spanning seven years from February 2013 to December 2020, is an open-source intelligence (OSINT) tool ideally suited to the task, though there are others.3

TOP OF PAGE


Historical Overview

Any discussion of Russian disinformation campaigns must begin with a nod to their genesis in the socio-political institutions that arose after the Russian Revolution. As communist philosophies were translated from ideologies to day-to-day behaviours, the security services began to almost immediately make use of disinformation as a critical component of their offensive tactics and even over-arching strategies.4

We mark the instantiation of the Soviet secret police, the Cheka, in 1917 as the beginning of Russia’s near-fetishization of disinformation. The Cheka’s earliest mandate was to disseminate rumours denouncing the remnants of the Russian aristocracy. By the time its successor, the State Political Directorate (GPU), was formed in the 1920s, among its many roles were activities dedicated to laying intrigue and subterfuge into the fabric of Russian life.5 We see from its earliest field reports evidence of a trial-and-error approach to codifying such activity in state organs, including the Orthodox Church.6

Anecdotally, folklore suggests that Joseph Stalin coined the term “disinformation” (dezinformatsiya) in 1923 while establishing the propaganda arm of the GPU, imbuing it with what was thought to be a French-sounding name in an attempt to reframe it as a “French capitalist tool” used to target the Soviet Union.7

Truth Seeding

Among the Soviet disinformation campaigns in those early years was Operation Trust, a counterintelligence gambit that for nearly a decade lured Russian capitalists, monarchists and other “counter-revolutionaries” into government hands via fraudulent resistance movements.8 The creation of the first “dedicated [Soviet] disinformation unit” has been attributed to the Trust.9 The genius of the operation was the GPU’s ability to knit enough genuine anti-communist individuals and activity with their own agents and narratives to ensnare unsuspecting individuals and deceive opponents.10 Operation Trust embodies truth seeding, a first principle of disinformation tradecraft that combines elements of truth with those of deceit to legitimize a narrative and confuse target audiences.

The use of a kernel of truth to propel false narratives has been a mainstay of Russian disinformation ever since. A recent example is the 2016 Pizzagate conspiracy wherein Russian actors were found to have propagated the false claim that eminent U.S. Democratic Party officials were running a human trafficking and child sex ring out of a pizza restaurant.11 Real, albeit falsely presented, emails between one of the allegedly involved politicians and the owners of a pizzeria were put forward as “evidence” to bolster these claims.12  This particular theory lingers; in 2020, musician Justin Bieber was briefly the subject of a social media-based maelstrom after conspiracy theorists interpreted a physical gesture he made on social media as a signal that he had been one of Pizzagate’s adolescent victims.13

Chaotic Ambiguity

Chaotic ambiguity, a second defining characteristic of Russian disinformation, is the spreading of multiple, often contradictory narratives to obfuscate and draw focus away from the underlying truth. The Soviet Union’s response to the 1983 downing of the civilian airliner, Korean Airlines Flight 007, is perhaps the best example of this tactic. The airliner was shot down over Sakhalin Islands by a Sukhoi Su-15 fighter that unintentionally entered Soviet airspace, resulting in the deaths of 269 people and the worsening of U.S.-Soviet relations.14

The initial Soviet response was to deny responsibility.15 Under the burden of evidence, it eventually admitted culpability, albeit with the assertion that the airliner had been intentionally targeted as it had allegedly been spying on behalf of the United States.16 This statement was followed by the suppression of evidence from the crash, as well as conflicting narratives and conspiracy theories surrounding the event. As late as 1996, the Soviet pilot responsible for the attack stated that the aircraft had been “disguised as a reconnaissance plane.”17

The downing of Korean Airlines Flight 007 also embodied the related idea of weaponized relativism, or the notion that there are “multiple interpretations of the truth” – a technique that, with denial and diversion, is a hallmark trait of Russian disinformation.18 Recent examples include the multiple, often contradictory narratives surrounding the 2014 downing of Flight MH17,19 the 2015 assassination of Boris Nemtsov,20 the 2018 Salisbury poisonings21 and the 2020 poisoning of Alexei Navalny,22 to name a few.

Network Effects

The third disinformation first principle is the deliberate repetition of chosen narratives through multiple, seemingly distinct conduits to increase traction in a given target audience. This is best encapsulated in the 1980s strategic disinformation campaign, Operation Infektion.

Infamous for its sheer magnitude and ultimate success, Infektion sought to blame the burgeoning HIV/AIDS epidemic on the United States, citing its alleged fabrication through a U.S. biological weapons program.23 The campaign’s genesis was a 1983 editorial in an Indian newspaper, Patriot, (established two decades earlier by the Soviet Union for the purpose of disseminating pro-Soviet narratives) titled “AIDS May Invade India: Mystery Disease Caused by U.S. Lab Experiments.”24 Of note was the article’s composition of roughly “20 percent forgery and 80 percent fact,”25 a technique highly emblematic of Soviet active measures at the time, and a reference to truth seeding.

Over the years, the KGB instructed subordinate intelligence agencies across the Soviet Union and its allies to perpetuate the claims made in the original editorial and added seemingly disparate narratives to ensure its continued growth. For example, the Patriot article claimed that the epidemic had been engineered to eradicate African-Americans and homosexuals.26 Indeed, a key characteristic of network effects in general and Infektion in particular was its reliance on the use of seemingly unrelated media outlets to disseminate a story, which would then be relayed through Soviet channels.27 This was especially effective when attempting to access non-Russian-speaking audiences.28 By the early 1990s, the AIDS story had been covered across 80 countries and 25 languages, and it is estimated that 15 per cent of the American population believed that the U.S. government had indeed created and disseminated HIV/AIDS.29

TOP OF PAGE


Tradecraft Evolution

A Decline in Intelligence Operations

While the above three principles are a constant presence in Russian disinformation activities from the 1920s onwards, they vary in their effectiveness and prevalence over time. With the Soviet Union’s collapse, for example, the Russian intelligence apparatus fell into disarray for a decade.30 While the whirlwind of nascent frontier capitalism infused with large doses of gangsterism was the logical driver of such disarray, it is moreover the case that the security infrastructure, including the eastern intelligence outposts that had once been tools in the Soviet Union’s arsenal, simply no longer existed.31 Even core capabilities such as active measures were lost after the Soviet Union’s dissolution.32

We see concrete examples of this decline at the close of the last century and early aughts.33 In the late 1990s wars in the former Yugoslavia, Russia was largely absent despite those conflicts being the largest Western intervention in Russia’s traditional sphere of influence since the Berlin airlift.34 The lack of meaningful disinformation characteristic of the Balkan wars carried over into the First and, albeit less so, the Second Chechen Wars, during which Russia had a limited ability to define the narratives.35

The 2000 sinking of the Kursk, one of the largest and most sophisticated nuclear cruise missile-armed submarines in the Russian navy, with the loss of all 118 sailors aboard, was perhaps the nadir of Russian disinformation tradecraft.36 For months, the story lingered in both the domestic and Western press as an example of unmitigated resource mismanagement, bureaucratic ineptitude and Russian leadership insensitivity to the plight of everyday citizens.37 The intelligence apparatus successfully produced and deployed remarkably few counter-narratives that gained even the slightest foothold.

Retooling in Georgia

The 2008 Russo-Georgian war served as a metaphorical retooling of Russian information operations. It built heavily upon, and expanded, past engagements in its use of disinformation in particular.38 Russia’s actions in the war constituted the “first use of cyberwarfare and information operations in conjunction with a conventional military operation.”39 Similar to the modus operandi in Chechnya, this disinformation sought to portray Russia as the victim as opposed to the instigator of the conflict, domestically and abroad – an approach that was later replicated in the annexation of Crimea and the invasion of Eastern Ukraine.40

Russia’s efforts in 2008 met with a higher degree of success than previous disinformation campaigns surrounding its military engagements, largely due to greater co-operation with Russian media.41 During the initial stages of the Georgian conflict, for instance, Russian media accounts of alleged civilian casualties at the hands of Georgian forces in South Ossetia were promulgated by Western media, although they were often revoked upon the discovery of their deceptive nature.42 Russian disinformation in Georgia was heavily exaggerated and rather easily discredited, and did not account for the counterforces of Georgian and Western media.43 Russian disinformation during the Russo-Georgian war left room for improvement.

Craft Deployment in Ukraine: Bringing the Three First Principles Together

The ubiquity of the term “disinformation” over the past decade is largely a result of the Russian annexation of Crimea in 2014, which instigated a wave of subsequent information operations.44 Disinformation in Ukraine post-2014 has centred on recurrent themes similar to those perpetuated in other countries in Russia’s near-abroad, and include narratives that tout the West’s moral ineptitude as compared to Russia, the illegitimacy of the Ukrainian nation and the concept of pan-Slavism.45

Russian disinformation campaigns targeting Ukraine serve as the culmination of its hybrid operations across a series of prior military engagements in the region, including the Chechen wars and the Russo-Georgian war.46 Russia has honed the first principles of disinformation: we see truth seeding in the 2021 story reporting the death of a child in Donbass due to a Ukrainian drone,47 later determined to have been a purposeful distortion of the child’s death.48 We see network effects in the aptly named Operation Secondary Infektion, a campaign begun in 2014 to discredit Ukrainian national government activities through online fabrications later picked up and promulgated by Russian-friendly media sources.49 50 We also see chaotic ambiguity in the 2014 downing of Malaysia Airlines Flight 17 over Eastern Ukraine, in which Russian disinformation surrounding the crash has sought to deflect culpability.51 In this latter instance, multiple contradictory narratives emerged from pro-Kremlin media sources, such as that the vessel was struck down by a Ukrainian Su-25 fighter jet, or later a Ukrainian Buk missile.52

Further cementing the maturation of its disinformation capabilities, over the past five years Russia has begun rewriting its role, or lack thereof, in earlier conflicts via concerted effort (which President Vladimir Putin labels a “soft power push”) to improve the perception of the Russian military at home and abroad.53 Russian films, such as “Balkan Line,” reframe Russia as a leader for peace against an antagonistic West during the civil wars in the former Yugoslavia. Other films include “Abyss” and “Tourist,” which recast Russian activity during the Chornobyl disaster and in the Central African Republic, respectively.54 Likewise the Kursk tragedy; a 2021 article by Russian state-owned news outlet Ria Novosti claimed the sinking was due to collision with a NATO vessel.55

TOP OF PAGE


Defining an Analytical Framework

To tackle a Russian disinformation machine operating at scale, Western security analysts56 require a framework that allows rapid, consistent identification of disinformation activity when and where it occurs.

Today’s disinformation campaigns are generated, supported and advanced online, in particular with and by social media platforms. These platforms are almost always assets of for-profit corporations and exist to generate revenue, which they accomplish largely by continuously increasing user traffic and content.57 Increasing traffic and content involves providing evolving functionality and maintaining a low barrier for content creation and dissemination.58 Any framework or model must then co-exist, operate and succeed in an environment by nature hostile to its mission: online arenas with ever-increasing reams of data in ever-morphing forms.

An early hurdle that a model targeting these platforms must overcome is to create repositories able to handle the large amounts of data they create. The repositories must be able to ingest unstructured data, which is to say characters, text, images, video, sound and even layered combinations of these data types. Herein lies a critical challenge: while there are several storage technologies up to the task, sophisticated toolsets and human talent are required to manipulate and tease value out of unstructured data. Point-and-click, layperson user interfaces do not readily exist to sift and sort large amounts of unstructured data. Moreover, the human talent required to build bespoke solutions generally takes the form of data scientists with advanced degrees and years of practical experience, neither of which aligns with the defence community’s analyst talent recruitment and retention paradigm.

A more practical approach is to identify domain-specific repositories that have already bounded social media platforms’ data and have made those data accessible using standard programming languages that technicians can operate. Here we should look to the private sector, namely the marketing industry, for data warehouses and toolsets tailor-made to exploit data from the world’s largest social media platforms. (Indeed, sending security analysts into eight-week job-shadowing regimes at Liberty Village boutique advertising agencies to learn social media analytics would provide tremendously more value than the standard IT security bootcamps). 

The repositories must also assign qualitative attributes to data as they ingest them from social media platforms. This is crucial. It would be hard to over-state just how much the defence community relies on primitive quantitative attributes to identify and evaluate the success of disinformation campaigns. The default approach for nearly two decades has been to trace suspected disinformation through user connections on social media (Jim is connected to Bob; Bob is connected to Frank, etc.) and to count the number of likes a given social media post receives as evidence of its effectiveness. These methodologies have been outdated and effectively countered for nearly as long as they have been in use; they are the intelligence equivalent to evaluating a professional baseball team’s on-field performance by reading its members’ social media posts or counting the number of likes its star pitcher receives when he makes a post. These things are tangential at best and generally irrelevant to the underlying question. Indeed, in keeping with the principle of chaotic ambiguity, the promulgators of disinformation undoubtedly use connections and likes to obfuscate and confuse.

Here again, the private sector has made strides over the past decade. Using complex algorithms deduced from a variety of factors (velocity: frequency of posts; mood: weighting descriptive text; and lifestyle: metadata consistency; to name a few), marketing data warehouses now analyze and assign sentimentality scores to data pulled from social media platforms. Developed at substantial cost and refined through years of trial and error, these qualitative attributes have been moulded into something approximating industry standards that are now used to evaluate customer behaviour. The use of such qualitative attributes would be an order-of-magnitude improvement over how security analysts presently understand social media activity, and moreover would serve as the linchpin for a proposed disinformation identification framework.

Summarizing the above, security analysts will achieve success in identifying the presence of Russian disinformation online and tracing its origins by:

  • Using repositories (warehouses, data lakes, etc.), pull all a social media platform’s data, regardless of type (text, image, video, etc.), into one infrastructure. Rather than create these repositories, which is cost and talent prohibitive, and noting they are not in large measure commercially available, the defence community should look to the private sector for existing bespoke infrastructure that can be re-purposed;
  • Formatting those repositories so that when data are ingested, they are standardized and tagged with markers for both routine quantitative characteristics (i.e., date/time stamps) and human-driven qualitative characteristics behind the data (i.e., sentimentality scores). Separately, neither provides the complete picture necessary to identify disinformation; together, fewer orchestrated campaigns would go undetected;
  • Ensuring tools to query the data are available for security analysts with point-and-click user interfaces, and that they allow for analysis across three views:
    1. Truth seeding: queries anchor on co-occurrence formulas (if A then B, where A is the kernel of truth and B the suspected disinformation);
    2. Chaotic ambiguity: queries anchor on time-series formulas (A’s prevalence over X time demonstrates consistency of disinformation); and
    3. Network effects: queries anchor on temporal formulas (A’s sentimentality scores leading to surge in popularity, i.e., retweets or forwards).

TOP OF PAGE


Next Steps: Testing and Refining the Model

Application of this model requires access to data warehouses that by design are not available in the defence community ecosystem, as they reside in the private sector and are often highly proprietary. A few open-source toolsets exist, mostly within academia. TweetsKB, a corpus of over two billion tweets is perhaps the largest and most capable.59 It includes metadata from all tweets dating back to 2013, each tagged with both quantitative and qualitative characteristics, although it requires specific programming expertise to access and query (RFF/S and SPARQL).

Using TweetsKB, or a related repository, the model’s effectiveness could be tested and refined against any number of known historical Russian disinformation campaigns. One particularly good case study, given its relatively short timeline and the high number of disinformation events it included, was the 2020 Belarusian presidential election. Accompanying the election was a wave of protests and police brutality as President Alexander Lukashenko maintained his grip on power, supported by vote-rigging and disinformation.60 As part of the pre-election Russian disinformation operations, Opposition leader Sviatlana Tsikhanouskaya was rumoured online to have been planted by European feminists in an “aim at destroying traditional Belarusian values.”61 SPARQL queries that seek to uncover instances of truth seeding, chaotic ambiguity and network effects in tweets in the six months preceding the election would be expected to paint a clear picture of Russian disinformation at work in that instance or any number of other specific disinformation instances executed during the election campaign.

TOP OF PAGE


Conclusion

Russian online disinformation operations have been operating at scale for nearly a decade, yet Western security analysts lack a cohesive framework for identifying and deterring them in a timely fashion. This paper has proposed such a framework, built around three dominant themes teased from over a century of Russian disinformation activity (truth seeding, network effects and chaotic ambiguity), and structured to target large social media platforms using advanced technologies.

TOP OF PAGE


End Notes

1 A. Polyakova, M. Boulègue, K. Zarembo, S. Solodkyy, K. Stoicescu, P. Chatterje-Doody and O. Jonsson, “The Evolution of Russian Hybrid Warfare,” CEPA, March 28, 2021, https://cepa.org/the-evolution-of-russian-hybrid-warfare-introduction/. Accessed January 1, 2022.

2 J. S. Nye, “Deterrence and Dissuasion in Cyberspace,” International Security, vol. 41, no. 3, 2017: 44–71.

3 P. Fafalios, V. Iosifidis, E. Ntoutsi, and S. Dietze, “TweetsKB: A Public and Large-scale RDF Corpus of Annotated Tweets,” The Semantic Web, 2018: 177–190.

4 T. Rid, Active Measures: The Secret History of Disinformation and Political Warfare, (New York: Picador, 2021).

5 J. Ryan, Lenin’s Terror: The Ideological Origins of Early Soviet State Violence, (London: Routledge, Taylor & Francis Group, 2014).

6 Ibid.

7 I. M. Pacepa and R. Rychlak, Disinformation: Former Spy Chief Reveals Secret Strategy for Undermining Freedom, Attacking Religion and Promoting Terrorism, (Chicago: WND Books, 2013).

8 C. M. Andrew and V. Mitrokhin, The Mitrokhin Archive: The KGB in Europe and the West, (New York: Penguin Books, 2018).

9 Rid.

10 Ibid.

11 M. Fisher, J. Woodrow Cox and P. Hermann, “Pizzagate: From Rumor, to Hashtag, to Gunfire in DC,” Washington Post, December 6, 2016.

12 Ibid.

13 C. Yand and S. Fenkell, “‘PizzaGate’ Conspiracy Theory Thrives Anew in TikTok Era,” New York Times, June 27, 2020.

14 F. Splidsboel Hansen, “Russian Hybrid Warfare: A Study of Disinformation,” DIIS Report, no. 06, 2017.

15 Ibid.

16 C. Bohlen, “Moscow Uniformly Blames U.S. for Downing of Korean Airliner,” Washington Post, September 1, 1984, https://www.washingtonpost.com/archive/politics/1984/09/01/moscow-uniformly-blames-us-for-downing-of-korean-airliner/7783793c-aa6e-48f6-b29c-c68854f686bd/. Accessed January 15, 2022.

17 M. Gordon, “Ex-Soviet Pilot Still Insists KAL 007 Was Spying,” New York Times, December 6, 1996.

18 Editorial, “The Guardian View on Russian Propaganda: The Truth is Out There,” The Guardian, March 2, 2015, https://www.theguardian.com/commentisfree/2015/mar/02/guardian-view-russian-propaganda-truth-out-there. Accessed January 10, 2022.

19 EUvsDisinfo, “Tracing Five Years of Pro-Kremlin Disinformation about MH17,” July 18, 2019, https://euvsdisinfo.eu/tracing-five-years-of-pro-kremlin-disinformation-about-mh17/. Accessed January 5, 2022.

20 L. Harding, A Very Expensive Poison: The Definitive Story of the Murder of Litvinenko and Russia’s War with the West,. (London, UK: Guardian Faber Publishing, 2016).

21 EUvsDisinfo, “Behind the Smokescreen: Who Are the Actors Spreading ...,” March 22, 2018, https://euvsdisinfo.eu/behind-the-smokescreen-who-are-the-actors-spreading-disinformation-on-ex-spy-poisoning/. Accessed January 1, 2022.

22 Ukraine Crisis Media Center, “Deny and Distort: Disinformation in Navalny Poisoning Case,” September 24, 2020, https://uacrisis.org/en/deny-and-distort-disinformation-in-navalny-poisoning-case. Accessed January 11, 2022.

23 T. Boghardt, “Operation Infektion: Soviet Bloc Intelligence and Its AIDS Disinformation Campaign,” Studies in Intelligence, vol. 53, no. 4, 2009: 1–24.

24 United States Department of State, “Soviet Influence Activities: A Report on Active Measures and Propaganda, 1986-87,” vol. 9627, 1987.

25 Rid.

26 Ibid.

27 N. Aleksejeva, L. Andriukaitis, L. Bandeira, D. Barojan, G. Brookie, E. Buziashvili, A. Carvin, K. Karan, B. Nimmo, I. Robertson, M. Sheldon, “Operation Secondary Infektion,” Atlantic Council, September 9, 2019, https://www.atlanticcouncil.org/in-depth-research-reports/report/operation-secondary-infektion/. Accessed January 2, 2022.

28 Ibid.

29 Boghardt.

30 Rid.

31 E. Lange-Ionatamisvili, “Analysis of Russia’s Information Campaign against Ukraine,” Riga: NATO Strategic Communications Center of Excellence, 2015.

32 Rid.

33 Lange-Ionatamisvili.

34 Ibid.

35 M. van Herpen, Putin’s Wars: The Rise of Russia’s New Imperialism, (Lanham, MD: Rowman & Littlefield, 2015).

36 Lange-Ionatamisvili.

37 B. Zoltan, “The Tragedy of the Kursk: Crisis Management in Putin’s Russia,” Government and Opposition, vol. 39, no. 3, Cambridge University Press, 476–503: 2004, http://www.jstor.org/stable/44483081.

38 P. B. Rich, Crisis in the Caucasus: Russia, Georgia and the West, (London, UK: Routledge, 2012).

39 A. Cohen, and Ariel and Robert Hamilton, “The Russian Military and the Georgia War: Lessons and Implications,” Monographs, Books, and Publications, 2011: 576.

40 Lange-Ionatamisvili.

41 Rich.

42 Cohen, Hamilton and Hamilton.

43 Ibid.

44 European Parliament, “At a Glance: Understanding Propaganda and Disinformation,” November 2015. Retrieved January 3, 2022, from https://www.europarl.europa.eu/RegData/etudes/ATAG/2015/571332/EPRS_ATA(2015)571332_EN.pdf.

45 H. Conley, J. Mina, R. Stefanov and M. Vladimirov, The Kremlin Playbook: Understanding Russian Influence in Central and Eastern Europe, (Lanham, MD: Rowman & Littlefield, 2016).

46 Lange-Ionatamisvili.

47 O. Churanova, “Fake: A Child Died in Donbas as a Result of a Ukrainian Drone Attack (update),” StopFake, April 8, 2021, https://www.stopfake.org/en/fake-a-child-died-in-donbas-as-a-result-of-a-ukrainian-drone-attack/. Accessed January 20, 2022.

48 Ibid.

49 B. Nimmo, C. François, C. S. Eib, L. Ronzaud, R. Ferreira, C. Hernon and T. Kostelancik, “Secondary Infektion,” Graphika, 2020.

50 N. Aleksejeva, L. Andriukaitis et al.

51 K. Giles, “The Next Phase of Russian Information Warfare,” Riga: NATO Strategic Communications Center for Excellence, vol. 20, 2016.

52 EUvsDisinfo, “Tracing Five Years …”

53 A. Crosby and Y. Petrovskaya, “Russian Film Draws ‘Balkan Line’ in Kremlin’s Effort to Shore Up Support,” Radio Free Europe/Radio Liberty, August 3, 2018, https://www.rferl.org/a/russian-film-balkan-line-pristina-airport-serbia-support/29410441.html. Accessed January 18, 2022.

54 P. Sauer, “New Movie Depicting Heroic Russian Instructors in Central African Republic Linked to ‘Putin’s Chef’,” Moscow Times, May 21, 2021, https://www.themoscowtimes.com/2021/05/21/new-movie-depicting-heroic-russian-instructors-in-central-african-republic-linked-to-putins-chef-a73973. Accessed January 4, 2022.

55 EUvsDisinfo, “Kursk Submarine Disaster Was Caused by Collision with NATO ...,” 2021, https://euvsdisinfo.eu/report/kursk-submarine-disaster-was-caused-by-collision-with-nato-vessel. Accessed January 1, 2022.

56 I use “security analysts” loosely, referring to the gambit of military intelligence operators, civilian defence analysts, and to a lesser extent private sector security practitioners.

57 S. Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, (New York: PublicAffairs, 2018).

58 Ibid.

59 Fafalios, Iosifidis et al.

60 Atlantic Council, “The Infowar Behind the Belarus Revolution,” October 29, 2021, https://www.atlanticcouncil.org/blogs/belarusalert/the-infowar-behind-the-belarus-revolution/. Accessed January 20, 2022.

61 EUvsDisinfo, “European Feminists Aim at Destroying Traditional Belarusian Values through Sviatlana Tsikhanouskaya,” July 21, 2020, https://euvsdisinfo.eu/report/european-feminists-aim-at-destroying-traditional-belarusian-values-through-sviatlana-tsikhanouskaya/. Accessed January 6, 2022.

TOP OF PAGE


About the Authors

Tom Robertson is the CEO of Continental Currency Exchange, Canada’s largest FX retailer. His views on the intersection of technology and great power competition have appeared in The National Interest, CBC News, EuroNews, First Monday, and elsewhere.

 

Teah Pelechaty is the Opinion Editor at the Kyiv Independent. She is completing a master’s degree in Global Affairs with a specialization in Global Security and Digital Governance at the University of Toronto and Sciences Po. She was previously a Junior Policy Analyst in national security with the Government of Canada and a Research Associate with the European Values Center for Security Policy. 

TOP OF PAGE


Canadian Global Affairs Institute

The Canadian Global Affairs Institute focuses on the entire range of Canada’s international relations in all its forms including trade investment and international capacity building. Successor to the Canadian Defence and Foreign Affairs Institute (CDFAI, which was established in 2001), the Institute works to inform Canadians about the importance of having a respected and influential voice in those parts of the globe where Canada has significant interests due to trade and investment, origins of Canada’s population, geographic security (and especially security of North America in conjunction with the United States), social development, or the peace and freedom of allied nations. The Institute aims to demonstrate to Canadians the importance of comprehensive foreign, defence and trade policies which both express our values and represent our interests.

The Institute was created to bridge the gap between what Canadians need to know about Canadian international activities and what they do know. Historically Canadians have tended to look abroad out of a search for markets because Canada depends heavily on foreign trade. In the modern post-Cold War world, however, global security and stability have become the bedrocks of global commerce and the free movement of people, goods and ideas across international boundaries. Canada has striven to open the world since the 1930s and was a driving factor behind the adoption of the main structures which underpin globalization such as the International Monetary Fund, the World Bank, the World Trade Organization and emerging free trade networks connecting dozens of international economies. The Canadian Global Affairs Institute recognizes Canada’s contribution to a globalized world and aims to inform Canadians about Canada’s role in that process and the connection between globalization and security.

In all its activities the Institute is a charitable, non-partisan, non-advocacy organization that provides a platform for a variety of viewpoints. It is supported financially by the contributions of individuals, foundations, and corporations. Conclusions or opinions expressed in Institute publications and programs are those of the author(s) and do not necessarily reflect the views of Institute staff, fellows, directors, advisors or any individuals or organizations that provide financial support to, or collaborate with, the Institute.

TOP OF PAGE


Showing 1 reaction

Please check your e-mail for a link to activate your account.
SUBSCRIBE TO OUR NEWSLETTERS
 
SEARCH

HEAD OFFICE
Canadian Global Affairs Institute
Suite 2720, 700–9th Avenue SW
Calgary, Alberta, Canada T2P 3V4

 

Calgary Office Phone: (587) 574-4757

 

OTTAWA OFFICE
Canadian Global Affairs Institute
8 York Street, 2nd Floor
Ottawa, Ontario, Canada K1N 5S6

 

Ottawa Office Phone: (613) 288-2529
Email: [email protected]
Web: cgai.ca

 

Making sense of our complex world.
Déchiffrer la complexité de notre monde.

 

©2002-2024 Canadian Global Affairs Institute
Charitable Registration No. 87982 7913 RR0001

 


Sign in with Facebook | Sign in with Twitter | Sign in with Email