Canadian Army Podcast

Bombs and bullets are not the only threats to our soldiers. Online disinformation is also being used as a weapon, and it’s one they face both on deployment and at home. Dr. Meghan Fitzpatrick and Dr. Dominique Laferrière are social scientists at Defence Research and Development Canada and have done extensive research on the topic. They explain the disinformation ecosystem that affects the military community and give us advice on how to separate fact from fiction.

Here are some online tools that can help you counter disinformation:
https://hapgood.us/2019/06/19/sift-the-four-moves/
https://www.snopes.com
https://euvsdisinfo.eu
https://info-radical.org/en/
https://www.getbadnews.com/books/english/
https://harmonysquare.game/en

Feel free to contact Captain Adam Orton with any comments or questions:
armyconnect-connectionarmee@forces.gc.ca

Meet our host Captain Adam Orton:  Bio | Video

Connect with the Canadian Army on social media:
Facebook  | Twitter | Instagram | YouTube

Visit Forces.ca if you are considering a career in the Army.

Copyright Information

© His Majesty the King in Right of Canada, as represented by the Minister of National Defence, 2023

What is Canadian Army Podcast?

This podcast is for and about soldiers of the Canadian Army.

Its primary goal is to provide them with useful information through thoughtful and open discussions that reflect their mutual interests and concerns.

Though soldiers are our primary audience, the topics covered on this podcast should be relevant to anyone who supports our soldiers or who has an interest in Canadian military matters.

[Fast paced music plays]

Dr. Meghan Fitzpatrick: You wouldn’t send a soldier out without basic information about their weapons and how to use them. And we should be doing the same for the information space.

Captain Adam Orton: Hi! I’m Captain Adam Orton with the Canadian Army Podcast. For thousands of years, armies have fought toe-to-toe on the battlefield. But it’s easy to forget that information warfare is a part of that fight that’s been happening since the beginning of time. Now, with the advent of the Internet, geography plays less of a role than ever, in terms of separating soldiers from their enemies in conflict. Here with us to talk about disinformation is Dr. Meghan Fitzpatrick.

Dr. Meghan Fitzpatrick: Hi, Adam.

Capt Orton: And Dr. Dominique Laferrière.

Dr. Dominique Laferrière: Hi, Adam.

Capt Orton: Who are social scientists that work with the Canadian Armed Forces and have done quite a bit of research on the topic. All right, let’s get into it. So we weren’t talking about disinformation as much in the public sphere like ten to fifteen years ago. But we’re definitely talking about it now. Dr. Fitzpatrick? Why is that?

Dr. Fitzpatrick: I think that’s because the way that we live our lives now is almost fully online—or at least a great portion of our lives is online. And that’s a space that is particularly prone to disinformation.

Capt Orton: Why? Why is it prone to disinformation?

Dr. Fitzpatrick: It’s a space that doesn’t have very many gatekeepers. So in the past, our news was really filtered in a way. So whether that’s a newspaper or a formal newscast—whereas online, everyone can be both a consumer and a producer of information. So it’s a much busier information environment than what we’re used to. And we can get information immediately. So, in the past, you'd have to wait for information to travel to you—whether that was by radio, newspaper, whatever it might be. And because of that, it took some time, there was a delay. Now, there’s no delay. And I think, Dominique, if you have anything to add to that.

Dr. Laferrière: The one thing I would add. And I’m sure we'll get back to it later, is that people are not necessarily well equipped to handle that much information, which ultimately allows the information to spread more rapidly and to be shared, you know, within networks of trusted individuals, within, you know, networks of friends and family. So that facilitates the spread of disinformation even more.

Capt Orton: By people being not well equipped, do you mean like humans just constantly being bombarded with unlimited amounts of information on a daily basis?

Dr. Laferrière: Yeah, that’s pretty much what I mean. And people being bombarded with information and not necessarily being able to disentangle highly credible and reliable information from information that’s not credible or reliable.

Capt Orton: So, most people—probably in their minds, at least—have a general idea of what they think disinformation is. But, when you think of disinformation, how is that different than maybe the average person thinks what that is?

Dr. Fitzpatrick: I think the way that we’ve defined it is very simple. So it’s a deception technique that uses false information to deceive and to mislead. So it’s spreading false information on purpose for a purpose. And I think the main thing that people usually get mixed up is between mis- and disinformation. So misinformation is not spread on purpose. People are spreading false information without actually knowing that they’re consuming false information, whereas those who are creating and then spreading disinformation are doing it for a purpose.

Capt Orton: While looking at it, where do you see the common offenders in terms of spreading intentional disinformation?

Dr. Fitzpatrick: Well, this is a pretty busy space. I think some of the bigger offenders are state actors. And Russia is a well known example of a country that spreads disinformation. I think one example that listeners may be familiar with is the 2016 U.S. presidential election, and the Russian disinformation campaign that was run during that particular campaign. And they spread disinformation really to polarize both sides of the political electorate. It wasn’t so much about supporting a particular candidate or achieving a particular aim. It was more so to divide the entire electorate by spreading divisive and false content online.

Capt Orton: Dr. Laferrière, how is this a threat to Canadian soldiers or military members right now?

Dr. Laferrière: I think the best way to answer your question is to start with some examples of disinformation campaigns that have either targeted the CAF or portrayed the CAF in a false way or that have targeted the CAF as an audience of that disinformation. So I know that Meghan, you’ve worked on this quite a bit because we’ve worked together on this?

Dr. Fitzpatrick: Yeah, absolutely. So as many people know, Canada is a member of NATO’s Enhanced Forward Presence. And starting in 2017, when the task force arrived in Latvia, a major Russian news program falsely claimed that the commander of the task force was a convicted murderer. And they aired photos of an individual in women’s underwear. And they used that to then characterize the taskforce as the, quote: “gay battle group” to play on anti-LGBTQ sentiments in the region. And since then, there have been a lot of stories that play up false narratives about the CAF soldiers that are there. So, for instance, that they are somehow living in luxury accommodation paid for by Latvian taxpayers; I’m sure they probably are not doing so, in fact I would imagine their accommodation might leave something to be desired occasionally.

Capt Orton: My military experience suggests that it was not luxurious.

Dr. Fitzpatrick: Right, exactly. And there’s also comparable examples. So British troops in neighbouring areas that have been accused of desecrating national memorials—openly defying police. And all of the battle groups have been accused of spreading COVID-19 during military exercises. So they have certainly been a subject for disinformation. And that’s often to undermine the reputation of the targeted force as well as to play up tensions or any possible tensions with them and the local population. But of course, disinformation is not just something that happens when you’re on deployment. This is also something that happens at home. And Dominique can certainly speak to a study that was done looking at exactly this.

Dr. Laferrière: Yeah. So in 2017, there was a study conducted by Vietnam Veterans of America. And what they discovered is that there were a lot of fake Facebook pages that had been created to entice the veteran community to bring everyone together online. And they were basically using existing images and information shared by true profiles and repurposing it. So, at first, they would, you know, lure people into their pages and attract them to follow their pages with this very positive messaging. But then ultimately, they started sharing divisive information. And, you know, the goal of this was to create tension between veterans and the military population and the civilian population. So, that’s a good example, I think, of how military members can be the target of a disinformation campaign in the sense that they’re being falsely portrayed. Whereas in this case, military members and veterans are the target audience of a disinformation campaign to sow more discord in the military population and the population in general. So I guess these two examples kind of suggest that the threats are threats to operational security and operational effectiveness, like the example that Meghan gave, right. And the example that I gave is basically, potentially decreasing trust among military and veteran populations and sowing discord within the population and between military and civilian populations as well.

Dr. Fitzpatrick: Yeah, and I would just add that we have to remember that the military is going to be an attractive target for adversaries—Western armed forces are increasingly diverse, and that is a strength. But it does mean that adversaries are going to look for those weaknesses and vulnerabilities and points of contention between different groups. And they will try to exploit them whether or not you’re on deployment or you’re at home.

Dr. Laferrière: Going back to, you know, the issue of threats, I think one of the threats that we do have to think about is threats to national security and public safety. And one of the things that we do have to think about and remember is that CAF members do have access to weapons—they are trained, they have access to classified information. So, if people domestically fall prey to disinformation, and even worse, you know, you start getting into conspiracy theory and conspiracy theory territory, you can get into some more troublesome or some more nefarious acts. You can think about the January 6th storming of the U.S. Capitol—which is one of the examples of military members who adhere to disinformation and conspiracy theories that have led to quite grave consequences.

There are also Canadian examples. So for example, in 2020, there was a Canadian Ranger who drove his truck through the gates of Rideau Hall, and he was found in possession of firearms, and the court later determined that his actions were politically motivated and that he intended to intimidate the Government of Canada. And what was also found later is that this person was consuming disinformation and a certain number of conspiracy theories as well. So these are some of the more local threats that could result from disinformation and conspiracy theories.

Capt Orton: So you talk about things like decreasing trust. And if I think, like, you know, especially in warfare, kinetic operations, bullets, bombs, stuff like that, doing battle damage assessment on that it’s very easy to see the results, like it’s very clear—like a building has been destroyed, or like the effect is immediately obvious. But, in kind of the information plane, maybe not so much. What has been the impact? And how do you assess that impact?

Dr. Fitzpatrick: Yeah, I would say that this is a very challenging area, and militaries around the world are trying to develop collateral effects measures or collateral effects estimation, on par with something like a battle damage assessment, or I should say, collateral damage estimation. But this is really difficult to do. And I think everyone is struggling to get their arms around what is an immense problem?

Capt Orton: So if we’re struggling to measure that affects, you know, what I would call a defensive context, why are they doing it? And how do they—by ‘they’ I would guess I would mean, the adversaries—how would the adversaries measure it in an offensive context?

Dr. Fitzpatrick: I don’t think they necessarily have to measure it. Adversaries have a variety of purposes for why they might be using disinformation. But going back to that 2016 U.S. presidential election, they didn’t need to achieve a particular effect. It was simply to sow division. So if they can increase division, success has been achieved. So they don’t really need to measure it so much as just achieve a generalized effect.

Capt Orton: You know, this reminds me—I’m going to throw back to the sniper podcast, which is, you know, this, the sniper said: “I sow chaos on the battlefield.” Right?

Dr. Fitzpatrick: Exactly.

Capt Orton: And so chaos in of itself can be an effect that, while not particularly harmful necessarily, in the moment, definitely reduces coordination and just makes things more complicated in general. Absolutely.

Dr. Fitzpatrick: And I suppose another example would be in 2014, when Russia annexes Crimea, for a period of time, that was significant enough—people didn’t know whether Russia was actually deployed in Crimea. It was rumours of Little Green Men. They didn’t really need to achieve that effect for very long. It just created enough confusion that it was hard for Western countries and for Ukraine itself to get organized and effectively respond.

Capt Orton: And that effect could actually be measured, but not in terms of information impact on people, but by the delayed response to an activity?

Dr. Fitzpatrick: Right, exactly.

Capt Orton: That’s actually really interesting. I never even thought of that.

Dr. Fitzpatrick: And maybe one element that I might add to what Meghan just said, is, you know, you asked why— why are they doing it if we can’t measure the effects? And I think one of the reasons is that the costs are extremely low. And there are techniques that can be used and put in place to generate a lot of information and disseminate it very rapidly. So think about bots, troll farms. So it’s quite easy to disseminate that information and the costs can be quite low.

Capt Orton: Well, speaking of bots, and things of that, you know, astroturfing, you know, for those that don’t know, astroturfing is having a large amount of accounts that spam this material in order to fill up a social media page or something like that, while creating the appearance that’s many individuals. What kind of tactics or procedures are used to actually spread disinformation like in a willful manner?

Dr. Fitzpatrick: I think there are a variety of tactics that we’ve already touched upon. So you have an instance of using social media platforms to spread disinformation. You could use bots and trolls on those social media platforms as well. You could also use a tactic like deep fake videos, I think many people will have encountered deep fakes, so videos and audio that’s been manipulated to give the appearance of something that the original video or audio did not. There are many examples out there and it’s relatively easy and low cost as Dominique alluded to. To do this, all you really need is a computer and some time.

Dr. Laferrière: If I can add to what Meghan just said—and going back to the examples that we gave also a little bit earlier. So some of the other techniques that increase the likelihood that an online campaign will go viral, include things like impersonation, so we talked about how fake Facebook accounts can be created using real information and real pictures from other users. So impersonation is one of the tactics, playing on strong emotions, especially negative emotions like anger and fear, normally is a good way to ensure or enhance the virality. Not sure virality is a word here, but let’s just try. It is now. So the—yeah, the likelihood that a campaign will go viral basically. So using negative emotions is one of them. And also polarization. But Meghan also touched upon that and mentioned it already.

Dr. Fitzpatrick: Yeah, playing on those hot button issues that our society has. So it could be anything from race to religion, to immigration—whatever is the hot button issue of the day, they know that that is going to get a response?

Capt Orton: How do you tell the difference between a troll who’s just on the Internet for kicks, maybe just trying to get people riled up—and we’ve been talking mainly about coordinated state actors, like an active disinformation campaign. How do you tell the difference?

Dr. Fitzpatrick: I think this is exactly what makes it so confusing. For the average person, I don’t think it will be very easy to tell that difference because the information ecosystem online is heavily populated by both. And an individual who has a malign intent can cause some damage as much possibly as a coordinated campaign. So it can be very difficult to tell the difference between the two unless you’re actually actively looking for it.

Capt Orton: So, given that this is a thing that’s pretty prevalent in social media, and that we’re obviously dealing with it, what are some things you can do to protect yourselves? Or even in some cases, if you see people spreading that disinformation, how can you do something about it?

Dr. Laferrière: I think that’s a good question. And it ties back to the work that Meghan and I have been doing. And if you look at research, and what practitioners are recommending, I think one of the best ways to prevent the spread of disinformation is to increase media literacy skills within populations. So media literacy, essentially, is the ability of an individual to be able to locate credible information and to use that information for their specific outcome, whether it be be informed or know about a certain news story, and so on. So basically being able to locate credible information. And there are specific skills or specific tips that can be taught to people to increase that media literacy. Meghan, I don’t know if you want to share a few of those tips.

Dr. Fitzpatrick: I think a good place to start is with a little model that’s been developed by a researcher Mike Caulfield that’s called SIFT. So it’s: Stop, Investigate, Find, and Trace. And it’s essentially outlining the steps that somebody would need to take to be able to find credible information online. It’s an incredibly useful way to think of this process, because sometimes, I think the process of finding credible information can be just as overwhelming as the information itself.

Capt Orton: That’s right.

Dr. Fitzpatrick: There’s also a lot of resources out there that have been developed by academics as well as civil agencies, civic groups. So something like the Stanford History Education Group doesn’t sound very flashy, but it does have some great information on there about understanding media literacy and equipping yourself with the skills necessary and it’s freely available to anyone online.

Capt Orton: I would say if it doesn’t sound flashy, it’s probably because it’s serious.

Dr. Fitzpatrick: Yes, it is very serious. Another thing that researchers do is lateral reading, so not staying on the website to figure out if it’s legitimate. I think we’ve all seen a lot of flashy websites that look pretty professional, and might give you the impression that they are a professional institution. But when you do a little digging, it is not in fact the case. So don’t just stay on the Web page to figure out if it’s legitimate.

Capt Orton: Yeah, diversify your sources

Dr. Fitzpatrick: Exactly.

Dr. Laferrière: And there are some other tools that can be used that are quite popular these days. These are inoculation-based games. So from a psychological perspective, inoculation is basically the idea that you can inoculate people against harmful narratives by essentially preemptively telling them that they will be receiving some form of harmful narrative or information and giving them the tools to be protected against that information. Two of the games that I have in mind right now one is called The Bad News Game and the other is Harmony Square, and both are available online. And, essentially, you are playing the bad guy. In Harmony Square, you are the chief disinformation officer of a public square and the goal is to get false information. So disinformation campaigns to spread as much as possible on social media. So, essentially, as a player, you are on a social media platform, and your goal is to spread disinformation as much as possible. And in doing so, you are basically learning the tips and the tactics of disinformation. So some of which we’ve discussed. So impersonation, playing on strong emotions, polarization and trolling. So you’re basically implementing and using those strategies, and you’re getting points when your campaign is going further and attracting more people. And you’re essentially being quote unquote, punished when you’re following journalistic standards. So it’s a little bit of a reverse mechanism. But as you’re playing the game, you’re really learning how disinformation creators and disseminators are crafting their campaigns, and thinking about them to make them go viral. So basically, you’re learning the tactics And the idea is that by learning them when you encounter them online, you will be better able to determine that this piece of information might be part of a bigger disinformation campaign.

Capt Orton: Other than games, what other resources are there for fact checking things?

Dr. Laferrière: Well, there are fact checking organizations that have websites that you can visit and find more information. So what they do basically, is they take news pieces, and they assess their credibility and the extent to which the information they disseminate is true or not. And what’s really neat about these organizations, or a lot of them, is that they take you through the steps that they took to determine whether the information was true or not. So you don’t just have a green check. Yes, this information is true, or you know, red X saying no, this is false, you actually have the logical step that we’re taking to determine whether that information was true or not. And some of the best known fact checking resources are Snopes, and there’s also EU vs Disinfo which specifically looks at Russian disinformation.

Dr. Fitzpatrick: Yeah. And just to add to what Dominique was saying, EU vs Disinfo is particularly good. If you’re looking to understand what’s going on in Ukraine—what’s going on over there, it’s a great place to look for those disinformation narratives. And they break it down for you exactly how they found the sources to find out the truth behind it.

Capt Orton: What are some of the things that we’re doing to prevent disinformation from getting into, you know, basically, the Army into the people that are working and trying to get things done?

Dr. Fitzpatrick: Yeah, I would say that you can’t really stop disinformation from entering the information ecosystem—it’s just going to be there. That’s how it’s going to work. But you can work on prevention. And I think about it a bit like basic training. So you wouldn’t send a soldier out without basic information about their weapons and how to use them. And we should be doing the same for the information space.

Capt Orton: So we’ve looked at—let’s say, just for simplicity’s sake, over the past ten years, you’ve definitely seen disinformation kind of kick into overdrive as a result of how social media played a role in our lives. What do you see coming over the horizon in kind of that information space, like five to ten years down the road?

Dr. Fitzpatrick: Well, this is hard to predict. But given the sophistication, the increasing sophistication of artificial intelligence, of machine learning, we are likely to see increasingly sophisticated disinformation techniques, perhaps disinformation that is more effectively targeted or tailored to whatever audience they’re looking to reach, and of course videos—so things like the deep fake videos are probably going to become more and more convincing.
Capt Orton: You know, what you’re saying about videos and deep fakes, also, is we’ve pretty much achieved consumer-level deep fakes already. Like there’s image generators, where you can go online, upload your picture, and boom, you’re in the middle of a movie scene or something like that. So I can’t imagine it’d be that far off that, you know, becomes very believable. And I can see why that would be concerning.

Dr. Fitzpatrick: No, absolutely. And especially when those videos are using well known figures. Because again, we’re talking about trust and credibility. And it can be very difficult to maintain that if it's easy to manipulate imagery as well as audio.

Dr. Laferrière: That’s the thing with these technologies is that they are not created for nefarious purposes. Right? There are a lot of positive outcomes, and that’s their main intent, right? But they can definitely be used with nefarious intent.

Capt Orton: Yeah, like maybe I just want to put my face on Arnold Schwarzenegger and Commando, you know.

Dr. Laferrière: Maybe you just want to look like a cat on Tik Tok.

[Music starts]

Capt Orton: All right, well, if you learn anything here, don’t trust everything you read, check your primary sources, and stay on top of it. Thanks so much for coming on the podcast. Really appreciate everything.

Dr. Laferrière: You’re welcome. Thanks for having us.

Dr. Fitzpatrick: Thank you for the invitation.

Capt Orton: If you want to learn more about disinformation, take a look at some of the games and some of the resources that you can find in our show notes.

I’m Captain Adam Orton for the Canadian Army Podcast. Orton out.

[Music ends]