Technology and Security (TS)

In this episode of Technology and Security, Dr Miah Hammond-Errey speaks with Nina Jankowicz, world leading disinformation expert and author. They discuss the necessity of an alliance among democratic nations to address technological challenges, and shape policies that have real impact. They discuss the security threats of disinformation campaigns, highlighting their real-world impact on human behaviour and the role of data in targeted manipulation. Russian disinformation tactics were examined, in the context of events in Ukraine, the treatment of Navalny, and broader global developments. Nina also shared examples of successful strategies used by countries like Estonia and Ukraine in countering disinformation through social policies and media literacy initiatives.
 
They explore content moderation and censorship equivalence and what it means for social media platforms and safeguarding democracy. The discussion touches on the impact of attention capture on our information environment. Looking ahead, they predict increased interference campaigns in 2024 as well as discuss the role of AI and watermarking, the necessity of improving legal frameworks for online harms, and the importance of transparency and oversight in social media to counter data and information warfare effectively. The conversation includes personal reflections from Nina and insights into the normalisation of misogyny and the alarming rise of deepfake pornography, with strategies to combat its proliferation and potential role for public figures like Taylor Swift in mobilising survivor communities. 
 
Resources mentioned in the recording:
 
·               Nina Jankowicz, How to Lose the Information War: Russia, Fake News, and the Future of Conflict (Bloomsbury 2020) https://www.ninajankowicz.com/books
·               Nina Jankowicz, How to Be a Woman Online: Surviving Abuse and Harassment and How to Fight Back (Bloomsbury 2022) https://www.ninajankowicz.com/books
·               Vaclav Havel, The Power of the Powerless. October 1978. 
·               Miah Hammond-Errey Big Data, Emerging Technologies and Intelligence: National Security Disrupted  (Routledge 2024) (20% discount code for book AFL04)
·               Miah Hammond-Errey, 18 December 2023, Did you Tech 2023? A wrap of the year’s tech news, with an Australian flavour, The Mandarin  
 
This podcast was recorded on the lands of the Dharawul people, and we pay our respects to their Elders past, present and emerging. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.
 
Thanks to the talents of those involved. Music by Dr Paul Mac and production by Elliott Brennan. 
 
Transcript: please check against delivery 

Dr Miah Hammond-Errey: My guest today is Nina Jankowicz. Nina is a disinformation expert who advocates for those harassed online. She's the vice president of the centre for Information Resilience. In 2022 she served as the executive director of the Disinformation Governance Board in the US Department of Homeland Security. She has authored two fabulous books, How to Lose the Information War and How to Be a Woman Online. Thanks so much for joining me, Nina.
 
Nina Jankowicz: I'm really excited to be here.
 
Dr Miah Hammond-Errey: We're coming to you today from the lands of the Dharawal people. We pay our respects to their elders, past, present and emerging, We acknowledge their continuing connection to land, sea, and community and extend that respect to all Aboriginal and Torres Strait Islander people.
 
Dr Miah Hammond-Errey: Nina, you have written extensively about elections and the information environment, and we've both written about the significance of 2024 for elections. 76 countries are holding elections this year, which will see more than half the world's population casting a ballot. What are you most concerned about?
 
Nina Jankowicz: I think the the thing that makes me most worried is actually the change kind of sea change in how we think about content moderation and trust and safety as we head into this year. back in 2016, 2017, this was a field that didn't really exist yet. since then, we've seen kind of the emergence of the field, the emergence of a lot of cooperation between the public and private sectors to identify and mitigate disinformation that have resulted in some pretty big gains. Right? We've seen, you know, campaigns taken down by the French government, that the Russians were trying to interfere in the 2017 French presidential election. We've seen, the the Iranian interference in 2020 where Iranians were pretending to be Proud Boys in the US election and targeting Democratic voters and swing states, in order to try to suppress their their voter turnout. And all of that was a result of coordination between the public and private sectors, and had less of an effect than it might have otherwise. If, you know, if we didn't have that coordination. And now what we see, especially here in the US, is this equivalence between any content moderation and censorship, while we head into this critical election period, I'm just worried that we're not going to see as much attention, um, that is leading to actual change and mitigation of threats that we saw in previous election cycles. Um, and I'm also worried that we're not going to see as much robust research into this sphere, because in addition to all of that we're also seeing harassment campaigns against individual researchers and their research institutions. And we have a large section of the population, at least here in the United States, that is equating, uh, not only keeping our democracy safe, but keeping people safe online as as censorship.
 
Dr Miah Hammond-Errey: in your recent Foreign Affairs article, You said, uh, nearly eight years after Russian operatives attempted to interfere in the 2016 US presidential election, US democracy has become even less safe, and the country's information environment more polluted with the freedom of speech of US citizens more at risk. what do you think needs to be done about it?
 
Nina Jankowicz: I think we're in such an intractable place here in the United States right now where again, we've seen these threats laid out over and over. And yet we've done so little about it. And so I think the first order of business is like, yeah, let's have these hard conversations about what is the proper relationship between government and the social media platforms, what oversight should the social media platforms have? And so what I would like to see is a not even a regulatory regime, but a transparency and oversight regime over the social media platforms. So we understand the decisions that they're making, what they're moderating, why they're moderating it, how much they're responding to user reports of harassment and threats and things like this.
 
Nina Jankowicz: Are they upholding their duty of care to their users And are they allowing users kind of right to appeal and things like that? And from there, with that information in hand, we can actually start to have the regulatory conversation that has been happening in fits and starts over the past eight years, but really hasn't gotten anywhere. And frankly, it's going to continue to affect our national security and affect, uh, affect the freedom of speech of, of millions of, of people, not only Americans, but around the world, while we continue to drag our feet as the home of many of the social media companies, uh, in, in letting them continue to self-regulate.
 
Dr Miah Hammond-Errey: can you describe what you see as the security threats of disinformation?
 
Nina Jankowicz: Yeah. I mean, so in my research, the things that have been, the most shocking to me are the the times when our adversaries are able to actually affect real world behaviour. So to me, it's not about the ads that the Russians bought in 2016. And it's not about bots and trolls. Uh, it is about real world behaviour and the way that Russia is able. Well, not just Russia, but that's my area of expertise, right? Russia is able to change. Change that.
 
Nina Jankowicz: what people seem to misunderstand, particularly about foreign disinformation, it's not it's not just about silly words on the internet or like memes that are being, uh, targeted from the Internet Research Agency. It's about, um, it's about changing behaviour. it's about participation in our democracy and everybody maintaining kind of their own free will. And so when I hear people say, oh, I don't really. Care of Russia's interfering in our democracy. I mean, that's kind of chilling to me. I care, I care if any foreign nation is trying to unduly influence what American or Australian or EU voters are, are thinking and doing.
 
Nina Jankowicz: there are greater national security threats as well, um, to the decision making that is happening in our capitals. And so one clear way that that's playing out right now, of course, is Ukraine aid. we have this enormous contingent of Republicans who fed in part by by Russian disinformation about Ukraine that is sitting on this aid while our Ukrainian allies are literally running out of bullets in the trenches that they're in. And if Ukraine is to fall, uh, that opens the door not only for Russia to, uh, further explore its what it sees as its near abroad in the former communist space. But it opens the door to China to do the same. In Taiwan. It opens the door, uh, for for many actors in the Middle East to start to instigate these unilateral changes of internationally recognised borders. And that does not lead to a secure or prosperous or peaceful world.
 
Dr Miah Hammond-Errey: Do you see any technology solutions that might help us to be cognitively resilient or improve our infrastructure or, nudge us towards making better content and dissemination choices?
 
Nina Jankowicz: Yeah. So there's been a lot of a lot of attempts at this with kind of, you know, the disinformation of yesteryear. And I think it's worked a little bit less well than some people had had hoped. Where I am a little bit more optimistic is, uh, watermarking and the types of technologies that will allow us to identify deep fakes and manipulate manipulated video and images from, uh, the real stuff. And so we have seen over the past seven years or so, a lot of efforts from the, actually led by industry, including companies like Adobe and its Content Authenticity Initiative, which are pioneering what's called, um, this, this watermark, a digital watermark that would allow people to trust the images that they're seeing or understand the ways in which they've been edited, if they have been edited. Um, it's I think, probably the best solution that we have. So I think that's a really good technological solution. And one of the reasons that I'm less worried about, really hyper photorealistic deepfakes making, uh, such an impact on, um, on our discourse. I think when it comes to text based deepfakes, uh, so large language models and generative AI, there were in a little bit, um, more hot water. Um, but but with the images, I'm actually pretty optimistic. And I think that's a digital tool that's going to be pretty widespread in the future as deepfakes become more accessible and democratised.
 
Dr Miah Hammond-Errey: Absolutely. how can we improve what I think is necessary and investment in in the trust and safety functions.
 
Nina Jankowicz: Yeah. So the trust and safety folks are kind of a bit of a unicorn, especially now because many of them have been let go from, from, uh, various tech companies. But they, they kind of cropped up in the post 2016 era, not only looking at things like foreign interference. But but also looking at, um, you know, things like hate speech and harassment online, um, looking at, uh, child sexual abuse material, looking at violent content and making sure that the things that were prohibited by community standards or terms of service were not making their way into people's feeds. Twitter, before, uh, Elon Musk's acquisition, there were a couple of things that the trust and safety team did their kind of policy wise that were interesting. They, of course, were behind the release of many of the transparency reports related or all of the transparency reports related to foreign interference.
 
Nina Jankowicz: So Twitter, unlike any other platform, would put out a data set, just a massive CSV file of all the foreign takedowns that they did for researchers to play around in, like a giant sandbox. And that was great, right? We gained so much understanding of the types of activity that they were seeing online. With that, they also introduced, uh, the infamous have you read the article pop up before you retweet it? Right. And that actually showed they did some research on it, and it showed that like that interstitial popping up before people retweeted things would actually, uh, stop them retweeting it by 20%. And that's a great way to kind of put the brakes on disinformation. Similarly, with hate speech, they introduced a pop up that And again, that again slowed down, the uncivil behaviour on, on Twitter by a bit. So they were kind of at the head of things. 
 
Nina Jankowicz: at Facebook currently and Instagram, uh, AI is doing a lot of the content moderation decisions for them. So, um, I would say trust and safety folks are the people who craft the policy. They're the ones that kind of research and implement it. Um, and then you've got kind of your content moderators below that, uh, if that helps people visualise how, how they exist within a company. I think they're a really critical function because they're thinking about the user experience on the platform and actually thinking about protecting people.
 
Dr Miah Hammond-Errey: the US has a really specific challenge in the information environment, and I'm talking here about the First Amendment. Of course. I'm really interested to hear your thoughts on how to manage polarisation and preserve the notion of free speech, at the same time protecting vulnerable people, especially minorities, and reducing hate crimes.
 
Nina Jankowicz: This is something I've thought about a lot. Um, I've dealt with a pretty significant amount of harassment myself. Um, and what I've found is that the legal infrastructure in the United States for protection against some of the nastiest stuff you'll find on the internet, including cyberstalking and just direct violent threats, does not exist in the United States. I have a cyberstalker myself, and I had to use a provision under Virginia law that is basically it exists for victims of domestic abuse and violence. And that's how I had to kind of bring up a protective order so that this man could not show up outside my home or at my place of work or at events that I do, um, and harass me. And I think that that is just such a again, these laws are not fit for purpose.
 
Nina Jankowicz: Words online do have an effect offline, particularly for women and minority communities. Often our families are brought into it. Uh, often these are, I would say, direct and credible threats. in a in a country that unfortunately has the level of gun ownership and the level of kind of, um, extremism that we do right now, I think we have to rightfully take all of those threats pretty seriously. So I'm hoping that we see an expansion of those codes, particularly for things like deepfake pornography. That's an easy win, right? Um, but some of the thornier issues, if the people who said these things to me online were saying them to me in my face over and over, it would be a no brainer. I would be able to get a protective order against them very easily. But just because they're behind a keyboard several states away, I'm I'm meant to feel less safe, even though I've been doxed, even though I've been directly threatened, even though I've been, you know, depicted in deepfake pornography. I'm just supposed to be okay with that because, uh, because they're they're behind a keyboard.
 
Nina Jankowicz: And yet we've seen this direct pathway of radicalisation to violence from people who have started on the internet and then ended up shooting up pizza parlours or worse. I think we need to reckon with that. And I don't think it's going to be an easy battle because when I started, um. Promoting my my second book on on How to Be a woman online. this kind of, uh, free speech absolutist ideology about how we spoke on the internet just didn't jive with actual free speech protections for for women and minorities. If if these folks who are using violent rhetoric or using misogynist, racist rhetoric online, homophobic rhetoric, you name it, are allowed to shout loudly and incessantly at us. It means that these minority voices are going to silence themselves, and we've seen that happening. So I think we need to recognise that it's a balance. Um, that free speech is not absolute, even in the United States. I think one of the best ways to do that, in addition to adding some, some specific stipulations to our, our state and federal legal codes about things like cyber stalking and harassment, um, is to insist on more oversight over the platforms.
 
Nina Jankowicz: Um, the UK has just, uh, passed their online safety bill and it's come into action. And I know there are a lot of very, um, legitimate concerns about freedom of expression and encryption related to that bill. But I do think that when it comes to racism and the protection of women and girls and other hate speech, that it is hitting the mark and it what it does is make sure that the platforms are upholding their duty of care toward users, and it can fine the individual executives for, uh, not upholding that duty of care. I also, by the way, quite like the Esafety commissioner system in Australia.
 
Dr Miah Hammond-Errey: Thank you for writing the book. we'll put a link in the show notes, but it was How to Be a Woman online surviving abuse and harassment and how to fight back. How do you see that affecting women in public roles in 2024? I mean, even in Australia, we have these protections, and yet we're still seeing, a much greater rate of online harassment and abuse.
 
Speaker2: I think, unfortunately, this misogyny has been normalised, uh, by by politicians, by people in power, um, by, by influencers. And so when there's no consequence for the people at the top who are doing it, we see kind of an open door for anybody else to engage in it as well. And I think that's why we're seeing this increase. And I do think you're right that, you know, with that increase comes increased threat for not only women in public life, but I would say just kind of ordinary women as well. one of the things that I try to raise the alarm bell a lot about is deepfake pornography. most of the deepfakes that exist today are non-consensual deepfake porn of women. Uh, about probably, I think over 96%. Now, that statistic is old from from 2019.
 
Dr Miah Hammond-Errey: You've written a lot about being a woman online. And so what was your response to the deepfakes of Taylor Swift?
 
Nina Jankowicz: I'm worried about deepfakes, right? Because I think they're becoming so they're so democratised, so accessible. Almost anybody can create them. And when we think of those tools in the hands of our adversaries, is Russia more likely to, um, create a deepfake faking nuclear war, or are they more likely to create deepfake pornography of Kamala Harris or Alexandria Ocasio-Cortez or, you know, name your powerful woman official, um, to to undermine her and undermine kind of the robustness and, um, equity in our democracy. I mean, I think, I think that's much, much more likely and frankly, technologically more possible because these these models are trained on women's bodies, and a lot of that kind of modelling already exists. Right. Um, so I'm worried about that, and I'm worried just about the constant violence that, that women in our, uh, our kind of public life have to face it, I think there's such a reticence to call it what it is that it is violence. But having experienced it myself and knowing kind of what what you go through as a person in like your trauma response, your physical response when the stuff is happening to you, there's no other way to describe it. And it it does have a lasting effect on you. getting back to Taylor Swift.
 
Nina Jankowicz: My hope with Taylor. is that she will use the position of power that she's in, not just to kind of call attention to this with Congress or the tech companies, but to really bring together a community of deepfake survivors and use her her power, her influence, to potentially have a class action suit against the creators or distributors of these deepfakes. in large part, deepfake pornography on the internet exists on a couple of key websites that Google indexes. So, again, potentially pressuring Google for De-indexing and demonetising these sites, or potentially bringing a class action suit against the sites that store and amplify these deepfakes. With a bunch of the women, both notable and just ordinary women who have been depicted in deepfakes and had their lives upturned by them. I think that would be a really powerful message to show, you know, yes, we do need this federal level legislation that doesn't exist yet to to for either civil or criminal penalties for deepfake porn, but also when you bring women together and when you piss them off, they happen to be very powerful. So look out like I think that is like a perfect thing for Taylor Swift to do.
 
Dr Miah Hammond-Errey: It is definitely the first episode we've talked about tay-tay.
 
Dr Miah Hammond-Errey: I want to go to a segment here What are some of the interdependencies and vulnerabilities of the information environment that you wish were better understood?
 
Nina Jankowicz: Oh, I was just talking to a group of college students today about cybersecurity and information security. Right. And often we see the study of information environments or disinformation kind of lumped in under cyber. I think obviously there are two kind of distinct fields that often have connection. And I'm kind of shocked how little, um, how little we speak each other's languages. Right. Um, it's pretty rare that we see people who are talking about, like, cyber resilience and information resilience, uh, in a way that is intertwined. And so I'd like to see more of that. I'd also like to see the demystification of both of these things as like highly technical subjects.
 
Dr Miah Hammond-Errey: How and why are the intersections between data and the information environment important?
 
Nina Jankowicz: Oh, man. Well, we the way that these platforms are staying free is because they're trafficking in our data. Um, and I think we need to understand the different data points that are being collected about us and the ways that they're used to target and manipulate us. I think, like, understanding all of that and the way that targeting works, plus all the other data points that aren't based on your location, right? Your browsing history, your purchase history, um, all of the the data points that unfortunately, in the United States are really, really, um, just out of control and sold to so many companies are used to create a very detailed picture of you online So data is absolutely part of the influence game. And the more control we have over our personal data, the more in control of our information environment will be.
 
Dr Miah Hammond-Errey: Absolutely. I couldn't agree more. This information environment looks worse than the one ahead of the 2016 election. what would a second Trump presidency mean for the United States information environment, and what would it mean for US democracy?
 
Nina Jankowicz: I don't think I can understate the existential threat that Trump poses to democracy and the information environment. He's called the press the enemy of the people. He, of course, has absolutely no regard for facts. He has openly said he is going to kind of dismantle the civil service and, uh, attempt to prosecute against political enemies. Uh, he has mocked and questioned the utility of our most important alliances. Um. I think, you know, not to mention the fact that he has endorsed violence in many cases, um, through January 6th and other violent events. Will be seeing kind of the the propagandising of our information environment even more than it already is, you know, Fox, Newsmax, Oann, um, a bunch of conservative influencers and, uh, online media outlets that work and live and breathe for the Trump presidency because it profits them. Um. It's just a really scary state of affairs. And I again personally know very well what it means to be the person who is the villain of the week, or the month or the year. Frankly, it was it was over a year for me that I still am sometimes mentioned on Fox News, and every time I am, it drives a huge wave of harassment toward me and my family. Um, which is one of the reasons I'm suing Fox News for defamation. this is the that power can't be understated. And it would be not only coming from these media networks, but from from Trump himself. And, you know, when when you have the president doing that, it just magnifies it so many times over.
Dr Miah Hammond-Errey: I think the thing that you've highlighted in that, which I've also seen in my own research, is that it is so easy to attack individuals and actually just remove them from the conversation, um, particularly women, And it does have a very chilling effect of removing those voices from the conversation. Um, and I think that is a place in a democracy we don't want to be in.
 
Dr Miah Hammond-Errey: I'm going to pivot here to ask you some questions about Russia, because you are, of course, an expert in disinformation and Russia and Ukraine. Um, can you talk us through the intersections of Russian disinformation and some of the world events we're seeing?
 
Nina Jankowicz: Woo! Yeah. I mean, uh, anywhere Russia goes, there's a huge disinformation campaign following. Right. So, with the full scale invasion of Ukraine, as we're recording this, we're just about, uh, two years from when the full scale invasion began. I was, uh, leading some research that The centre for Information Resilience is continuing today looking at open source verification and geolocation of of incidents related to the war in Ukraine. And at the time, we were tracking the Russian troop build-up, um, on the borders of Ukraine. And I remember having all these conversations with journalists and, uh, and various government officials from around the world, not not in the States, but especially some in Europe who were just doubting that this was going to happen. And I said, I don't know how you can argue with, hundreds of thousands of troops and all of this equipment being piled up at the border.
 
Nina Jankowicz: We were we a global community, were trusting the word of Vladimir Putin, who had lied so many, many, many, many times before, um, trusting his assurances that these were just military exercises to the point where they weren't believing what was in front of their eyes, the Intel that the US and the UK had declassified, the open source evidence that was there, they wouldn't believe it until the last minute. And so I think that shows the strength of of Russian disinformation. However, since then, I would say for the first year or so of, of the full scale invasion, we saw pretty, um, a pretty clear eyed view coming from the international community about what Russia was up to and kind of a lack of tolerance for Russia's BS. And that was so refreshing. And I think that was in part because of the effort led by the US and UK to declassify evidence and prebunking, um, some of the Russian narratives that were coming out. It was also because of the Ukrainian resilience and bravery and the communication strategy of the Zelensky administration that was like fully supported by a grassroots civil society effort that he embraced as well. As I mentioned before, you know, aid is sitting in Congress while Congressman twiddle their thumbs and Ukrainians are dying trying to keep their territory from Russia. So Russia has absolutely been pouring salt on that wound for, uh, for the last year, They have absolutely exploited this pre-existing vulnerability in American society, and they're doing similar things in Europe. And then they've been very successful in the global majority countries, especially kind of Middle East, North Africa, Southeast Asia, um, where Ukraine is far away. And what they see is kind of us or Western hypocrisy in support of Ukraine versus support of other democratic movements that have happened in other places around the world.
 
Nina Jankowicz: Regarding Navalny's death, it's been interesting to see how Russian media have reacted to it. There's actually been a surprising amount of of information related to Navalny's death. And I think the idea there is to make it seem like there is no foul play, to be as transparent as possible and say, oh no, this guy just died. He was on a walk. It's okay. But obviously when you put the full story together, right, um, of how Navalny has been treated, uh, throughout, you know, when he was still organising in Russia, the fact that he was poisoned. And we know that, again, from open source investigations that Bellingcat conducted, uh, as well as, you know, the the various trials and tribulations that he's been through. He had gone through since he was imprisoned again in 2021. It just doesn't add up. And so unless you, um, are are predisposed to believing that life in a Russian work camp is is is great and that Russia is great, you know, perhaps if you're Tucker Carlson, I think a lot of people are seeing through, uh, seeing through that as well. And we have just seen a raft of sanctions announced by the US and a lot of European allies following suit. as one of my, uh, my dear friends and a journalist who covers Ukraine Sabra Ayers said, everything that grows in Russia is either killed or dies. Um, and I think that's a really clear eyed view of, of Putin and his leadership.
 
Dr Miah Hammond-Errey: you've written a fair bit, obviously, about, Russian disinformation in Ukraine across an extended period of time. I was wondering if you can talk us through the shifts between the 2014 invasion of Crimea and the ongoing invasion of Ukraine and some of the the way that the information operations have changed tactically and become kind of more integrated with national strategy or grand strategy.
 
Nina Jankowicz: Absolutely. Yeah. So in 2014, I think we were seeing the auditioning of the tactics that we all became familiar with in the 2016 influence campaigns in the US and kind of Russian disinformation since then. So a lot of false amplifiers, trolls, bots commenting on news sites and message boards and Russian social media like Odnoklassniki and VKontakte, which are kind of Russian versions of Facebook. we've seen a little a lot of a shift. I would say, um, both tactically and in terms of narratives since 2014, I think the, the sort of rote manipulation that used to exist is, is no longer going to fly on most social media platforms, even if it's in Russian or Ukrainian.
 
Nina Jankowicz: Um, which is a difference, right? At least that stuff still exists. They're able to detect that kind of bot, uh, bot activity or troll activity. the other difference, I think, is that it's much more top down and less bottom up than it used to be. So we used to have the IRA kind of posting away all their different little screeds and kind of throwing spaghetti at the wall and seeing what sticks. the stakes are higher now. It was a full scale invasion with the intention of kind of decapitating the Ukrainian regime and taking over the entire country. Um, not to mention the kind of war crimes and human rights abuses that have taken place by, by Russian soldiers. And so I think there is a lot more, more message discipline and a lot less of kind of the, the little hijinks, um, that have occurred since then.
 
Nina Jankowicz: we're seeing, uh, a lot coming from Russian diplomatic accounts through Russian media and through information laundering, identifying those useful idiots, uh, around the world who are willing to carry the flag for Putin and the Kremlin. And it's, that's really hard to push back against. Right? It's not a fake account. it is uh, it's somebody who's expressing their real opinion and, um, and moving that kind of, uh, moving that across the, the world's information environment. And ultimately, you know, that plays into, uh, Putin's strategy of kind of reassembling the post-Soviet world together under one roof, uh, and pushing back against the expansion of NATO. That's all what this is about.
 
Dr Miah Hammond-Errey: I've heard you say elsewhere and talk about countries that have succeeded in countering disinformation, in citizens. Can you talk us through, you know, briefly, some of those examples.
 
Nina Jankowicz: Yeah. So one of the ones that I. Uh, in earlier stages of my research was was incredibly kind of bullish on was Estonia. Um, obviously Estonia had, uh, a quite large Russian population ethnically when Estonia became independent in 1991. Um, and as a result of uh, various government policies, they essentially de facto segregated that population. And it led to a lot of misgivings, uh, among them, that Russia itself was able to kind of wheedle and, and, um, and make, make a lot worse. Estonia kind of got the memo that they needed to work on integrating the Russian population. And so they opened up a lot of opportunities for them. They they started investing more in kind of Russian neighbourhoods and Russian cities in the country. They, uh, really tried to up Estonian language learning, not in kind of a forceful way that they used to, but by opening classes that were free for, for Russians to get to, they invested in Russian language media, and they invested in kind of education and media literacy as well. And what they've seen is that kind of younger generations of ethnic Russians in Estonia are having more of an affinity toward Europe and more of an affinity toward Estonia than they are toward kind of Russia and the Russian Federation. Um, even though they, they live in, uh, in a kind of Russian speaking enclave in, in Estonia. So I think that's one really great example.
 
Nina Jankowicz: I'm not saying it's like fully solved, but to really mitigate a problem that was was a sore, uh, for, for for the Estonian government for a long time. I go into much more detail in my book. Um, I look at a lot of the media literacy efforts across, uh, across Central and Eastern Europe and the Nordics and Baltics as well. And I see a lot of promise. Ukraine has been really successful in its information and media literacy. I think that this is always part of the solution. I think it is so important, it really can't be overstated. And we've talked about some of the kind of algorithmic literacy that we need. Um, already today, uh, it is teaching people to navigate their information environment helps with so many of the other kind of questions that we've discussed, um, today and frankly, keeps people safer, too. One of the things I've talked about in testimonies before Congress is the need to to train, uh, troops and families of deployed troops who are, frankly, quite big targets of disinformation campaigns. And this is one of the ways that we can kind of test out information media literacy curriculum in the future. So, um, a lot of those countries have done really well with that. And I think it needs to be the part of any solution, uh, toward disinformation that we have. It's not about what telling people what's true or false. It's just giving them the tools to navigate an increasingly complex information environment.
Dr Miah Hammond-Errey: When you look at the average usage across devices per day, the West is spending a lot of time online. Can you talk me through what you see the relationship between attention capture and a changing information environment?
 
Nina Jankowicz: Oh, man. It's so. It's so crazy the way that our devices have, uh, have kept us from, like, the Long Terme engagement with paper books. Like, I'm. I'm rereading Vaclav Havel's Power of the powerless. This is a tiny book, right? And I it's amazing how I struggle to kind of keep my attention on it. And I think that that also means that we are just drawn to the the kind of most salacious headlines were drawn to the most clickbaity material, the stuff that autoplays the stuff that captures our attention instead of engaging with nuance. And when we lose that nuance and we have that context collapse, that is so much of the reason that disinformation finds fodder among vulnerable people these days.
 
Dr Miah Hammond-Errey: You briefly served as the executive director of the Disinformation Governance Board in the US Department of Homeland Security. the work was unfortunately disbanded after itself becoming the subject of a disinformation campaign. Do you have any reflections, from a systemic change perspective, but also personally on that time?
 
Speaker2: Oh, I have lots of reflections, but we don't we don't have a ton of time, so let me keep it top level. The biggest thing that I think governments need to be aware of when they are engaging in any counter-disinformation work, is that they need to be super transparent and proactive about communicating about those efforts, and that people are going to have legitimate questions and they need to be ready to answer them. They also need to be ready to very unequivocally and forcefully dispel with any allegations that are untrue. And that's what the Biden administration didn't do in my case, nor did they support me kind of on a personal, physical, emotional security level. These sorts of attacks, which are going to become increasingly personalised, are not they're not uncommon right now, and they're only going to become more common. And so what do governments do to support their people, particularly the women and people of colour, Um, when it comes to these personal attacks? And that was the reason I left. was the fact that the Biden administration was absolutely deer in the headlights, had no idea what to do, not only about the false descriptions of the board, but about the personal attacks on me. They couldn't stick their neck out and say, actually, we hired Nina because she's one of the pre-eminent experts in this stuff. And here's her nonpartisan body of work that you are absolutely blowing out of proportion. They just couldn't find it in themselves to do that. Uh, and I not not to mention kind of the physical, um, and uh, and it security bits that they just absolutely were scrambling with. So I think we need to think proactively both on the comms front and the security front when engaging in any policy activity like this.
 
Dr Miah Hammond-Errey: One is a segment called Emerging Tech for Emerging Leaders. Can you share any new and emerging technologies you think up and coming leaders of today and tomorrow need to know about?
 
Nina Jankowicz: So I am happy to recommend a plugin that I think anybody who is worried about kind of their online security footprint should be using. It's called Privacy Party. it is developed by Tracy Chow, which does a check-up of all of your social profiles and says like highlights. Here are the things that you might want to change and does them for you, which makes it really easy. You don't have to kind of screw around with the the settings on your account and look for things that the platforms don't want you to see. It brings it all right up there to the front. So that is something I would look at if you're worried about your digital security footprint.
 
Dr Miah Hammond-Errey: So what is the role of alliance building and technology policy?
 
Nina Jankowicz: Oh, gosh, I wish we did better with with alliances and technology policy. I often feel like we are either working directly at odds with or kind of racing against um, in particular the EU. Uh, I think especially given the US relationship with freedom of expression, it's often hard to find overlap with our allies on what we think tech policy should look like. But I am cognisant that the more we come together in that space, the greater our effect will be. But if we all did it together, it could be such, such a powerful message and we just have not been able to do that yet. So I would love to see more of that. you know, an alliance of democracies that is dealing with, uh, tech issues, I think that that could be very effective indeed.
 
Dr Miah Hammond-Errey: Where do you see technology, power and informational contest going in 2024?
Nina Jankowicz: I think with all of the elections that are on the docket this year, we are going to continue to see an interference and influence campaigns not only from the big three, but certainly, uh, from smaller nations as well, because the playbook has been opened for anybody who wants to do it. And I think we're going to see, especially with the introduction of, uh, you know, democratised large language models, the ability to influence in a local language without detection, because it was usually those kind of idiosyncratic linguistic slip ups that allowed us to identify Chinese or Russian or Iranian disinformation. It's not going to happen anymore. Right? So it's so much easier. And it's not just about elections. It goes beyond elections. And like we were talking about before with Ukraine kind of policy outcomes. Um, so I think, I think we're going to see a lot more of that in 2024 and beyond.
 
Dr Miah Hammond-Errey: coming up is eyes and Ears. What have you been reading, listening to, or watching lately
 
Nina Jankowicz: I've got a big stack of books on civil resistance that I'm working my way through, and that's why I'm reading, power of the powerless by by Vaclav Havel, which I read in graduate school and haven't revisited, um, in a while. I am thinking about, um, my third book and ways that civil resistance and kind of the concept of civil resistance can be applied for the digital, polarised, vitriolic age. So that's kind of my, that's my work reading right now. in terms of fun reading. Well, watching my my guilty pleasure is, Uh, I can't believe I'm admitting this is, uh, All Creatures great and small. a British, series. Used to be a book, too. Before it was a TV series. This is the revival. it's about a vet in the Yorkshire Dales in the 30s and 40s. I just love it.
 
Dr Miah Hammond-Errey: I was gonna ask what you do in your downtime to keep sane. And now I know.
 
Nina Jankowicz: God, I don't have a lot of downtime yet. Uh, my. So my baby is is kind of in the throes of entering the terrible twos,
 
Nina Jankowicz: They're just like constantly trying to kill themselves, but they're at least cute while they do it.
 
Dr Miah Hammond-Errey: Yeah. This is where we are a great advertisement for kids.
 
Dr Miah Hammond-Errey: My final segment is need to know. Is there anything I didn't ask that would have been great to cover?
 
Nina Jankowicz: Oh, gosh. No, Miah, it was a delight to talk to you.

What is Technology and Security (TS)?

Technology and Security (TS) explores the intersections of emerging technologies and security. It is hosted by Dr Miah Hammond-Errey. Each month, experts in technology and security join Miah to discuss pressing issues, policy debates, international developments, and share leadership and career advice. https://miahhe.com/about-ts | https://stratfutures.com

Transcript
Dr Miah Hammond-Errey: My guest today is Nina Jankowicz. Nina is a disinformation expert who advocates for those harassed online. She's the vice president of the centre for Information Resilience. In 2022 she served as the executive director of the Disinformation Governance Board in the US Department of Homeland Security. She has authored two fabulous books, How to Lose the Information War and How to Be a Woman Online. Thanks so much for joining me, Nina.

Nina Jankowicz: I'm really excited to be here.

Dr Miah Hammond-Errey: We're coming to you today from the lands of the Dharawal people. We pay our respects to their elders, past, present and emerging, We acknowledge their continuing connection to land, sea, and community and extend that respect to all Aboriginal and Torres Strait Islander people.

Dr Miah Hammond-Errey: Nina, you have written extensively about elections and the information environment, and we've both written about the significance of 2024 for elections. 76 countries are holding elections this year, which will see more than half the world's population casting a ballot. What are you most concerned about?

Nina Jankowicz: I think the the thing that makes me most worried is actually the change kind of sea change in how we think about content moderation and trust and safety as we head into this year. back in 2016, 2017, this was a field that didn't really exist yet. since then, we've seen kind of the emergence of the field, the emergence of a lot of cooperation between the public and private sectors to identify and mitigate disinformation that have resulted in some pretty big gains. Right? We've seen, you know, campaigns taken down by the French government, that the Russians were trying to interfere in the 2017 French presidential election. We've seen, the the Iranian interference in 2020 where Iranians were pretending to be Proud Boys in the US election and targeting Democratic voters and swing states, in order to try to suppress their their voter turnout. And all of that was a result of coordination between the public and private sectors, and had less of an effect than it might have otherwise. If, you know, if we didn't have that coordination. And now what we see, especially here in the US, is this equivalence between any content moderation and censorship, while we head into this critical election period, I'm just worried that we're not going to see as much attention, um, that is leading to actual change and mitigation of threats that we saw in previous election cycles. Um, and I'm also worried that we're not going to see as much robust research into this sphere, because in addition to all of that we're also seeing harassment campaigns against individual researchers and their research institutions. And we have a large section of the population, at least here in the United States, that is equating, uh, not only keeping our democracy safe, but keeping people safe online as as censorship.

Dr Miah Hammond-Errey: in your recent Foreign Affairs article, You said, uh, nearly eight years after Russian operatives attempted to interfere in the 2016 US presidential election, US democracy has become even less safe, and the country's information environment more polluted with the freedom of speech of US citizens more at risk. what do you think needs to be done about it?

Nina Jankowicz: I think we're in such an intractable place here in the United States right now where again, we've seen these threats laid out over and over. And yet we've done so little about it. And so I think the first order of business is like, yeah, let's have these hard conversations about what is the proper relationship between government and the social media platforms, what oversight should the social media platforms have? And so what I would like to see is a not even a regulatory regime, but a transparency and oversight regime over the social media platforms. So we understand the decisions that they're making, what they're moderating, why they're moderating it, how much they're responding to user reports of harassment and threats and things like this.

Nina Jankowicz: Are they upholding their duty of care to their users And are they allowing users kind of right to appeal and things like that? And from there, with that information in hand, we can actually start to have the regulatory conversation that has been happening in fits and starts over the past eight years, but really hasn't gotten anywhere. And frankly, it's going to continue to affect our national security and affect, uh, affect the freedom of speech of, of millions of, of people, not only Americans, but around the world, while we continue to drag our feet as the home of many of the social media companies, uh, in, in letting them continue to self-regulate.

Dr Miah Hammond-Errey: can you describe what you see as the security threats of disinformation?

Nina Jankowicz: Yeah. I mean, so in my research, the things that have been, the most shocking to me are the the times when our adversaries are able to actually affect real world behaviour. So to me, it's not about the ads that the Russians bought in 2016. And it's not about bots and trolls. Uh, it is about real world behaviour and the way that Russia is able. Well, not just Russia, but that's my area of expertise, right? Russia is able to change. Change that.

Nina Jankowicz: what people seem to misunderstand, particularly about foreign disinformation, it's not it's not just about silly words on the internet or like memes that are being, uh, targeted from the Internet Research Agency. It's about, um, it's about changing behaviour. it's about participation in our democracy and everybody maintaining kind of their own free will. And so when I hear people say, oh, I don't really. Care of Russia's interfering in our democracy. I mean, that's kind of chilling to me. I care, I care if any foreign nation is trying to unduly influence what American or Australian or EU voters are, are thinking and doing.

Nina Jankowicz: there are greater national security threats as well, um, to the decision making that is happening in our capitals. And so one clear way that that's playing out right now, of course, is Ukraine aid. we have this enormous contingent of Republicans who fed in part by by Russian disinformation about Ukraine that is sitting on this aid while our Ukrainian allies are literally running out of bullets in the trenches that they're in. And if Ukraine is to fall, uh, that opens the door not only for Russia to, uh, further explore its what it sees as its near abroad in the former communist space. But it opens the door to China to do the same. In Taiwan. It opens the door, uh, for for many actors in the Middle East to start to instigate these unilateral changes of internationally recognised borders. And that does not lead to a secure or prosperous or peaceful world.

Dr Miah Hammond-Errey: Do you see any technology solutions that might help us to be cognitively resilient or improve our infrastructure or, nudge us towards making better content and dissemination choices?

Nina Jankowicz: Yeah. So there's been a lot of a lot of attempts at this with kind of, you know, the disinformation of yesteryear. And I think it's worked a little bit less well than some people had had hoped. Where I am a little bit more optimistic is, uh, watermarking and the types of technologies that will allow us to identify deep fakes and manipulate manipulated video and images from, uh, the real stuff. And so we have seen over the past seven years or so, a lot of efforts from the, actually led by industry, including companies like Adobe and its Content Authenticity Initiative, which are pioneering what's called, um, this, this watermark, a digital watermark that would allow people to trust the images that they're seeing or understand the ways in which they've been edited, if they have been edited. Um, it's I think, probably the best solution that we have. So I think that's a really good technological solution. And one of the reasons that I'm less worried about, really hyper photorealistic deepfakes making, uh, such an impact on, um, on our discourse. I think when it comes to text based deepfakes, uh, so large language models and generative AI, there were in a little bit, um, more hot water. Um, but but with the images, I'm actually pretty optimistic. And I think that's a digital tool that's going to be pretty widespread in the future as deepfakes become more accessible and democratised.

Dr Miah Hammond-Errey: Absolutely. how can we improve what I think is necessary and investment in in the trust and safety functions.

Nina Jankowicz: Yeah. So the trust and safety folks are kind of a bit of a unicorn, especially now because many of them have been let go from, from, uh, various tech companies. But they, they kind of cropped up in the post 2016 era, not only looking at things like foreign interference. But but also looking at, um, you know, things like hate speech and harassment online, um, looking at, uh, child sexual abuse material, looking at violent content and making sure that the things that were prohibited by community standards or terms of service were not making their way into people's feeds. Twitter, before, uh, Elon Musk's acquisition, there were a couple of things that the trust and safety team did their kind of policy wise that were interesting. They, of course, were behind the release of many of the transparency reports related or all of the transparency reports related to foreign interference.

Nina Jankowicz: So Twitter, unlike any other platform, would put out a data set, just a massive CSV file of all the foreign takedowns that they did for researchers to play around in, like a giant sandbox. And that was great, right? We gained so much understanding of the types of activity that they were seeing online. With that, they also introduced, uh, the infamous have you read the article pop up before you retweet it? Right. And that actually showed they did some research on it, and it showed that like that interstitial popping up before people retweeted things would actually, uh, stop them retweeting it by 20%. And that's a great way to kind of put the brakes on disinformation. Similarly, with hate speech, they introduced a pop up that And again, that again slowed down, the uncivil behaviour on, on Twitter by a bit. So they were kind of at the head of things.

Nina Jankowicz: at Facebook currently and Instagram, uh, AI is doing a lot of the content moderation decisions for them. So, um, I would say trust and safety folks are the people who craft the policy. They're the ones that kind of research and implement it. Um, and then you've got kind of your content moderators below that, uh, if that helps people visualise how, how they exist within a company. I think they're a really critical function because they're thinking about the user experience on the platform and actually thinking about protecting people.

Dr Miah Hammond-Errey: the US has a really specific challenge in the information environment, and I'm talking here about the First Amendment. Of course. I'm really interested to hear your thoughts on how to manage polarisation and preserve the notion of free speech, at the same time protecting vulnerable people, especially minorities, and reducing hate crimes.

Nina Jankowicz: This is something I've thought about a lot. Um, I've dealt with a pretty significant amount of harassment myself. Um, and what I've found is that the legal infrastructure in the United States for protection against some of the nastiest stuff you'll find on the internet, including cyberstalking and just direct violent threats, does not exist in the United States. I have a cyberstalker myself, and I had to use a provision under Virginia law that is basically it exists for victims of domestic abuse and violence. And that's how I had to kind of bring up a protective order so that this man could not show up outside my home or at my place of work or at events that I do, um, and harass me. And I think that that is just such a again, these laws are not fit for purpose.

Nina Jankowicz: Words online do have an effect offline, particularly for women and minority communities. Often our families are brought into it. Uh, often these are, I would say, direct and credible threats. in a in a country that unfortunately has the level of gun ownership and the level of kind of, um, extremism that we do right now, I think we have to rightfully take all of those threats pretty seriously. So I'm hoping that we see an expansion of those codes, particularly for things like deepfake pornography. That's an easy win, right? Um, but some of the thornier issues, if the people who said these things to me online were saying them to me in my face over and over, it would be a no brainer. I would be able to get a protective order against them very easily. But just because they're behind a keyboard several states away, I'm I'm meant to feel less safe, even though I've been doxed, even though I've been directly threatened, even though I've been, you know, depicted in deepfake pornography. I'm just supposed to be okay with that because, uh, because they're they're behind a keyboard.

Nina Jankowicz: And yet we've seen this direct pathway of radicalisation to violence from people who have started on the internet and then ended up shooting up pizza parlours or worse. I think we need to reckon with that. And I don't think it's going to be an easy battle because when I started, um. Promoting my my second book on on How to Be a woman online. this kind of, uh, free speech absolutist ideology about how we spoke on the internet just didn't jive with actual free speech protections for for women and minorities. If if these folks who are using violent rhetoric or using misogynist, racist rhetoric online, homophobic rhetoric, you name it, are allowed to shout loudly and incessantly at us. It means that these minority voices are going to silence themselves, and we've seen that happening. So I think we need to recognise that it's a balance. Um, that free speech is not absolute, even in the United States. I think one of the best ways to do that, in addition to adding some, some specific stipulations to our, our state and federal legal codes about things like cyber stalking and harassment, um, is to insist on more oversight over the platforms.

Nina Jankowicz: Um, the UK has just, uh, passed their online safety bill and it's come into action. And I know there are a lot of very, um, legitimate concerns about freedom of expression and encryption related to that bill. But I do think that when it comes to racism and the protection of women and girls and other hate speech, that it is hitting the mark and it what it does is make sure that the platforms are upholding their duty of care toward users, and it can fine the individual executives for, uh, not upholding that duty of care. I also, by the way, quite like the Esafety commissioner system in Australia.

Dr Miah Hammond-Errey: Thank you for writing the book. we'll put a link in the show notes, but it was How to Be a Woman online surviving abuse and harassment and how to fight back. How do you see that affecting women in public roles in 2024? I mean, even in Australia, we have these protections, and yet we're still seeing, a much greater rate of online harassment and abuse.

Speaker2: I think, unfortunately, this misogyny has been normalised, uh, by by politicians, by people in power, um, by, by influencers. And so when there's no consequence for the people at the top who are doing it, we see kind of an open door for anybody else to engage in it as well. And I think that's why we're seeing this increase. And I do think you're right that, you know, with that increase comes increased threat for not only women in public life, but I would say just kind of ordinary women as well. one of the things that I try to raise the alarm bell a lot about is deepfake pornography. most of the deepfakes that exist today are non-consensual deepfake porn of women. Uh, about probably, I think over 96%. Now, that statistic is old from from 2019.

Dr Miah Hammond-Errey: You've written a lot about being a woman online. And so what was your response to the deepfakes of Taylor Swift?

Nina Jankowicz: I'm worried about deepfakes, right? Because I think they're becoming so they're so democratised, so accessible. Almost anybody can create them. And when we think of those tools in the hands of our adversaries, is Russia more likely to, um, create a deepfake faking nuclear war, or are they more likely to create deepfake pornography of Kamala Harris or Alexandria Ocasio-Cortez or, you know, name your powerful woman official, um, to to undermine her and undermine kind of the robustness and, um, equity in our democracy. I mean, I think, I think that's much, much more likely and frankly, technologically more possible because these these models are trained on women's bodies, and a lot of that kind of modelling already exists. Right. Um, so I'm worried about that, and I'm worried just about the constant violence that, that women in our, uh, our kind of public life have to face it, I think there's such a reticence to call it what it is that it is violence. But having experienced it myself and knowing kind of what what you go through as a person in like your trauma response, your physical response when the stuff is happening to you, there's no other way to describe it. And it it does have a lasting effect on you. getting back to Taylor Swift.

Nina Jankowicz: My hope with Taylor. is that she will use the position of power that she's in, not just to kind of call attention to this with Congress or the tech companies, but to really bring together a community of deepfake survivors and use her her power, her influence, to potentially have a class action suit against the creators or distributors of these deepfakes. in large part, deepfake pornography on the internet exists on a couple of key websites that Google indexes. So, again, potentially pressuring Google for De-indexing and demonetising these sites, or potentially bringing a class action suit against the sites that store and amplify these deepfakes. With a bunch of the women, both notable and just ordinary women who have been depicted in deepfakes and had their lives upturned by them. I think that would be a really powerful message to show, you know, yes, we do need this federal level legislation that doesn't exist yet to to for either civil or criminal penalties for deepfake porn, but also when you bring women together and when you piss them off, they happen to be very powerful. So look out like I think that is like a perfect thing for Taylor Swift to do.

Dr Miah Hammond-Errey: It is definitely the first episode we've talked about tay-tay.

Dr Miah Hammond-Errey: I want to go to a segment here What are some of the interdependencies and vulnerabilities of the information environment that you wish were better understood?

Nina Jankowicz: Oh, I was just talking to a group of college students today about cybersecurity and information security. Right. And often we see the study of information environments or disinformation kind of lumped in under cyber. I think obviously there are two kind of distinct fields that often have connection. And I'm kind of shocked how little, um, how little we speak each other's languages. Right. Um, it's pretty rare that we see people who are talking about, like, cyber resilience and information resilience, uh, in a way that is intertwined. And so I'd like to see more of that. I'd also like to see the demystification of both of these things as like highly technical subjects.

Dr Miah Hammond-Errey: How and why are the intersections between data and the information environment important?

Nina Jankowicz: Oh, man. Well, we the way that these platforms are staying free is because they're trafficking in our data. Um, and I think we need to understand the different data points that are being collected about us and the ways that they're used to target and manipulate us. I think, like, understanding all of that and the way that targeting works, plus all the other data points that aren't based on your location, right? Your browsing history, your purchase history, um, all of the the data points that unfortunately, in the United States are really, really, um, just out of control and sold to so many companies are used to create a very detailed picture of you online So data is absolutely part of the influence game. And the more control we have over our personal data, the more in control of our information environment will be.

Dr Miah Hammond-Errey: Absolutely. I couldn't agree more. This information environment looks worse than the one ahead of the 2016 election. what would a second Trump presidency mean for the United States information environment, and what would it mean for US democracy?

Nina Jankowicz: I don't think I can understate the existential threat that Trump poses to democracy and the information environment. He's called the press the enemy of the people. He, of course, has absolutely no regard for facts. He has openly said he is going to kind of dismantle the civil service and, uh, attempt to prosecute against political enemies. Uh, he has mocked and questioned the utility of our most important alliances. Um. I think, you know, not to mention the fact that he has endorsed violence in many cases, um, through January 6th and other violent events. Will be seeing kind of the the propagandising of our information environment even more than it already is, you know, Fox, Newsmax, Oann, um, a bunch of conservative influencers and, uh, online media outlets that work and live and breathe for the Trump presidency because it profits them. Um. It's just a really scary state of affairs. And I again personally know very well what it means to be the person who is the villain of the week, or the month or the year. Frankly, it was it was over a year for me that I still am sometimes mentioned on Fox News, and every time I am, it drives a huge wave of harassment toward me and my family. Um, which is one of the reasons I'm suing Fox News for defamation. this is the that power can't be understated. And it would be not only coming from these media networks, but from from Trump himself. And, you know, when when you have the president doing that, it just magnifies it so many times over.
Dr Miah Hammond-Errey: I think the thing that you've highlighted in that, which I've also seen in my own research, is that it is so easy to attack individuals and actually just remove them from the conversation, um, particularly women, And it does have a very chilling effect of removing those voices from the conversation. Um, and I think that is a place in a democracy we don't want to be in.

Dr Miah Hammond-Errey: I'm going to pivot here to ask you some questions about Russia, because you are, of course, an expert in disinformation and Russia and Ukraine. Um, can you talk us through the intersections of Russian disinformation and some of the world events we're seeing?

Nina Jankowicz: Woo! Yeah. I mean, uh, anywhere Russia goes, there's a huge disinformation campaign following. Right. So, with the full scale invasion of Ukraine, as we're recording this, we're just about, uh, two years from when the full scale invasion began. I was, uh, leading some research that The centre for Information Resilience is continuing today looking at open source verification and geolocation of of incidents related to the war in Ukraine. And at the time, we were tracking the Russian troop build-up, um, on the borders of Ukraine. And I remember having all these conversations with journalists and, uh, and various government officials from around the world, not not in the States, but especially some in Europe who were just doubting that this was going to happen. And I said, I don't know how you can argue with, hundreds of thousands of troops and all of this equipment being piled up at the border.

Nina Jankowicz: We were we a global community, were trusting the word of Vladimir Putin, who had lied so many, many, many, many times before, um, trusting his assurances that these were just military exercises to the point where they weren't believing what was in front of their eyes, the Intel that the US and the UK had declassified, the open source evidence that was there, they wouldn't believe it until the last minute. And so I think that shows the strength of of Russian disinformation. However, since then, I would say for the first year or so of, of the full scale invasion, we saw pretty, um, a pretty clear eyed view coming from the international community about what Russia was up to and kind of a lack of tolerance for Russia's BS. And that was so refreshing. And I think that was in part because of the effort led by the US and UK to declassify evidence and prebunking, um, some of the Russian narratives that were coming out. It was also because of the Ukrainian resilience and bravery and the communication strategy of the Zelensky administration that was like fully supported by a grassroots civil society effort that he embraced as well. As I mentioned before, you know, aid is sitting in Congress while Congressman twiddle their thumbs and Ukrainians are dying trying to keep their territory from Russia. So Russia has absolutely been pouring salt on that wound for, uh, for the last year, They have absolutely exploited this pre-existing vulnerability in American society, and they're doing similar things in Europe. And then they've been very successful in the global majority countries, especially kind of Middle East, North Africa, Southeast Asia, um, where Ukraine is far away. And what they see is kind of us or Western hypocrisy in support of Ukraine versus support of other democratic movements that have happened in other places around the world.

Nina Jankowicz: Regarding Navalny's death, it's been interesting to see how Russian media have reacted to it. There's actually been a surprising amount of of information related to Navalny's death. And I think the idea there is to make it seem like there is no foul play, to be as transparent as possible and say, oh no, this guy just died. He was on a walk. It's okay. But obviously when you put the full story together, right, um, of how Navalny has been treated, uh, throughout, you know, when he was still organising in Russia, the fact that he was poisoned. And we know that, again, from open source investigations that Bellingcat conducted, uh, as well as, you know, the the various trials and tribulations that he's been through. He had gone through since he was imprisoned again in 2021. It just doesn't add up. And so unless you, um, are are predisposed to believing that life in a Russian work camp is is is great and that Russia is great, you know, perhaps if you're Tucker Carlson, I think a lot of people are seeing through, uh, seeing through that as well. And we have just seen a raft of sanctions announced by the US and a lot of European allies following suit. as one of my, uh, my dear friends and a journalist who covers Ukraine Sabra Ayers said, everything that grows in Russia is either killed or dies. Um, and I think that's a really clear eyed view of, of Putin and his leadership.

Dr Miah Hammond-Errey: you've written a fair bit, obviously, about, Russian disinformation in Ukraine across an extended period of time. I was wondering if you can talk us through the shifts between the 2014 invasion of Crimea and the ongoing invasion of Ukraine and some of the the way that the information operations have changed tactically and become kind of more integrated with national strategy or grand strategy.

Nina Jankowicz: Absolutely. Yeah. So in 2014, I think we were seeing the auditioning of the tactics that we all became familiar with in the 2016 influence campaigns in the US and kind of Russian disinformation since then. So a lot of false amplifiers, trolls, bots commenting on news sites and message boards and Russian social media like Odnoklassniki and VKontakte, which are kind of Russian versions of Facebook. we've seen a little a lot of a shift. I would say, um, both tactically and in terms of narratives since 2014, I think the, the sort of rote manipulation that used to exist is, is no longer going to fly on most social media platforms, even if it's in Russian or Ukrainian.

Nina Jankowicz: Um, which is a difference, right? At least that stuff still exists. They're able to detect that kind of bot, uh, bot activity or troll activity. the other difference, I think, is that it's much more top down and less bottom up than it used to be. So we used to have the IRA kind of posting away all their different little screeds and kind of throwing spaghetti at the wall and seeing what sticks. the stakes are higher now. It was a full scale invasion with the intention of kind of decapitating the Ukrainian regime and taking over the entire country. Um, not to mention the kind of war crimes and human rights abuses that have taken place by, by Russian soldiers. And so I think there is a lot more, more message discipline and a lot less of kind of the, the little hijinks, um, that have occurred since then.

Nina Jankowicz: we're seeing, uh, a lot coming from Russian diplomatic accounts through Russian media and through information laundering, identifying those useful idiots, uh, around the world who are willing to carry the flag for Putin and the Kremlin. And it's, that's really hard to push back against. Right? It's not a fake account. it is uh, it's somebody who's expressing their real opinion and, um, and moving that kind of, uh, moving that across the, the world's information environment. And ultimately, you know, that plays into, uh, Putin's strategy of kind of reassembling the post-Soviet world together under one roof, uh, and pushing back against the expansion of NATO. That's all what this is about.

Dr Miah Hammond-Errey: I've heard you say elsewhere and talk about countries that have succeeded in countering disinformation, in citizens. Can you talk us through, you know, briefly, some of those examples.

Nina Jankowicz: Yeah. So one of the ones that I. Uh, in earlier stages of my research was was incredibly kind of bullish on was Estonia. Um, obviously Estonia had, uh, a quite large Russian population ethnically when Estonia became independent in 1991. Um, and as a result of uh, various government policies, they essentially de facto segregated that population. And it led to a lot of misgivings, uh, among them, that Russia itself was able to kind of wheedle and, and, um, and make, make a lot worse. Estonia kind of got the memo that they needed to work on integrating the Russian population. And so they opened up a lot of opportunities for them. They they started investing more in kind of Russian neighbourhoods and Russian cities in the country. They, uh, really tried to up Estonian language learning, not in kind of a forceful way that they used to, but by opening classes that were free for, for Russians to get to, they invested in Russian language media, and they invested in kind of education and media literacy as well. And what they've seen is that kind of younger generations of ethnic Russians in Estonia are having more of an affinity toward Europe and more of an affinity toward Estonia than they are toward kind of Russia and the Russian Federation. Um, even though they, they live in, uh, in a kind of Russian speaking enclave in, in Estonia. So I think that's one really great example.

Nina Jankowicz: I'm not saying it's like fully solved, but to really mitigate a problem that was was a sore, uh, for, for for the Estonian government for a long time. I go into much more detail in my book. Um, I look at a lot of the media literacy efforts across, uh, across Central and Eastern Europe and the Nordics and Baltics as well. And I see a lot of promise. Ukraine has been really successful in its information and media literacy. I think that this is always part of the solution. I think it is so important, it really can't be overstated. And we've talked about some of the kind of algorithmic literacy that we need. Um, already today, uh, it is teaching people to navigate their information environment helps with so many of the other kind of questions that we've discussed, um, today and frankly, keeps people safer, too. One of the things I've talked about in testimonies before Congress is the need to to train, uh, troops and families of deployed troops who are, frankly, quite big targets of disinformation campaigns. And this is one of the ways that we can kind of test out information media literacy curriculum in the future. So, um, a lot of those countries have done really well with that. And I think it needs to be the part of any solution, uh, toward disinformation that we have. It's not about what telling people what's true or false. It's just giving them the tools to navigate an increasingly complex information environment.
Dr Miah Hammond-Errey: When you look at the average usage across devices per day, the West is spending a lot of time online. Can you talk me through what you see the relationship between attention capture and a changing information environment?

Nina Jankowicz: Oh, man. It's so. It's so crazy the way that our devices have, uh, have kept us from, like, the Long Terme engagement with paper books. Like, I'm. I'm rereading Vaclav Havel's Power of the powerless. This is a tiny book, right? And I it's amazing how I struggle to kind of keep my attention on it. And I think that that also means that we are just drawn to the the kind of most salacious headlines were drawn to the most clickbaity material, the stuff that autoplays the stuff that captures our attention instead of engaging with nuance. And when we lose that nuance and we have that context collapse, that is so much of the reason that disinformation finds fodder among vulnerable people these days.

Dr Miah Hammond-Errey: You briefly served as the executive director of the Disinformation Governance Board in the US Department of Homeland Security. the work was unfortunately disbanded after itself becoming the subject of a disinformation campaign. Do you have any reflections, from a systemic change perspective, but also personally on that time?

Speaker2: Oh, I have lots of reflections, but we don't we don't have a ton of time, so let me keep it top level. The biggest thing that I think governments need to be aware of when they are engaging in any counter-disinformation work, is that they need to be super transparent and proactive about communicating about those efforts, and that people are going to have legitimate questions and they need to be ready to answer them. They also need to be ready to very unequivocally and forcefully dispel with any allegations that are untrue. And that's what the Biden administration didn't do in my case, nor did they support me kind of on a personal, physical, emotional security level. These sorts of attacks, which are going to become increasingly personalised, are not they're not uncommon right now, and they're only going to become more common. And so what do governments do to support their people, particularly the women and people of colour, Um, when it comes to these personal attacks? And that was the reason I left. was the fact that the Biden administration was absolutely deer in the headlights, had no idea what to do, not only about the false descriptions of the board, but about the personal attacks on me. They couldn't stick their neck out and say, actually, we hired Nina because she's one of the pre-eminent experts in this stuff. And here's her nonpartisan body of work that you are absolutely blowing out of proportion. They just couldn't find it in themselves to do that. Uh, and I not not to mention kind of the physical, um, and uh, and it security bits that they just absolutely were scrambling with. So I think we need to think proactively both on the comms front and the security front when engaging in any policy activity like this.

Dr Miah Hammond-Errey: One is a segment called Emerging Tech for Emerging Leaders. Can you share any new and emerging technologies you think up and coming leaders of today and tomorrow need to know about?

Nina Jankowicz: So I am happy to recommend a plugin that I think anybody who is worried about kind of their online security footprint should be using. It's called Privacy Party. it is developed by Tracy Chow, which does a check-up of all of your social profiles and says like highlights. Here are the things that you might want to change and does them for you, which makes it really easy. You don't have to kind of screw around with the the settings on your account and look for things that the platforms don't want you to see. It brings it all right up there to the front. So that is something I would look at if you're worried about your digital security footprint.

Dr Miah Hammond-Errey: So what is the role of alliance building and technology policy?

Nina Jankowicz: Oh, gosh, I wish we did better with with alliances and technology policy. I often feel like we are either working directly at odds with or kind of racing against um, in particular the EU. Uh, I think especially given the US relationship with freedom of expression, it's often hard to find overlap with our allies on what we think tech policy should look like. But I am cognisant that the more we come together in that space, the greater our effect will be. But if we all did it together, it could be such, such a powerful message and we just have not been able to do that yet. So I would love to see more of that. you know, an alliance of democracies that is dealing with, uh, tech issues, I think that that could be very effective indeed.

Dr Miah Hammond-Errey: Where do you see technology, power and informational contest going in 2024?
Nina Jankowicz: I think with all of the elections that are on the docket this year, we are going to continue to see an interference and influence campaigns not only from the big three, but certainly, uh, from smaller nations as well, because the playbook has been opened for anybody who wants to do it. And I think we're going to see, especially with the introduction of, uh, you know, democratised large language models, the ability to influence in a local language without detection, because it was usually those kind of idiosyncratic linguistic slip ups that allowed us to identify Chinese or Russian or Iranian disinformation. It's not going to happen anymore. Right? So it's so much easier. And it's not just about elections. It goes beyond elections. And like we were talking about before with Ukraine kind of policy outcomes. Um, so I think, I think we're going to see a lot more of that in 2024 and beyond.

Dr Miah Hammond-Errey: coming up is eyes and Ears. What have you been reading, listening to, or watching lately

Nina Jankowicz: I've got a big stack of books on civil resistance that I'm working my way through, and that's why I'm reading, power of the powerless by by Vaclav Havel, which I read in graduate school and haven't revisited, um, in a while. I am thinking about, um, my third book and ways that civil resistance and kind of the concept of civil resistance can be applied for the digital, polarised, vitriolic age. So that's kind of my, that's my work reading right now. in terms of fun reading. Well, watching my my guilty pleasure is, Uh, I can't believe I'm admitting this is, uh, All Creatures great and small. a British, series. Used to be a book, too. Before it was a TV series. This is the revival. it's about a vet in the Yorkshire Dales in the 30s and 40s. I just love it.

Dr Miah Hammond-Errey: I was gonna ask what you do in your downtime to keep sane. And now I know.

Nina Jankowicz: God, I don't have a lot of downtime yet. Uh, my. So my baby is is kind of in the throes of entering the terrible twos,

Nina Jankowicz: They're just like constantly trying to kill themselves, but they're at least cute while they do it.

Dr Miah Hammond-Errey: Yeah. This is where we are a great advertisement for kids.

Dr Miah Hammond-Errey: My final segment is need to know. Is there anything I didn't ask that would have been great to cover?

Nina Jankowicz: Oh, gosh. No, Miah, it was a delight to talk to you.