Technology and Security

BONUS: Tailwinds and Tensions of Technology for OSINT (presented at Australian OSINT Symposium)
 
In this bonus episode of the Technology & Security podcast, host Dr. Miah Hammond-Errey presents to the Australian OSINT Symposium. Called Tailwinds and Tensions of Technology for OSINT, this presentation identified three major shifts and explored their impacts. The first is how the digital landscape and emerging tech are transforming intelligence. The second is a decline in information environment. The third is the increased environment of competition and conflict, not just in our region but globally. It unpacks these more and offers insights into some of the tailwinds and tensions for open-source intelligence. 

Addressing questions from the audience, Dr. Hammond-Errey speaks candidly on the roles of universities, the power dynamics between governments and big tech, and potential social media bans as well as the desire to ‘analogue’ life. This episode is a thought-provoking look at the tailwinds and tensions of new technologies for OSINT and includes consideration of how we can foster a tech ecosystem that aligns with democratic values, placing human needs and safety at the forefront.
 
Resources mentioned in the recording:
 
·               Column:  Lowy Interpreter, 18 Jun 2024, Should Australia ban TikTok? Lowy Interpreter  
·               Book: 29 Jan 2024, Big Data, Emerging Technology & Intelligence: National Security Disrupted, Routledge (30% off discount code: ADC24) 
·               Research Report: 9 Feb 2023, Secrecy, sovereignty and sharing: How data and emerging technologies are transforming intelligence. United States Studies Centre
·               OSINT Combine https://www.osintcombine.com/
·               To find out more about Australian OSINT Symposium (held annually): https://www.osintsymposium.com/ 
 
This podcast was recorded on the lands of the Gadigal people, and we pay our respects to their Elders past, present and emerging. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.
 
Thanks to the talents of those involved. Music by Dr Paul Mac and production by Elliott Brennan. 
 

What is Technology and Security?

Technology and Security (TS) explores the intersections of emerging technologies and security. It is hosted by Dr Miah Hammond-Errey. Each month, experts in technology and security join Miah to discuss pressing issues, policy debates, international developments, and share leadership and career advice. https://miahhe.com/about-ts | https://stratfutures.com

06. Tailwinds and Tensions of Technology and OSINT.mp4
[00:00:01] Jane van Tienen: Welcome to Technology and Security, a podcast exploring the intersections of emerging technologies and national security. If you haven’t already picked it up, I’m not your regular host. My name is Jane van Tienen, and I’m the Chief Intelligence Officer at OSINT Combine, and Australian founded global company that helps build enduring open-source intelligence capability in strategically aligned organisations. The reason I’m in your ears today, is because your regular host, Dr Miah Hammond Errey recently spoke at the Australian OSINT Symposium, which is hosted annually by OSINT Combine. The Australian OSINT Symposium gives voice to diverse viewpoints from the Australian and international OSINT and technology communities, with a simple objective, to foster learning. It’s designed to energise an engaging conversation, showcasing different levels of experience and perspective.

Symposium is a true meeting of the minds, and it’s unique in the Southern Hemisphere to have a conference of this kind focused on open-source intelligence.
With that in mind, your regular listeners won’t be surprised that I reached out to you to seek your willingness and availability to contribute to Symposium. And lucky for us, you were up for it! So, for this special episode of the Technology & Security podcast, it’s my pleasure then to introduce you, Miah, or more rightly, a recording of you speaking at the Australian OSINT Symposium, on the topic “Tailwinds and Tensions of Technology, OSINT and National Security”. Miah, thanks again for joining us at Symposium this year. It was terrific to have you contribute your energy, passion and perspectives on all things Big Data, Emerging Tech and Intelligence. Over to you.

[00:01:36] Dr Miah Hammond-Errey: Jane, thank you for such a kind introduction. And, Chris, wherever you are, thank you so much for inviting me. I'm really stoked to be here today. And when I looked across who was here, um, there's just an incredible wealth of knowledge in the room itself, so I'll definitely leave some time for questions. I do just firstly want to acknowledge, of course, that we're on the lands of the Gadigal people, and I want to acknowledge their continued connection to land, sea and community and pay my respects to their elders, past, present and emerging. I'm kind of coming to you today with a big brief. I was asked to take you outside the everyday and kind of focus on three really big shifts that are happening in technology and security, and then explore what that might mean for open source intelligence. I know you've all been sitting down for a little while, so can we stand up and stay standing? If you think that we should ban TikTok. I can't tell if people are just dancing. What's happening? Can you stay standing? If you think we should ban X Twitter, whatever you want to call it.

[00:03:00] Dr Miah Hammond-Errey: cool. Everyone can sit down. obviously India banned TikTok in 2020, and the US is likely to implement the impending ban at some point. Brazil banned X, uh, earlier this year, and it wasn't on my 2024 bingo card. But banning socials for young people might now be a thing in Australia. and I guess I wanted to use the start of these technology conversations. Yes, these are all about socials, but to talk about the longer and kind of larger reaching implications for open source intelligence, but also for technology and society much more broadly, um, the discussion about banning applications, commercial user applications in society offers some real insights into how I think we're going to start to see technology discussed more in public spaces. So, you know, the way we view technology is rapidly changing. We've had the digital transformation and then we all had this kind of shared isolation experience of Covid, which was really fueled by data. Digital technologies and applications. And we're now entering a more contested tech space. This kind of discussion highlights, I think, some of the challenges we're going to see, things like the impact to foreign investment on consumer applications made in certain countries, the capacity of the tech ecosystem more broadly to change Osint access really rapidly, and where we see the role and boundaries of government in our lives and in society, and what we expect them to do when it comes to platform regulation and how we identify and manage individual, organizational and national technology and security risks.

[00:04:52] Dr Miah Hammond-Errey: So I'm kind of going to talk to you now about what I see as the three big shifts in technology and security. And the first one there is digital landscape and emerging tech, which I argue in my book and PhD are transforming intelligence kind of really broadly. I'm going to go through these in a bit more detail. The second is a decline in information environment, and I will outline what I mean by information environment. I'm talking about the content and information, the infrastructure and kind of the digital platforms we receive it in and our brains. You know the cognitive resilience piece like how do we actually understand information and what's the context that it's delivered in. And finally, I won't spend too much time on this because I know Mark will have talked geopolitics. But the increased environment of competition and conflict, not just in our region but globally. So I'm going to unpack each of those a little and then talk about some of the tailwinds and tensions for open source intelligence, and hopefully leave lots of time for questions. Just going to grab some stuff. Turns out I can't use the tech. So basically, the premise of my book was that the big data landscape, which is data abundance, digital connectivity, and ubiquitous technology are transforming both the activities, the ways of intelligence, the organizations that actually do intelligence both in government and out of government, and also the communities of intelligence, the way that we actually organize and coalesce around intelligence.

[00:06:50] Dr Miah Hammond-Errey: I argued that they are creating new social harms and exacerbating existing, you know, national security threats. And I feel like to this audience, the discussion about data abundance and digital connectivity and ubiquitous technology is going to be pretty basic. But I did just want to cover one point. For national security, for everyone in this room, I presume the understanding of the level of data coverage over our lives, our human lives, the quantified nature of that is, I assume, a kind of base point. But from a national security perspective, this is something that's really shifted. You know, we did not used to have the omnipresence of digital quantification of our lives. And whilst that might seem really obvious, it does create other national security threats. It means that things like surveillance, targeting and tracking capabilities that were once government focused and government initiated can now to a lesser or greater degree also be undertaken by a really broad range of actors. For governments, that is a complete paradigm shift. So for open source intelligence, that might be, you know, an incremental progression, but for the nation state, that is actually quite a big change. One of the things that is happening, and I think you'll really start to see an increase in discussion on this, is the concentration of power that has happened as a result of that shift.

[00:08:21] Dr Miah Hammond-Errey: So if we kind of break those down a little bit more, when we talk about the volume of personal and personally identifying data, you know, the digital connectivity that allows that transmission and the ubiquity of technologies, whether they're, you know, simple web applications all the way through to more emerging technologies emerged like AI and emerging like Neurotech, we can see that the companies that have managed that the best have concentrated the power. Informational power. Technology, power, geopolitical power and jurisdictionally concentrated that power. So there are obviously some major hubs where that activity occurs. We're also seeing future tech disruption. And I just wanted to highlight, I guess, the pace of consumer adoption. So, you know, for the 100 million user mark, when we start to talk about which is kind of like the unicorn number of technology adoptions, it took the telephone 75 years to reach 100 million users. It took Instagram a couple of years. It took ChatGPT two months. Does anyone know how long it took threads? Does anyone want to ask what threads is? Because I get that question all the time. So it took Instagram's version of Twitter threads one week to get to 100 million users, and we can see at the pace that we're adopting new technologies, that that rapid adoption will actually drive different kinds of disruption from what we've seen in the past, and I don't want to focus on it too much.

[00:09:54] Dr Miah Hammond-Errey: But I do just want to highlight that AI is changing our relationship to knowledge. It changes how we access knowledge. It will if we start to move towards voice assistance, for example, it will shape and influence our understanding of information. And again, it concentrates the power of who has the ability to shape that information. So the digital disruption discussion is not just about technology. It's also obviously revolutionizing our social experience. And that has, you know, pretty profound implications for intelligence. The um, the kind of focus of intelligence itself. You know, most of my research has been inside the national intelligence community or kind of national security adjacent. Um, and so many of these will perhaps not be quite as pertinent in an Osint perspective, but I want to highlight the ones that are. Obviously, secrecy and the declassification of intelligence places a premium on what you choose to classify and the capabilities you choose to look after. And what that also does is create opportunity in the open source space. It does the same thing for digital infrastructure. So I've argued in multiple places that the current the contemporary digital infrastructure for national security intelligence is not at where it needs to be.

[00:11:17] Dr Miah Hammond-Errey: And I think that's a place we can definitely look to our Osint capability providers as an area to look at as something to lean into, because I think that's where the future of real intelligence infrastructure is going to go. When we look at public trust in intelligence, we start to see a significant shift in the way that people trust organisations, government, intelligence agencies, and the way that the volume of information in our environment is affecting the way that we think about trust. It's a pretty nebulous concept. Trust, and I won't get into the academics of it, but there's a lot to it. I also just want to highlight I wrote a paper, which looks at how the technology environment is changing fundamental principles and practices as set down by the Richardson Review or as well articulated by Dennis, is changing secrecy, sovereignty or nationality and sharing and I think that they are three key areas where open source intelligence has possibility to input here, but also that will have wide ranging national security implications if we can't fix them. So I'm waiting, obviously, like everyone else, for the final copy of the Independent Intelligence Review to see what it says about technology and where we might be going.

[00:12:40] Dr Miah Hammond-Errey: I'm just going to throw this one in. It's I won't spend too long on it. It's a piece I wrote that highlights what I think are the characteristics of good intelligence, and what the leaders and practitioners in Australia's national intelligence community suggested to me were important. And they came up with six features, which, if you're a Sherman Kent fan, should be absolutely no surprise or list any Lowenthal, any kind of intelligence scholar that you like. They haven't been written down in one paper before, and we also haven't seen any explorations of what it means in a technology enabled environment. And it's really easy to say these are the most important characteristics. What came out of my research was just how difficult it is for practitioners to achieve this in the current environment, which is why the next big shift is super important. So they are it needs to be timely. It needs to be purposeful, actionable, accurate, a huge challenge, value added and unbiased. And I'm super happy to take questions on that. But I wanted to just quickly cover off this before we go to the next big shift. This first shift of digital transformation obviously brings with it some opportunities and challenges for Osint. And I'll start with the positives, because I'm a national security threat analyst at heart, right. So I always end with the threat because that's what the fun part is. So I think the tailwinds are Osint is obviously going to continue to grow irrespective of what access data access we have.

[00:14:11] Dr Miah Hammond-Errey: This environment is not slowing down. And when we start to look at the third big shift, which is regional competition and conflict, we'll see just how important it can be. But who does it and how it might change, I think will continue to evolve. There's a huge discussion about whether or not more Osint should be taken into government, or whether or not it should be done in partnership with others, and that conversation, it will continue on, I think, until there's either some government policy or your private providers become so good that it's impossible to avoid them. It's happening to a certain extent already, but one of the things that is significant in the technology transformation is the way that harms assessment, or the actual harms the ability to identify them is more diffused. Obviously, we see that in the cybersecurity environment, but if you look across any major threat landscape in the national security space, identification of harms is much more difficult. They are more diffuse. You rely on many more actors and many actors overseas. And so I think this is an area that Osint can increase its options and opportunities for government, but also for private sector in areas like obviously we already see intelligence, but in cyber and critical infrastructure, which I think is building. But I think there's going to be more areas of harms assessment, you know, in the social media for kids space, as an example, the potential opportunities in language and audiovisual to text translation, as well as AI and information verification are clearly profound here.

[00:15:54] Dr Miah Hammond-Errey: I caught the end of Mark's speech and as he's saying, verification for senior decision makers is absolutely critical. And it's one of the areas where non-state intelligence functions just absolutely have the edge. And so I'd love to see that being built up more. There's lots in the content authenticity space, but looking at how we actually verify information across the board, I think is going to be an ongoing and huge discussion. Some of the tensions, right. I mean, will AI replace human analysts? I get this question all the time, sometimes from fearful analysts and then often kind of with hope from business leaders saying, now we're going to bring in AI and we're going to get rid of our analysts. And you're like, that's not going to work. Um, at least not now. It may obviously, at some point when some of those capabilities improve enough. But it is a significant tension. Where will the analytical capability of the human lie and where will that capability of AI lie? And as AI improves from that clunky llama or ChatGPT that we all tried, we will start to see huge, huge gains in that space, particularly in Osint. How to maximize the use of Osint for national intelligence, which I sort of touched on, and one that is really live to me at the moment, particularly because of the impending privacy legislation which I'm following, is the way that the privacy landscape and our broader data landscape impact on both national security and Osint capabilities.

[00:17:27] Dr Miah Hammond-Errey: And I've highlighted here tracking, targeting and surveillance. But significant policy shifts will have quite speedy implications here. Or go the opposite. And policy silence will have really big negative issues for individuals and for communities, and balancing those two will remain an ongoing tension, I think.

[00:17:27] Dr Miah Hammond-Errey: The decline in the information environment, resilience, this is a really big topic to try and, you know, hit as one of my three shifts. I don't think we can talk about Osint. and not talk about it. So as I mentioned, to begin with, I think the information environment is three things. it is content or infrastructure. Uh, sorry, content or information infrastructure, which I've extrapolated out to be our digital platforms, um, of systems of creation, distribution and use, um, and cognitive resilience. So talking about how we engage with the information around us, how we interact with it. Um, and I guess what that really looks like. So I've, I've put here that I'm going to quickly cover some trends. And I just wanted to throw up some things that I've seen that other researchers are looking at.

[00:18:47] Dr Miah Hammond-Errey: So in the content and information space what are we seeing. We're seeing a shift towards social. We're seeing a significant increase in the number of individuals in Australia getting the predominant part of their news from social media. We're seeing a shift to digital where we're seeing a significant I mean, this is kind of pretty obvious, but we are seeing a significant decline in the use of traditional print based media. Obviously, for the most part. What does that mean? Well, it means an increased power for the platforms that are offering those services from infrastructure. We're seeing a concentration of power in the companies that provide those services, and a capacity for companies to significantly shape what it is that we see, you know, whether that's whether that's through automation or AI of news feeds. Um, all the way, you know, through to actual critical infrastructure and the threats there. And the cognitive resilience piece is super interesting. I mean, I see on one hand the the reason that the federal government is interested in banning social media for young people. The research clearly says there is a decline in mental health with young people that use their phones. I've all people that use their phones. Actually, the research suggests that our attention is dropping if we increase our screen time. There are so many indicators that using social media particularly.

[00:20:14] Dr Miah Hammond-Errey: But, you know, digital landscape in the digital environment is decreasing our capacity to cognitively receive information effectively and that many of us are in overwhelm and overload all of the time. That's a fairly unsustainable position. I'm certainly not advocating for a social media ban, but I do think it's something to consider that this is a trend that we're seeing in this information environment. So whether or not you like how they're trying to manage it, it is something we will need to manage. And that obviously has pretty significant, um, implications for open source intelligence. I've talked a little bit about the information environment and what it is, but it's clear you don't need to read the research to see how affected we are as individuals, as decision makers, by the information environment. You know, mis and disinformation is our information environment is susceptible to it across the stack at every part of that stack from ISP all the way up to, you know, individual user, we're seeing an overwhelming use of non-military measures. So this is kind of a the active measures or Russian term. I did my master's in Russian disinformation. So it's a topic I refer back to often. And I remember looking at that in 2017 and saying, you know, Gerasimov said he thought it was 4 to 1. And, you know, in 2017, I was like, do you reckon this could be 10 to 1? And now, like, I just I can't even imagine it feels like 1 million to 1, right? Like the use of non-military measures to achieve state outcomes is increasing significantly.

[00:21:49] Dr Miah Hammond-Errey: And and that is crossing over with our ability to interpret, understand, receive information and make good decisions. Right. Um, it's spoken about heaps this year, but the impact on democracy and elections, um, on deep fakes. You know, we've just seen this week another release of an a video purporting to be with the alleged offender, purporting for it to be a deepfake of him snorting a white substance. Um, and we've previously seen a number of, you know, high quality deepfakes of politicians, whether that's Joe Biden and robocalls all the way through to, um, Senator Pococks, uh, deepfake of Albanese announcing the gambling legislation. So that is going to continue. And obviously, that's going to be a strong legislative focus. But I think there is a huge role here for open source intelligence to play in understanding the sentiment, the landscape, what is happening as a result of that and then helping to actually ameliorate those problems. I think there is already there or has already been a significant increase in the use of information to influence people, decisions and outcomes, and that includes you. We've seen examples where not just examples now, but when I started talking about this, you know, we'd seen a few examples here and there.

[00:23:09] Dr Miah Hammond-Errey: We saw a NATO commander subjected to an information campaign in Eastern Europe which saw him pulled out. but since then we've seen a significant increase in attempts to silence voices that oppositions don't like. These voices are, unfortunately, often women. They're often people of color, but they can be anyone, anyone that has a voice that is in opposition position can easily become the subject of an information campaign. Not necessarily by a state actor could be any actor. But the problem is, as organizations, as leaders, how do we protect our people? How do we think about if someone in my organization has been subjected to this, what can we do? You know, what are some of the ways that we can help them? And unfortunately, at the moment there's not a lot. So the big shift information resilience decline, I think there's one kind of really high, high area of opportunity I see here. And that is um, I feel like Osint actually is an incredible place. I mean, as a, as a capability, as a community is an incredible place to promote platform changes that support democratic principles. There is, if we aren't already, at the point where we're seeing out to out conflict between tech companies and government, We will be. So get your popcorn out. But this is a place that we all, as individuals, need to stand up and say, this is what is important.

[00:24:42] Dr Miah Hammond-Errey: And then as organizations and capabilities to stand up and offer something. In terms of tensions, I think reducing privacy and security vulnerabilities. And so here, I mean, increasing personal privacy protections, um, can and will probably decrease Osint data access. And so that has a tension, because on the one hand, we're excited that that's improving our personal privacy and improving, you know, the community's personal privacy. But on another hand, that will make work more difficult. And I think the significant tension here of emerging technologies, you know, AI is already bringing a lot to this space, whether that is from deepfakes to auto generated disinformation, uh, all the way through to neurotechnology, which has the capacity to influence our thinking and actions as well as, you know, provide information. So I think some really big shifts there that over time and probably not much time, I'm talking, you know, another three months, another 12 months. We're expecting the introduction of consumer wearable neurotechnology that isn't like a ginormous glass. So the Metawatch is planned for 2025 and that's a neurotechnology device. So once we start to see widespread consumer adoption of neurotechnology of, um, consumer applications, I think we're going to see that rapid adoption pace. And that's where this will really speed up the decline in the information environment. The final piece that I kind of wanted to bring a bit of focus to, and this is really deeply connected for me with the information environment, is that global and regional conflicts, competition and conflict are headed in one direction.

[00:26:27] Dr Miah Hammond-Errey: You know, there's it's almost impossible to imagine a reverse to cooperation right now. It just doesn't feel like we're headed in that direction. And there's a number of reasons for that. And as Jane introduced, I ran the, ASPI info ops team for a while. And one of the kind of real pleasures of that job was that I got to work with some really incredible people. And one of the reports that we put together was about CCP influence in the Pacific, in the Solomon Islands particularly. And I remember looking over the data, and basically the team had scraped all of the publicly available data in the Solomon Islands, and we had analyzed it according to a method which I won't bore you with, but I created for my master's thesis, and it kind of looks at all the different ways that you can enact state sponsored information, influence or interference, and I remember I'd actually left ASPI by the time the report was published, and I remember seeing it published with the headline, Suppressing the Lies and Telling the truth, while suppressing the truth and telling us I always get it the wrong way round, but it's it's true. Like, unfortunately. Yeah, it's one of those things. Um, unfortunately, what we're seeing in our region is such extreme information, influence shaping the environment is becoming incredibly hostile.

[00:27:55] Dr Miah Hammond-Errey: And it's not just because of information, it's just a first and easy tool of like first resort, if you like. Um, I'm actually working on a paper at the moment with some colleagues looking at how the information environment is actually an incredible predictor of the military, economic and diplomatic space. And in fact, I would argue that now, the way that our technology environment has set up our information structures is that it's kind of like a band around those three things, and you can't push one up more than the other, and maybe it's always been that way, but now we can see it. So I guess I put I've put these others up here to say I think time is super important, obviously diplomatic information, military and economic, but I think technology is now up there too. And it is shaping these, you know, these kind of whole factors. I recently interviewed Carly Kind, the Australian Privacy Commissioner, in a podcast that was released this week. And she said to me, these big tech companies are not intermediaries anymore. They are shaping our technology and privacy landscape, the extent to which we fight back, or the extent to which that we engage with that will have profound intelligence collection and analysis implications. Um, I didn't mention it earlier, but in the cognitive resilience piece, I did just want to highlight the role of diversity and cognitive diversity.

[00:29:23] Dr Miah Hammond-Errey: I had the absolute pleasure of interviewing Sir David Omand for the podcast, and we spoke about this extensively. Um, not all of it made the pod, but he's really passionate about cognitive diversity and how we as leaders create that. And I think that's an absolutely ripe and important area for for more of a focus in intelligence work. And I put climate up there because so much of our security environment will in the next, you know, coming ten, 20, 50 years, be shaped by climate. And we've seen historically, how many of the large scale migration, issues have fled into security issues and climate will add that plus so, so many more. Um, although I love the warm weather in spring, I'm really deeply concerned about what that will do for climate. And I hope that if there is scope for us to do anything, um, in in joining together to fix this in open source intelligence, I really hope that we can do something in climate. So to bring the final, you know, big shift if you like to conclusion. We are obviously at a point at increased competition and conflict, and one, I just find it so hard to see we could ever get back to cooperation without a major and significant event, whether that's actual conflict or war taking place. It's significantly exacerbated by the decline in our information environment.

[00:30:50] Dr Miah Hammond-Errey: And the two are playing off each other. And it does really require an immediate response, not necessarily by us in this room, but it's something to consider as we're going forward and using tools and building capabilities. So in terms of tailwinds, I think there's an opportunity to improve understanding of foreign investment vulnerabilities. And I think of things here like the next TikTok or of any tech application that we're going to start to use, um, the use of open source intelligence partnerships. So, um, this is something I kind of thought about a lot more. After chatting with Byron Tao, who wrote the book Means of Control. And it's about basically the concept that open source intelligence obviously can create new partnerships with countries that we can't share classified sources and methods with. And absolutely, our region is a perfect example of where open source could really drive those intelligence partnerships and some tensions, supply chain vulnerabilities and foreign investments in tech. And I think that foreign investment piece in tech might seem a little bit kind of odd and nebulous, but investment not being able to flow into countries, any country, or the possibility of applications being suddenly shut down is going to really jolt the market and keep people quite concerned. So I think that's going to change how we invest in technology, and I think it's something Osint can really help with.

[00:32:15] Dr Miah Hammond-Errey: We can understand more about the risks of foreign investment concern, and we can try to identify them early So the three big shifts are that the digital landscape and new technologies are transforming intelligence and of course, society. We're seeing a significant decline in the information environment across the board, but also in socials. And we're seeing an increase in regional competition slash conflict. I would say I think all of these three shifts make it harder to govern. They make it harder to unite populations and harder to solve real security threats. But considering these challenges, I think Australia is actually doing pretty well, considering it raises many questions about how we as individuals can navigate the complexities, but also our organizations, company boards and governments. And of course, wherever there are challenges, there are always going to be business opportunities. We'll absolutely need more OSINT capabilities in society and in government to make us safer and more secure. But I kind of want to conclude on this with hope. We have an ability to shape our technology ecosystem. We need to provide policymakers and leaders with the best information we can to improve our business and national security decision making, as well as contribute to global efforts. We also need to imagine the kind of technology future we'd like to leave, and use Osint to help chart the pathway forward. Thanks for having me and I'm really happy to take any questions.

[00:34:01] Speaker1: All right. We've got a couple of roving mics. There's a question over here.

[00:34:10] Audience question: Thank you for that presentation. Very thought provoking. but what I'm wondering is you talked about, um, this rapid adoption of, um, new tech. And as someone that's a bit older than, say, the younger generation. Now, um, you know, I, um, relate to people that are slow to adopt new technology, but I get that it's absolutely being sort of adopted faster. And I know that Australia is one of the quicker adopters for things like smartphones. Do you think it's a reflection on generational change, or is there something else, um, that is impacting how quickly we're adopting new technology? And is it, um, regional specific or are there areas that that is happening faster? And I guess how how does Australia fit into that? Yeah, sure.

[00:35:10] Dr Miah Hammond-Errey: Good question. Um, I just want to firstly say I have started a personal movement to try to analog my life. So I'm trying to like reduce my reliance on technology. Turns out I'm obviously not alone. There's millions of people around the world who are doing this for the exact reason. We don't all want to adopt everything. technology is being adopted across the board much more quickly, globally, driven by a couple of things. I mean, economics, to start with, business leaders are being sold. The idea that AI will make their business more effective, make it, um, you know, cheaper to operate and more efficient. The research you highlight is a little bit old that says Australia is an early adopter. It is true when it comes to AI, we are actually one of the most concerned nations where around 80% concern about AI. But the trend really is that while we're a fast adopter at some things, we actually are quite cautious about community concern about safety and security, which I love.

[00:36:22] Dr Miah Hammond-Errey: I think it's really important, and I think what we're seeing globally is people are happy to try an application, they're happy to try a consumer application. And I'm using that term really specifically because when we think about things like Neurotechnologies, we have medical applications and consumer applications and they have really different legislative regimes around them. I think we will continue to see consumer adoption at a very rapid pace, whilst there's some benefit to be gained. So if people feel like a service is something they really want to use, it will get adopted really quickly. There is a pushback, of course, you know, there globally we are seeing significant cases against Google, against Apple, against meta, um, you know, against pretty much every big tech company. And many of them are being won. And so we will see real pushback. And I think we'll see, at least I hope we see in Australia a much more diverse, you know, locally or sovereign built, um, ecosystem which is more competitive?

[00:37:31] Audience question: Audience question here. With increased tensions between government and tech companies over the information environment. What role can universities play or do they play?

[00:37:42] Dr Miah Hammond-Errey: Yeah, I didn't get the chance to kind of go into the nitty gritty of like what we can do about that environment, but there is a significant role here for universities, both in education, um, and obviously also in providing evidence of research of specific harms and the way mitigation can, can work in a policy setting. So, um, you know, evidence based policy. I think on the education side of the House, there is an absolute opportunity in Australia to rethink the way that formal education works. I mean, everyone is talking about this. The universities themselves are fully conscious we need to move towards different kinds of education. We need to prepare Australians for the jobs that we once thought were decades in the future, and now look like they're right on top of us. And we also need to prepare Australians with the skills, whether their formal qualifications or, um, you know, social and political skills to navigate the new environment. And I think universities are an awesome place to start that process, or at least to collaborate with. I'd love to see more, um, you know, government, private sector and university collaborations working as well with the civil society who are often identifying on the front line of the actual social harms from things like social media or, um, you know, spyware and so on. I think there's so much more space for that. I think it's about investment. We need more investment.

[00:39:20] Audience question: Cheers. Thanks, Miah. Um, probably a two part question. Maybe our first part, which you might want to answer last.

[00:39:29] Audience question:

[00:39:33] Audience question: would you ban X or TikTok? Um, and what do you think about, you know, how information we consume in on these digital platforms? Are we going to see sort of a greater concentration of power by the big tech companies? And what does that mean for information and how we perceive knowledge in society?

[00:39:51] Dr Miah Hammond-Errey: Yeah. Um, so I have published on what I would do to ban TikTok. The answer is no. I don't believe in banning consumer applications. I don't think it's good for democracy. I think you should regulate the harms and enforce your regulation. Um, and on the kids thing, I think if we're going to ban social media, like my argument would be don't ban social media, make social media safe for kids, like make those spaces safe for the people to use them, and it won't be a problem. And that's about smart regulation rather than banning. And the second question was about information resilience. Was it, um, as in what we can do about it? Yeah. There's like this is a really exciting area. I think we can actually do a lot. I don't think we need I mean, I think we're going to always need things like debunking and debunking and trying to reduce, um, awareness or access to mis dis mal information. Um, but I really think here we need to focus on building up not just critical thinking skills, but actually regulating the aspects across the tech stack, the whole stack, not looking at just consumers or just critical infrastructure providers or ISPs. We actually need to look at the companies involved all the way along and say, how can you incrementally make this environment better for humans rather than how can we stop, you know, misinformation from circulating? Because I think what we're going to see is just another form of that. Um, I really think it needs to be a lot more comprehensive and across the whole technology stack, rather than looking at, you know, one application or one company, because I think that's where we get stuck. Yeah, that tech tech power is, is also a huge element of that. But that's another conversation.

[00:41:40] Audience question: Uh, just got a question. And that's a great segue for that. Um, given that it seems that this social media ban that's been proposed by Labour, um, is being led by commercial interests rather than any altruistic kind of cause. So I'm calling out News Corp here, um, for championing, um, themselves as the kind of leaders of that. Do you think there's a danger there? Um, with those kinds of proclamations colouring the narratives around those proposals, leading to a backlash by population saying that they don't want bans? Um, because obviously they take that as an affront and censorship, but that obviously plays into the interests of big tech. So do you think there's, I guess, a threat there in that those conversations or narratives around what should be fairly reasonable conversations around the role of big tech in our lives, being colored by, um, certain other vested interests, kind of steering the conversation in a way that's not really, um, balanced or fair. Derailing those kinds of efforts.

[00:42:40] Dr Miah Hammond-Errey: Yeah, I mean, I, I am definitely concerned about big tech power. I am also concerned about, you know, domestic policy interests being shaped by certain individuals. That's always happened. So I don't think this is anything too new. I do think we need to see technology companies and social media companies in a different way to traditional kind of other forms of regulation in the fact that they have a significant influence power that we haven't we haven't actually researched and understood. Yet I don't really think. I guess when I step back and think about it as a possible decision maker, it must be incredibly difficult to know what to do here. Like, I think they've made an incredible strategic error in this and I, I would be like astonished if we end up with a social media ban in any kind of, you know, short term, um, just the realities of making that work. And as you say, a population backlash. Having said that, you know, there's always this argument with Big Tech. it's a populist approach, right? Like, of course it's effective because everyone is annoyed at the power of big tech. Um, and everyone feels unable to kind of pull themselves out of that and I, I would argue government are in exactly the same boat. Right? Like the very infrastructure that they use relies on big tech tech companies. I make the case in a forthcoming article. They cannot go to war without the endorsement essentially of Big Tech at this point. And so they are also bound by the same realities that we are as individuals. And the only way to change that and to shape that is international collaboration and diplomatic efforts. It's why we have to do things as multilateral forums. We have to work in partnerships with others because we can't do it alone. So I think the social media ban, I think, is a real misstep. We leave that to the side and hope it kind of gets lost somewhere. I think, you know, what would be more effective is effective regulation on privacy landscape, looking across, influence and interference in our own political system, creating methods and ways to communicate authentic, verifiable information from government and trusted institutions, like lots of other ways that that problem could be approached. and, yeah, education.

[00:43:03] Dr Miah Hammond-Errey It was a real honour and pleasure to join Jane, Chris, and the OSINT combined team at this year's symposium. Thank you very much for having me. Thank you also to an excellent audience for your engagement, great question, and many follow -ups after the event. Given the intense interest in the role of Technology and Intelligence Production and Security Decision Making. From time to time, I'll be adding special editions of Technology and Security with a purple logo highlighting their intelligence specific episodes. Reach out and let me know how you find them.

[00:43:36] Dr Miah Hammond-Errey Thanks for listening to Technology and Security. I've been your host, Dr. Miah Hammond-Errey. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me @miah_he or send an email to drmiah@stratfutures.com. You can find out more about the work we do on our website www.stratfutures.com also linked in the show notes. If you liked this episode, please rate, review and share it with your friends.