Cloud Realities

As more people are displaced by war, economic instability, and a warming planet, more countries are turning to AI-driven technology to “manage” the influx.
 
This week, Dave, Esmee and Rob talk to Petra Molnar, Harvard Faculty Associate and Author of 'The Walls Have Eyes'' about how tech is being deployed at borders, how it is impacting communities and how innovation could be used for good.

TLDR
02:01  Confused about whether we learn from disasters
06:00 Cloud conversation with Petra Molnar
36:05 Research about the state of AI narratives and the perceived "story crisis"
41:35 Walking with a real dog in the park!

Resources
The Walls Have Eyes:  https://www.amazon.co.uk/Walls-Have-Eyes-Artificial-Intelligence/dp/1620978369/ref=asc_df_1620978369/?tag=googshopuk-21&linkCode=df0&hvadid=696285193871&hvpos=&hvnetw=g&hvrand=7943747214225498341&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=1007014&hvtargid=pla-2281435176378&psc=1&mcid=ca0e0fa0fbd43a82883655a1dea4ea58&th=1&psc=1&hvocijid=7943747214225498341-1620978369-&hvexpln=74&gad_source=1
 
Guest
Petra Molnar: https://www.linkedin.com/in/petra-molnar/

Hosts
Dave Chapman: https://www.linkedin.com/in/chapmandr/
Esmee van de Giessen: https://www.linkedin.com/in/esmeevandegiessen/
Rob Kernahan: https://www.linkedin.com/in/rob-kernahan/

Production
Marcel van der Burg: https://www.linkedin.com/in/marcel-vd-burg/
Dave Chapman: https://www.linkedin.com/in/chapmandr/

Sound
Ben Corbett: https://www.linkedin.com/in/ben-corbett-3b6a11135/
Louis Corbett:  https://www.linkedin.com/in/louis-corbett-087250264/

'Cloud Realities' is an original podcast from Capgemini

Creators and Guests

Host
Dave Chapman
Chief Cloud Evangelist with nearly 30 years of global experience in strategic development, transformation, program delivery, and operations, I bring a wealth of expertise to the world of cloud innovation. In addition to my professional expertise, I’m the creator and main host of the Cloud Realities podcast, where we explore the transformative power of cloud technology.
Host
Esmee van de Giessen
Principal Consultant Enterprise Transformation and Cloud Realities podcast host, bridges gaps to drive impactful change. With expertise in agile, value delivery, culture, and user adoption, she empowers teams and leaders to ensure technology enhances agility, resilience, and sustainable growth across ecosystems.
Host
Rob Kernahan
VP Chief Architect for Cloud and Cloud Realities podcast host, drives digital transformation by combining deep technical expertise with exceptional client engagement. Passionate about high-performance cultures, he leverages cloud and modern operating models to create low-friction, high-velocity environments that fuel business growth and empower people to thrive.
Producer
Marcel van der Burg
VP Global Marketing and producer of the Cloud Realities podcast, is a strategic marketing leader with 33+ years of experience. He drives global cloud marketing strategies, leveraging creativity, multi-channel expertise, and problem-solving to deliver impactful business growth in complex environments.

What is Cloud Realities?

Exploring the practical and exciting alternate realities that can be unleashed through cloud driven transformation and cloud native living and working.

Each episode, our hosts Dave, Esmee & Rob talk to Cloud leaders and practitioners to understand how previously untapped business value can be released, how to deal with the challenges and risks that come with bold ventures and how does human experience factor into all of this?

They cover Intelligent Industry, Customer Experience, Sustainability, AI, Data and Insight, Cyber, Cost, Leadership, Talent and, of course, Tech.

Together, Dave, Esmee & Rob have over 80 years of cloud and transformation experience and act as our guides though a new reality each week.

Web - https://www.capgemini.com/insights/research-library/cloud-realities-podcast/
Email - Podcasts.cor@capgemini.com

CR082: Surviving Migration in the Age of AI with Petra Molnar, Harvard Faculty Associate
[00:00:00] He's re releasing his debut album, but if it gets to number one in the UK, he's going to legally change his name. And people can vote on what that name can be. Oh no! Oh no! Now, the UK public, when asked a question like that, cannot be trusted.
Welcome to Cloud Realities, an original podcast from Capgemini. And this week, it's a conversation show exploring the Borders Industrial Complex, and how technology is being used at work. borders and the challenges that are arising in communities around that. I'm Dave Chapman. I'm Esmee van de Giessen and I'm Rob Kernahan.
And I'm delighted to say that joining us for that conversation is Petra Molnar, Harvard Faculty Associate and the author of The Walls Have Eyes. Petra's been working in this space her whole [00:01:00] career and is going to share with us some insights from it. So Petra, It's wonderful to see you. I think you're in Brooklyn this morning, is that right?
Yes, that's right. Thanks so much for having me. Oh, it's our pleasure. And, uh, what's the weather like where you are? Really sunny and a beautiful, crisp, autumnal day. Oh, I see. That's nice. Oh, I see. That's marvelous, isn't it? They did a nice picture there. Now, where do you stand on the, uh, pumpkin spice latte?
Too sweet for me. Thank you. a matcha latte girl. Matcha. Hear, hear. Also, have you seen how many calories are in that drink? I mean, literally, it's half your daily intake in one drink. You look at calories, Rob. I do, when it's that high. Why do you say it like that, Es? Oh no, no, no. So as you can tell, Robert and Esme are also here.
How are you doing, guys? You good? Alright, yeah, not too bad. It's all good. Yes. It's Friday. Got that Friday feeling, haven't we? It is. Last thing on a Friday in the UK and I guess pretty much first about lunchtime on a Friday in Brooklyn. Yeah, that's right. [00:02:00] Yeah, very nice. All right, let's move on. Robert, what are you being confused about?
Well, Dave, you remember the CrowdStrike fiasco that occurred, which was, has now been described as the largest global IT outage ever. I think it deserves that title. It does. I think it does deserve that moniker. It earned it. Why do you wonder though? Right. I always said that to be a good technology professional, you have to have been in a live room or during a production incident to understand how you should better build systems and use technology to be, you know, more appropriate.
And I do wonder how lots of people will have learned lots of things about how they've set their organizations off on the back of that disaster. I know lots of companies lost lots of money, but actually was there a greater goods that came out of CrowdStrike teaching everybody that when you have something so significant, you need to be better prepared in how you set yourselves up.
Because some companies deal extremely well with it. So we'd say they've got lots of maturity and how they handle IT operations and others fell foul of the issue. And [00:03:00] hopefully there's this confusion in mind. This is actually, I know it was, bad impact for a day or two, but have they now learned a better way to use technology to be more appropriate and to be better managed and all that stuff?
Because I'm sure there's some very harsh reviews going on about those who suffered badly from it, but actually, will they learn and become better organizations off the back of it? So this is the bit that's confusing me about, is it better to have an outage like this every now and again that, you know, it's not massively impactful.
you know, that we got back to normal quite quickly, but actually there's probably some decent learnings in there somewhere. Well, of course, what should happen in any decent operational process is you should do dry runs of this stuff, right? Yeah. So on a, on a relatively regular basis, and that used to be about annually, I think last time I, Running ops.
If ever. It's the classic, isn't it? You should do it twice a year or once a year, but everybody knows, ah, it's alright, it probably won't work. Or it becomes such a lightweight exercise when you do it that you don't really learn anything. [00:04:00] And then the worst of the thing that happens is you might do it, you might do it quite seriously and somebody writes a report And then nobody does anything about the report, or they, or when it came to some really hard decisions about things, like maybe things needed to be recontacted, or maybe actually we needed a big investment here to deal with that situation.
All of a sudden the can gets kicked down the road and that sort of stuff. So I think what, something like you described, does do it, it breaks processes that were prone to break anyway. And it would, and it would certainly shine a light on that. I think COVID as well, like, especially that early, early period of immediately having to work from home and all of a sudden, All of those projects that were running globally where this is even, this is hard to believe, I think now, but there were telephones on desks in offices pre COVID.
Yeah. Like real, like proper, you picked up a handset and it had a cable, you know, like a spiral cable [00:05:00] attached to a box on your desk with buttons on that you could ring somebody up. And, I remember, I remember not, I mean, it wasn't a year before, it was maybe five years before. We were trying to take that out of the organization I was working in and the amount of resistance that there was.
And that's my point. To take phones off desks. What, what an instant like the CrowdStrike one does is, is remove the resistance instantly because there was such a dramatic impact that any of the arguments that were in play to not do it suddenly get removed. Yeah, yeah. There's nothing like a. A sense of urgency.
Yeah. So in terms of timing, this is brilliant because it's budgeting season. It's September. We're all going for budgets. So you either can showcase, look how well we've done. If you want to do even better next year, give us more money for X, Y, Z. And if it went completely wrong, you have a really good sense of urgency to get more budget.
So I think a lot of people in planning and budgeting are, well, they have a good case now. Well, there's nothing like an existential threat to focus the mind, is there? [00:06:00] Indeed. Indeed. Okay. Let's move on to our main subject of the day. Why don't we start Petra just by understanding a little bit of your background that will help us, I think, navigate a difficult subject.
So tell us a little bit about your background as a refugee lawyer and on from there and just give us some context. Sure. So my official training is as a refugee and immigration lawyer and also a social anthropologist. And I'm also somebody who's crossed borders myself, I'm sure, like many listeners, and I've been working on migration issues since about 2008, but not in tech.
You know, I was doing a lot of frontline litigation in immigration detention cases, and then looking at other issues around gender and migration, but I kind of found myself at the intersection of tech and migration very much 2018. Right, right. Before we go on to the very serious subject, it might be worth just pointing out that Rob has crossed a load of borders.
Now, Rob, [00:07:00] when you, when you go for Where's this going? I don't know where this is going, Dave. When you go for border crossing, you like to be prepared, right? Oh, now that's a Yeah. Yeah, no, there's a, there's a, there is a preparedness I do appreciate when traveling, especially internationally, when I'm asking to receive entry to someone else's country, I make sure that paperwork is, is potentially in good order.
Unlike you, which just very, turns up and goes, Rob, where do we go next? Yeah, that is true. But like, it's not really an intersection of technology and borders for you, Rob. It's more of an intersection of printed out stuff in a plastic wallet. I have this fear when I turn up, my phone's run out of power. I got everything printed and ready and you know, all that good stuff.
Anyway, back to the, back to the main story. I couldn't, I couldn't. I couldn't resist because I've crossed a number of borders with Rob and it's always a fraught experience. Prepared experience, David, prepared, not fraught. And, and how then, how then did it start to intersect with technology and What digital [00:08:00] has to offer a truck?
Yeah, so very much. You know, it's one of those stories that happened by accident for me back in 2018. We held an event that work in Toronto and Canada kind of broadly looking at power and technology and how new projects are impacting vulnerable communities. And I started asking some questions about well, what's happening on the migration side.
Right. And my colleague Lex Gill and I came across some really concerning information from the Canadian government utilizing automated decision making tools without public knowledge or a lot of you know, any kind of criticism or open conversations about what they were actually doing. And so we wrote a report about it called Bots at the Gate, which came out six years ago and I wasn't really expecting much, you know, I thought it would go out into the ether, like these reports do, right?
And I kind of move on with my life. Like 12 downloads and be like, what? Exactly. Very, very typical for, you know, an academic and a lawyer to get that kind of stuff. But yeah, it got a lot of attention and made its way all the way to the UN. [00:09:00] And, and for me personally, it kind of Open up my eyes to this new intersection of trying to understand how technology and new tools are impacting people on the move and people's experiences as they cross borders.
And then it's been a wild journey since. Now, what, what in that paper with, with the things that were particularly catching people's attention? You know, what was the, what was the surprise? Yeah, you know, I think the surprising part was that it was one of the first, if not the first, paper to use a human rights analysis on this technology to say automated decision making and algorithms in immigration infringe people's right to privacy, right to equality and freedom from discrimination, even right to life.
You know, for me back then, to be totally honest, again, I'm not a tech person, like I did all the immigration and human rights analysis. And my colleague did the tech analysis. I didn't really know what an algorithm was. We're talking like Wikipedia level knowledge, you know, um, but I learned a lot as well.
And [00:10:00] since then, you know, I think it provided a lens and an entry point into this topic that I've been trying to understand. But that's it. That's a great way to to learn it, I think from the lens that you're looking at, because if you're born in technology and you, you, you grow up through it, so it was your core discipline, then you're sometimes enveloped by the fact that that's what you do and that's what you're used to.
So you don't, you're blind to almost the consequences that you might be creating a classic. One is people who created the algorithm. didn't always foresee the damage it may do to society. And there's lots of that. So coming in from the other direction gives you a completely different perspective. And I think that probably adds a lot of value on the conversation that we're having because so many in tech don't appreciate the positive and negative consequences of what they're about to undertake, or rather they understand the positive and don't want to think about the negative, I would say.
Yeah. And thanks for saying it that way, because, you know, I remember in the early years of, of trying to do this kind of tech and human rights work, I felt almost shy, like I shouldn't be in this space or that, you know, [00:11:00] I'm not an expert on algorithms or machine learning. But, you know, Maybe I am an expert on the human impact right and then ultimately that's that's what it's about to try and illustrate what this technology is doing to real people on the ground.
Very well put Rob. The consequences and impact even in basic interface design and user experience design and customer experience design. They're thought about very specifically sometimes and your great experiences are created. You know this whole industry around creating excellent customer experience and then sometimes.
They're not thought about at all, and it just comes off as clunky. There are certain bits of software that, you know, I will not, I will not name who still can't get an interface design working in the year 2024. And we've got drones and almost flying cars, and yet some of these interfaces look like the worst thing in the world.
And then you've got the unintended consequences, I think, that that you're [00:12:00] describing here. And I think as you, as you, I think, developed on from the paper, you developed onto a book, I think, The Walls Have Eyes. Do you want to just tell us a little bit about that and then we'll wade into some of the observations?
Sure. So I've been working on this book for, for six years as a result of this paper that I mentioned back in 2018. And this is kind of where my lawyer hat is replaced by my anthropologist hat. I've always been trying to work from an interdisciplinary and a global perspective to. have a comparison kind of as the starting point for analysis.
And so I thought, okay, well, if this is happening in Canada, what's happening in other places. And in 2020, I ended up going to Greece and, you know, as a good little ethnographer, I thought I would stay for two months and I stayed for two and a half years. because it was one of the epicenters of a lot of this kind of technological development and also a jumping off point for some of the other case studies and stories that are in the book.
For example, from the Kenya Somalia border, also the U S Mexico corridor, occupied Palestinian territories and other contexts as well. [00:13:00] And yeah, it culminated in, in the walls of eyes, which was published earlier this year. And yeah, I mean, it's been, it's been an amazing journey, a hard one, but ultimately I think the biggest.
privilege of my life to try and understand this complicated topic and also hopefully to to give space to people's stories as they are finding themselves on the sharpest edges of this technology. Yes you are, you are shining a light on an area that I think you don't necessarily normally consider in the in the way that you and Rob were just uh just on taking it there.
It is fascinating. So what are the big conclusions that you are coming to in the book? And you used the phrase when we were talking about this conversation of the borders industrial complex. I wonder if you could just unpick that a little bit for us and explain to us what that actually means. Sure. You know, so essentially at every point of a person's migration journey, they're now impacted by tech.
And a lot of this tech is high risk. It's unregulated. You know, it's things like drones, thermal cameras, uh, the kind of. quote, unquote, traditional [00:14:00] surveillance equipment. But it's also more draconian and experimental projects like a lie detectors, robo dogs, other types of really scary tools. But all of this is happening because of two kind of major trends.
One being that states oftentimes think of migration as a problem. And people on the move, refugees, people who are seeking asylum as a threat, either to national security or identity. And therefore, if people who are crossing borders are seen as a problem, the private sector has very cleverly decided to offer a solution.
And the solution oftentimes comes in these kind of technological ways. The other element to it, as you mentioned, is that it's very lucrative. Private sector actors make a lot of money in this. And it's what people like Todd Miller, who's an amazing journalist in Arizona, I would urge you to check out his work.
He's been calling it the border industrial complex. And this is like, we're talking a multi billion dollar industry that has sprung up now offering these kinds of quote unquote [00:15:00] solutions to governments that are frankly, draconian and also infringe on people's human rights. And yet it makes sense when you start thinking about it from this kind of bottom line perspective that there is a quick buck to be made at the expense of people's lives.
And I thought the robot dog example was, it was particularly resonant to me because, you know, we've all seen pictures of those robot dogs doing backflips and stuff like that, right? And you're like, God, look, it's a robot dog, you know, and you're coming at it from that angle. But facing down a robot dog?
must feel somewhat different to that, I would imagine. Have you had, have you heard firsthand stories of interacting with such things? Yeah, I mean, the robodogs, I think, are probably one of the most visceral examples that I've come across and people have come across in this space. And it's also a very surreal piece of technology when you start thinking about it and contextualizing it at the border.
And in one of my most surreal experiences, and yeah, if you, if you choose to read the book, you'll see I've had many over the years. [00:16:00] I've been really lucky because at the U. S. Mexico border, I work quite closely with search and rescue groups or Samaritan groups that go into the desert to drop water, to assist people in distress, sometimes also deal with human remains.
And interestingly, there are also a lot of them are retirees, you know, they're 70 and 80 year olds who instead of being on a sun lounger are going into the Sonora desert. I find that really fascinating. But I was with this one group. in the Sonora in Arizona, and at this very moment in February of 2022 is when the Department of Homeland Security announced in a public press release, which is still available and you can Google it, that robodogs are going to be joining this kind of global arsenal of, of border tech.
And they even said something along the lines of that the robodogs are going to be lending a helping hand or a paw to border enforcement. There's something so disturbing about that because when you're in the desert and you're witnessing and seeing places where people have passed away, right? Like I've been to some of these spaces and then to imagine that people like that will be chased [00:17:00] by robodogs.
I mean, it's just something really, really disturbing and visceral about that. It's technology shows no empathy, isn't it? It's that point about, it's a very complex situation and just throwing technology at it does have, has no human factor at all. And you think, You know, it's the, it's the brutality of the situation, which is the world.
Just, you know, it's getting it's up. I've said this before, but it's capitalism, you know, romping home with a problem and making removing the human touch and just saying, right, we're going to deploy technology and we're going to forget what happens on the other side of that border type thing, isn't it?
It's the lack of humanity is quite alarming in many respects. Yeah, absolutely. And that's a trend that I've noticed, you know, in the hundreds of conversations that I've had over the course of, you know, the last six years, this kind of feeling that people share with me of, of being dehumanized and even further dehumanized than they already are, right?
Because so many immigration systems are historically [00:18:00] discriminatory, racist, sexist. I mean, there's so many issues that happen in spaces at the border and to think that more and more technology is kind of creating this barrier or veneer between our common humanity. is what I really struggle with. Not to mention from a legal perspective too, right?
There's this kind of diffusion of responsibility that can happen when we start introducing any kind of automation. Because then the person who's making the decision can say, well, you know, I'm just following orders from this algorithm or the system that I'm using. Right. And once again, creating a barrier between the person that you're dealing with.
As a human being. And instead of seeing them as a data point that you have to analyze and deal with. That's a very good point. There's plenty of examples where people hide behind the algorithm and say, well, the computer told me to do this, but apply to this has some very serious consequences for, for people, real people.
Yeah. Yeah. And it's, it's where things like, you know, we, we've talked about things like data bias on the show before, where, you know, you might have. [00:19:00] AI, and we'll come on to AI of course in a second, but we, you might have AI making decisions about certain things with, with a robust logic that is biased by a data set and without having a human in that process somewhere.
I think Rob, probably to your point about the lack of humanity in it. The risk of that must be very high. And, again, until we, until we discussed this conversation before we had it, I hadn't quite thought about, I hadn't put myself in the shoes of the person who might be facing down a decision making system like that.
I wonder if you've spoken to people that have been in that sort of situation. Yeah, absolutely. And that's really what the book tries to do. You know, it's, it's, it's not academic. It's, it's more of a story driven kind of narrative, because I think that's, that's precisely the thing that's missing in conversation.
Like how are real people feeling when they're interacting with these systems, these projects and, and what's left out of the conversation. You know, I mentioned one kind of, project or class of projects, [00:20:00] which broadly could be called AI lie detectors. If we will, I mean, we're simplifying it a bit, but essentially they were these pilot projects that were developed in the European union under the horizon 2020 research scheme that used facial recognition and micro expression analysis.
to make determinations about whether or not somebody is telling the truth at the border. I mean, we can have a whole conversation about whether or not this even works, right? Like there's been debunked largely as like snake oil tech, but from a refugee law perspective, maybe just even a human perspective.
Like how can an AI lie detector deal with differences in cross cultural communication, for example? I've represented people in court who didn't make eye contact with a judge of the opposite gender, maybe because of their religion, maybe because of their experiences, or maybe because of nerves, or, you know, What about the impact of trauma on memory and the fact that we don't tell stories in a linear way anyway, like human decision makers struggle with this already, how can a human create a piece of technology be any better?[00:21:00]
If anything, it's going to amplify the kind of biases around human behavior that already exist in our society and create new ones right now. What do you see as the ripple effects into communities that are impacted by this? Like, we're probably all lucky enough, at least thus far, to have not been impacted in a, in, in one of these quite horrible sounding situations, but of course there are communities that are riven with people that are impacted by this, so what do you see?
Yeah, you know, I mean, in my work, I think there is the importance of keeping a close eye on the border and on migration spaces is important because that contextual specificity I think shows us the like vast human rights impacts there. But it doesn't just stay there. Absolutely. This kind of technology bleeds over.
into other facets of public life already. And even some of the sharpest examples, the robodog that we were talking about a year after the Department of Homeland Security announced that it was going to be at the border, the New York City [00:22:00] Police Department announced that they were going to be using robodogs on the streets of New York City.
One was even painted white with black spots on it, like a Dalmatian, right? That's just one example. I mean, we've been seeing, you know, facial recognition used in public spaces and sports stadiums. There's welfare algorithms that are being used, sentencing algorithms, even predictive policing. So it is making incursions into facets of public life, while still disproportionately impacting communities that have been historically marginalized.
If you think, I mean, I was about to bring up the point, it's not long before the technology is on the border and it gets turned inwards, it's already happened. It's like, I mean, it's like you sort of go, what does that actually mean? Because again, the same thing, I mean, maybe that brings it into sharp focus for the citizen in the country that may not be totally aware of what's happening on the border, that suddenly they see them on the streets and it won't be long before there's a story that has a problem associated with it where it didn't quite go to plan.
And again, Doug's probably [00:23:00] introduced. with the best of intentions from a perspective of we're trying to police and protect, but actually you can feel something bad is about to happen. And do we want that again? Because again, it won't show humanity to the, Well, I thought you, I thought you pointed out the fact that they painted it like a Dalmatian.
It's interesting. Trying to make it friendly. Trying to make it cute looking. And maybe this is an obvious, uh, I don't know if it's like a sci fi movie parallel to Drawbarr. It reminds me of the scene in Robocop with Ed 209. And he comes in and it's like, you know, you have 20 seconds to comply. Oh yeah. And I'm not trying to trivialize the situation.
It's like, that's the immediate imagery that occurs in my head. When you start thinking about it through this lens. And it's the, I mean, people have predicted this for ages, that once you create the technology with the best of intentions, it often gets turned to other use cases. And it's been created, right?
So the genie's out of the bottle. And this is an example of robotics. Everybody thought, oh, you know, we'll create it. And then people find the dastardly, um, deployment of it. Yeah. This makes me think of that other point, [00:24:00] Dave, that you made about the priorities, right? Like whose priorities underpin what we innovate and why?
And to me, you know, as someone who's been trying to understand this from a global perspective, I just can't help but ask, like, why are we developing robodogs and AI lie detectors and not using AI to root out racist border guards or audit immigration decision makers, right? Like there's a clear normative choice there that is positioning this technology against a particular community, right?
And we then see this play out in other facets of life as well. What's the fundamental if you took all the money that's being spent on this tech and turned it into something far more progressive? I mean, you mentioned it's billions. You can do a lot with that amount of money, can't you? It's the who's deciding that route and who's overseeing them and how are they influenced.
It's that classic who's lobbying who where, so the budgets go to the wrong place. I think it is following the money. I was wondering, Petra, how open was it in your [00:25:00] research? Because it, it, you know, it involves a lot of money. So a lot of people really, you know, rather not have negative connotations around their technology.
How is that for you coming in and researching that? Yeah, you know, a lot of actors don't want pesky human rights lawyers sticking their noses where we're not welcome. So there were definitely elements to the work that were difficult, both from a safety perspective, but also from an access perspective.
But I will say, it's changed as well, you know, over the last few years. Because when I was doing a lot of this work, trying to understand the private sector elements of it back, let's say in like 2021, 2022, it was honestly a little bit more open than it is now. You know, I could go to the world border security Congress and, and Defea and these other big, you know, multinational conferences that are out there where private sector actors are selling their wares to the government.
And you walk into these giant halls, right. And you see like tanks and robodogs and drones and all sorts of companies. kind of selling, selling their stuff. [00:26:00] It's not like they would be super friendly once they saw a researcher, you know, on my badge and not private sector or whatever, but at least I could be there, you know, and, and press was also welcome.
Although these days it's become very, very difficult. I have press colleagues who are not able to get credentials, for example, to go to these spaces at all. And to me, that shows again, that. You know, with the increasing critique around this and legitimate critique that's based on established legal norms, right?
It's not like we're just pulling this critique out of thin air. We're trying to ground it in human rights jurisprudence and international law. Like once you start critiquing it in this way, powerful actors are clamping down on even just the transparency element of what is being developed. You raise a point about press access, which is quite important, which is transparency's a key part to, you know, an open and well functioning society.
The director of the film, Civil War, was interviewed, and where did the idea from that film come? It was playing out, if you take the base situation that's occurring and you keep amplifying it, you get to this dystopian future where, you know, you [00:27:00] get this authoritarian type of thing. approach to the world.
And that, that creates the, the issue that the film plays off. But it is that you can see these, these things that we held so dear, like free access for the press being denied. And you're like, well, is this the first couple of steps towards something that's much darker? Or is it, you know, how do we, how do we defend against, I suppose your book is a good example where you're at least raising awareness about this sort of stuff is going on.
And it's quite important that people understand it. And so much of it is, is ultimately about the politics of fear, isn't it? And pitting ourselves against one another and, and utilizing. communities on the margins as the kind of scapegoat for this. I mean, if you look at the UK government, the US government and conversations at the EU level, right?
Migration has become this, this specter that's been so politicized in very particular ways by everyone along the political spectrum, right? Instead of actually seeing each other as human beings, it, it, animates these kind of logics of, well, again, if migration is a threat and these people are frauds, [00:28:00] they're the ultimate other that have to be pushed away and made intelligible, trackable, and knowable.
And all of a sudden you have this lucrative technology, right, that will help you do that. It starts kind of making sense why we are seeing this kind of incursion of tech into this space, again, without conversations about people's human rights. and the kind of normalization of it almost as the inevitable, right?
Without public transparency and accountability, or even a conversation like, are we okay with robodogs at the border or on our streets? We've not had that conversation. Well, it's very difficult to have any nuanced conversation these days. And technology is, I mean, that's an impact of technology as well, where we just like to shout at each other on social media instead of actually maybe debating what we should actually be thinking about.
But I wonder also whether, Much like a lot of commercially driven tech innovation right now, it's just quite ungoverned. Yeah. And it's unregulated almost, and that's sort of fine when you're talking about like productivity apps or, you know, [00:29:00] ERP apps or something like that. But, When you start to talk about existential technologies and dangerous technologies, that borders very quickly into it being quite unacceptable.
And we've had the conversation on the show quite a bit about AI sort of romping forward in a pretty unregulated way. Uh, in a commercial arms race and of course there's tons of upside and we talk about the upside of it a lot. But actually there is also an ethical issue. There's a deep, deeply moral issue around the impact it's going to have.
And I think what you're poking on here quite rightly is, you know, not only are these impacts terrifying in a lot of ways, but they're happening now. Like this isn't a, this isn't when, you know, we're not talking about Robocop, we're actually talking about something that's happening like right now. And that is, it does feel like it's crept up on us somewhat, no?
Yeah, definitely. But I also don't think that's, that's an accident because I think spaces like the border [00:30:00] and spaces of humanity or an emergency like refugee camps have been these kind of laboratories of technological experimentation where things are tested out largely without public scrutiny or sometimes even care because while it's happening so far away from us, like how can we identify with people who are there and then also, you know, the kind of political fear mongering of.
You know, all migrants and refugees are frauds. Therefore, why would we even engage with that population? But again, that allows for the border to become this laboratory of experimentation, normalize the tech and then use it in other spaces. And that's precisely the kind of trend that we've been noticing over the last number of years.
In the work that you've done, and maybe to bring our conversation to a bit of a conclusion and ideally, maybe just bring some hope into the conversation, what are you seeing in terms of what good could look like? And what does the conversation around that, where is it up to? And you know, what can people do to help with that?
Yeah, and thanks for that, Dave, because I know the book and my work generally can skew a bit dystopic [00:31:00] because it's trying to lay out, yeah, an evidentiary record of what's what's happening. But that's also only part of the story. Yeah, there's always, you know, I've been to so many difficult, bleak places while writing this book and seen some really difficult things, but I have to tell you, at every single border or in every single emergency situation, there's always people who choose to help and show up.
And. kind of try and resist what's happening, whether it's the 78 year old Samaritans who are in Arizona or Polish farmers at the Polish Belarusian border, you know, or lawyers supporting people on the move. I mean, there's so many amazing communities and individuals who are saying, no, this is wrong. And we want to either shed light on it or fight against what's happening kind of at this intersection of, of tech and migration and human rights.
There's been some interesting, you know, legal challenges that are coming up and then kind of utilizing the legal frameworks that exist or trying to push new laws like the European Union's AI Act to be more cognizant of how AI can be hurting people. But there's also, [00:32:00] I think, another side to it where.
Technology in the migration space can actually be used to empower communities and give people access to more information or psychosocial support. I've been seeing some of these projects grow in one of the projects that we incubate at my main job, which is called the Migration Technology Monitor. And there are people on the move themselves, refugees, for example, from Venezuela or Sudan are developing tech that actually is for their community, by their community. Whether it's chatbots or psychosocial archives online, or just even information campaigns for migrant workers. Is that because the democratization of technology has helped them get their hands on tech that they can then leverage to help their situation?
Yeah, that's definitely part of it. And I think it's also part, partly about recognizing the expertise that's already there in, in mobile communities too. Of course. I mean, for example, one of my colleagues. His name is Simon Drotty. He's a data analyst and a refugee who lives in a Ugandan refugee camp. And he coded an entire app by himself.
I mean, it's, [00:33:00] there's so much knowledge and amazing learning that has to come from the ground in order, I think, for the conversation to shift and for us to actually look at who's around the table when we innovate and, and what kind of solutions are even proposed. Because if that happens so far away from places like refugee camps or borders, it doesn't actually capture the kind of complex reality that people are living through.
What would you, if you had a, a mobilizing hand around something, what would you do? What are the couple of steps that you would take to, to try and move the needle on something like this? I, there's a few, few things. One would be actually drawing up some, some legislation and some governance guardrails around tech because, you know, I, I know there's this fear around stifling innovation, but I actually have to say some innovation needs to be stifled at this moment because it's hurting real people.
And so whether we need a red line under robox or ai lie detectors or autonomous weapons, right? Used in conflict. I actually do think we need to have conversations about [00:34:00] a ban or at the very least, a moratorium, no, actually a ban . I, I mean, it's, it, it's, it's a point well made that, and it reminded me of the, um, the, uh, the letter that I know a hundred.
Top AI thinkers wrote, was it about a year or two ago, where they kind of wrote an open letter that said we should actually pause, we should actually pause development and innovation around this until we can understand what's happening. Now, that was unfortunately undermined by Elon Musk opening Grok about three weeks later and therefore it looked actually, looked like more of a Disingenuous, didn't it?
Disingenuous position. I welcomed that letter at the time because I did think somebody needed to make a stand on something like that. Yeah, and it must be coming from the private sector as well, right, because if it's just a bunch of human rights lawyers, you know, or affected communities, I mean, this is obviously where the expertise and then the power really needs to be coming from.
But we also don't have access to the kind of brokerages of power, right, that happen [00:35:00] behind closed doors when you have private sector actors meeting with industry meeting with the public sector. A lot of this critique has to actually come from within the industry as well. The other thing I would say is, yeah, paying attention to where conversations around innovation are happening and who's around the table and why, again, affected communities are an afterthought at best in these conversations.
I think we actually need to be building brand new tables altogether, ones that are led by people on the move in this case, but people from affected communities and other cases as the starting point for expertise and the kind of information that needs to be there at the outset, not again, you know, down the line in the lifecycle of a project, but from the very beginning, I think that's how we kind of shift the needle on how technology is developed and what it's being developed for. Again, kind of centering the human experience in all of this.
[00:36:00] Yes, so I think Petra is actually showcasing the exact point that I was wondering about these days. I love storytelling, I love narratives, and especially when we're talking about technology, right? I said that before, like media archaeology, all the stories that come with technology, they're dystopian or utopian, but there's not a lot in the middle.
And I think that the nuances that we also discussed, I think that's so key. But now with the speed of AI and you see so many like the huge tech companies, we now see Zuckerberg, but we also saw like Mark Benioff, all the big leaders are now talking about all the possibilities of AI and it's going to bring us so much joy and a better world out there.
And I'm really wondering if with that speed and you have You're an author, you're a researcher. It takes months, it takes years to do dedicated research. And now with the blink of an eye or [00:37:00] a tweet, a new narrative is out there. So how do you look at that with the speed and, and, you know, trying to get to that nuance situation?
That's such a good point. And it's one that keeps me up at night a lot because yeah, when you work on issues that are so fast developing, and then even just from yeah, trying to understand the kind of breadth of the different tech that's used at the border. I mean, oh my god, there's something new coming out nearly every day, right?
And you can't help but again, you're kind of catching up to this, this, this, It's extremely quick commitment to development when it comes to tech, right? And I'm worried about that. Yeah. Because books take a really long time to put together. Even from the final draft, you know, it takes about a year and a half for the actual book to come out and it immediately is outdated, right?
But I guess for me, you know, the technology is a lens through which to understand power and power differentials. And that remains the same, no matter. what projects have kind of come out, you know, since putting pen to paper, so to speak. But it is, yeah, there is that kind of tension there [00:38:00] when you're trying to put together an evidentiary record or tell a story in a way that captures a particular moment in time or an experience that a person is having.
while the kind of churn of innovation happens so, so quickly. So, but I think sitting in that tension is also okay because, you know, it is complex and it is complicated and no story is ever complete or perfect anyways. Right. And particularly in what you were looking at, but like the root cause issue doesn't change, does it?
So the, the, the, a technical spiral that's exacerbating perhaps the situation doesn't really change the car situation or the car situations valid. It's just being amplified now by certain uses of brute force technology. Yeah, no, absolutely. And paying attention to the root causes, in my case, you know, root causes of migration is really at the heart of it all.
Like, why are people forced to leave their homes? You know, this is this kind of foundational idea that I think we forget about, but Most people don't want to be refugees, right? People don't want to leave their homes because [00:39:00] of war or conflict or environmental degradation. They oftentimes have to. And they might even have economic reasons to do so.
Like who cares? Who doesn't, right? I mean, I know this might be a bit controversial for a lawyer to say because we have these kind of rigid categories that you're supposed to fit into, but that's not how human life works, right? We all contain multitudes, so to speak. And, and I think that's a good thing to kind of sit in that complexity.
It's always the point about the judge, isn't it? The human who can. preside and use decision based on the nuance in the situation, whereas an algorithm won't do it. It's a good point. I think going back to the storytelling point, when you have a very complex topic, sometimes bringing it to life through a story and storytelling brings realization to people and it's a way, it's a function that you can use or a mechanism I should say to use to widen people's understanding.
on what is a very complicated thing. I think back in the tech industry when the Phoenix Project was written by Gene Kim, he described a story about the world which was bad and how it [00:40:00] could be good. And that book has fundamentally changed the way the tech industry thinks about it. And I think you do need the deeper stories.
I know it takes a lot longer to bring out, but they have, I think in the end they have so much more power when they resonate through a story. Yeah, definitely. And that's, yeah, that's precisely why The Walls of Eyes is full of stories so that it's not just. Another kind of dry academic text policy analysis, but it centers the human impacts of all of this, and maybe in a more subtle way, although I do talk about it a little bit.
There's like a little author's note at the beginning. I think storytelling is also an act of resistance and slow storytelling to this kind of deep, you know, concentrated methodology of showing up to places again and again and again and building relationships with people. This is where, you know, the ethnographic way of working, of spending a lot of time in a place with people who are impacted, I think can be quite powerful because it's almost like an antidote to this kind of move fast and break things ethos, which I do think is maybe changing a little bit, [00:41:00] um, in the tech space, but it still is this kind of like runaway train idea.
So I think slow storytelling is actually. a good method of resistance to that as well, slowing down and spending time with one another. Well, I mean, what a subject. And I think of all of the pods we've done, it's hard to pick one that's had such sort of a meaningful emotional resonance in terms of the impact of of technology, albeit in an extremely dark and serious way.
So Petra, thank you so much for spending time and sharing your research and insight with us today. Thank you so much for having me. It was an honor. Now we end every episode of this podcast by asking our guests what they're excited about doing next, and that could be you've got a great thing planned with some friends at the weekend, or it could be something in your professional life.
So Petra, what are you excited about doing next? Well, I'm excited about starting my second book, which is actually going to be on the joys and mechanisms of resistance. when it comes to tech. And I'm also excited to walk my dog in the park later. Not a robot dog, a real dog. [00:42:00] You've actually got one. It comes on.
Yeah. Yeah. Yeah. That would be, that would be the deepest ironic ending. They are really friendly. Well, have a lovely weekend and thanks again. Thanks so much. Thank you so much, Petra, for joining us this week. And also thanks to our sound and editing wizards, Ben and Louis, you're doing an amazing job and Marcel, God, what would we do without you?
And thank you all our listeners. very much. We're also on LinkedIn, obviously. So send us a text or a DM to let us know what you think of the show. If there's anything you'd like to add or, you know, what comes up in your mind, email it to cloudrealities@capgemini. com is also an option. And of course, if. If you haven't already done it, subscribe, please.
See you in another reality next [00:43:00] week.