Technology Now

Alongside tackling global warming, conservation and protecting the diversity of our natural world is a global priority.  AI plays a huge role, from analysing satellite imagery of reforestation efforts to identifying wildlife from acoustic scanners or trip cameras.

However, there’s an issue in these biomes where many people live - these efforts are taking place without their permission. Privacy, data protection and individual rights can potentially be sidelined in the quest for a tech-driven solution to a global problem.

Joining us today is Joycelyn Longdon. She’s a PhD Student in the Department of Computer Science at Cambridge University. Her research is around looking at the technical, AI-based solutions to environmental protection, and respect local populations, whilst trying to protect our natural habitats.

She also runs Climate in Colour, an organisation dedicated to making conversations around climate more diverse and accessible.

This is Technology Now, a weekly show from Hewlett Packard Enterprise. Every week we look at a story that's been making headlines, take a look at the technology behind it, and explain why it matters to organisations and what we can learn from it.

We'd love to hear your one-minute review of books which have changed your year! Simply record them on your smart device or computer and upload them using this Google form: https://forms.gle/pqsWwFwQtdGCKqED6

Do you have a question for the expert? Ask it here using this Google form: https://forms.gle/8vzFNnPa94awARHMA

About this week's Guest, Joycelyn Longdon: https://www.cst.cam.ac.uk/people/jl2182
Climate in Colour: https://climateincolour.com/
2021: Longdon, J. 2020. “Environmental Data Justice.The Lancet 4 (November). DOI:10.1016/S2542-5196(20)30254-0.
Technology Untangled Season 4 Episode 1 - Unconscious Bias: Is AI dividing us? https://link.chtbl.com/TechnologyUntangled_401
Global competition for a limited pool of technology workers is heating up: https://www.imf.org/en/Publications/fandd/issues/2019/03/global-competition-for-technology-workers-costa




Creators & Guests

Host
Aubrey Lovell
Host
Michael Bird

What is Technology Now?

HPE News. Tech Insights. World-Class Innovations. We take you straight to the source — interviewing tech's foremost thought leaders and change-makers that are propelling businesses and industries forward.

Michael Bird (00:00):
Hello, hello, and welcome back to Technology Now, a weekly show from Hewlett Packard Enterprise, where we take what's happening in the world and explore how it's changing the way organizations are using technology. We are hosts, Michael Bird...

Aubrey Lovell (00:23):
And Aubrey Lovell, and in this episode, we're taking a look at Bias in AI.

Michael Bird (00:28):
In this episode, we are taking a look at Bias in AI. It's an area we've covered in this podcast before, as well as on our sister podcast Technology Untangled. But this time, we're taking a different approach because we are looking at how AI can be biased when it comes to climate change and conservation.

Aubrey Lovell (00:47):
It's obviously an area that doesn't get a huge amount of tension, certainly not as much as it deserves, but we've previously learned on this podcast that bias in equals bias out, and that might mean that when we're using AI modeling to prepare solutions for local climate and environmental issues, we're at risk of not respecting factors and frankly, communities who aren't represented. In this episode, we'll be looking at why and where current AI models are potentially going wrong, what can be done to fix it, and the ways in which AI can be used to help preserve our planet. We'll also be turning to you, our audience, for your questions to the expert and your recommendations on the books, which have changed your year.

Michael Bird (01:26):
Sounds like this episode is going to be a must listen, as incidentally, is episode one of Technology Untangled Season Four on Bias in AI. That's where we've got the line on bias in equals bias out, and it's well worth the listen. So, if you are the kind of person who needs to know why, what's going on in the world matters to your organization, then this podcast is for you. Oh, and if you haven't yet, do make sure you subscribe on your podcast app of choice so you don't miss out. Right, Aubrey? Let's get on with it.

Aubrey Lovell (01:56):
Let's do it.

(02:00):
Okay. So, AI is being used in every part of our lives from chatbots to self-driving cars, and we've talked before on a previous episode about how it's being used to inform weather predictions and long-term climate models. It's also being used in surprising ways on a much more grassroots level, quite literally take countering deforestation, for example. Increasingly, drones and predictive analysis are being used to track damage and provide reforestation efforts with simulations to plan out planting and even suggest the best tree and plant types to suit the soil. But often, this isn't being done on a large governmental level. It's being done by smaller organizations on the ground, making use of increasingly accessible AI tools and compute.

Michael Bird (02:45):
But, AI tools are mathematical solutions. They don't necessarily take into account non-mathematical factors such as the impact of their solutions on local communities and farmers. So, how serious is the problem and what could be done to bridge the gap? Joining us today is Joycelyn Longdon. She's a PhD student in the Department of Computer Science at Cambridge University. Her research is looking at the technical AI-based solutions to environmental protection and respect to local populations whilst trying to protect our natural habitats. She also runs Climate in Color, an organization dedicated to making conversations around climate more diverse and accessible. Thank you so much for joining us, Joycelyn. First off, can you just tell us a little bit about your research? What got you on the track of looking at AI bias in conservation?

Joycelyn Longdon (03:38):
So, my research is part of a PhD program here at Cambridge called AI for Environmental Risk, and it's a program that trains doctoral researchers in machine learning techniques, but the application of those techniques to any environmental risks. So, I am working on justice-led bioacoustics, and essentially, what those three words mean is working with forest communities and using acoustic technologies. So those are recording devices that we embed in the forest to listen to the forest, and usually you would analyze those recordings with machine learning algorithms in order to understand the populations of species in the forest, how their populations were changing based on different times of the year, based on different environmental factors, and based on anthropogenic activity. But I'm coming at it, as I said, from a justice-led perspective. So, really building up the whole research design and the design of the technologies that I'm building with community members, understanding what the opportunities are with this technology, but also what the possible harms and concerns are for communities on the ground in order to create an environment where we can use technology for biodiversity conservation, but not at the expense of local forest communities.

Aubrey Lovell (05:00):
What does AI bias look like when it comes to conservation?

Joycelyn Longdon (05:04):
AI more and more is having a larger role in the decisions that not just conservationists and research institutions, but policymakers are making on the ground for decision-making. So, we can take an example of my master's thesis, which was looking at deforestation rates using satellite imagery and machine learning to understand deforestation within a conserved area. Now, these sites are motivated by creating carbon offsets for European companies and are largely informed by these techniques of using satellite imagery and machine learning to understand changes in forest cover. So, a lot of the time we're seeing actually the proof of concept or the continuation of programs being guided only by the outcomes of a machine learning algorithm and not the outcomes on the ground.

(05:57):
So, a research project found that at that same site that had been marked as successful in forest conservation, had actually not transferred any of the necessary monies to community who were safeguarding that forest, that the ability for women to provide sustenance for the community was taken away because they would usually farm in between the trees in a sort of agroecological sense. But this was sort of banned as part of that program, and this led to tensions between villagers and communities around the region. And so, whilst in other spaces of AI, say with hiring or the justice system, there is something to be said about the direct output. It's more so a kind of multifaceted approach about what data goes in and what data doesn't. Many decisions are being made, financial decisions are being made, and environmental decisions are being made without consulting the people whose lives it's going to affect.

Aubrey Lovell (07:04):
One of the examples you've given in the past is around acoustic sensors placed in the forest and the way their findings are interpreted. Can you tell us a little more about that?

Joycelyn Longdon (07:13):
Yeah. I think this really has to do with the very Westernized skew of conservation. Now, if we were to put these sensors in a national park, say in the UK, no one lives there. We go there for leisure, and usually, it is not encroaching on anyone's home privacy. Now, forests around the world, especially tropical forests, and those that are the most biodiverse are people's homes. And so, when we embed sensors and technologies that surveil and record, we are putting those in people's homes. So, it's akin to someone coming into your house or to your garden and putting up a sensor for research that would change the way that you engage with your house and with your garden. It would change the way you would converse with your friends and family. It would change the way. It would change how safe you felt in that environment.

(08:03):
So across the board, not just in bioacoustics, but with camera traps, drones, all of these technologies that are really being embedded in people's homes and lands, and that violate certain levels of human rights and rights to privacy and rights to consent, and also agency over how that data is used are often overlooked and are important in contexts that are not Western.

Michael Bird (08:27):
Is this an aspect of the field particularly well understood, do you think?

Joycelyn Longdon (08:31):
I think there are more and more conversations happening and have been happening for the last decade or two about the colonial history of conservation and the ways in which conservation has violated the rights of many communities around the world, and especially in the tropics. I think the difficulty comes in implementation, and I don't think there are enough examples of this work being done well. This work sort of at the intersection of technology and of data and technology justice. There is a small group of conservationists that I interact with and work with, who are real experts in this space. There's probably about 40 to 50. And so, the expertise is there. I just think on a wider scale, we need a lot of capacity building. We need practices and methods to be implemented on the ground and to be taken to that government level because I think that's where the real knowledge is missing.

(09:28):
I think that in the policymaking spheres in the government level, there's a huge amount of excitement for technology to just solve an issue and for it to be a bandaid to a very complex problem. And whilst some people see that complexity as a hindrance because of the urgency of the biodiversity crisis, we won't really get to the bottom of these issues, and we won't have equitable solutions that have efficacy if we don't tackle these justice issues, as well. So, I think we're right in the middle of all of it.

Michael Bird (09:58):
What are you hoping that will come out of your research?

Joycelyn Longdon (10:01):
I'm going to be really annoying with my answer and say that there is no perfect solution, only because there is so many different communities around the world and they all need different things. On a base level, what I hope is that my work provides methods and capacity building for researchers and policymakers to go about this work. And so, I guess that the most simple solution would be that people have the capacity and the knowledge to get to those answers in whatever context they're in, and that justice is at the center of all conservation, technology, development and deployment, and we have a really rigorous approach to doing that. But of course, in each of these forest ecosystems or different landscapes, the exact solution is going to be a little bit different based on so many different factors. But yeah, I think that getting to a point, where it is second nature for justice to be built in to technology development and deployment would be a really great place to be at.

(11:00):
For anyone who's interested in learning more about sort of conservation, justice, and conservation data justice, I've sent some resources to the producers, so they will be in the show notes.

Aubrey Lovell (11:11):
That is just so incredibly interesting. Thank you so much, Joycelyn. It sounds like amazing research that you're doing and we're so grateful you could join us to talk about it. We'll be back with audience questions for you in a bit, so don't go anywhere.

Michael Bird (11:26):
So, next up is down to you, our audience, as we open the floor for you to give your recommendations on books, which have changed the way that you look at the world, life, and business in the last 12 months. Now, they could be technology-based, they could have changed the way that you work, or they could have just made you look at the world in a slightly different way.

Aubrey Lovell (11:45):
And as always, if you want to share your recommendations, there is a link in the podcast description. Just record a voice note on your phone and send it over.

Ryan Sutton (11:54):
Hello, my name is Ryan Sutton, and a book that's changed my perspective is Alan Wiseman's The World Without Us, and it sort of follows his years of research into trying to answer the question what the world would look like if humans just suddenly disappeared. It's got lots of research and sort of experts in there, giving their perspective on what would happen, but it's not just a big fantasy about having a perfect world rid of humans, sort of talking about the different ways that we can potentially reverse some of the impacts we've had. So, it's quite hopeful where you think that the titles quite maybe pessimistic. It's a very hopeful sort of outlook, just talking about the resiliency of the earth. See, that's quite nice, especially in such troubled, troubled times.

Aubrey Lovell (12:42):
All right, thanks for that. Joycelyn, have you read anything in the last year, which has changed the way you look at the world?

Joycelyn Longdon (12:48):
There's a book called A River Called Time by an author called Courttia Newland. It's about a future London, but it's also an alternate history, where colonialism never happened. And in this city, there is a sort of central, elite hub that has become sort of the epitome of technological sustainability. And the book is sort of a warning against techno utopias and shows how technological futures that our void of spiritual, cultural, social understanding will just continue to perpetuate the same systems of extraction and oppression as we have today. And yeah, it's a really great read.

Michael Bird (13:32):
Great, great recommendation. Thanks, Joycelyn.

(13:37):
It is time for questions from the audience. You've been sending in your questions to Joycelyn on grassroots conservation and AI, and we've pulled out a couple. So, the first question is from Laura in Aberdeen. She would like to know your opinions on organizations from the Global North parachuting and solutions for conservation in other parts of the world and what can be done about it.

Joycelyn Longdon (14:00):
There's definitely a draw when a solution is created in the Global North to scale that up. I think that in the tech space, scaling up is a really key piece of jargon and a huge motivator for organizations, institutions, and companies. I think the issue of that is one, that not all solutions are going to fit in all contexts, and we end up with a mismatch between the technologies that have been built and exported and the cultural needs and societal needs of the places that they're exported to. But larger than that, I think we really suppress homegrown talent because all of that talent gets poached for western companies, large tech companies. And what we see happening is that less homegrown solutions built by local people, less research led by local people happens because the Global North has, of course, a huge amount of resource and capacity to drive these projects and solutions.

(15:07):
Instead, I think what the Global North can do is to use that resource to build capacity to fund and create incubators and create spaces for technologists on the continent to learn and train, but also to push forward their own ideas that are more relevant for their communities.

Michael Bird (15:27):
Excellent. Thanks, Joycelyn. And incidentally, if you want to know more about poaching of tech talent, as Joycelyn mentioned, there's an article on this from the International Monetary Fund, which we've linked in the show notes. It's a couple of years old, but it outlines the issue very well.

Aubrey Lovell (15:43):
Maria from Paris would like to know how much interest there is from traditional research bodies like universities to finding localized solutions to big global problems.

Joycelyn Longdon (15:54):
That's a good question. I don't know how to answer that because I'm really lucky in that I have a great team around me in terms of my supervisors and my research group, who believe in this kind of work. But I'm less sure that I could speak on behalf of the larger institution. And I think having conversations with other people doing this work, it does still feel like a very marginalized approach to this research. And I think, of course, institutions are motivated by impact. Of course, we would question how genuine that impact is. But I think that institutions prioritize and see as more fulfilling projects that are on the largest scale, and that doesn't always allow space for this amount of nuance. And of course, we will have to scale up these ideas in some way or form, but it still feels like approach that's still coming in from the margins. And I'm sure one day, just as with other spaces that used to be on the margins and are now worldwide, that that will change. But currently, it still feels like a marginal approach to this kind of research.

Michael Bird (17:11):
Thanks, Joycelyn. And as promised, we'll drop a couple of links in the podcast description for more on these topics.

Aubrey Lovell (17:17):
All right. I'm catching my breath here. So, we're getting towards the end of the show, which means it's time for.

(17:22):
This Weekend History.

Michael Bird (17:22):
This Weekend History.

Aubrey Lovell (17:30):
A look at monumental events in the world of business and technology, which has changed our lives and our vocal cords.

Michael Bird (17:36):
Now, the clue last week was the first computer to have six legs. And would you believe, it's the advent of debugging as a computer term this week in 1947? In this context, the term is attributed to US Admiral Grace Hopper. She was leading the building and programming of the Harvard Mark II when a literal moth was found lodged in one of the relay banks. Now, for the nerdy among you, it was relay number 70 in Bank F. And the term bug to describe electrical gremlins was already in common use. But this was the first literal case in computing. The moth was taped to the log sheet along with a comment about being the first instance of a debugging, and the rest they say is history. Incidentally, you can still see the logbook and moth in the Smithsonian.

Aubrey Lovell (18:31):
Wow, you really do learn something every day.

Michael Bird (18:33):
Yeah, every day's a school day.

Aubrey Lovell (18:36):
It is. And next week, the clue is It's 1959 and the Moon Meets the Moon. Huh? Know what it is? Well, producer Sam is telling us to whisper it faintly in case the rest of us here.

Michael Bird (18:52):
Well, that brings us to the end of Technology Now for this week. Do keep those suggestions coming in for life-changing books, using the link in the podcast description.

Aubrey Lovell (19:01):
Until then, thank you so much to our guests, Joycelyn Longdon, and to our listeners, as always, thank you all so much for joining us. Technology Now is hosted by Michael Bird and myself, Aubrey Lovell. This episode was produced by Sam Datta Pollen and Zoe Anderson, with production support from Harry Morton, Alicia Kempson, Alison Paisley, Alyssa Mitri, Camilla Patel, and Alex Podmore. Technology Now is a lower street production for Hewlett Packard Enterprise. We'll see you next week. Cheers.