Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
Alix: [00:00:00] Hey there. Welcome to Computer Says maybe this is your host, Alix Dunn. And as we've mentioned on the past few episodes, we have been preparing something special for you on the heels of MozFest, which happened a little over a month ago, which is kind of wild. Fast four weeks and the run up to the holidays.
This episode stitches together. Lots of different conversations we had at MozFest, both from the kind of formal programming. Uh, you'll hear some panel conversations in here, but also from interviews we were able to have on the side with really interesting people that were headlining the event. So you'll hear a couple of those in here.
We also hear a little bit from Nabiha Syed the executive director of Mozilla Foundation and Helen King Turvey , who is a board member at Mozilla and has been to every Moz Fest, but the first one. And we had them in a, um, cool conversation about past Moz fests and what we might see in the [00:01:00] future and kind of what they think the organization should be prioritizing when it convenes.
The broader Mozilla community. I'll stop there. But thank you to the Mozilla team who helped us put all this together and created lots of easy ways for us to record stuff while we were in Barcelona. And thank you to Georgia Iacovou and Sarah Myles from our team who put this episode together.
So it's day one of MozFest. Yeah, we can walk. Jake and I are walking around the venue and it is a really adorable, kind of like enclosed village inside of what feels like a castle. But I also [00:02:00] think the cute little kitsch coffee shops, um, might say it's a little bit more of a commercial enterprise than the castle might, uh, at.
Show hi. Uh, already running into old friends, seeing people that I'm presuming based on their outfits are gonna be new friends. Um, and you can't see what I'm looking at, but there's like basically a really adorable city walkway covered with an umbrella ceiling. So like the blue, the orange, everything's orange at MozFest.
And we're really excited to have our first session in just about a half hour focused on how data center expansion is happening globally. It's really nice to have this as kind of a follow up from our Climate Week live show where we talked to local organizers who are resisting data center expansion in American cities.
But this is obviously a global phenomenon. And so our first session is gonna be. In conversation with journalists who have been covering data center expansion in other countries. So we'll hear from [00:03:00] them about what their process is like of doing investigations and helping local communities understand the implications, and also trying to get some transparency into an industry.
That has been, as Karen Howe puts it, um, positioning itself as needing investigative journalism to have any idea what's going on, and they're gonna help us basically, like get a bigger picture of what's going on around the world. So really looking forward to that session.
Okay. Hi everyone. Welcome to the morning of day one at MozFest. It's. So nice to see these, I think early risers. Um, I'll also say this is not even the first session on data centers. I think we've already had one. Um, so this is. One of many discussions about the really important implications of all of the physical infrastructure being built everywhere around the world, whether we want it or not, to help this AI boom, whether we want that or not, either.
So my name's Alix Dunn and I'm the founder and CEO of the maybe, and I'm here [00:04:00] coating with,
Justin: I'm Justin Hendricks. I'm uh, CEO and Editor of Tech Policy Prep.
Alix: And we are surrounding three wonderful people who have been working on data centers, uh, to better understand how they are being developed, um, in different contexts around the world, investigating the implications of some of these.
And also when they're being built without any transparency, um, attempting to try and better tell stories, um, and help us understand how these companies are making choices that have huge implications for energy, water, air quality, and community safety. I would say even. So to, to kick us off, I wanted each of you to introduce yourselves, and when you do that, if you could share a little bit about the work you do and also a story about data center development that you think would help us sort of kick off a conversation about the issues that we are dealing with when we talk about the physical infrastructure that we're seeing pop up all around the world.
So, Pablo, do you wanna go first?
Pablo: My name is, uh, Pablo Jiménez Arandia. I'm a investigative reporter and free [00:05:00] journalist. From here based in Barcelona. For the last couple of years now, I've been doing quite some work on data center expansion and data center boom in, in several countries. I started working on this topic here in Spain since, yeah, I started to listen to news of huge data centers from US.
Tech giants come to Spain and like one year ago I, I like expanded. The project to two countries in Latin America, Chile, and, and Mexico. Thanks to a fellowship program with the policy center. I've been digging into what are the implications for local communities, but also what is the role of local authorities, companies, and I've been looking quite in detail into more than a dosing projects, like in different stages of development.
But yeah, thinking about like one specific. The study, the work that I've done in Mexico, I've been focused in the region of Carrero, which is becoming like one of the huge hubs for data centers in [00:06:00] Latin America. Microsoft is one of the companies that is building there a, uh, one of its new hyperscale hub for those that.
Don't know. I mean, hyperscale is the way that the industry calls to those data centers, which are the biggest ones. And yeah, and so I mean, we've been looking really detail into this hub of data centers from Microsoft. We found how, for instance, like authorities there, they don't ask the company to disclose any information about.
How much water they need to cool the servers that are running inside the warehouses that they have. So, and at the same time, I mean, we were like doing quite some reporting on the field and we were visiting the, the rural communities that they actually have, like the data centers in their backyard.
Literally people there, they knew nothing about what was going on in, in these data centers. They didn't even know that they need water to operate. And these same people they are facing like water outage almost on a daily [00:07:00] basis. So it's a huge area. So yeah, I think that this specific case exemplifies quite, quite well what, what we are dealing with and yeah.
And there are like other. Issues with this specific project. They're using gas generators. I mean, we found out, we uncover how, uh, they're using fossil fuel energy on site. I mean, they couldn't connect to the national electricity grid. So basically they decided instead of waiting to have like all these infrastructure that they need, they decided just to install like a bunch of, uh, gas generators to kick off operations from these data centers.
So I think that this really exemplifies the way big tech companies work, right? I mean, it's. Build fast. Started operating the soonest. Possible without looking into the implications and the footprint that they live like in the, in the communities.
Paz: So, hello, my name is Paz Peña . I'm the Illa Senior Fellow studying data centers.
I'm focused on understanding the social environmental [00:08:00] impacts of data centers in Latin America. So one of the things that I believe is important. You know, a very interesting story is that in Latin America, activism against data centers started back in 2020. Oh, the Santiago in Chile is a big case. Right?
And I think it's very interesting because there's a very interesting history around data centers or communities taking data centers to court, right? So this is really interesting because it's not only about holding accountable. Companies, but also governments, right? Because governments has a lot to do in data centers', investments in Latin America.
So, uh, one case is, for example, the case of Chile in Santiago, in a little neighborhood in Santiago, call it res Google started at data center there back in the 2020, the community take the case to the court. And actually last year, I think, no, uh, the court said that [00:09:00] the government should. Not only inform the audience to the public, the use of water and the on the different social environmental impacts, but also to take into consideration the climate change impacts on the data center Environmental.
Impact assessment. Right. So this is really interesting, right? Because what we are discussing actually is that data centers are actually affecting the way we are dealing with climate change. So you can see an example from Chile that this is something that is taking into account right? That's one which I think is super important for the world.
Actually. The second one is that happen in, in Euro Y. So again, Google, I dunno why, uh, is under controversy, right? Because they're building a data center in Montevideo, in Canons. It's the capital, right? UDI is going through a really tough drought [00:10:00] period. People actually in Montevideo is taking showers with, you know, water in a pocket, right?
It's a terrible situation for common citizens and they announce that they're gonna build data centers there in that situation. So that is incredible. Right? So people decided to. Take this case to the court because when they ask the government to know what will be the amount of water that this data centers will consume, they say, we cannot say it because it's corporate secret.
Like the government is saying it's corporate secret. So the court finally says. You cannot take the, um, corporate secret as an excuse to not give information to people, especially on water. Water is a resource that everyone needs to know what's happening, right? So. Again, no. The court is talking to the government, trying to hold accountable, the government to give information [00:11:00] about these investments and the social environmental impacts.
Again, I think this is really important too, right? Because again. Uh, one of the problems that we are having in Latin America with data centers is the use of water, but also because we are living the consequences of climate change. And one of the consequences of climate change in Latin America is the drought, right?
Chile, Mexico, um, Brazil, even. We are all facing drought. So I think it's very important to hear these kind of stories that actually courts are actually functioning, trying to hold accountable governments around data centers.
Tessa: Thanks. Hi, I am Tessa. I'm the Impact Editor at Lighthouse Reports. Lighthouse is an investigative nonprofit, and we are behind investigative stories that you might see in newspapers and in media around the world.
We're not a publishing platform ourselves, but we work in collaboration with other journalists and other newsrooms. In order to publish [00:12:00] investigations and to go where audiences are already consuming their news. So myself as the impact editor, often what that means is a shift in thinking of the people who experience, for example, the firsthand harms of data centers, the communities themselves, a shift away from thinking of.
People just as sources or, or as points of information, but thinking of them as the reason that you do the story, but also the main beneficiaries of that story. And I think that fundamentally changes the way in which we report and definitely the media partners in which we report with. I think a big problem is that a big focus on AI work has focusing on, you know, the benefits of AI and, and delivering this story to, to consumers who are really, really excited about this.
But it kind of ignores the way in which communities are experiencing the harms or the repercussions of that needless expansion. I think what that means is making sure that those stories are delivered in communities in ways in which they engage with the news, but also in languages in which they engage with as well.
And I think that's something that has been surprising about our work at Lighthouse [00:13:00] is maybe not so surprising, but how much time and space it takes to build trust with communities who have been systematically screwed over by governments. By tech companies, the media, even themselves, to build trust in order for those findings to, to resonate with those communities.
And I think an example of a story that I think does this well, and something that comes to mind is the data center that was built X AI's data center in Memphis, which was a data center that cropped up. Pretty much overnight. And the first the communities were hearing about it was from the smell of the gas that was coming into their houses.
And the confusion, you know, of, of that community, if they had collectively all left their gas stoves on. But it was actually the new reality of that town. And I think this story just goes to show the complete disregard first of all, of companies like XAI who received no, you know. Permits to, to open this data center, did not consult the communities at all.
And yet, you know, are doing so to generate a huge amount of excitement and a huge amount of revenue for their [00:14:00] own personal AI pursuits, which for Xai is pretty much just grok, which I don't think anybody could say really tangibly benefits their lives in, in any way. And I think this story just speaks to the maybe values or ideas behind many of these.
AI pursuits, which is, you know, for the sole pursuit of profit for companies at the very expense of the communities who experience water shortages, whose lives are, are changed. And those communities, you know, it's not just an annoyance of a noise or a smell, it's those people in those towns are, are getting sick, are getting respiratory illnesses, are suffering from asthma or bronchitis of the, you know, high risk of the gases.
And I think this story shows the power imbalance, but also speaks to the need for, for journalists. To focus on the way in which communities experience the expansion of AI and the data centers that are involved.
Alix: So for folks that are trying to follow this, it can be difficult. So you guys are, obviously this is your bread and butter, you're focused on, um, data centers globally, but what should people be looking out for over the next year if they're trying to [00:15:00] understand, is it getting better, is it getting worse?
Um, what are the kind of key stories that you're gonna be following and that maybe folks in the room, folks in the live stream should also be paying attention to? Just for them to be tracking what's happening and the kind of overall trajectory of things.
Paz: I will say that the big story in the next two years will be the energy bills that we are paying and that we are subsidizing to big tech companies.
This is something that you are already seeing in the US in Ireland. It's not so clear in other parts of the world, so we need to do a lot of research on that. But I will say that if we can do that story that people can. Understand that they are paying for electricity for big tech companies. This is gonna be huge, and I think that is the story that I will follow in the next two years.
Tessa: Yeah, I mean, something I think that continually comes up as we're speaking to communities who are facing the palms of, [00:16:00] of data centers being built is that, you know, whilst they're advocating against a data center being built in their community, I think a lot of people have. Guilt about the fact that, you know, well, if we resist this data center here, it's just gonna go down the road and it's just gonna impact another community.
And they're just gonna have the same repercussions that we're facing, the same water shortages, the same electricity problems, the same smell, the same noise, pollution. And I think something that, that I'm following is, and, and what comes to mind is the work of Global Action Plan in the UK and Fox Glove, which are trying to coordinate different communities to fight and advocate against.
Data centers and, and AI expansion. And I think that's something that will come up and be more, more relevant is the way in which these communities can coordinate so that, you know, data centers can't just move to the next town and the next town. That they are being told that the way that they need to fundamentally do things, uh, has to be different.
Pablo: Well, just trying to add like something a bit different. Like I think the Chi case, like all the resistances against these huge [00:17:00] projects, um, probably is a unique case in the world. Chile. I mean, the case of Rios that Path mentioned, which is a craziest story I always say, is that, that you could really make a, a great movie from it or a, a really great, uh, series or a show, um, because yeah, it's.
Was basically like a group of normal people, neighbors that stand up and like in a matter of few weeks, they became like experts in a, a really, uh, technical and difficult to understand industry, and they were able to basically challenge, uh, giant as Google, right? Also connecting with this idea of how do we put tors into this industry.
I think that, yeah, there is this idea of, I mean, this is not the only AI possible, I mean, this is the. AI that a group of elites in Silicon Valley are, are deciding to be. So there are like alternatives to build like more sustainable, more fair systems towards the [00:18:00] planet and the people that live that we live here.
So yeah, I think that work from the regulatory side probably has to come like. Regaining the power that, like democrat institutions used to have sometime towards this, this huge, uh, private actors and, and also thinking of alternatives like being creative, right? This is. Yeah, this is not inevitable. I mean, there are like alternatives to do things in a different way.
Alix: Wow. If you can hear that. It's the sounds of MozFest, uh, or mic test before we're actually, we're the early births. Oh yeah. What's our, [00:19:00] what's our first session? Yeah. Okay. I don't know. Um, so I'm here with Hannah, me, am I memoir? Yeah. Hana Memon. Hannah Memon. Yeah. Okay. From generation. Z, gen Z for change, gen Z for change.
I feel like, I mean, there's like a couple of different things I wanted to talk to you about. Let's set the scene. Yeah. You're a senior at Columbia University. October 7th happens. Mm-hmm. The day's following, there's a wave of. Islamophobia. Mm-hmm. That wasn't necessarily not present before, before October 7th, but yeah.
Yeah.
Hana: Um, really interesting space because the Muslim Students Association at Columbia about 10 years prior was being surveilled by the NYPD in a post nine 11, very post nine 11 world and. That surveillance meant a lot of distrust within the Muslim spaces at Columbia. And so being in those sort of spaces and leadership position was really interesting.
And then you have October 7th happening really uncovering the Islamophobia that we were already witnessed before. Um, that was the [00:20:00] gross iceberg. Exactly. Yeah. And then. I think getting into this work when, you know, I was graduating school after witnessing all my friends being arrested and trying to figure out what my next step in life would look like really came down to recentering, like the values that I understood as a student organizer.
And so finding an organization like Gen Z for change was really special 'cause I was able to combine both the computer science degree and the technical skill that I had. For the causes, civic and social issues I was most passionate about. It was really important. Gen Z for change, they had really been on my radar, particularly because they had created a ceasefire email tool and as described by political, it was like the most moving, almost like email tool that was like essentially a shortcut on your iPhone that people would set up to send.
Automatic like emails to your congressional officials [00:21:00] so you could schedule it. So every time you open TikTok, it would send that sort of email. And I think we sent like al
Alix: almost like 5 million emails. Like really, uh, leveraging doom scrolling. Yeah. Yeah. Like,
Hana: you know, why not be combine it with some level of intuit of civic, your civic duty and engagement.
Okay. So what's the organization doing now? Like what's the overall mission? So Gen for Change works very closely with on the ground organizers and advocates, lawmakers, policy makers and technologists like me and other others on our team, and also as well as our creator coalition of over 500 members. To bring about both narrative change, you know, around protecting our democracy.
I know that sounds very vague, but we. Are trying to address the issues facing our generation the most. But we also recognize our generation is almost like split into two groups. We have what we consider like those before, like those who graduated high school before COVID and after COVID. And seeing that [00:22:00] sort of like political shift and divide among our generation's been really interesting.
So trying to identify and, you know, advocate for those particular issues. Of course, you know, we already talked about Palestine being a really big cause and movement that we're. Very excited to support and seeing the Black Lives Matter movement, uh, particularly online. You know, given. 2020s nature of being constantly online, really engaged young people like that's how Gen Z four change actually really started in a way around advocating for TikTok, for Biden is how it really started.
It was a group of content creators and online personalities coming together to. Advocate for against another Trump era in 2020. And then obviously our organization has grown since then. But I think it was really clear we had so much connection and commitment to each other in, in the older generation, I think you also saw a win.
We saw a win, and I think we really ran with it and really loved that momentum. [00:23:00] And then we also noticed. Uh, you know, in that era, of course, we were holding our representatives and our elected officials accountable, but we also saw like the far right media. We have like one 40th of the budget maybe really taking over TikTok and all other media spaces.
And so that sort of shift is really relevant or that shift is very visible in this Trump era. Yeah. Super interesting.
Alix: How does it feel? Watching, and I actually, I'm not like, I'm sorry, I'm processing this older, younger generation and a younger generation of Gen Z not having experienced a wind.
Hana: And especially like considering Gen Z's grappling with the climate crisis.
Yeah. With the hous, with housing issues. Of course everyone is, but you know, our generation is deeply already facing the impacts of this. And you know, as you know, we've. Barely graduated college, you know, and are experiencing like highest student loan rates and amounts always. Right? So our reality has never really been.[00:24:00]
Multiple wins. It's always been a barely win or a win we didn't even really want. Yeah. Um, and so that's been really interesting to look at. But once that momentum was built like online during COVI, it was really exciting to. Keep that momentum going for, you know, other elected officials we were passionate about and other policies we wanted to see come into effect.
And then very quickly after when other sort of media spaces came about and, and right wing media, there's a a little bit of a shift going on there as well.
Alix: How are you feeling about corporate consolidation around technical infrastructure and this like new private sector, explicit commitment to surveillance and exploitation?
Hana: I think, yeah, surveillance has been really interesting in my personal perspective, not sort of leaning on the greater generation perspective just yet. Growing up around tech always, it's very clear that like a lot of. Data privacy and collection practices that are not very good in the us. Were [00:25:00] just such like norms and standards for us that right now it's really clear we're trying to undo a lot of that normalization.
Um, and especially when you bring in big AI into the conversation. It's very clear, like there needs to be so much unlearning and so much reframing and mm-hmm. Narrative building.
Alix: Um, so it's a surveillance state now that feels much more driven by industry rather than the nation state. And I feel like for me at least, like in.
My younger years of working in advocacy around these technologies, it was always focused on government. Mm-hmm. And then there was this moment where I was like, oh, wait a second. Actually, I think with like Snowden revelations, there was this moment where it was like, oh, it's, it's, it's government and industry and industry collaborating hand in hand.
Yeah. You know, the
Hana: government paying industry Yeah. And all these contracts that you're seeing, especially now with like Palantir and Ice and, but now it feels
Alix: much more private sector almost driven led Yeah, of
Hana: course. I, I think that's very true. I think one of the sessions I was at earlier today, they were like, would you rather [00:26:00] trust big tech or federal agencies?
And about 50% of respondents had, or a little bit over 50% said they would trust big tech over government. And that sort of shift is so scary. Is it an American audience? Yeah. Oh, exclusively American audience. Because I
Alix: think there's this very American view of, or or a skepticism of government that is growing in other places.
Yeah. But I feel like we've oftentimes been. Indoctrinated to think industry is better at their things, innovating faster, whereas government is,
Hana: you know, so bureaucratic, it's slow. They're reputationally responsive. Mm-hmm. Also in a way that government isn't. Yeah, exactly. And I think a lot of people forget that, like the lack of, you know, regulations and.
The right protocols to actually, you know, preserve humanity are not embedded into these tech systems and companies. And so I think Gen Z is like unfortunately so desensitized to a lot of these poor data practices just 'cause Instagram has had my data since I was. Maybe 10 years old and I'm 23 now. Oh. So like that's a [00:27:00] lot of data and tracking and history.
Like I've seen the app through so many iterations. Oh my God. And they've seen me through so many iterations and so it's a lot of unlearning for me as well. How do
Alix: you feel about, um, I don't know. We've been doing some work on EdTech and it feels like at the same time there's, you know, there's this push for a smartphone free childhood alongside an incredible.
Push to further digitize and sort of algorithmically intensify education environments from the school side, like the administration side. How do you feel about that whole chestnut?
Hana: I'm very skeptical of like ai deeply, deeply, deeply embedded into classrooms. Not fully sure about this report, but someone I talked to at this past weekend was like.
Hearing reports of a potential, a public classroom in Buffalo, New York being like taken over, like not having a single teacher and being taught exclusively by ai. Oh, I hate it. That I hate it. Exactly. Things like that make me really skeptical, [00:28:00] especially given like open AI control over like so many aspects of people's lives.
I'm just thinking about 10-year-old you on Instagram. Yeah. And data being collected and I don't need to be talking to the meta AI bots that have like. Random personalities based off of other people's data and personalities that they think they would like. Like, and the session I just came from was like, yeah, why we don't need an LLM boyfriend.
And very true. Like I think it's really shifting AI and education is, yeah, something to be very skeptical about how it's embedded on campus. I got to slowly teach educators about. Like how chat GBT is being used by students and then, you know, as they were budding educators themselves, teaching them like how they should be using ai.
Like not, you know, exclusively to grade, you know, everything needs to kind of have a human touch to it if you are going to do it. 'cause I think there are are smart cases to be made about, like there's a lot of teacher burnout, but when it comes to students and learning, I'm not, I'm gonna take a [00:29:00] skeptical stand back and say like it definitely should be teacher led and driven and taught, whereas.
You just don't want like AI slop in your brain at all times. I
Alix: know. It's so, I, yeah. Yeah. Um, I'm with you. Okay. So what's next
Hana: for you? Yeah, well, gen Z for change and I are working on some cool campaigns and projects. I think by the time this sort of voice recording will be out, we are working with the Amazon.
And A ECJ, which is like the Amazon employees who are advocating for climate justice on and getting creators to sign on to our pledge around better, more sustainable AI practices. If Amazon is to continuously invest in AI and not in other areas of their organization, I'm excited to keep learning. Um, that's a big part of what brings me to MozFest and I'm excited to like dive deeper into.
Almost diving deeper into the ideas. I picked up this, you know, this weekend, you know, a lot of learning [00:30:00] about public ai, community driven AI benchmarks, um, and frameworks. Yeah. I'm, I'm excited to keep building disruptive tools and technologies and I was gonna say, I
Alix: hear a
Hana: technologist in you. Yeah. And, um, my favorite phrase that we have at Gen Z for change is like, the work we do is both good trouble making and we love coding and causing a ruckus.
And so. I think that's what I'm gonna keep doing. Awesome. Awesome. Okay. Well thank you. This was fantastic. Thank you so much. Fun. Um,
Alix: when is your session, when is your big talk? Have you already done it? Well, we are not doing a big talk this time. Okay.
Hana: But we definitely want to next year. Okay. Okay. Cool. With Alright.
Well thank you so much. Thank you so much for having me.
Alix: We're in this like wild. Space that is like best I can describe a, I would say 18th century gazebo, but it's like pretend 18th [00:31:00] century. 'cause I've only recently learned that this is all new from like the 1940s or something. Um, which makes it all the weirder. Um, but we're kind of also elevated above the rest of the conference.
So it was also really funny to be interviewing. Audrey Tang while, uh, Ru ha Benjamin was on the stage, um, getting standing ovations, um, which is really cool to see and I can only imagine. She was absolutely brilliant. So we're halfway through day one and have already done a panel, two interviews, um, and are getting prepped for lunch.
And then two more panels this afternoon, one on. Scale and the kind of economics of scale with Catherine Bracy and Jess Remington and one on AI's integration into military technologies. Alight subjects to end our day.
I'm here with Malik Afegbua. I missed your session yesterday. Um, but do you wanna say a little bit, sort of set the scene for us when you organize these, um, shows, what is it? What is it like, what are [00:32:00] we, what are we seeing if we're in the audience?
Malik: Well, I'm sad you missed it. Yeah. First of all, and we are trying to create like a spectacle, you know, show you something that should exist.
You know, something that we're not used to, something that should. Be living like that would make you ask questions on why, you know, why do we have this perception on aging, on fashion, on confident and inclusion and all of that. And the, the perception of beauty as well. You know, it'll make you rethink what beauty really means, you know, on a fashion runway in life and all of that.
So that's kind of introducing you to that world of altered perceptions. Yes.
Alix: So I'm, I'm seeing, um, people that are older than I might expect. Yes. Um, coming out on stage in sort of expressions of confidence demonstrating clothing, is it mostly about the clothes or sort of how to how it is? Yeah,
Malik: it is, it is more of a celebration of.
Who they are. Not even the, not even the fashion. The fashion is just an addition to it. And the fashion is an [00:33:00] expression of their confidence, how they carry that fashion, what they're supposed to look like as regards to just gloomy and what the world thinks, age, ageism or Asian means.
Alix: Being old is sad, you know?
You know, and we're trying
Malik: to change that because that's not true. Like, and that that affects the mindset of even the people that are getting older thinking about, yeah, I'm getting older now. This is what I'm gonna start doing, or this is where I'll be, but no, it's not true. You can leave your life to whenever you feel like.
Alix: I also feel like most people as they age, stay inside very young, and it's the process of how society reacts to your physical changes. That probably changes how you think about your own conception of age.
Malik: Exactly. And you will think about that because if you, on, on a normal day, if you find an older person doing something that you normally not see them doing, you'll be looking at them like, what is he doing here?
What's going on? Is he okay? Yeah. You know, do you need help? Do you
Alix: need help? Yeah,
Malik: yeah. You get me. And that is just the mindset. But I think like if we train our minds, [00:34:00] especially that generation as well, train our minds to say, I could do anything regardless of what age I am. That would make you even wanna exercise more.
You know, eat more healthy, get depressed less, join community even more, because now you know that you can. Rather than just staying indoors or just having a routine, you know? So we're trying to change all of that mindset and perceptions. You're not dying out, you're evolving. It's more like a wisdom journey that we are trying to tap into as well.
So
Alix: it's an accumulation of experiences. Exactly. Is not a, a decay No, not a decay.
Malik: Exactly. Yeah. Yes.
Alix: So how did you get.
Malik: I always experiment with different mediums from arts, to film, to fashion, everything. So I always find either a story I'm trying to tell or a problem I'm trying to solve. Then I think about what's the best tool to solve that problem or tell that story.
And in, at this point I was working professionally in my commercial, um, spaces, creating adverts. TV commercial was very professional stuff, but I was also dealing with grief, and grief in a sense that, uh, I was very close to my mom. We are six [00:35:00] actually, and we are all close, like the same way with my mother.
I used to talk to my mom like every other day, tell her anything and everything. Like literally she was like my bestie and it was just very sudden that I just couldn't speak to her anymore. She had like a double, um, heart attack from complications of. The COVID vaccine on air. So that was very, like, I couldn't deal with it because she, she passed away for about five minutes.
She got revived, but she had oxygen in her brain, so that was already the problem. And she was on life support for about eight months. So I couldn't communicate, I couldn't deal with her. I was just trying my best to carry on with life. As you would say, I was depressed as well, but I felt like she would always advise me to keep.
Being stronger to keep doing what I'm doing regardless of what I'm feeling. 'cause I'm here for a reason and I feel like I need to hone a memories or hone a life. So I wanted to just create things that would honor her, her demograph, not just her, but that entire society. That's why I started to create.
All the people in spaces you normally not [00:36:00] see them in happy spaces, euphoria, you know, looking good, elegant, like, stuff like that. And I woke up one morning and I was thinking about the fashion show Foren, and I felt like, oh yeah, there'll be something. I went online, did some, did my research. It was nothing.
I was looking for references and what to create and how to create it. But I was literally not in an ad. Started to do that. I was faced with some bias and misrepresentation, but I trained my data sets. Then, yeah, for the rest was history.
Alix: And when you first started it was a digital project?
Malik: It was digital project, yes.
Alix: Yeah. So how did you, so you trained some models. And then started generating Yes. Video. Yeah. Yes.
Malik: Images was first, it was, it was pictures on the runway stages in different parts of the world, on the beach, you know, in, in, in Magoo. Just different spaces, different places looking fly and elegant. And that evolved to many other things.
That was that. Now we have real fashion shows we've done in Amsterdam, we had one in Milan, and we've done two in Lagos, Nigeria. We had one last night here in Barcelona as well, so, yes.
Alix: And [00:37:00] what, like, what, what made you decide that it was worth trying to bring into physical reality? It
Malik: was the impact, the, the impact that digital one had.
I had tons and tons, I mean hundreds of thousands of messages from a lot of people all over the world. Uh, I had people in Brazil drawing graffs on the, on buildings, on the floor of. My work. That was an AI project, you know, and this was a time where people were saying no to ai, but when it came to my project, it was accepted like way more than every other AI project back then because of the story, the message, the narrative.
And it was viral, like literally in every country. And I'm like, that is impact. That is, you know, that's powerful. So I'm like, you struck
Alix: a A nerve. A
Malik: nerve, you know? Yeah. And I felt like, why don't we make this real? I mean, it doesn't have to stay in a digital space. It could be digital or physical, fully physical.
Let's create this clothing. Let's produce everything I could. I could co-create with AI in a digital space to make that a reality as well. So it goes beyond the fashion show. I even trained fashion designers on how to [00:38:00] use technology to scale or enhance the skills as well, based on this project as well.
Created a line, created a process. It's been crazy ever since.
Alix: So what's next?
Malik: More shows a hundred percent. Nice. Um, I'm gonna be stalking the, the, the fashion that we showcased last night. That was the first time we showing it. Oh, cool. You know, the next, uh, show we have is in Casablanca, but beyond that I'm launching a toy collection, a new toy collection, um, about the elders as well and a bunch of other things, uh, next year.
'cause I'm a film director. I'm a filmmaker. I've got a few things lined up for the second quarter of next year as well in that space.
Alix: Cool. Awesome. Okay. Well. I think that's good. This is great. Um, thank you for everything you're doing. I'm so sorry I missed the show in person. I will not be in Casablanca, unfortunately.
Oh. But I will try and catch you on tour. Um, and we'll be on the lookout for this toy collection. Amazing. Yeah. Awesome. Okay, well thank you so much.
Malik: Thank you for having me. Cheers.
Helen: Hi, I am Helen. I'm a board member of Mozilla and [00:39:00] I have been to, to my shame, I missed the first. Moss Fest in Barcelona. But the reason was really good. I was very, very, very pregnant, and so I couldn't fly. But other than that, I've been to every single Moss Fest that has ever happened.
Nabiha: I know. I wanna hear the highlights.
Low lights, drama. What is. It
Alix: was really nice to sit down with Helen Trivia and Naja Ed today about their perspective. Since Helen has been to every single Moz Fest that there has been, except the very first one, which she seems really upset about. Uh, and Naja, this is her first one. So even though she's running the show now, she'd actually never been to one.
Helen: Alex said earlier that somebody had said to her that this was the most inclusive MozFest they've ever been to. Aw, and which is amazing, and they've. Felt included. And the the drama that I would point to was consistent and low level of various people coming and feeling like they didn't belong. So, um, I have, yeah, [00:40:00] various friends who've come and, and tried to speak no, no, no.
In previous, in previous years, who have come and tried to express their truth or their ideas or their projects and their perspectives. And it was. Definitely, um, not welcome sometimes because it was a bit too political sometimes because they were a different color or a different gender from the norm that was, um, pervasive in those areas.
And it was definitely previously more technology driven. So that has been, and it's not that it's night and day from. You know, from previous ones to to today, but certainly from the early ones it was, it was about the technology versus how the technology impacts the human beings. Yeah. So that is beautiful.
Nabiha: I think bringing back in or closely integrating like the Maker hacker, DNA, is something I would love to do for next year. The other dream I have for next year is, I know in the past there were youth tracks and I really. Like a dream would be to talk to like Lego [00:41:00] Roblox, the magnet tiles, people who invented something that's both wonderful.
And also I face plant daily on magnet tiles to have a youth zone where people can be building and making things together.
Helen: There were a lot of. Phy physical areas you could play. There would be cardboard boxes, there would be Lego, there would be, I love that. People making, people making airplanes that would fly down the steps, phy, whatever it was.
It was very, very physical and it felt like intergenerationally. You could get involved. My daughter is here right now and one of the things that she said, which was super interesting was the opening keynote was. So accessible and fantastic. And then when everybody discussed it afterwards, it became inaccessible to her because people were trying to be as smart as the keynote itself and somehow remove the power of the simplicity of it.
Ooh. And so returning to that simple and the playful and physical tangible, like the haptics,
Nabiha: the tangibility, it's so important. And so we had a little [00:42:00] taste of it with like the Lego single table, but the having it. Interspersed interspersed, like not just in one tent. I, I like that. I
Helen: hear that. And having, you know, a table with a puzzle half done on it and you know, those types of elements of play, I think bringing those into it would be great.
Nabiha: I would never make it to any of my talks. If I saw a half done puzzle, I'd be like, puzzle must need completion. Like I would be. This is, this is why we
Helen: have you in your role.
Nabiha: I studied forensically, the book on How to Moz Fest that came out at the 10 year anniversary. Like it's my version of it. I can send you a picture, is actually embarrassing because it is tabbed.
It is highlighted. It is like all the. My husband thought it was one of my law school textbooks, like lying around. 'cause he was with me at that part of my journey and he's like, why are your textbooks around? And I was like, it's for work. Like don't worry. And but the thing that it kept saying in the How to Moz Fest guide was that there is a ma magic you [00:43:00] can't describe.
And I was like, hmm. Cool. Correct. Not described. And everyone on the team was like, there's just a magic you can't describe. And getting here and seeing the serendipity of like the corridor mm-hmm. Conversations and the. The metric I always use is like, I'll do a quick scan. If I see two people with their phones angled towards each other, clearly exchanging information.
If I see like three to five of them in my line of sight, in the five minutes after a talk, I'm like, check. Good. Yeah. Like something is happening. And so I'm constantly creeping around just being like,
Helen: it does not surprise me that you highlighted and tab the manual. Um, A, because it's you and B, because it's evident.
It's evident in it is wild that you've never been to another mo fest. And you and your team created this. Why? No.
Nabiha: The team is extraordinary. Like they live the spirit of, and I think that is part of the magic. Like they believe in what it is and then you see that in the care that's around. What would you change about this Mooz Festival?
Well, youth space, the other thing I'd [00:44:00] really like to do is. We've thought of Wranglers in the past as curators. Yep. But what if Wranglers just really wrangled in real time? Like you train them as facilitators and there's other spaces to be like, hi, that conversation happened. That conversation happened.
We're gonna keep it going and like hack that problem over here and not just facilitate in a let's have a talk, but like, Hey, like get your laptop out. Let's fucking build this, right? Or let's that kind of magic. Is really special and we had a taste of it in the funder pre-day. A conversation happened on stage where people were talking about different levels of risk that different funders are willing to take and how it could be really useful to say, oh, you're an early stage seed equivalent funder.
You are comfortable with execution risk. Or, or like playing with it. You're not, but you're willing to be follow on. Funding. Political risk is a conversation that is topical for funders in the United States, but we don't know who's willing to take which [00:45:00] fights in what way. And so the conversation happened on stage and there was a like, yeah, it'd be great if we knew.
And I was like, okay, yeah, it would be great if we knew there is a whiteboard, we should eat it afterwards. If it says political risk on it, like light it on fire, but have the conversation. And it was a real time breakout session. And that used to happen in the old mod all the time, class all the time. I would love for next year and the years to come to have Wranglers as trained facilitators to do that kind of magic.
Helen: And so what would be amazing, 'cause that is a lot of work, right? That is, that is not just somebody being a good facilitator, but also thinking about the outcomes we want and the assets that we want to create within that. And so the idea of what would be those questions like political funding and. How would they, how could we manifest them physically, artistically, creatively, and layer them with community in an open way would be incredible.
It'd be amazing. Imagine an art installation that was built over the three days that you could actually see people's [00:46:00] energy and heat maps and understanding. I love, I love, and I'm also wondering about. The, when I was in, um, a pa, a panel that Alex was hosting yesterday, and there was, um, various different projects, um, that were on stage.
There was the methane mapper. I instantly had, oh goodness, they should really speak to X and this should happen. Yeah. And that, and so how, where are the places that, and that's not just me having to remember to go and introduce these people, but are there ways in which we could. We can facilitate people being able to do that really easily or even have shout outs and ahas from that, that we share more broadly with the people who are here.
Nabiha: Oh, I love that. 'cause one of the community health metrics I would love to look at over time is how many people reached out to someone for advice in the community? Like nothing to do with Mozilla at all. Like just who reached out to who, who co-invested with who, who, who? Co-authored something with who was like, I'm in trouble.
I'm gonna call you if you have
Helen: [00:47:00] that. Who cared? Yeah. Where's the care in the community? Right. That's, yeah.
Nabiha: And I, I'm thinking both about how do you architect a space to do that. Mm-hmm. And that's like the first thing that we need to do with care. But then also how do you tend to that and watch it and nurture it over time?
And so I'm trying to think about the metrics we would keep an eye on to do it.
Helen: One of the things that I've loved about this MozFest is the investment in today and tomorrow, and the. Next two Moz fests. That's something that Ravens Born had and it really became something that was owned. Yeah. By that local community.
Yeah, and I think this is exceptional that you were doing this here. I'm loving
Nabiha: that piece. Thanks. Thanks. We're having a good time.
Helen: A good time.
The audience.[00:48:00]
Alix: All right, so we are leaving. The MozFest venue in Barcelona for the very last time. Maybe they've just said that they're signing up for three more years, I think of MozFest in Barcelona, although I don't know if that means it's gonna be in this bizarre Spanish attempt in village that they've set up here.
Um, which several people have referred, referred to as Disneyland, the Empire Edition, and we're headed back to the hotel after an absolutely bananas three days. Um, I don't think we caught. Any other sessions than the ones we were involved in, which was really always frustrating. But we'll, we'll definitely follow up on the live stream.
'cause there were several that I really wish that we'd been able to make. Um, and I think overall past Moz fests have felt a lot, um, drier, more technical, more driven by the people actively participating in the technical infrastructure that Mozilla is building. And this felt much more. Global, much more, um, [00:49:00] political, much more creative.
Um, also kind of more choose your own adventure because the space was structured in a way that was partly outside. There were all these pockets and places where you could kind of form the group of people that you wanted to form. I really appreciated Aku H's question at the a MA. Asking how Mozilla avoids letting industry set the agenda for MozFest and for Mozilla.
Um, and I think she, we actually talked to her after, and I think the reason that she asked that question is because it's kind of happening everywhere. So just because the AI industry is saying, Hey, everyone should be talking about ai, it's like, well, of course the AI industry wants that. And if Mozilla.
Sort of falls into that trap of feeling like they don't engage in the future of AI systems, that they're somehow missing the boat on the future of technology. Just making sure that the organization continues to question that assumption and that inevitability narrative around AI systems feels really important.
Um, and I think Moz Fest by finding a way to balance between [00:50:00] the enabling people to explore and experiment. While also engaging in the, uh, broader societal level questions about technology. So you can't ignore ai, I don't think. Um, but at the same time, uh, it needs to be thoughtfully managed. Uh, how Mozilla creates spaces of exploration that include experimentation with AI without just with.
Enabling and kind of being complicit in some of the worst of AI politics. Those are my reflections on the last few days. It's been wonderful and also just like so nice to see friends and see Barcelona. Um, had a really wonderful time with the ladies from Fox Glove. Talking about the future of strategic litigation challenge, big tech, as they're always doing, they're always up to so many different lawsuits that hopefully you'll hear about in the next few months.
Um, and I always leave these kinds of events when I get to see so many people I've worked with over the last like 15 years and seeing if their careers and focus. And strategies have evolved. It's just so cool to see how many different [00:51:00] versions of the same kinds of political questions you can see someone asking and having over time, the work that people do over years and years on these issues really compound into something meaningful.
So thank you to Mozilla for hosting us and hopefully we'll be back next time. Thanks for listening and also for your listening pleasure. Over the next few days, we are going to share some interviews we did on site that I think are really different and interesting, and I hope that you enjoy getting a little bit more insight from some of the amazing people that were convened at MozFest.
You'll get to hear from Audrey Tang who's been on the show before, but it's just so fun to talk to largely because I think. She looks at things a lot more optimistically than I do, and sometimes we need a little dose of that. And then Louisa Machado, who is an organizer that is looking at how we can improve the intergenerational conversations we're having about AI and then our old pal Abbe Hanni in a conversation about what happened at the [00:52:00] AI for Good Conference that was hosted a few months ago.
You might have followed this, but basically. Abe gave a barnstormer of a talk. There was an attempt, um, to censor it. You can learn more about that in our interview and why she gave the uncensored talk at MozFest. We will also be hearing from Ben Collins, who's the CEO of the Onion, one of my favorite publications, who is coming out swinging, um, with a print edition, and he's got some interesting things to say about the future of media and also how he sees humor playing an important role in our current political moment.
So next week we will be releasing these each day. So there'll be a daily drop of one of these, pretty short, but I think really good interviews with people that were at MozFest and to get you in the mood for these conversations. Um, or here's a little preview.
Ben: Most of my job is hearing in the background.
People just like cackling all the time. While I like file expense reports, the
Luisa: tax systems that we're dealing with, they are the way they are because we are living a crisis of imagination.
Audrey: You spend lots of money and lots of [00:53:00] effort and your success is literally nothing happens.
Abeba: So yesterday, uh, I gave that talk dance insert version in a much more relaxed, uh, friendly, supportive, friendly environment.
Where I don't have to apologize or where, where I don't have to tiptoe around Israel's contract with companies like Google and Microsoft.
Alix: Alright, thanks to Sarah Myles and Georgia Iacovou for producing this episode and the interviews to come and to Mozilla foundation for being such great hosts. We had a really great time at Bonds Fest and appreciate how their team were just great partners in helping us make the thing you just heard and we'll hear next week.
So with that, see you soon.