Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
Alix: [00:00:00] Hello. Welcome to Computer Says maybe this is your host, Alix Dunn, and I'm joined
Georgia: by Georgia Iacovou. I am a producer on computer, says maybe Hello
Prathm: and Prathm Juneja. I'm the research strategist at the maybe.
Alix: And we are gonna do what we did last year, which was so fun that we decided to do it again. We make all these shows every week.
I think all of us have like different reasons we like them, so we find it like a fun, like retro both to celebrate the fact that we made a weekly podcast for an entire year this year, which is fantastic and not easy. Uh, so it's like a kind of like a victory lap, but also reminding ourselves of all the stuff we learned and all the people that we think are cool.
Georgia: Each of us picked. Two of our favorite episodes and we're gonna play a [00:01:00] clip from each of those and whoever picked it is gonna introduce it. Pham, this was you. So Pham, why this one?
Prathm: Yeah, I chose this clip, or Georgia actually chose this clip 'cause I requested to take five clips and Georgia was like, I'm choosing one clip please.
Um, but I chose this clip from the interview with Marietje Schaake because I think the entire episode is kind of about. The biggest story of the year, at least in our circles. There were some really big stories outside of the technology policy circles, which is the relationship between governments and technology companies.
Let me do some SEO real quick. This clip and the entire episode are about ai, government digitalization, AI regulation, China geopolitics, abundance as recline procedure, doge AI Act, big tech, open AI Europe innovation. Elon Musk. This episode is 37 minutes talking about all of the things that everyone is talking about because we've spent the entire year talking about the narrative of AI and national security using technology to improve government privatization, [00:02:00] democracy, how we should regulate, and I think the clip emphasizes one of those points, but the interview is about all of them, which is we keep saying we wanna modernize government, but what we actually mean in practice is almost all the time we're actually going to hand over the reigns of our most sacred processes to private capital.
Let's play the clip.
Marietje: What I wanted to do with the tech who is to show how. The power grab the coup by tech companies is systemic and it's often not very visible. It's not necessarily the technology that people use every day, but it's everywhere. Meaning that there has been a trend of decades of outsourcing of critical functions by governments.
This was obviously. Cheered on by tech companies, the US government itself. The idea being if you don't digitize, you're stuck in the Middle Ages. You are not willing really to provide the best services to your citizens. You don't wanna be efficient. You don't want to be cost efficient. Basically, you cannot afford not to digitize.
And I think [00:03:00] governments throughout the democratic world, including in Europe, have. Proceeded to do this to just keep embracing more and more digitization. But digitization essentially meant privatization because it's not investing in your own government capabilities, hiring talents, advancing the public interest with regard to tech.
With outsourcing, there's a loss of expertise with outsourcing, there's more proprietary data and information, and so it's harder to come by expertise of the kinds of services that are now operated by companies, but. In the name of the government, right? And then with the next wave of a new technology or with the need to change something about the system, the knowledge is no longer in-house.
And so guess what? It becomes easier to outsource and then that just perpetuates, perpetuates. And so you basically end up transferring enormous amounts of critical functions, critical insights. Agency to private actors that have [00:04:00] fundamentally different incentives to make profits, to please shareholders, to grow market size, to be faster than the competitor or the fundamentally different objectives than what a government's responsibility really is.
Prathm: So my day job, the work that I mostly do is advise governments about ai, digitalization, and democracy. Like this is the subject that I'm thinking about all the time because I'll walk into one of those rooms and it'll be about how. Everyone's like, how can we use AI to fix our elections? And what I actually want us to be talking about is two factor authentication and basic digitalization and policy, and running a fair and functioning democracy.
Elections are fundamentally about trust. Most of government is fundamentally about trust, and you lose that trust as soon as you start outsourcing critical paths to technological companies, especially in ai when like what we want AI to make decisions about benefit services or about taxes or about elections when we can't even explain how the models work.
All of this is driven by these vendors, technology [00:05:00] companies, and this like overarching DPI esque narrative on. We need to just innovate, innovate, innovate, run government, like a tech company as people have been saying since 2008, and by vendors who just break shit because they make more money breaking shit.
And I also think this is the story of all of the abundance stuff and the procedure stuff and the capacity stuff, because it's like, yes, digitalization is important and carries many opportunities, but if we want to do that, we have to do it well and actually build the capacity within government. That is rights respecting and focused on actual policy outcomes rather than just making the problem worse by letting a bunch of private actors take control of our government services.
Alix: Yeah. I also like that it really lays out the incentive structures and how they're all broken. 'cause I think sometimes we talk about symptoms within this space of like, oh, Palantir and NHS just signed a contract. And it's like, oh, but like how did that come to pass? And I think there's something in her analysis that's just so deeply helpful because it.
Explains in really plain language what's happening. Like why are policy makers [00:06:00] incentivized to do this when in fact they shouldn't be? Or like, why are companies increasingly good at getting these contracts? Like it feels like it's getting easier over time. And I think she describes that really effectively.
There are a lot of people that have tried to not engage in the political analysis that underpins why you would want a government to be staffed with people that know what's going on. So you see a lot of like. Civic tech people being like, oh, we should hire 10,000 people into government that know and understand technology and like understand it from a more.
Intersectional or inclusive or like are more human centered in their understanding of technology. There's like whole field of people that I've worked a little bit with and what's really frustrating is most of the people that work in that space do not engage in the underlying politics of why that proposition is a political vision.
Like they just don't, they're like, well, obviously you wanna have more people that understand technology and government. It's like, but that's not obvious. 'cause actually the current trajectory is the opposite for very good. Political reasons that have to do with money, they have to do with power building.
They have to do with this fusion of [00:07:00] the right. That's basically like we don't actually want a functioning state. So like why would we invest in it? And in fact we can just sell it or like give it to someone that runs a company and we think companies should own the world and we don't really like democracy very much.
And I just feel like her analysis both underpins why that work is so important, but also I think lays bare that there's not that many people doing that work that actually engaged with the underlying politics of how we got to where
Prathm: we are. The other link here that I think gets missed so often is we spend so much time talking about how we are going to regulate the technology companies and AI Act and GPR and all of this.
But what happens when you are trying to regulate the company that owns your government infrastructure? Like you don't do it because they own you. And so we're always like, oh, these, these companies have major lobbying power, but it's not just their money spent in politics, it's the fact that they run the systems that are running government too.
And that's completely different power dynamic from just them being political actors. They're also government actors at this point. [00:08:00]
Georgia: Yeah, I think it's really odd that like the people in power are being led by the other people in power, like the private actors and both sides equally have just no vision for a good future.
Their visions are kind of around just everyone staying indoors, getting like drone delivered coffee and being really uncreative all the time because like AI is gonna do everything for you. And it's just like, that's not a real vision. Give me something I can work with here. I like how she paints. It's like a vicious cycle.
Like if you keep outsourcing, you're going to have to keep outsourcing because you're gonna remove the capacity, the confidence, and the. The internal capacity within government to like be able to move things forward when things need to change, when you need to upgrade your systems. All the boring stuff just gets outsourced to like other people who make it look like the boring stuff is sexy when it is just supposed to be boring.
Tawana Petty. This was the very first episode of the year, so it's all the way back from January. The fact that I've been thinking about it ever since then I think speaks volumes. She framed safety and surveillance as like things that conflated all the time. We have lots of like big, high level [00:09:00] conversations in this podcast, which I think there's lots of space for that and those are really good.
But it was also really excellent to hear someone who. Talking about this kind of stuff, but on a somewhat smaller scale, because I think most of her work is based in Detroit. Also,
Alix: before we hear the clip, um, I saw Tiawana a lot at MozFest. We were staying in the same hotel, and she and a friend of hers every night at like midnight, I'd get back to the hotel and be like, I'm going right to bed.
And I'd walk into the hotel and then Tiawana and her friend would be eating carrot cake and having cocktails like literally every night of mustard. That's awesome. And so I would stay up way too late, um, talking to Tana. When she first introduced me to this friend of hers, she was like, oh yeah, this is how she does, she does this podcast.
And she's like, but 'cause it's always, you always feel like a douche when someone introduces you as a, as someone who, that's a podcast. That's fine. I'll, I will never get over that. It's like really horrible feeling. Yeah. She, she paused for a second and she was like. But it's more than a podcast. It's like a whole thing.
It's a whole thing and was very sweet about it. Anyway, so shout out to Tijuana for also being, shout to Tiana to say it that way, which just great. Yeah.
Tawana: When I look at the way [00:10:00] that surveillance has been conflated with safety, most times they're surveilled. Communities are not connected, and the safe communities are the ones where, Hey, I notice that you know Mr.
Watson. Didn't pull his garbage can back for seven days. Something must be wrong. I'm not out here telling people don't get an alarm system on your house, but what I am saying is that alarm system on your house doesn't mean much if you are not in relationship with the people who live on your street. So if something does happen to you.
You're relying on an outside force to come in and potentially respond to a situation that has already happened and it's probably too late, right? It's a reactionary response to a potential harm, whereas you, your neighbor, who's looking out for you, it's probably a greater prevention than living in Fort Knox.
It's a very [00:11:00] individualized, capitalistic reality that says that if I just put myself in a big enough bubble, then all the harms that could potentially come to me will bounce up against it. It's just not factual. There's so much research. There's research out there that says that a street is much safer if it has a streetlights than it is if there's surveillance cameras.
And so there are a lot of communities even within Detroit, there, the streets are just not well lit. In addition to that. Those same areas are in extreme poverty, so we know that had, as an example, American Rescue Plan funds went into resourcing those neighborhoods, ensuring that there were street lights and that people weren't starving.
That would probably be a much safer neighborhood. It's just this reactionary way of making one community. Less [00:12:00] humane than another community. When the obvious things are
Georgia: radical, that's when we know we're in trouble. Something I also wanted to say is like while I was listening to this clip after Mauricio's clip, was that the story seems so similar to me in that kind of like you're putting it as she says, you're putting yourself in this bubble, you've outsourced your safety to these surveillance companies, and I think it's in the same way that government departments have made themselves impotent by like outsourcing their functions to like.
Again, surveillance company is just a different shape, right? They kind of like fit together in that way. I think personally we do the same thing where we're just like, oh well yeah, this company or this like system knows better than I how to like keep me safe or how to like keep my family safe, my home safe, keep my life ticking over whatever smart homes and all of that.
So I think, yeah, it's just not right. Communities that are surveilled in this very top down way are just not connected together. And I thought that was a really good and nice thing to continuously reiterate in our brains. I really like
Alix: the outside force comment. I feel like that's also something that happens with like technology or engineering logic.
'cause you're always ping back to the [00:13:00] central server instead of like going over to the node next to you. And there's something really. So that's a very nerdy way to talk about community relationships breaking down, but, but it's very, like, you see it all the time with things like Nextdoor, where basically it's like, I'm gonna know what's happening in my community by looking at the global internet.
That makes no sense. And I just think this outside, inside. Construct is really helpful because it makes it clear that trying to lean on an outside force to create some feeling is actually probably a fool's errand. I think it's just really powerful. Also, like thinking, I just love the phrase that she uses a lot, that surveillance and safety.
It's also a really interesting example of where a really powerful narrative. Frame can be an incredibly effective type of advocacy. 'cause like when you hear surveillance ain't safety, your brain immediately goes to, oh, okay. But a lot of people think being surveilled is a path to safety, but it's not like it really does break down a kind of mental [00:14:00] construct that I think is very common in the public consciousness.
Georgia: Absolutely. Anything something else? She said, and this is what we call the episode, actually, these communities feel like they're being watched rather than seen. So it's like this continuous kind of like invisible, yeah, like outside force watching you ready to penalize you whenever you slip up or make a mistake rather than being seen and having like your needs understood and then met, which I think is like an important dichotomy.
I think this overlaps with so many other conversations we've had on this podcast as well. Something that Cory Tro talks about a lot when he like talks about how essentially when you like have objects in your home or just like in your life that are like supplemented with software, so like smart objects, they then become more flexible and they're connected to this central server, these continuous tendrils that like go from your private spaces into this unknown world.
That is controlled by a company and then they can change and adapt the objects that live in your house to meet their needs. Which I think is Yeah, a problem that we are constantly coming up with in multiple different ways.
Prathm: Samsung is putting ads on all their fridges [00:15:00] and a bunch of grills broke down on Thanksgiving 'cause they needed to do software updates.
Like this is, this is the world we live in right now. So
Georgia: stupid.
Prathm: I, I think that techno solutionism is like a pervasive disease epidemic. It is. It is contagious. It is. Spreading in our communities, in our surveillance, in our governments, and we need to put a stop to this.
Alix: This is from Carly Kind's episode, Alix, you pick this one. Do you wanna tell us more?
My mind was really drawn to like a lot of our January, February episodes. I feel like we started hitting our stride at the beginning of the year. So there might also be these moments where I felt like I could have these kind of mind-bending interesting conversations with people that I've known for a long time, have huge respect for, but like didn't necessarily ever take the time to like talk to them about their fields because I work across so many fields that you like use words in this like flat shorthand kind of way.
And privacy is one of those words that I like, use and think about, but like. Not [00:16:00] very deeply anymore 'cause I like have had to shortcut it in my mind 'cause it's like not my field that I like am most interested in. But Carly has worked on privacy for 20 years now. She's a human rights lawyer. She's a close friend.
We like talk about her specialism on policy stuff. She used to run the Aid of Love Life Institute, then she went to be the Australia Privacy Commissioner. We got a chance to catch up at wrights com, which was really nice to just like hang out with a friend in Taiwan, but also sitting down and like. Going deep, um, like how it's going as like a privacy commissioner.
It's her first political post, which is like a whole other wild career trajectory. I'm excited to talk about Carly's episode, not least because like she talks about her political analysis of privacy overall, but also when we start getting really frustrated about the role of states, it's nice to have examples where actually people like us are now in government and trying to do things that maybe changes the trajectory of some of this stuff.
So we're gonna hear from Carly. About how she conceptualizes her role as a regulator and tries to change some of the behavior of these [00:17:00] companies.
Carly: One of my many roles is to give them tools that they can use, and sometimes those tools are like a threat of a big enforcement action so that they can then go to their boss and say, like the privacy regulators looking to this practice, we really need to make sure that we're upping our game, et cetera.
We also operate a lot in the cybersecurity space because in Australia, at least I'm sure this is true, in other entities, we're the defacto cyber regulator as well, and so like a clear link is. Loads of tech teams don't have the tech, the money they need to do all of the infrastructure upgrade and the cyber stuff that they wanna do.
Boards hear about it. We see board minutes all the time when we're doing investigations where like head of it or the size O or whoever will come to the board meeting and be like, I need more money for this upgrade. And the board will be like, uh, we're like, that's low on our risk management framework, so we're not gonna invest that right
Alix: now.
We can't make money by being more secure.
Carly: Exactly. So like. There's a clear link between when we go to court, and we did this last year. We filed a case against a health insurer who was subject to a massive data breach. 9 million Australians health insurance information was disclosed. We're currently seeking civil [00:18:00] penalties that.
If the court gave them the maximum penalty would be in the trillions against this company. And so the very fact of filing the case against Madi Bank was something that the CISOs can take back to their boards to say, this is why we need you to invest in cyber infrastructure at this company. So it's quite an interesting kind of allyship between those who are out there trying to promote privacy on the one hand and the regulator on the other.
Alix: I'm curious what you all thought when hearing. Was it surprising to you that regulators think in this way, that that's this deeply strategic in terms of like using this kind of enforcement action to build leverage for overall systems change?
Georgia: One thing I was gonna say was it was like mainly quite refreshing to hear.
It's just like we're not here to like catch people out. We actually are thinking about the threat of like enforcement as one way in which we. Incentivize compliance. Right. And I guess it's kinda obvious now that we've like thought about this and we've heard this and we've heard her frame it in this way, but it's good to hear it laid out that way.
I think it's just important to understand and remember the point of a regulator is not to [00:19:00] like catch you out, catch you doing the wrong thing. It's to help you figure out how to do the right thing with the resources you have. It's fines as leverage, not punishment. Exactly. Yeah.
Prathm: I, I think it's fun when the carrot and the stick approach is like actually.
Used and actually works versus every other way we try and use it, which is just to like criminalize basic people and their lives. I think it was like really fun to see the level of strategy that's being put into something like this, and also the importance of the issue. Like so many companies, government, institutions, et cetera.
Are simply not secure and it is never a thing that is incentivized in action. And so for a cari to be thinking so thoughtfully and carefully on how to actually make that happen in the service of the public good is amazing.
Alix: I feel like too, the getting into the mechanisms of a regulator. Is like where Carly also gets the nerdiest.
Um, something is really interesting and it's like where the detail starts to really matter. I found in this conversation, like whenever we got into detail, I was like, oh, that's so cool that you're actually thinking strategically about this stuff and like have
Georgia: a
Alix: whole
Georgia: system view. I [00:20:00] also really enjoyed when she was sort of bringing in the point about reasonable expectations, thinking about that and generative ai and like when 20 years ago you were tagging all your friends in photos on Facebook, you didn't have a reasonable expectation that they would use that to then.
Potentially train like computer vision because like a normal everyday person wouldn't be thinking about that. They would just be like, isn't this great? I can tag my best friend looking like shit in the club. Whatever. I dunno.
Alix: Yeah. And like how that expectation of privacy is changing over time and how that like affects, you know.
Prathm: I think it's fun that so much of data protection is super intuitive. Like the EU principles of like minimization, necessity and proportionality. You can explain that to someone in 30 seconds and they're like, yeah, shit. Like you shouldn't use my data unless you absolutely have to. And the thing that you're using it for is proportional to the amount of data that you're taking.
Like all of this is so simple and intuitive and it's awesome to see regulators using the. Power of simplicity in, in that way.
Georgia: Yeah. Alix, you picked this one. Do you wanna tell us [00:21:00] more?
Alix: Yeah, I mean, I, um, I picked this one for a couple reasons. One, 'cause I think the conversation that I had with Rudy Fraser from Black Sky Algorithms, most people that I know that know how to do things with technology.
Oftentimes that knowledge comes at a cost of their ability to understand politics and like human systems. Um, that's mean. There are definitely people that can do both. But Rudy's ability to interweave an understanding of engineering and technical systems with an extremely clear political vision. That was both refreshingly radical but also totally reasonable in terms of like how he's building an organization that can potentially financially sustain itself.
Also, there was just this like beautiful co-mingling of skills and intelligence and vision. I was just like really inspired at the end of the conversation and I feel like a lot of the episodes I leave with like a sense of knowledge or anger or clarity, but I don't oftentimes leave being like. If this person had more support, [00:22:00] what they could accomplish, and I left that conversation being like, how do we get Rudy more support doing what he's doing?
There also some like really interesting connections that he drew in the conversation that I still keep thinking about. Like for example, in mutual aid situations, like he tried to set up a bank account for a community and he was like, but financial institutions only. Allow you to hold assets as an individual and like, isn't that stupid?
But technology allows you to pull resources in a way that financial systems don't. And it was just like this very, like how do you align the underlying bureaucratic systems that undergird our lives in a way that reflects the actual problems we have? He was like, had kind of that engineering mindset of like seeing the problem and being able to articulate it really clearly and being like, and I created a structure that would allow us to subvert that and re-engineer something in a better way.
So I don't know. I just felt really excited that people like Rudy are building stuff. Basically,
Georgia: yeah, I agree. Um, I also felt like super hopeful rather than angry after his interview. Yeah. So let's hear it. Now, let's hear this clip.
Rudy: We're on a decentralized [00:23:00] protocol with the explicit intentions that there will be billions of users on it eventually.
That's kind of, that's the goal, right? That's Blue Sky's goal. They want to build a decentralized version of like a Facebook scale network, and so. What I find really interesting is that although the, the broader network can be billions of users, you're able to make this dotted line community for your couple millions of folks.
I think there's like an upper bound of what I would expect to be a part of Black Sky. 'cause it's not meant to be every black person, right? It's like whoever just opts into this community. And there's other communities that have followed along in our path. So we're not the only one. There's people who've like seen our model, their depths from other communities.
Are contributing to our code base and giving us credit of like forging this path forward. North Sky's an example of this. I think the goal with Black Sky is to create, because we've built all this infrastructure for ourselves, is to sell this infrastructure to other communities as a way for us [00:24:00] to have a our business model.
And so each component of our stack, that's the other cool thing with that protocol. All this stuff is. Composable as a user, you can use our custom feed and not use the Mod Service. You can use the mod service and be looking at other feeds and the mod service still works. A barrier to community builders is the technical stuff.
So can we make a one click solution? The cell here is not run your own server. The cell here is create your own rules. To create your own rules, you need to run your own server. So how can we make it easy for you to just get started doing that?
Alix: I love it. And I feel like the whole node star series, like one question I asked Mike Masnick, and I think maybe I asked Andrew, and I definitely asked Rudy, but in different ways was like, I've always felt like these types of approaches, sort of decentralized roll your own infrastructure kind of situation, anything that you got in agency.
So like being able to control your own destiny in a technical system. We lost access because fewer people could do the [00:25:00] things that were required to get that agency. So like, not everybody wants to do this. Not everybody can do this. Not everybody should have to do this. And all three of them had really good responses to that.
In terms of the, the overall path is that this is going to get easier and easier, and they do think that that's important. And I think hearing this, the way he's thinking through the strategy of making the things that he's built. More accessible over time so that more people can access this infrastructure is just really exciting because I think historically decentralized communities haven't really spent that much time figuring out how to increase adoption and make it like pleasant to do this kind of stuff.
And it's just really nice to see that maturity in these spaces over the last like, I don't know, 20 years or something.
Georgia: Yeah, I think it's really interesting because it is also like a cultural thing. It's like a cultural shift mastered on. Protocol Activity Pub. I've talked to lots of technical people about around this and I'm just kind of like, why does it feel like there's a sudden explosion of excitement around decentralized social networks now that we have app protocol and they, you know, the fact of the matter is it is just much more flexible, more durable.
It's easier to like build stuff on top of it. And it also [00:26:00] partially is a timing thing because people were jumping ship from Twitter and like going to Blue Sky. Yeah. And I think. Culturally, and this is something that Mike did touch on a bit on his episode in this No Star series, there is a kind of like old internet feel to mastered on servers.
And we've sort of moved forward from that now where it was like this is a nerdy, somewhat gate kept community. And there's lots of people who like run master on servers who aren't to like take over the world. They just wanna like chat with their friends in like a safe space. Right? Which is fair enough, which is why I am, I really like how Rudy refers to it, like.
The Black Sky community is like a dotted line community. You can phase in and out at the same time. You're within both the Black Sky community in a safe space where you can like. Talk and exchange ideas with other black people, but you are not cut off from the wider blue sky network.
Prathm: I'm just gonna stake my, my bet now publicly.
I think the shit's gonna win. I think this is going to win. I think we are going to do decentralized protocols. I do not think it's gonna happen in the next few months. As like everyone was hoping that we would all switch over, but it's just so obviously the case that this is the better system [00:27:00] and people like it more, that people will get closer towards this slowly, and I think some of the tech companies see the writing on the wall and are doing some versions of, oh, maybe we'll participate, but maybe we'll do it in a little gatekeeper way.
But you know, this is going to win. I don't think this is as easy as just crossing a threshold. There is a ton of this fight that hasn't happened yet because the big players aren't scared enough yet. When it gets closer to that tipping point being hit, expect the complete private capital playbook of Oh, but decentralized means less safe.
Oh, what about child safety? They're gonna try and get governments to regulate this out of existence.
Alix: They've already tried to, yeah, like so Age verification. Yeah, there you go. Basically Blue Sky has said we can't functionally. Do it. But they do. They do do it. They do though. They do in certain jurisdictions and it's not, yeah, that makes sense.
That makes sense. Don't get it. Like I don't get like, but they can't guarantee. That someone at Proto can't guarantee that an app built on top of it is [00:28:00] gonna do it. Yeah, no, no, I think you're right. And I also think they're gonna try and co-op the shit out of it, which I don't really know how you necessarily do that, but I imagine it has something to do with buying Blue Sky or like some, well, they also, they
Prathm: want to be the apps that you access the protocol via.
Like that's what threats is. Yeah. It's like theoretically Mac it on. Ask, but you are on the app that Meta owns and they're still selling all your data
Georgia: is Threads on Activity Pub? No, I think they tried to Federate and then Activity Pub were like, yeah, it didn't work. No, get it, get out. Like,
Prathm: I mean, I don't know if it's still staying, but they were doing some federation stuff for sure.
I don't know if they've gotten rid of it yet.
Georgia: It's interoperable with those platforms.
Prathm: Yeah, but it's not built on them. There's some cross-posting and stuff and yeah.
Georgia: There's an interesting question here about if we shift away from everybody's attention is concentrated in the small handful of centralized platforms, like what does that do to their business models?
Like how do they. Then manage that. Like what are they gonna do when your attention and the amount of time that you spend on one app doesn't matter anymore? Because like there is no one app, you're just hanging out in a space and you're using whatever, like skin you prefer on that [00:29:00] content delivery system or whatever.
Alix: There's also this like human centipede element of this, of like, um, like where there we have, there is Yeah, just wait for it. Oh God. Um, where you have like decentralized apps where like share link, you can access the thing, like there's a like more of a. Pool of content that's probably harder to control who has access to it.
So terms of service will be harder, which means there'll be more scraping probably within decentralized things 'cause it's easy to just fork and basically copy like all the content on a particular protocol. So that is happening at the same time that LLMs are trying to create many universes where basically you live within a chat bot.
So the reason I say human centipede is because basically like over time these LLMs are gonna like lock into some type of content ecosystem, and I imagine that will be more likely to be decentralized platforms. There's more freedom in some sense on the internet, but then there's these people who are kind of like trapped downstream because they've become either like addicted, because they're like obsessed with their sycophantic friend who talks to them [00:30:00] and recycles weird shit on the internet.
Not literal shit, um, just to be clear. Uh, but I just feel like there's like a really interesting emerging dramatic openness within decentralized spaces and then dramatic closeness within LLMs. And I kind of wonder what's gonna happen with all that.
Georgia: Yeah, that's really interesting. Yeah. I mean, actually we should probably move on.
I think it Stay tuned. Stay tuned. Stay tuned. We'll out.
Prathm: We'll find out what happens to the internet.
Georgia: Pham, this is your pick. Would you like to tell us why you picked this?
Prathm: Okay, so this is Brian Chen from Data and Society, talking about chips and the amount of time I have lost in 2025. Hearing someone who thinks they know about chips and the importance of chips, talk to me about chips.
I'm like, I'm losing my. I'm losing my mind. I think this interview should be mandatory listening before anyone with any amount of power uses the word chip in a sentence. He makes a couple of points in this clip and in the interview at large, making chips is really hard and really expensive, and we're not gonna magically create a mini Taipei inside of Arizona if we just like buy [00:31:00] 10% of intel and yell about national security, Taiwan's leadership.
At least in, in the past, uh, viewed the emphasis of chips as a way to strengthen national security. And now it's sitting in this kind of pseudo cold war between the west, US and, and China and the rest of the world. And the question he asks here in this clip is, does this actually make Taiwan safer? And I think that is just.
An amazing question to be asking. So let's listen to the clip.
Brian: I'll back up and say it's incredibly, incredibly difficult to make chips. Chips are more power, modern life. Anything that processes digital information, ones and zeros is powered by chips. So that includes everything from like your basic.
Consumer electronics to the most advanced ai, advanced computing. Nvidia is a, a really good example of this. Their H 100 GPU is pretty much the kind of industry standard for deep learning and LLM training. Nvidia does not make those chips. They design them in-house and they're very, very good at designing those chips.
But [00:32:00] when it comes to actually making them, they just outsource it to a different company. And that's the same thing with. Apple or any other kind of major tech company. The reason they do that is it is so costly and difficult to make chips in addition to how incredibly complicated the supply chain is.
You have like rare minerals being taken from this part of the world. You have special equipment from this part of the world. It all tends to get bottlenecked in Taiwan. But at these fabrication facilities or fab facilities, this is where they physically actually take. Silicon wafers and turn them into the chips that can be used in computering.
There are some estimates that it's maybe $20 billion just to have one of those facilities companies that fabricate these chips. They have like a lot of these, so. The cost to enter this market is extremely high. Taiwan has long been caught in this regional conflict. It has long been an object of imperial desires of Chinese imperialism.
In this kind of context, Taiwan said [00:33:00] the way that we're gonna secure ourselves promote our national security is we are going to secure our position in global supply chains. We are going to dominate chip manufacturing. Uh, a lot of foreign policy experts will describe its chip industry as the country's silicon shield.
So a lot of people are, are very kind of triumph list about the fact that Taiwan has pursued this type of economic development in the mold of national security. And I think one question that still kind of remains for me anyway is how much has that Silicon Shield really made Taiwan safer? I think you can argue that.
It has as much as anything else made the country an object of desire for larger nations, right? The chip industry there has encouraged imperial desires not negated them.
Alix: After like a couple months after this interview, I learned that Taiwan is on a major fault line, which I didn't know. So like in earthquake.
Tomorrow could like bring down the global economy in a way that I hadn't quite appreciated. It's on the little, don't worry. We got Arizona
Prathm: situation, we got Cs and C Arizona.
Alix: Yeah. [00:34:00] We've got $20 billion in 10 years to build a, yeah. Anyway, so why, why did you like, I don't know, the Silicon Silicon Shield thing?
Yeah, go for it.
Prathm: The, there is, there is a narrative that I'm gonna try and summarize that I think is the predominant narrative about chips right now. And it goes something like, we need weapons, we need AI to make weapons, we need chips to make ai, Taiwan makes all of the chips. China's going to invade Taiwan.
We need to copy Taiwan in America. We should invest money in ship making. That, that's, that's a narrative. Tri trillions of dollars, right? There's, yeah.
Alix: Yeah. Welcome to our economy.
Prathm: I, I think we should make chips in America. Like I think that's a great idea and I think we should make a lot of things in America and invest in a lot of infrastructure in America.
So prioritizing chips above all else, which, you know, I play video games. I would like cheaper GPU prices. It's insane to me that this is the only priority, like we should build a better healthcare system and technologies and batteries, and a power grid and energy and all of these things. So I'm just not convinced that this goal of building more chips in America.
Coming [00:35:00] from this pathway of weapons and war pleas is the pathway that leads to the best results. Really. And I think Brian does a great job of saying, why?
Georgia: How can we orient a new, better healthcare system around making more chips somehow?
Prathm: There you go. Or just, you know, every American gets a PS five, put chips
Alix: in.
Exactly. Put chips in the body. So pro, if you were to. Engineering economy that included chip manufacturing. That was less bat shit crazy. How would you, what? What would it look like?
Prathm: I think it would look like the CHIPS act, and you're welcome to sell it with a bunch of national security language if you think that that's what you need.
But like actuality, it's like what does it look like to take all of the things that run our country and make sure we have supplies for that and are able to do that. Internally and then also prioritize a fuck ton more spending on everything else. That also matters. My argument here is that we should spend more money on America.
Chips are a crucial part of that. We only spend money on things if we want to. Create war [00:36:00] chips are the best example of this right now. It is all national security language and it means what's gonna happen is a handful of companies are gonna be able to produce these chips. We're gonna argue this makes us safer, but it's not actually gonna do anything for domestic life and it's instead just gonna feed into the military industrial complex like it.
It's just the wrong way to this thing that we need.
Alix: It was ever thus, yeah. Yeah. I think it also connects to AI bubble conversations because I think it's become a bottleneck in a way that threatens also the global economy. Like I still, I still struggle to conceptualize how the market can freak out about Mom.
Donny suggesting free bus fairs. For New York, but then be like, yeah, $7 trillion for Altman to like build some, like whatever sounds reasonable. Um, and there's just such an inability to invest in people. It's almost always investing in an architecture where there's no people, and I don't understand how both, because there's no people, because we put them out of work by building a GI or some shit, whatever they, they're trying to do.
Or [00:37:00] there's no people because we built a military industrial complex that like wipes out. Absolutely
Georgia: and like never ending, going round around. Conversations about scale, scale, scale. How do we achieve more scale? Well, we need more scale. This was like so perfectly conceptualizing. Brian's interview, he almost does it here.
I didn't wanna put it all in because then it would just be, the whole clip would just literally be about silicon wafers. But he does talk about how from beginning to end, how they're manufactured and actually indeed how much waste there is. Because there's like. There's like a yield, right? Because they're so hard to make and it's so complex.
And I remember you, Alix, pressing him for kind of like, what's the percentage rate of like how many do they lose? Are we talking 1%, 2% or like 20? And he's like, yeah, like 20. And it's just like, that's crazy. It's just because of that error rate. They're just like, therefore we need more scale so that we can get more out of like these strange diminishing returns on these silicon wafers, which frankly sound delicious.
But yeah, I don't know. This was part of our. Series called Gotcha, which is about scams and how they work from Bridget [00:38:00] Read who wrote the book, little Bosses Everywhere, and it's about the history of multilevel marketing. I picked this because I think, not to be like grand unifying theory about this, but like it really does feel like a really good history of like the beginning of how we got here because it was kind of like in the mid 20th century in the US thinking about how the economy should work and how individuals should like.
Think about themselves in that economy. A lot of it was a response to like the new deal and trying to kind of claw it back essentially. So let's hear Bridget and then let's discuss
Bridget: from the very beginning, they're having to invent like reinvent sales as something that is really gonna help us all. And so they create this vocabulary that is a lie.
You know, they call consumption production. When you buy something in MLM, they immediately flip it and call it sales. It's a very literal kind of transformation. They're trying to tell you that you are doing something fundamentally extractive. You're paying for a product and they're trying to tell you that you're [00:39:00] contributing something.
You're producing something valuable. It is totally gaslighting in the service of like convincing us all that. Buying more and more shit is somehow producing something in a marketplace. And again, right now we are being sold on something that. Truly the benefits to us right now, and I'm talking about AI, just to be clear, are not clear to us, right, to regular people, but we are being told that regulation is gonna hold any of those benefits back, and that if we try to get ahead of any of this and put our arms around it in any way, we're gonna keep ourselves from this like incredible imagined future of riches.
MLM has been doing that for so much longer. When you're brought in, if I bring you into my downline, you're at the bottom of someone's pyramid. But I do it by telling you, you can build your own pyramid. I mean, it is literally a story about exploitation [00:40:00] and selling that exploitation MLMs literally call recruiting people into the business sharing.
It is a spin on instead of selling, being something kind of icky and this transaction that. In the olden days, salesmen were icky. You know, it was gross. It was a transaction that was not supposed to be your family member, your friend. It was not supposed to exist in your sort of intimate life. And now everybody is selling something to everybody and we're taught that it's good because we're all doing it together.
But again, all of it is to serve in what I feel to obfuscate. The fundamental relationship of labor and capital.
Alix: I also think we should use the term future of riches somewhere. That's so good. Oh my god. Yeah. I'm always thinking about my future of riches. I don't know, there's just so much in here. There's so much like selling exploitation is also, yeah.
I
Georgia: think also like she talks about how they like successfully like just switch the narrative to make a salesperson like a kind of like [00:41:00] sweaty door to door salesman. Please buy this like vacuum cleaner. Reframing it to be like less icky as she said, and into kind of like, no, you're just like. An individual entrepreneur who's just making it work for themselves, supporting their family, going out and being like really entrepreneurial in this new free radical market of enterprise.
That was coupled with like the people sort of spinning this rhetoric are like were back then post-second World War, like seen as quite radical and out there, and I think they've like successfully shifted. That to seeming like the norm. Now, if you are failing it, the only person you have to blame is yourself type of thing.
There's no like need for welfare. If you can just like make it out there on your own and just hustle basically, grind, set, et cetera.
Prathm: I think it is so amazing how the story of MLMs, it feels like the story of like the capitalism that is discussed on social media platforms like this is tied so deeply into like the Manosphere stuff too.
It's like all of these guys who are selling. Ways of, they're like, oh, you wanna be a content creator, buy my [00:42:00] pack. That teaches you how to be a content creator. Exactly. Exactly. Everyone, yeah. Everyone's obsession with like, you gotta run your own thing. Like you need financial independence and it's all scams.
Like these are all scams and people selling scams.
Georgia: Pay me to show you how to become rich. It's like the, the way that you are becoming rich is because I'm paying you to tell me how to be, it's like a circular, like couldn't be more transparent humans. Yeah, that's right. Here we go again. Everything's a human cent page.
Alix: What's really cool is that at the end of the year when we like look back at shows, I think we end up seeing patterns in the themes of what we talk about and how a lot of the work, like no one person can work on all of the stuff. We're interested in. And so it becomes this like interesting network of people that are thinking about these issues from different perspectives.
And it's one of my favorite things about doing the show is that we get to like go deep on particular perspectives with people, but then when you kind of zoom out, you start seeing this fabric of like who's doing what in a way that I think becomes really meaningful. Different conversations throughout the year have kinda joined [00:43:00] up into this overall.
Probably view that we have about what the problems are and what the possible solutions are. But I don't know, like when you see these, all these things together as like a representation of our year, recognizing that there's like some stuff that we didn't mention at all, but I think it does kind of capture like an overall set of.
Themes. So any final words on like, I don't know what we focused on this year and if maybe we could also end on like what we're thinking about maybe trying to dig into next year?
Georgia: Yeah, I think, I think it definitely was super comprehensive. So, you know, well done us. Um, but I, I think, I think, um, there's definitely, I totally hear you when you're kind of like, you're looking at all of this stuff.
There's also of people working on lots of different things and it feels like actually they're kind of just working towards like the same fixed point, but they're just coming at it from like. Many directions and many perspectives. And I think when I think about like Brian Chen talking about the horrendous scale that is required to like create the infrastructure that is then required to create AI that we think we need to pursue, like.
Warfare and you know, it's like gratuitous and it's like [00:44:00] massive and it feels like it's never ending. And it feels like Sarah West said when we spoke to her about this, like it doesn't even feel like it's the same as like the 2008 crisis bubble. Like we're all converging on this huge punctuating event that's going to like explode and destroy everything.
It just feels like. We're in this mode of like, everyone is kidding themselves. They're all high on their own supply and they're just like really excited to continuously pump money into this thing, but there's no dump. There's just pump, pump, pump, pump, no dump. And I just feel like there's a really strong connection there with Bridget Reed's thesis around kind of like.
Sell nothing to everyone and everyone's selling nothing, nothing, nothing, nothing. Even though like we're selling and there is no product and there's this like collapse of production and consumption, and I feel like the underlying thumb from this is just as Maurizio talks about is just this inherent like power struggle, tight rope pull.
What is not the tight rope pull anymore. It's more kind of like the government handing all the resources over to tech companies just creating this like. Inherently untrustworthy, adversarial ecosystem that is not democratic and where [00:45:00] like you are, you know, on a micro and macro level sitting in your government department or in your home and you're just hoping that all of the technology companies that you've given money to will sort out all of the problems that are like bouncing up against your windshield.
Yeah. So it all feels super connected, but also like I spend every week thinking about this stuff. So maybe, I dunno,
Alix: but I heard democracy is a big one. Like the decoupling of production. That's actually. Worth something like a decoupling of the traditional way we think of value creation in an economy with like whatever this is.
It's like a fantasy. I'm also hearing all of the systemic risks that that introduces. So economic, but also kind of social fabric and also geopolitical. Basically the consolidation of energy and vision on this one particular project, and then like how that creates political risk essentially everywhere else.
Problem, thoughts, themes. Yeah,
Prathm: I fully agreed with both of that. I think it's the same story that it has always been. There is. Private [00:46:00] capital actors who want power, they are assuming a lot of that power right now by taking over core government functions and by having government no longer participate in like reasonable decision making and regulation that protects people's lives.
And one of the major consequences of this is a breakdown of. The community and public ownership, that is what we actually crave as human beings. We want to feel safe because we know our neighbors. We want to be connected on decentralized ways that don't have some tech company arbitrating our conversation.
We want our government to actually build things for us and that make our lives better, and we're, we're just seeing this all play out in extreme ways because the promises being made for this vision are so extreme and so fictional right now. What we, we are losing so much of the narrative ground in this exact moment and this year showed us.
The consequences of that really happening at scale.
Georgia: Yeah, agree. You're just really making me think about how like, [00:47:00] meanwhile, like while we're being sold this like amazing, beautiful vision of future riches, we are simultaneously, our reality is becoming whittled down and down and down to just one where we continuously transact and don't interact, if you know what I mean, like.
We're just like condensed down to a series of transactions throughout your day rather than like meaningful connection with people. And it's just like, I don't wanna spend all my life fucking transacting.
Alix: Yeah, I think, I think that all sounds right and I feel like should we, should we, I feel like we should preview a little bit of what's coming next year.
'cause we know some of the stuff we're gonna be doing, and then maybe we could talk a little bit about things that we wanna cover as individuals. So we've got some stuff in the works in early January. We'll be dropping a trailer for a new miniseries, so be on the lookout for that. And we're also mindful that the India AI summit is coming up in February, and that there is gonna be a lot of techno solutionism and geopolitical maneuvering that we're gonna keep our eye on.
And we'll have some content related to that in January and February. Also, and maybe this is just like what we each want to do more on, I really wanna do more work on, [00:48:00] um, the kind of post democracy political project of the tech bros. 'cause I think a lot of people, when they try and critique corporate action in the space, think about it.
In very closed system terms. They're like, oh, in a capitalistic economy, oh, they're just profit motivated. They're just trying to make money on profit. Or like, oh, they're just trying to do X or Y. When in fact, I think a lot of them are now. Both high on their own supply, probably like high on ketamine, and also have been like dipped in a set of ideologies, this like dark enlightenment set of ideologies where they like literally don't want democracy anymore.
They like literally don't want normal economic systems anymore. And I think I would love to spend more time engaging with some of those ideologies. Not necessarily to like go off the deep end in those spaces, but just to like be a little bit more. Clear-eyed about what is influencing some of the decision making and the strategies of these guys, and it feels like of increasing importance.
Um, yeah. What do you guys wanna cover next year? [00:49:00]
Georgia: I actually think a lot of the stuff that I'm thinking about from exterior is like really overlapping with what you just said. And I think I wanna like think about that sort of anti-democracy project and like take it further into kind of like understanding.
The psyche of these like eight dudes who wanna rule the world and like turn our species multi-planetary. There was like a disgusting, gratuitous obsession with their own mortality and like they hate it and they reject it. They don't like their fragile human bodies and they want to kind of like. Live forever.
I'd be interested to like think more about transhumanism and like male wellbeing, basically like the wellness industry, but through like a male lens. Those two things I think are like really, really interconnected. There's a lot of overlap there with the EA movement and kind of like AI safety and strange, like self hatred of humanity and trying to like cure all disease and scrub the world of autism and like things like this.
I think there's like. It's nebulous and there's like loads here, but I would like wanna like do more work that makes sense of it. And then I think alongside that, like looking [00:50:00] more into kind of like how AI slot is transforming internet culture, not in a kind of like. Oh, what's this doing to, because these are, this is, these are really important questions as well, but like what's it doing to like journalism?
What's it doing to our like information ecosystem? How are like businesses gonna make money if like the entire economy of the web is no longer based around like click on this website and like people are looking at it because AI summaries are everywhere. More about how are people actually using like AI slop for fun, for their benefit for political agendas.
And I think like. Is AI slop necessarily bad? Is there a difference between like slop and brain rot and just like questions around those kinds of things and like, should we be looking at AI lop or just like AI generated media as just like new type of media that we maybe have to accept into our ecosystem rather than fully reject?
That's just the beginnings of my ideas around. What I think we should be thinking about the next year,
Prathm: I kind of wanna do a little bit of the opposite of what the two of you have suggested, which is I think, I think we need to articulate the political ideology of the right. The right has a tech informed [00:51:00] political ideology.
I also want us to focus on building, starting the process of building a narrative around what a tech informed left political ideology looks like. Like I, I want a positive vision. Yeah. Just, just desperately stronger brain. Yeah, and I wanna talk about like pure politics more on the show and have people who are informed about the actual political systems, not just from a technologist who is politically informed, but like a politically informed person or politics person who is maybe technology informed.
I wanna spend more time in that space and and just work towards having more of those talks that Alix leaves feeling, oh shit, progress rather than. I have another way of describing how the world is fucked.
Alix: I'm down for that. Yeah, that sounds great. Yes, that's pretty good. I also think there's more that there's more like potential in that.
I mean, and not potential as in like I think there are more candidates for guests now than there were even a year ago on that front, and I think we should lean into it. Cool. Also, we would love to hear from you dear listener. If there's like topics you want us to cover or guests you want us to think about, or if you wanna share your end [00:52:00] of your reflections, feel free to drop us a line.
'cause we're of a certain age that I think validation is really important to us. And do do mine. I could my entire life. Alix, do you like, do you like us? Do you like us? Yeah. Am I pretty? Uh, yeah. So we'd love to hear if you listen. What you think and if you've got ideas, uh, or any feedback on this year, we are here and ready and listening.
But thank you Georgia, for doing the work to like comb through all these episodes and find these clips. Very appreciated and thank you Pro for you know this. It's great.
Georgia: Thank you everyone, and obviously
Alix: thank you to Sarah Myles who isn't here in the Zoom box, but is listen, she's in spirit listening.
Georgia: She's here in spirit. Thanks for listening. We'll see you next year guys. Bye. Bye. Love you.