Computer Says Maybe

A replay of our conversation with Kate Sim, on the state of child safety online.

More like this: Dogwhistles: Networked Transphobia Online

We’re replaying five deep conversations over the Christmas period for you to listen to on your travels and downtime — please enjoy!

Child safety is a fuzzy catch-all concept for our broader social anxieties that seems to be everywhere in our conversations about the internet. But child safety isn’t a new concept, and the way our politics focuses on the spectacle isn’t new either.

To help us unpack this is Kate Sim, who has over a decade of experience in sexual violence prevention and response and is currently the Director of the Children’s Online Safety and Privacy Research (COSPR) program at the University of Western Australia’s Tech & Policy Lab. We discuss the growth of ‘child safety’ regulation around the world, and how it often conflates multiple topics: age-gating adult content, explicit attempts to harm children, national security, and even ‘family values’.

Further reading & resources:
**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Post Production by Sarah Myles | Pre Production by Georgia Iacovou

What is Computer Says Maybe?

Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.

Alix: [00:00:00] It's Boxing Day. I hope you had a great time yesterday. If you celebrate over this winter break, we are bringing you five deep conversations from the podcast that we think pair well with long walks, hopefully with nice weather and on the heels of pretty big food comas. And today's is from Dr. Kate Sim and we cover.

A topic that has obviously been of huge interest over the last few years, and that's kids online safety. I am in Australia for the winter break or summer break or whatever it is we call it here. And the government here has just rolled out a social media ban for under 16 year olds, so it feels. Like a good time to have Dr.

Kate Sim coming in and talking about her perspective from years of expertise in this area on what it would look like to actually keep kids safe online. Just FYI. This is a conversation from April, 2025, so we don't have the benefit of knowing how some things would have rolled out for the rest of this year, but I think there's still loads of interesting insights here as we think about possible legislation to protect kids from [00:01:00] the maelstrom of issues of online platforms for them now.

Welcome to Computer says maybe this is your host Alex Dunn, and in a first, but probably not a last, we have a guest who has been on the show before and is back with a completely different, uh, area of conversation. So for those of you who didn't hear Dr. Kate Sim in an early episode of the show, talking about her time at Google.

You should listen to that. It is very interesting. She also is an expert in child safety, particularly thinking about online environments and the way that kids experience the internet and the implications that has for their joy and experience in life, but also the potential downsides of digitizing childhood.

Um. We get into all kinds of things. Some of the things that we talk about might be triggering if you have kids and are worried about them or have had bad experiences trying to help them stay safe [00:02:00] online or if you yourself have been a victim. So just as a warning, if. Child sexual abuse material and conversation about that is something you wanna skip, then I would not listen to this episode 'cause it is throughout, which makes it hard to timestamp for the purposes of trigger warning.

But why I wanted to have this conversation is because kids online is such an intense conversation. It is also disgust in really vague ways so that you, you get the sense of urgency, the sense of importance in this conversation, but oftentimes it doesn't come with. A specificity and a contextualized understanding of what is the actual problem?

What solutions are people trying? Are those solutions working? How should we think about how these solutions might affect other rights of adults or even. Rights of kids who do deserve online spaces that they actually wanna engage in. You know, the purpose here is not to cut kids off from all access to the internet, although some people are gonna try that and have tried that, but I just, I, I [00:03:00] find Kate's sort of systematic approach to this problem from a position of wanting to protect kids.

Both from controlling adults who wanna limit their access to information that they just don't want them to have access to. And also from potential abusers, um, and from really scammy companies who haven't taken any responsibility in creating an online space that's safe for kids or adults. You know, I mean, there's a lot of abdication of responsibility from platforms that we get into in the conversation.

Essentially, I just wanted to have a. Systematic exploration of a topic with an expert I trust, and I hope you find some good information, some good kind of reframing, which I think Kate does a lot on this issue and helping you kind of understand what's at stake for kids and what maybe you should be done about it.

So with that, my conversation with Dr. Kate Sim.

Kate: Hi, my name is Kate Sim and I'm the director of the COSPER Program, which stands [00:04:00] for Children's Online Safety and Privacy Research, and it's a research program housed at the University of Western Australia's Tech and Policy Lab.

Alix: I kind of wanted to start with like broad strokes. What. Is happening to kids online and how you even approach that question.

Technology has taken, you know, the world by storm over the last 15 years. Lots of stuff's happened. Kids have been thrown into the mix. We have different duties of care to children in terms of how they engage with technology. It's a huge conversation. So could we start with just your research on what is actually happening?

Like what is the size of the problem? What are the types of problems for kids online? So much of it actually has to do with the fuzziness

Kate: of the concept of child safety, which is just really vague. Like is that referring to the school safety where desks have to be of certain heights so that kids are participating in classroom environment?

Well, are we talking about stranger danger of the eighties and nineties, telling kids not to talk to [00:05:00] strangers? It's a very, I think, purposely fuzzy concept that encompasses a whole range of. Things. So the metaphor that I often hear from regulators and particularly parent groups is think about going to a pool.

And there are all these safeguards present, but we don't have those things in online spaces. So we're trying to bring them in, and this metaphor sounds super intact. Of course, we would want to have some rails and little floaties and a pool so the kids are not drowning. But I think that metaphor in itself is.

Quite reflective of how we're thinking about child safety, and I think that's where this fixation on a particular set of extreme, severe harms come in to be a stand in for all of our anxieties about young people and all of our anxieties about technology. So one particular area in which that manifests is around child sexual abuse.

Online, when I've been studying child sexual abuse and sexual abuse for a very long time, this is very consistent with the way that. [00:06:00] Child sexual abuse or CSA has been thought of since the seventies. You know, it became politicized as something that we, as a society should reckon with in the seventies and eighties.

Much thanks to feminist organizing around politicizing gender-based violence and violence within the family system. But what's kind of interesting that happens in the eighties and nineties is gender-based violence, sexual violence against particularly adult women in marital setting or workplace setting.

That gets politicized as something that happens like within our community, but with child sexual abuse, it conjures up additional set of anxieties people have about the sanctity of the family, that it doesn't take a similar kind of politicizing route and it maintains its scary image as this thing that happens elsewhere.

It's simultaneously. Prevalent everywhere, but also something just really pernicious and it conjures up immediate discomfort so it doesn't get politicized in the same way that [00:07:00] sexual violence with adults does.

Alix: Can you unpack that? 'cause when you say sexual violence against adult women in eighties and nineties and kind of gender-based violence conversations got politicized.

Can you say more what you mean by that?

Kate: I mean, it was up until that point. So there's a concept called like hermeneutic injustice from feminist philosopher Miranda Fricker, talking about things that we know to be injustices, but socially we have not acknowledged and termed it as such. So things like sexual harassment, which was just considered locker room voice talk or just what you have to deal with if you're a woman in the office because of.

Organizing through feminist workers in particular, that gets politicized as a societal harm that we now see in legal documents, and now we have institutional policies and norms around it. But when we politicize things like that, you know, obviously it can take multiple routes. I often like to think about sexual violence as kind of an umbrella category, and within that there are all of these different.

Ways in which gender and [00:08:00] sexuality and power intersect

Alix: interpersonal relationships, so there's a prior normalization of a thing, and then change in society that basically says that thing that used to be normalized is not normal, and actually it's gonna require organizing around. Making it not normal and actually legislating against it.

So you actually have to convince men and people with power that changing this thing is actually of importance. You also get backlash and actually the book backlash is in my backlog and I'm excited to read it 'cause I the history of how, when feminism has made progress, how the backlash has happened and kind of manifested as a useful way to understand.

Also, anytime we make social progress, there's like a group of people that will. Do anything they can to claw it all back, but I hadn't quite thought about. So in the child safety context, one, how this connects to your earlier point that it's fuzzily defined, and two, how basically it's universally accepted, maybe with the exception of certain forms of like corporal punishment that hurting kids is.[00:09:00]

Universally seen as bad and wrong in a way that hurting women is like, meh. You know, depends on, yeah, what year we're talking about who we're talking to. I hadn't thought about that. Affecting how we then talk about safety.

Kate: Yeah. Yeah. So then there's this really interesting like tension around normalizing it because again, with sexual harassment in the workplace, for example, it's something that was normalized that we're trying to de normalize.

And in doing so. The organizing kind of strategy is to narratively show that like this is messed up and it shouldn't be considered normal. And actually what is abnormal is the fact that it's so prevalent. So there's this like weird paradox in trying to normalize it and a similar kind of dynamic plays out with child sexual abuse as well, which is, you look at the statistics and it is extremely prevalent.

It's prevalent in the sense of it happens within the family system, and that is what's so troubling about [00:10:00] it, that this institution of the family, that's our first contact with being a person in the world, people that we depend on closest are often the people who are most unsafe to children, and these are really jarring and difficult concepts to grasp.

So in an effort to try to raise awareness and de normalize the violence that we enable against children, what ended up happening is it ends up becoming kind of a spectacle. So even when you say child sexual abuse, even though there are multiple ways in which that can play out, I think people immediately jump to sexual spectacle things like.

Dateline and all of these, you know, crime shows that really center on sexual violence against children is the image that we immediately come to because that's the imaginary that's really popular in culture and that's ingrained in us. And unfortunately that has the effect of actually making. Real instances of child sexual abuse, real instances of child maltreatment become less visible because it [00:11:00] doesn't rise to the standard of what we imagine to be the worst form of violence against children.

And I think it's a really, really scary thing for many parents. And guardians to grapple with because when you take it on its face value and you see that 90% of child sexual abuse happens within the family system from somebody known to the child, you have to recognize the fact that the family is dangerous for many children.

And that's a really, really uncomfortable and troubling thought. And I think that invokes also like really intense feelings about guilt and shame from parents and guardians of crimes.

Alix: I think a lot of times there's an a historical and also purely digital way of having this conversation that I think is that spectacle where it's like it feels moral panicky in terms of this is coming from nothing, rather than coming from a long history.

And also this is coming in a completely disconnected way from the physical [00:12:00] worlds in a way that I think really makes it so much harder to kind of understand this as a set of systemic. Problems.

Kate: And I think that's to bring it back to the question that we started with, which is what is it about the fuzziness of the concept of child safety is because it's so fuzzy.

It allows it to sort of tap in and out of this like spectacle of the worst form of violence against children. And then add to that, the tech facilitated dynamics, which technology also conjures a set of anxieties, you know, particularly with parents. And both of these fuzzy concepts playing against each other so that at times we're talking about something very specific, but also we're talking about something very ubiquitous.

So it ends up basically being this kind of standin for all of our anxieties about young people and technology, and it lacks the specificity that's needed for the kind of big scale intervention and regulations. That we're seeing, playing out in discussions about client size [00:13:00] scanning or discussions about grooming detection or online harms and so on.

Alix: Yeah. I also feel like, I mean, I wanna pick up on that word anxiety. 'cause like the anxieties about children feels worth unpacking a little bit. It's kind of hiding. Several layers of different things. So like anxiety can be, I would like to be able to control what my child does, which is like one of the core driving anxieties I think of every parent of like, Hey, can you do that thing?

Oh, you're not gonna do it. I want to be able to make you do that thing. Um, so there's this like desire to control. There's also, I'm a conservative parent who really doesn't want my kid to be trans, so I'm gonna like completely. Rubbish. The idea that they could possibly be trans by getting radicalized online.

As a parent who doesn't want trans kids to exist and wants to control their kids, the second anxiety is like a legit, I think anxiety about what the internet is for kids. [00:14:00] The uncertainty, the lack of ability to like know, which maybe isn't control as well, like wanting to know what they're seeing, wanting to know how it's making them feel.

Is it a net positive in their life? Is it a net negative? Do I need to like do something to help them? Navigate the digital world, even when it is a world that I don't really fully understand. 'cause they're engaging with it in a different way than adults do. And then it feels like there's this anxiety about all of the legitimate harms.

That are happening, you know, like whether it's is my kid feeling down and gonna get advertised a suicide kit on Amazon and end up in forums that you know, are encouraging them commit suicide, is my kid who's like feeling kind of insecure about their body gonna all of a sudden be pushed pro anorexia content and then developing, you know, like, is my kid gonna be groomed in chat by someone on a video game forum and I'm not gonna, you know, there's like that whole bucket of what I think are very legitimate.

Anxieties. 'cause I know you think about this a lot, but like how do you think about parental anxiety as it relates to the kinds of problems we're talking about [00:15:00] as it relates to the kinds of solutions that are appropriate given how we think about dealing with parental anxiety?

Kate: So I think this is really helpful if we actually look at some of the examples of recent.

Regulations and proposals that have come up across the world. So for example, in Europe we have the Digital Services Act that has a strong components around child safety. The EU also proposed the CS a file, which is the child sexual abuse file, and it was proposing client site scanning. You know, in Asia, we're seeing similar verbiage around Canada's online safety bill or the UK's online safety apps being reproduced in in Malaysia, in the Philippines.

South Korea has had along digital sex crimes legislations. In Australia. We have the eSafety Commissioner, which I believe is a first regulatory position dedicated to addressing online harms, and they have the recent social media ban. New Zealand also has a online safety act in the African continent.

Uganda has the anti homosexuality [00:16:00] bill. That also includes like a online harms and online content components, even though it's received less attention. Similarly, Ghana has the human sexual rights and family values bill about that. Also. It's,

Alix: you know, again, don't really have to to know it's in that, to get a sense.

Kate: Yeah, yeah, exactly. That you can just like listen to the words and family values, doing a lot of work here. And then of course, in North and South Americas, uh, you have the Failed Kids Online Safety Act. COSA in the US that always wants to pop up every few years. Again, Canada's online harms Bill and Brazil actually has been doing some interesting work around children's data protection and child online safety measures.

So, you know, there was a lot of like regulation dumping, but just to show that there's a global like regulatory attention around online harms and around children and they are taking. What I understand to be quite privatized and technical and symptomatic approach to these issues. So what do all of these laws have in common?

And I [00:17:00] think finding the common thread can help us better understand like how we're understanding the issue of child safety and online harms. So one is that a lot of it is really focused on illegal content. So child sexual abuse in particular, you know, there are many. Harmful content that's out there.

But a lot of the fixation tends to be around sexual content, which also is kind of a blanket term, referring to things like sex education, content to pornography and child safety ends up being a stand in for tech facilitated child sexual abuse, which is a very specific issue that's gonna warrant a different set of interventions compared to, again, workplace safety or like general child

Alix: safety.

So essentially these bills lump together. Technology assisted child abuse with content that is legal, but there's an attempt to sort of ringfence kids from adult content.

Kate: Yeah. Yeah. So these end up being kind of collapsed. [00:18:00] I mean, my argument is that we should be thinking about them in very different terms because they're very different.

They're

Alix: so different. And like the implications, I don't know, just makes me think of like Tipper Gore in the nineties being like, we really need to have labels on CDs that tell parents that they shouldn't buy. You know, like combining mat rhetoric and discourse. With the very serious questions of abuse, like how did is, has it, are there conversations where that's been disaggregated in a way that's useful?

Or is it always put together?

Kate: Yeah, it's often been put together. So a lot of it has to do with this very particular US Protestant definition of child sexual abuse material, which formerly was known as child pornography and we don't use that term anymore 'cause pornography suggests. Sexual gratification and the idea is a child sexual abuse material or CS a is always abusive.

And that sounds intuitive, but actually when you look at the historical and the legal context, we see that CSA is a [00:19:00] very, very, very specific legal object that is imbued with a lot of. Projections about children and sexuality. So when we say CS A, the statistic that's often floating around is that just last year alone, ncmec, which stands for National Center for Missing and Exploited Children, and it's a US clearing house for receiving reports of illegal material involving children.

So CSAM, they often say that, you know, they receive 33 to 36 million. Tips about CS a M from internet service providers and that's, I just wanna pause it because that's a question number. Oh my God. God, I have so

Alix: many questions. Like on what timeline? Like in a year. In a year. Okay. And ISPs. So internet service providers.

So the people that run the router in your home, they're scanning the material that goes through routers for Cs a.

Kate: Yeah, so this is including, you know, like Verizon, so like the telecos to your social media platforms [00:20:00] to, okay, okay. Yeah. Payment transactions. So yeah, that's a really, really crazy number, 36 million.

And it's really daunting and it does a lot of work to, again, to conjure this visceral fear that this awful thing is happening on the internet. And again, I don't wanna dismiss because that that is also true and they're legally

Alix: required to report it. Is that the.

Kate: Exactly. So again, because of a lot of these ISPs being hosted in the US, they have to follow US Law and US has a law that if you become aware of CSAM being in your products, then you have to report it.

So they have a mandatory reporting obligation when they become aware of Cs a being on their products.

Alix: I mean that incentivizes disa awareness or whatever we call not being aware.

Kate: Exactly, exactly. So it's a, you know, again, because the mandatory aspect is on, when you become aware, platforms have an incentive to over report, and there's a great report from Stanford [00:21:00] Internet Observatory by um, Shelby Grossman and Rihanna FCO and so on.

That actually looks into the bottleneck issue that ends up happening because of this incentive. Without going too much into the technical details, part of the reason why there's such over-reporting, in addition to platform incentive, is that the category of CCAM is already really, really fuzzy. So it includes everything on one hand from explicit sexual act.

So the law defines us as things involving genitalia and visible penetration to things like lascivious exhibition, which is a very vague expression. Plenty of legal scholars have written about how this is not a legally useful concept at all, and that it very much reflects gabo of the time when the law was developed.

But lascivious exhibition basically has something called a dose test, which lays out these six visual criteria for assessing whether [00:22:00] a picture of a child is lascivious, exhibiting, or not. And it's like, you know, focus on the genital, the child's gaze is looking at the viewer a certain way, and it was developed by a bunch of judges looking at images of kids.

But how do you apply a standard like that at scale? Exactly. So that's when, again, this. You know, almost like a binary logic of like if there's a gaze, is it like this of the legal definition ends up creating this technological infrastructure through the use of computer vision technology. So ESPs, again, because of this incentive to report and their mandatory obligations have created a technological infrastructure that uses computer vision technology to basically apply.

This definition to detecting CSAM in collaboration with law enforcement and clearing houses.

Alix: So this is how the dad. Had his whole Google accounts deactivated. Yeah, because he had a picture of his kid in like the bathtub or something.

Kate: [00:23:00] Yeah, exactly. Yeah. Basically the machine learning system that's used is trained on hashes, which is like a digital fingerprint of confirmed or vetted CS a.

So this means that someone in the investigation pipeline, whether that is Google and Meta and the ESPs or law enforcement. Investigators or these clearing houses at some point in time have looked at this image and confirmed it to be a quote unquote true positive CS a m. That database is used to train this computer vision model to look for identical matches, so that's called hash matching technology, and that's used across the board as the first point of detection for finding known cs a That's on ESP services.

Alix: Interesting. I wanna get to technology enabled abuse. In your head, do you distinguish between these wars or policy battles about how [00:24:00] to reasonably detect, accurately report the kind of dark part of the internet where a lot of material of kids getting abused is shared and the use of technology by one person, probably an adult.

To pursue abuse of a child online?

Kate: Well, I think they exist on a continuum. So again, with this like hashes, again, a lot of assumption embedded into these kind of hash databases. They tend to involve young children, and the abuse is visibly a parent. But again, children or minors under the legal standards refer to anyone from the age of zero to 18, and sexuality development is a very natural part of.

Children throughout that age, this

Alix: is how we get sexting between two 16 year olds. Exactly. Ends up,

Kate: yeah, exactly. So as the detection system expands, not to just known material, but to potentially novel material, it's [00:25:00] applying this very narrow understanding of child abuse that's visually obvious to a bunch of.

Images that I think can very understandably arise. So that includes everything from like baby bum on the beach, or parents and grandparents with pictures of their kids taking a bath, which again, the genitalia might be in focus in that picture because you're taking a bath of a naked child. But it doesn't mean that that was taken for sexually malicious purposes or is abusive.

Yeah. Or is abusive. And then there's, you know, older children who are sexting. And again, you know, wanting to take a intimate picture of you to share it with trusted people, whether that's friends or partners. There's a very normative expression of sexuality when we look at studies about why kids do it, which I think is a question that I have to get from parents.

Like, why are they doing it? What if they just didn't do it? And this gets really relegated as like a frivolous response. But I think it's so important to actually. Acknowledge and like hold it, which is like, kids are like, it makes me feel good. It's fun and I feel connected to my friends and my [00:26:00] partners and I like looking at my body and it's an expression, I'm a person.

I'm a person who likes to have fun.

Alix: Yeah. Yeah.

Kate: Um. And I think that gets kind of relegated as this, oh, you're endangering yourself a lot online. And that's where a lot of the Stranger Danger narrative is seeping in as well. But to kind of go back to the technological aspect of it, all of these images could potentially be considered CS a m.

ESPs are incentivized to over-report. Even though some of these things like consensual sexting between older teens is completely normative, but again, under the eyes of the law, if your image gets detected, then suddenly you're implicated for sharing and possessing CSA, even though it's your own imagery as a young person.

So this law enacted at scale through technology ends up creating these situations where. People are expressing their sexuality. Young people are expressing their sexuality in completely understandable ways, but detection system [00:27:00] doesn't allow for all of that nuance to be captured. That's kind of the over inclusive nature of detection.

On the other hand is the under inclusive nature of detection, which is that CSAM or the images of naked bodies of children isn't always the most useful indicator of actual bowel treatment of children happening in place. Things that we see online are often symptomatic of. Offline and other dimensions.

There's a very strong relationship between poverty and neglect as to why certain kids face greater online threats than not children who have their non-consensual image shared, but they don't involve nudity. So you know, a young girl without her face garment or boys holding hands. These are things that don't feature nudity at all, but actually under the eyes of.

The law, it doesn't register as sexual abuse because it doesn't include nudity, but in real life, these images are used to humiliate and shame and threaten young [00:28:00] people's lives all the time. So these are examples of where the law and the technology is actually under inclusive in thinking about child sexual abuse.

Alix: So this feels like a classic problem where there's a growing intersection between technology. And a thing that is deeply complex and sociological and political and contextual, and there's an expectation that technology, if sufficiently designed, can essentially take us in a binary way from lots and lots of problems to no problems, but actually it creates a lot of different problems, and then we become fixated on.

What is the technical system that can solve this stuff, and that then becomes the primary focus of a much more complex set of issues. I mean, it feels so classic, like this is like basically every technological social system is that people really [00:29:00] want a technical fix. Those often don't exist.

Kate: Yeah, and, and also the things that are technologically convenient for us to focus on.

All it means is that that object is technologically convenient. It's actually not the best indicator of. Actual child sexual abuse. Some of the work I've been doing the past six months has been looking into this like growing legal and technological ecosystem of detecting child sexual abuse material again, under the guise that CS A is a good indicator for real child sexual abuse happening, which again, it can be, but.

Not always, and not always in really important ways. So when we're chasing, counting, and documenting and archiving CSAM by rolling out these technologies, the more and more ESPs are adopting to my knowledge. There is no independent, uh. Audit from algorithmic bias perspective of the performance of these tools.

ESPs are just rolling them out [00:30:00] and they're not vetted regularly in terms of their accuracy across skin types or their accuracy across differently gendered bodies and so on. We're basically expanding this ecosystem that's really good at detecting a certain object. That doesn't necessarily mean that it's a good indicator of real child sexual abuse.

And research shows that a better indicator is thinking about maltreatment, things that are connected to offline dynamics that child sexual abuse prevention experts have been talking about for a long time. And that involves having training and capacity building youth serving professionals and youth serving organizations to identify harmful.

Normative sexual behavior early on. Comprehensive and robust sex education, comprehensive and robust sex education that's tied to like digital intimacy, understanding. And these are things that kids actually want, like kids want to be talk about this with their parents and their friends, [00:31:00] but their friends and their parents and their guardians are not always ready to or willing to because it's

Alix: uncomfortable for the adults.

Kate: Exactly.

Alix: It's so interesting that like the YOLO attitude to rolling this stuff out when both false positives and false negatives are so consequential. The fact that we fixated on scale before we fixated on the actual problem solution match just feels really. Kind of dark.

Kate: Yeah. Yeah. This is such a consistent story of technological solutionism.

Yeah. Yeah. Um, kind of rolled out to the most like logical, extreme and all being done in the name of a good thing. You know? It's not like there's this like evil round table of technological leaders. It happened because. People were trying to do right by child safety. But what ends up happening is that we are dedicating more resources and giving more power to these, again, intrusive and untested technologies and to people who actually, their responsibility comes after an [00:32:00] abuse has taken place.

So we're really looking at detection and investigation that's. Getting bolstered through this legal and technological ecosystem rather than, but the, but the people who are actually on the Yeah. Front lines doing prevention, actually talking to young people and interfacing with them on the regular basis with the actual expertise to provide them with prevention.

That ecosystem is shrinking globally, particularly under austerity.

Alix: It also feels like we're in this phase of consolidating how the internet. And content is gonna work. Obviously section two 30 goes away. This is a very different conversation and there are some like structural things that might change dramatically that make a huge difference.

But it feels like we're in this era of we had this kind of wild growth that was very chaotic, very VC backed, so happened really fast with very little care concern for the impacts that these technologies were gonna have. And now there's this period of like society. Being like, ah. Um, and it feels like, you know, there's, there's [00:33:00] so many different panics around things, and so policies are being made, technologies are being made, policies that require technologies are being rolled out, and it feels like it's all gonna kind of congeal structurally, which makes it a really important time to get a lot of this stuff right.

And it feels like. We're not on the path to getting these things right. Like, I don't know. How do you feel about the next five, 10 years in terms of the infrastructure layer of how we as a society manage some of these risks and harms to kids?

Kate: Yeah, I mean, I am really concerned, um, as I'm sure many, many people are.

There is good faith in trying to roll out policies and do something. I think it's uncontested that we have rolled out these technologies to have such grip on our personal and public and intimate lives in ways that really disadvantage young people, and that's something ought to be done. I don't think anyone is in disagreement, but in that rush for something to be done, there's a tendency to kind of [00:34:00] just.

Dump everything and see what sticks almost, or, um, just a rush to get something done without really reflecting. And this is again, going back to the global trend towards all of these online safety acts. You look at the language in the bill, and it's not just about child sexual abuse, it's also about national security.

They define illegal content as CS a m and the protter content. Why these? Two things are grouped together in this one bill that's being described as a child safety bill. That's unclear to me, but that's doing a lot of work there. Right? And to the point that you were just making, and what I'm particularly worried about is that it's all taking a very policing heavy approach.

If we look at the fuzziness of child sexual abuse material, particularly looking at context where images might be generated in a very normative context of young people exploring their sexuality, why is our first point of response to detect it and to [00:35:00] report it? Instead of our response being, what can we provide young people so that they can express their bodies and express desires in a safe and trusted way?

And we see that at scale in governments, again, allocating more money and resources to. Developing this technological infrastructure while prevention, while sex education and all of these things are getting reduced. To me, the clearest indicator of that is Crisis Text Line, which is a text helpline for young people, um, who are contemplating suicide.

So it's a US kind of mental health helpline does a report every year. You know, they talk to young people who've accessed their services and ask, what do you wish that you had? And the thing that is really devastating is in 2024 in the report, the six things that adolescents said that they wanted are, um, one, opportunities for social connection.

Two, engagement in music writing, visual and performing arts. Three mental health services. [00:36:00] Four, exercise and sports programming. Five books and audio books and six outdoor spaces in nature. So while we are funding more and more into the technological infrastructure, all of these six things are the ones that government is not.

Supporting and that tech companies are constantly extracting from us. And these are the things that young people are saying that they want and that they need so that they can live their lives happily, freely and be able to take risks and be taken at face value. And that's the tragic part of this, that we get so caught up in the law and the technology that we're not actually listening to what young people are literally telling us that they want Directly.

Yeah, directly. That's so,

Alix: I mean, I, so I hear you on like. Misallocation of resources, presuming resources are zero sum and you have to choose between detection and prevention and like actual meaning offline programming. I'm wondering if we were to focus on the technology part of that question, setting aside, maybe it shouldn't be so law enforcement [00:37:00] oriented, like what would you wanna see in terms of either how the technology currently in place for kids to engage to have that change or additional technology that could support.

Parents, teachers, law enforcement kids, et cetera. What would you wanna see?

Kate: Yeah, I mean, one is to actually go after the predatory business model of technology companies, things like sextortion scams or other extortion scams. They target young people, not because they have sexual interest in young people, but because young people are gullible.

One of the ways in which these scam schemes are able to take such advantage of young people, also elderly. Is that they're able to generate really convincing profiles of themselves, of their, of their friends, and that's all possible because of ad tech and data brokers that make young people's information so readily accessible because we have an entire social media ecosystem that's based on monetizing personal data.

And [00:38:00] actually there's a really clear complicity there for EdTech a lot of these technology tools that are used in schools. They're monetizing young people's information and selling it to the highest bidder, and that allows for data brokers to make it available so that scammers can easily just extract that information and make a very convincing scheme that 12 and 13 year olds are easily going to fall into.

And again, that's not because of their gullibility, like it's designed to trick you. And it's designed to trick you effectively because the information that they're sharing is so convincing because it's from real people and real lives. There's been a lot of interest in financial extortion schemes. To me, it's very strange to frame that as like a youth sexual abuse issue when it's part of an extension of a scam issue, and that affects everybody.

Because of the data monetizing model that we have. So strong data protection for young people is strong data protection for everybody. And that is going to keep all of our [00:39:00] information safe. And it's been interesting that when we think about safety, we don't think about strong data protection.

Alix: I sent two packages to the US within like six weeks of each other with the same shipping service.

It was with DHL. And within 24 hours I got a call with someone trying to socially engineer to get me to send them money using that contact information that I had put in the system with DHL. So. That happens to me twice with DHL. And I'm like, oh, so DHL someone in DHL is selling up to the minute information.

And then I'm like, okay, like DHL, like there should be a place where I can go and say this is happening and it's probably happening to thousands of other people, but like there's no real consequences for any of this stuff. And I feel like we've just sort of accepted it happens and it's on you to detect it.

And the same is true for kids. I just find it wild that we haven't addressed the sort of structural, like even, sorry, now I'm rambling, but like Tanya O' Carroll's recent case, [00:40:00] she won against Meta and basically said, according to GDPR, she should have the option to be able to turn off targeted advertising because essentially it's a form of surveillance and it's a form of consent that she does not give.

She should be able to use Facebook as a service and be able to turn that off. She won that case and it got me thinking like. It's kind of fucked up that like targeted advertising as a model is acceptable for kids. How is it okay that if you have a kid account that they're. Surveilling you and advertising towards you?

How did we get there?

Kate: Yeah, and then the research is pretty, pretty strong and consistent that advertising is actively harmful to children under the age of six. Like the fact that we have normalized that, that we can do targeted advertising to young people to that extent is a reflection of like, we've led these industry actors.

Just roll things out without responsibility. There is an opportunity to like actually hold big tech accountability here. I think where we get it wrong is things that happen [00:41:00] to children, I think are symptomatic of larger issues, and when we don't ask ourselves, is this a tech first issue where we can enact systemic change, or is this an issue that's.

Surfacing through technology. I think that question about when do we foreground tech is a really important one that we need our regulators and we need our civil society representatives to really think about. But when it comes to child safety, because it's so visceral, I don't see that additional step of asking like, is this the appropriate technological intervention to a problem?

Because. The issue of sexual abuse has unfortunately existed before technology. We're seeing it surfacing in a way that is really disturbing, but you know, overall crime is on the decline based on these like big longitudinal studies and also. What we're seeing is just the tip of the iceberg of a larger structural issue.

So technology is really good at showing us the symptomatic, but we have to do the extra work to [00:42:00] understand what the systemic harms are and which interventions are appropriate for that. You know, I get it. Like the symptomatic is so easy. It's the low hanging fruit, it's right there and. It does get us to react in such a strong way, especially when children are involved.

But as adults, particularly as adults, in positions of power, we have to do extra work of holding ourselves to account and how do we actually take responsibility for this fucked up world that we're handing off to our kids because we let big tech get in the way. Like how do we offset that? And that means we have to ask ourselves a hard question of.

When is it actually tech first and what's the actual systemic issue here?

Alix: Okay. I feel like that's a good place to end the kind of deeper conversation I have a couple of rapid fire questions that I feel like you've probably thought about. One. What do you think of social media bans for under X Age? I think it's a, it's a bad idea.

Do you wanna say a little bit worse?

Kate: Uh, yes. Um, I grew up in Korea where we have real name policy and there are bands to certain platforms, [00:43:00] and what ends up happening is that it just drives young people to less moderated. Online spaces and band to me is a digital version of abstinence. Telling kids not to do something is gonna make them wanna do it more.

It's your God-given right as a teenager to wanna do something that an adult tells you not to do. So just like on principle and also like an actual enforcement, like how are we gonna roll that out? And I think that's where things like H verification come in, which is. You know, another longer conversation.

But what if you are a child and you don't have documentation, or you know you don't have trusted adults who can verify you? And why are we allowing big tech companies to gather more information? This is a problem that exists because tech companies have too much information about us. Why are we solving that?

By giving them more information of our most intrusive government information.

Alix: This takes me to my second question, which was what do you think about age verification? 'cause there's obviously like a big push. There's companies that say, you know, with almost perfect accuracy, they can estimate based on someone's [00:44:00] face, whether they're under age.

What do you think about that? I presume you think it doesn't work and it's dumb and it's more surveillance architecture, but

Kate: Yeah. Yeah, more or less any kind of facial recognition. We know that the accuracy is extremely. Drastically different across skin types, um, especially when it's looking at younger people and actually giving more data to facial recognition firms so they can refine their model based on training data of our children, to me is not the solution here, and we're giving them more information.

And if they are able to, you know, quote unquote accurately guess, it's also because they're pulling in other information about us based on targeted advertising. So again, we're in the same situation where. The problem created because they're extracting too much information of our most intimate desires and interests and curiosities.

We're giving them even more information to do that and do it through these third party firms that are unvetted and unregulated.

Alix: Okay, two more. One. I think I know the answer to this one too, but what did you think of the anxious [00:45:00] generation?

Kate: Okay. That hated it.

Alix: Um,

Kate: like yeah,

Alix: people were

Kate: listening to it. I mean, he got one thing, right?

Sure. Which is, you know, the anxious generation. It does get to the core. I think he unintentionally got that one right. But, uh, please listen to, and books could kill episodes. Oh my god. Hate an anxious generation.

Alix: That's, it's so good. Okay. We'll link that in. It's so good. Yeah. They have misrate it, but,

Kate: and actually there's this amazing lecture off between, um, Candace AERs and Jonathan Hay.

Um, I will also link it to the documents so you can see it itself. Amazing. Very satisfying.

Alix: But I mean, I guess so I mean, the book itself, which like from other people who I trust who have eviscerated it, it's shit. In terms of it as a foil for galvanizing a conversation. Do you feel like the conversation that it has sparked has been.

Net. Positive.

Kate: I think it's been n negative, um, particularly, and that's made my life difficult. Um,

Alix: how dare they,

Kate: how dare, how dare they? Um, I, I mean, I think one thing that is worth is, and I think this, this is probably kind of a, a [00:46:00] takeaway for the digital rights community. I do think parents very legitimate concern about young people and how to manage devices in a world that increasingly feels just so ungraspable to them is something that digital.

Rights community particularly, but technologists have been really dismissive about, um, and issues around safety. And these are very real issues that really impact young people's lives. And there's a tendency to sort of hand weave it as like moral panic or just like nervous moms, and that's super gender and a super racialized.

But these are real issues that are impacting families and there is an opportunity here for civil society. And technologists to be really clear about offering interventions that actually address the problem. Um, and in the absence of that, we end up defaulting to, again, these like spectacles and symptomatic solutions.

Alix: Okay. So my last question is for parents. What do you say in terms of like how to, or are there any resources that you point people [00:47:00] to when thinking about what to do or how to have different conversations with your kids?

Kate: I mean, when I talk to parents, what I find is actually I, I end up just assuring them that they're trying their best, and that's all you can do.

I'll kind of end on this like quote that I often end my lecture with. It's from a little, I think it's a zine or a book I'm forgetting on like trust kids about like youth liberation. And the quote goes, kids are fully formed human beings the moment they come into this world. They know who they are and they know what they love and they know what to do.

It's our job to be stewards of their humanness and provide the opportunities that they need to thrive. Not what you think they need to thrive, but what they tell you they need to thrive. So I think that quote does a really great job capturing that. You know, your kids actually do wanna talk to you about things that might make you uncomfortable, but they.

Want you in their lives and they're talking to you. And it's our job as adults, as trusted adults in their lives to actually listen to grow the fuck up. Yeah, grow the fuck up, and to listen and be available when they do [00:48:00] reach out.

Alix: All right. Well I think that's a good place to end it. Thank you. This was super kind of grim, but also I think kind of helpful and I feel like I just appreciate the depth of.

Knowledge you bring to this conversation? 'cause I feel like, I don't know, I get frustrated by what feels like a, it feels like the way that this conversation happens oftentimes devalues what's actually happening and it makes it actually, there's just this huge mismatch between how these conversations happening and the type of problem and the size of the problem that we're dealing with.

And I just really appreciate kind of how you look at this so holistically. So thank you.

Kate: No, I, I feel like we're chasing this like imagined child victim when real children are telling us what they want and what they need.

Alix: Yeah, totally.

I hope that was. Informative if you're a parent, made you feel a little bit at least clearer about the kinds of conversations that you should be having with your kids if you're a policymaker. Hopefully giving you some inspiration of how to approach this issue without [00:49:00] treading on other rights, and also not getting too technocratic with the solutions you're trying to apply.

If you're a tech politics person, hopefully this has given you. Some food for thought on how to navigate addressing concerns, reasonable concerns about kids online, but without succumbing to kind of the moral panic around it. We'll leave it there. Thank you to Georgia Iacovou and Sarah Myles, producers of the show and Prathm Juneja, who's doing a bit of pinch hitting this month while Georgia's out on vacation.

And thanks to you for listening, and we'll see you next week.