Baroness Beeban Kidron shows tech isn’t exempt from society’s rules, children aren’t de-facto adults and that data protection, child-centered design and digital rights enable the digital world children deserve.
How is the use of artificial intelligence (AI) shaping our human experience?
Kimberly Nevala ponders the reality of AI with a diverse group of innovators, advocates and data scientists. Ethics and uncertainty. Automation and art. Work, politics and culture. In real life and online. Contemplate AI’s impact, for better and worse.
All presentations represent the opinions of the presenter and do not represent the position or the opinion of SAS.
KIMBERLY NEVALA: Welcome to Pondering AI. My name is Kimberly Nevala, and I'm a strategic advisor at SAS. I'm so pleased to be hosting our second season as we continue to talk to a diverse group of thinkers, advocates, researchers, and doers, all working to ensure our AI enabled future puts people and our environment first.
Today it is an immense honor to be joined by Baroness Beeban Kidron. Beeban is an award-winning filmmaker. She's a Crossbench Peer in the UK House of Lords and the founder and chair of the 5Rights Foundation. She's leading the fight to protect children's rights and well-being in the digital realm. Welcome, Beeban.
BARONESS BEEBAN KIDRON: Thank you for having me.
KIMBERLY NEVALA: Now will you share the story of your journey from film to advocacy?
BARONESS BEEBAN KIDRON: Yeah. It was a bit of an accident; I have to admit. I'm someone who, for more than 30 years, was a movie director, I have to say, in a time when there weren't many women doing that role. But in between making narrative films and Hollywood movies and all of the expected stuff, I always used to make films that had a social interest, social purpose.
And so in 2012, around the time that a smartphone became the price point that a child might be given one, I noticed a shift in the young people around me. And I became interested in that. And I made a documentary film that was called In Real Life.
And in the course of making the film, I spent hundreds and hundreds of hours with teenagers in their bedrooms, doing whatever it was that they were doing online. So I have to say it involved watching pornography, playing games, going to meet ups, falling in love, you name it, I watched it.
But in the course of that experience, I also interviewed a lot of experts. It was on that journey, when people kept on saying that the utopian vision of the internet was that all users would be equal, that I realized the problem. And the problem, as I saw it, was if you treat all users equally, then de facto you treat the child as if they're an adult. And that seemed to me regressive.
It also seemed to me to explain a lot of the problems that the young people I have spent all this time with were having. It seemed the reason. It seemed to underpin and at that moment, I had my light bulb thing which said, actually we're not all equal. Some of us are children, and as it turns out, one in three of us online are children.
KIMBERLY NEVALA: That's an amazing statistic. Actually, I don't know that I've ever necessarily thought about what the actual percentage of the population online would be children. So, you've - as you've mentioned - been speaking then for at least a decade now about the cultural implications of the increased technical aspect. Now in the broadest sense, how have you observed digital changing everyone's sense of self as individuals and as part of a bigger human collective?
BARONESS BEEBAN KIDRON: So it's a really great question.
Preemptively, I'd just like to put out there that it's important that we recognize that technology does not sit outside of the changes to society more broadly. So technology is a tool for an expression of globalization. It's a tool for an expression of 21st century capitalism et cetera. So in that regard, many of the issues associated with it are not technological but to do with the business model that has grown out of those particularly, I would say, out of those two things.
And I think that then if you move towards to the tech sector, and you go, actually what's unusual about them is not that they're innovative and creative and bring these things. But what's unusual and particular about them is that they have successfully argued for an exception to be treated as companies. So if you think about pharma or oil and gas or defense and food and so on, all of those people, all of those sectors trade within society's rules, and tech argues it should not.
So the reason I say all of that in order to answer what was a slightly different question is to provide context for my real feeling, which is that rather than predominantly connecting us, which is the promise of this technology, it has, in practice, divided us. Not because it must because-- I mean look at us, we have some trouble signing on. But here we are talking across space and time. But actually because those people have most reach and most depth are driven by commercial concerns.
And that means that, for them, any price we pay as a society or as an individual is worth it for their bottom line. And at an extreme, that can mean public disorder like Capitol Hill or murder like Christchurch or death as in the suicide of young people or the spread of fake news et cetera.
And so I guess what I'm really adding up to saying is that it's a pretty cynical business which first atomized us and then links according to the needs of what are essentially advertising platforms rather than linking us according to our human needs. So that's the shift. And I really think it's important to put it in the broad context because actually by putting it in the broad societal context, you also start to speak to what the solutions are.
KIMBERLY NEVALA: You mentioned you were sitting with kids and with teenagers. That had to be eye opening on so many levels. It also shows that you have the ability to really engage and gain trust with folks. You mentioned you started to see some of these problems. So what were or are some of the discrete problems and lessons you've observed that we're teaching children due to how they engage with and how we engage them through technology?
BARONESS BEEBAN KIDRON: So I think that the first thing, and it's very unsurprising, is the culture that you have to be in touch at any cost. And that is a culture that is not accidental. It is designed into the system. Designers call it engagement networking, extension of time, et cetera. There are KPIs on all of these things.
KIMBERLY NEVALA: Lots of nice sounding things we can measure.
BARONESS BEEBAN KIDRON: Yeah exactly. Nice things you can measure but for a child-- and I do remember a particular young child that just popped into my mind actually, who said, “Miss, can you just tell me how long can I leave it before I respond to a text”. For kids it is this imperative to be in touch that, I am going to die if someone takes away my phone. Parents, who take them into the woods, to make-- put the gaming console there. That's a really big piece.
And then you get into other things which I'm sure we can return to but to the need to be thin, they learn to hurt themselves, insecure, self-involved, that sex sells, spreads, share, pays, competition and so on.
But I could give you a very long list, but I also want to do the other thing because, and I'm going to be like a broken record here, this is not about technology. It's about uses and abuses. And this is what the result is of companies that have data extraction on their mind because it's great. You can be a budding musician or filmmaker and get your stuff out. It's great that you can speak to your granny across the world. It's great you can find out health information that perhaps the parents or carers (caregivers) in your life don't want you to know or facts about climate change or indeed anything else.
So the problem is about the dominant pressures and we at 5Rights do an awful lot of research, and we read a lot of other people's research. But frankly, you don't have to be an expert or an academic to know how kids are actually spending their time.
So all these beautiful things that they could be doing are actually not what they are doing. And kids tell us, parents tell us, teachers tell us. And so, I'm afraid, does the share price of Facebook and Google tell us what they're actually doing.
So I think that the pressures are very extreme. We've seen quite a record of it this week in the press with our Wall Street Journal revelations. We've done a lot of work here at 5Rights that shows the kind of pressures. And we engage very, very closely with kids and they tell us.
KIMBERLY NEVALA: It'd be interesting to hear through their voice as well. What do you hear from them directly, what do they say?
BARONESS BEEBAN KIDRON: They feel that they are under immense pressure to look good, to be on, and to be popular. So unsurprisingly, the three mantras of this particular business model, which is network, interact, and spend time, are the very three things that are at the top of the kids' list. So unfortunately, the designers are doing a fabulous job, and it is resulting in the real life of kids.
I think that the other thing is actually around sex, sexuality, and some of the pressures around that. And that manifests in some very, very ugly ways about how behaviors and assumptions particularly, I would say from what the message of industrial level porn says to boys, how they're supposed to behave, and what they're supposed to like, and then the experience of girls with those pressures. But actually not exclusively that, just to be clear.
And we've seen here in the UK a phenomenal rise of children both self-creating child sexual abuse material of their own and also children being abused. In a recent report that hit the headlines here, some girls were being asked as often as 11 times a night to post naked pictures by people they knew of their own age.
And the sexual economy has got younger and younger and younger. And the thing I would say to people who are listening is rather than seeing this as an anomaly, something that is just part of life, just imagine if young boys came to the front door of your apartment and knocked on the door 11 times in one night and say, hey, is so-and-so in? Will she just come to the door naked? Someone would get the nose punched. It's created an alternative universe which kids find necessarily difficult to navigate.
KIMBERLY NEVALA: I can only really imagine back in the day, where we are not so young as we used to be, but you could disconnect from it. You walked away. Peer pressure was there. It's always been there. But I think there was still a differentiation between my sense of self and my sense of self-worth that wasn't only related to how I was being judged and seen and connected within that broad community. It's always a piece there. It's almost shocking though.
The other thing you said there that was important is that we tend to think of certain behaviors and interactions in the digital ether as somehow being different than what we would agree to in the real world. Is that right?
BARONESS BEEBAN KIDRON: Yeah. And I want to pick up on your point.
There was another thing about the division of self. Even if you were under pressure socially, you went home. If you were under pressure at home, you went to school. If you went to school, maybe you had some friends. In a connected life of a child now all of those things happen through the device itself. They're all in the same place. So you can't even do your homework without being bombarded by the question of your sexuality perhaps. It's a very muddled space, so part of your sense of self was having a little bit of recovery time to actually go, oh no, I'm not that person those people are telling me. I am this self.
Whereas a lot of children talk to us about an online self and an offline self. But what they actually end up describing is their perfect being and they're depressed being rather than a private and public. And it's really quite disturbing at a very individual level.
KIMBERLY NEVALA: Yeah it is fundamentally disturbing. And it causes me to think about why have we as an adult collective been so laissez-faire about these issues. You've obviously talked about the commercialization and the business model of the internet, but I'm sitting here right now going, wow I'm not helping. How was I, not necessarily aware, but why am I not active? What's been keeping us from--
BARONESS BEEBAN KIDRON: Well, first of all, let me say join the fight Kimberly. You are welcome in our team.
KIMBERLY NEVALA: I'm coming.
BARONESS BEEBAN KIDRON: Come on down. And I really appreciate this question because to be honest, I've spent quite a lot of time thinking about it.
I have to say not only amongst my political life, my professional life, my personal life. I was crazy that people couldn't see what I could see. And I have come to the conclusion that it is about a few very key things.
I think the first thing is it says something to adult’s own entrapment. They're involved in their own false god about convenience. I will point out it is the convenience of the tech company rather than yours that they've outsourced lots of things for us to do rather than them. But that's for another day.
Cheapness, which is another thing, because actually they've outsourced the cost as well onto either individuals or society more broadly. And we can come back to that. And a human love of the new. So I think there's one piece, which is around people wanting and loving and feeling that they must love the tech.
I think the second thing is that they have literally armies of behavioral scientists. I am an addict to the news apps. I go from one news app to the other news app. And Wall Street Journal may say something different from New York Times and then I must see the Telegraph and the Times and the Guardian. And I mean it's insane. Nobody needs that much news and it's all the same news.
So we are addicts to the system, and therefore, it's very difficult to separate ourselves and our needs and our agency from that of people who are growing up into it and don't have the offline experience and don't have other interruptions in their lives that might take them away from it. And then the last thing I would say is that we are really very bad in the modern world about numbers and about science and about facts.
And it's a horrible white noise that we're a little bit frightened of.
And as we said in the beginning, I was a movie director for 30 years. I can tell you that people would sit for hours and listen about the most inane details of movie making, whether it was how we do the makeup or what the call times are or who's first on the call sheet or all the nonsense they will listen to. I'm not talking about the interesting bits that we'll all listen to. But they'll listen to the egregious behavior of movie stars as if it were a new chapter in the Bible.
But when it comes to this, they go, oh that's someone else's business. And as someone who's brought in data protection law, all I can say is meet someone at a party, sit at a table and they say, what do you do? And I go, data protection law. And they literally scream for the aisle.
So I think that there's something more profound about our cultural move-away from understanding how things work. Because actually, when people do take the time, and some of the stories that we've learned to tell here in 5Rights, whether it's Twisted Toys or whether it's creating images-- we made a newspaper about data and we made a kids' book about the code-- and when people begin to understand it, they are outraged. Outraged, angry, and fired up. And they think it is quite wrong.
Just a really tiny example, when I explain that an algorithm could determine that an adult who's a stranger should be introduced to a child, as a random thing, they go shock horror. Is that not against the law? And I go, no, it's a feature on 75% of social media sites that your children are using. And they go something must be done.
So I think it's a lot about storytelling and it's a lot about these other underlying cultural things. And it's been astonishing to me that I happily report that sometime here in 2021 people have got a grip and may have begun to understand. And I am very, very happy about that.
KIMBERLY NEVALA: That's excellent and I want to talk about some of the things that you're really doing and moving forward with 5Rights that I think are progressing that narrative. Because to some extent, it strikes me a little bit like a traffic accident where you don't want to look, but you can't help it. And then it's so horrifying. You still want to close your eyes.
And then you've got the abstraction into the digital. And you think, well how bad could that be? But as you said, I'm actually just not telling myself the story of "would that be OK in real life?" There are so many pieces and parts there. So you want to shift that digital paradigm for children full stop. Tell us about 5Rights and the vision and mission of the organization. And then we're going to talk about some of the legislation and that Twisted Toys campaign that you talked about.
BARONESS BEEBAN KIDRON: OK. So 5Rights was my response to no one being interested in this agenda. Was like, OK, I'm going to do it. And I always like to say my poor husband, he sat me down and he said, you're one of the most successful directors in the world are you sure one middle aged woman against Silicon Valley-- And I always have that in my head because I think we all have a duty to act when we see. And it just sometimes happens that you see. And that was my duty to act. It wasn't a choice.
So 5Rights was born out of that moment, born out of that experience, and born out of that particular time where I saw it as being a systems problem. Not a problem of bad actors and users. Those problems do exist, but it's a systems problem.
And so our mission at 5Rights is to build the digital world children deserve. And I think that that's really important because we do think they should be digitally involved. And we do think this is a new world order in which children have the right to participate. And we are also very interested and do some work in the global south to make sure that children do have affordable access. So it's not the naysayers here. It's the "what quality is that experience?" And we have three gazes.
The first is data protection and privacy and the reason for that is follow the money. If you make your money through data, you can protect people through data because the extraction of data is the thing that actually all the design features hang on.
The second one is child centered design, which is around assessing and mitigating, answering one question and one question only, which is, if the person at the end of this service or product or feature was a child, how would it impact them and how would you design it differently for their benefit? And actually if you just keep on asking that question, whether you are an e-commerce company who recommends a knife with a school bag-- because kids have been buying those together, that idea, true fact-- or whether you're a social media company who's got a map thing which shows the real live location of a young child to the public, and start thinking, is that a good idea? You start coming up with different ideas.
And then thirdly is children's rights. And our argument-- and this is a bit difficult for America, which is the only non-signatory-- and North Korea-- to the Convention on the Rights of the Child-- but for the rest of the world, this is the foundational document of how you behave in regard to children and it's a fundamental overarching provision is that you must act in the best interest of the child.
And I had a wonderful conversation with someone at one of the big platforms who said, we thought that was so abstract until we started doing some news cases. And actually, every single time, we really knew what was in the best interest of the child.
KIMBERLY NEVALA: Right.
BARONESS BEEBAN KIDRON: Actually, a rights framework has been very, very useful in legislation here in Europe. And 5Rights-- and we're very proud of this-- we, on behalf of the Committee of the Rights of the Child, drafted, in huge consultation with workshops with children in 28 countries et cetera, what is called a general comment. And general comment 25 sets out how children's existing rights apply to the digital world. And that was, for the first time, made official that children's rights actually exist in on, off, virtual, real, et cetera. And that was adopted formally by the UN family earlier this year.
So those are the three cases, but they're all about ending the era of exceptionality and putting it in the societal, treaty, rules-based, world. And it's all about looking at it as a systems level not going, oh that poor child. That happened to that poor child, and that was that particular platform. Because actually almost all of these things are systemic.
KIMBERLY NEVALA: Yeah, that's a really fundamental and important shift in the lens and in the point of view that you take. And the means that we approach that conversation vary differently. This is not folks with pitchforks and things storming the gates but having a very productive conversation even in what seems the face of insurmountable odds.
Now I want to come back to this idea of youth as stakeholders as opposed to subjects. But before that, can we talk a little bit about some of the really important legislation? You mentioned the code, but also I think probably one of the most important policies and legislation I see out today has been the children's code. Can you walk us through the evolution and the impact of that?
BARONESS BEEBAN KIDRON: Yeah. So the code came about in the UK because, in anticipating Brexit, we brought the GDPR into international law. And in doing so, it had to pass through parliament. And I had to look at it, and I thought it is shocking to me. And we talked about this right at the head of the show, which is one in three users, one in five in the UK, are under 18. So that's one shocking fact that didn't get much more attention
I think the other shocking fact is that because of a piece of legislation in the US that was brought in the 2000s, which is called COPPA, that children traditionally online are considered at 13 and under. And I think you have to just not go very far to ask any parent whether they think their 13-year-old is ready for the big, wide world. If you're really going to insist on it, I'd love them to send them out into the world and let them just come back in 10 years and see how that went.
So there seem to be these very crucial things. There seem to me to be an inattention to children. But there was this one thing, and it was a recital 38, and it said children require special attention. It was slightly different from that but that's what it meant, a special consideration. And I thought, OK, what would that look like if we had a data code that was the special consideration for children?
And I sat down, and I tried to imagine what I thought it would look like. And first of all, I thought, oh I would define a child as someone under 18 because that's the definition under the UNCRC. And we will deliver on the best interest because the convention also says that.
And then actually, instead of having a legislation that is about the sites that are directed at children, we do it where they're going to be because children under 18 are, largely speaking, most of their time, not on things designed for children. We're not talking about improving the Disney app. We're talking about the entire spectrum of where children are.
And on that basis, I put forward an amendment to the Data Protection Act that was called the Age-Appropriate Design Code: we often call it the Children's Code. And it really set out a rights-based approach to data protection. And in negotiation with the government, and finally with the extraordinary commitment of the regulator once it was through all its hurdles and all the battles and all the transitions and so on, that translated into 15 standards. And I have to say 14 of those 15 standards were the standards that the minister agreed to in parliament when we passed the amendment and the regulator added one.
But I think what is interesting is on the journey, there were a lot of naysayers. And on the journey, there was a lot of pushback and everything from this will be the end of the internet to we are leaving the UK to the grandiose ridiculous claims to actually people having very, very legitimate concerns about dumbing down children's experience, or they might be locked out or so on.
But actually, as you've seen in the course of this summer as people started to actually comply with the code, what people began to understand is that it is in fact a safety-by-design system. What it says is, hey, look at your service. Is it likely to be accessed by children? And it lays out what that might look like. It's very simple.
If your market research says, if you know your customer says, if academics say, if children say, then it is. And if it is, do this risk assessment and look at these four areas of risk. And then actually look at the provisions and see whether your risk and your provisions are bumping into each other. And they're very bold, but they're very straightforward. They say you may not share child's data with a third party unless it's in their best interest. You must not profile them to deliver them detrimental material. Do not show their real time location. Put GPS off unless they turn it on and put a warning. Don't let their parents survey them unless they know their parents are surveyed. I mean they're very straightforward, basic things.
And what we've seen the result of right up the other end is Google is now safe search automatically for kids under 18. You would ask yourself why it wasn't before but it is now. They've turned off autoplay. That's been a long beef of mine. Direct messaging off for under-16s on TikTok and Instagram. And they've stopped behavioral advertising on Instagram and Google towards kids, et cetera. I can't remember them all.
But these huge, huge platform changes, each one of which has been deemed impossible in and of itself, have been coming like tumbleweed through summer and a few more private ones that I happen to know about that they didn't want to bring attention to and some that are still coming, I can assure you.
And also a whole load of well-being things, time out, time off and a lot of work on people's terms and conditions. It's a long way to go. It's one step. But I think from my personal point of view, I go, what is the biggest change of the code? It's that you can see it can be done because this is an entirely engineered system. It is human made. It's optimized for advertising. Well let's optimize it for well-being. You could optimize it for zebras if you want it to.
The point is we have to decide as a society what kind of digital world we want. And the point is, right back to your first question, that actually they are determining a different kind of human interaction and we really must ask ourselves whether it's worth paying the price. And I say, in relation to children, no.
KIMBERLY NEVALA: It's almost breathtakingly simple in some ways and so incredibly complicated. And I love just that re-emphasis as well that we tend to somehow assume that the systems we have built have now become systems we cannot unbuild or rebuild, even though they didn't actually exist not that long ago. So there's something else that can also exist not that far in the future. And I don't know if that's somewhat because we just see them as so overwhelming and pervasive at this point, but I'm sure that's been true of any technology along the way.
Now so many things we could talk about there. I have to tell you the Twisted Toys campaign is genius and terrifying. And clearly your skills as a storyteller and understanding how to do that come into play. Tell us about the intent of that campaign. And if folks haven't seen this, you've got to look at it because it's evil genius in all the right ways.
BARONESS BEEBAN KIDRON: So twisted-toys.com. Well thank you for saying it's genius. I have to say we had a lot of help. During COVID, there were a few really creative people who worked for absolutely nothing to help us with that. So I want to do a shout out and just say everyone who worked on Twisted Toys is a genius.
But basically this was the essay question. I said, if I spend 20 minutes with anybody in the world, I can explain what 5Rights does and why it's so important, but it is really hard to do it in writing. And it is really hard to get people to understand it our way because in a way, they are both encouraged by an enormous lobbying effort and perhaps a little bit because of what you just said, which is people always imagine the world is how it is until it is something different. And perhaps it's the role of art and artists and creators to show us the future.
And so I said, I want 60 seconds. I'm going to tell you the 20-minute version. You give it back to me in 60 seconds. It took 10 months, but what came out of the process was that we had a catalog of toys. And each toy would represent a different problem of the digital world. And in doing so, it would show how preposterous it was, and we started off with the catalog, and once we had the catalog, we thought OK we're going to make adverts for the catalog.
And we made four videos of four of the toys, and I won't do the whole catalog, but if I say that we have Stalkie Talkie, which is the toy that helps adult strangers find children. I mean, that's dark but it's both funny and to the point. We have Share Bear, who is the teddy bear who monitors absolutely everything. And when you feel really, really sad, he gives you an advert for something that might make you feel better. And, if you feel fat, he offers you food, et cetera.
And then we have the Wakey Wakey Light, which is the one that wakes you up to tell you irrelevant things that you don't need to know. And in a way, the most simple and the best of all, is "My First Terms and Conditions." And it's a book this big that talks to the conditions. And if you hang on one moment--
KIMBERLY NEVALA: And for those of you, while she runs off, when she did, "this big," it was a book that was at least six inches thick.
SHARE BEAR: You look so sad today. Here's an advert about losing weight.
BARONESS BEEBAN KIDRON: And that is Share Bear.
SHARE BEAR: Smile! You're on camera.
BARONESS BEEBAN KIDRON: Anyway, it's a ludicrous thing you can see on the website. And I think that what was so great, and we were so happy with what we did there, was that people from all over the world shared the adverts and it brought us a lot of new friends. And a lot of people who said, it was so wonderful to see something so dark treated with humor.
And obviously as a children's charity we're always worried about taking these really, really serious issues, and sometimes we deal with families, we deal with really very troubled children sometimes, and so on, but actually we got such a lot of support from that community as saying, you've actually explained our pain in a way we want to see it. So we were very proud of that and thank you for saying it was genius.
KIMBERLY NEVALA: Well done again. And I also think such a testimony to the power of storytelling and giving voice to things in ways that make them palatable or at least make it possible to sidle up to them and take a look straight on. And humor and storytelling all play a part in doing that. It doesn't diminish the criticality of the issue, I think, but it welcomes people into the conversation.
We talk a lot, when we're talking about AI in general and making it equitable or fair or sustainable, about the need for diverse ecosystems and stakeholder groups and bringing people into that conversation. And it strikes me that the population you work with this youth and children, it would be very easy for us and for you, as policymakers, to view them and engage with them as if they were your subjects, as if you're representing them versus engaging them as stakeholders and giving them their voice.
Is that an important distinction about engaging youth versus representing youth or the youth as a subject in a stakeholder? And how do you work with that population to make sure that you are giving them their voice in an appropriate way?
BARONESS BEEBAN KIDRON: Yeah. Well I think it is a huge distinction. And I think, bizarrely, it's a particularly huge distinction when you're talking about population that does not traditionally have access to the same tools of expression as other organized (groups). I'm not going to say it's only kids because that's not true, but it certainly is. They don't get to vote, for a start.
KIMBERLY NEVALA: Right.
BARONESS BEEBAN KIDRON: At every level, they don't have the same representation as adults. I think maybe if I could just tell two very quick stories. Number one is if you look at the 15 standards of the code, I can't give you a name of a child against a lot of them. They're not in the code. They're not in the legislation. They're not in the publicity. But in my mind, there is Molly. There is Peter. There's Benita. I can tell you.
And that was partly because we do very different workshops in very different places in different parts of the world. And as children give us their experience we begin to see these problems. And then as you solve the problem, they attach. And in particular, I believe that we mustn't nudge a child to lower their privacy setting. It's the first time that negative nudges, or dark patterns if you want to call them that, are in law.
But that's actually a young boy called Peter who was playing a game with no save button and got into a really terrible, traumatic family situation because he wouldn't go down to dinner again and again and again because he was always an hour in or two hours in or two hours and a half in. And he couldn't save the game. And so it actually says in law you must be able to have a Save button. This is what I'm talking about when I say child centered design. That child blamed himself for the divorce of his parents because of no Save button. It's a long story, I won't tell you here. But my point is these are real things. So first of all, that's one thing.
I think the other thing is-- Actually only last week, I saw a young man, I'm not going to tell his name here, but he was one of the kids in my original film. And he was 15 when I met him, and when I was 16, we went to meet his online lover that he'd never met in real life. And he came back last week and telling me that they were engaged. And he still works doing some of the IT at 5Rights. He's now a young man of 24. That's how long I've been doing this.
KIMBERLY NEVALA: Wow.
BARONESS BEEBAN KIDRON: And he said to me, I am more proud about my involvement with 5Rights and being part of the reason that the world has changed. And I see it all because I work in IT and I'm just so proud to have played a part.
And all I can say is the UK is littered with people like that and actually, some other countries are littered with people like that. And it is the thing that we are most proud of because you started this conversation talking about groups and about this idea of the collective. And I do this day in day out. All the team here at 5Rights are committed in that way. And we all see ourselves as conduits for a better digital world for young people. We are not individuals. We are conduits. And that is humbling and it's really quite good fun.
KIMBERLY NEVALA: Yeah I'm staggered and rarely speechless, but I've come close a few times today. We could go on for a long time, and I may have to entice you back.
Both personally and on behalf of the human collective that we're all a part of, just thank you for leading this charge and speaking up and engaging the rest of us, so that we are speaking up and showing up for those who are just too easily overlooked as we craft, what I was going to say is our digital future, but really it's just their future, full stop. So thank you again.
BARONESS BEEBAN KIDRON: Absolute pleasure, thank you.
KIMBERLY NEVALA: Well next up, it's going to be difficult to top that but we'll try to match it as our discussion continues with Giselle Mota. Giselle is currently a principal consultant on the future of work at ADP. We're going to be discussing automation, agency, and learning in the age of AI. This will be another dynamic and forward-thinking conversation, so subscribe now to Pondering AI, so you don't miss it.