Disruptive Voices from UCL Grand Challenges

In the first episode of our Inequalities mini-series, host Ethne James Souch is joined by Prof Jessica Ringrose, Dr Marc Tibber, and Sophie Wilson to delve into the complex intersection of social media, mental health, and regulation. The conversation spans topics from online safety and digital literacy to phone bans in schools and the lived experiences of neurodivergent communities. It explores how we can better support young people in navigating digital spaces without reinforcing inequalities. Drawing from research, policy, and lived experience, this episode offers a rich and nuanced discussion on what it means to build an inclusive and just digital future. With insights from two UCL Grand Challenges-funded projects, Ethne and guests consider the risks and possibilities of online life, the evolving landscape of regulation, and the power of co-production in developing ethical and effective interventions.

 Date of episode recording: 2025-07-31
 Duration: 00:33:15
 Language of episode: English
 Presenter: Ethne James Souch
 Guests: Prof Jessica Ringrose; Dr Marc Tibber; Sophie Wilson
 Producer: Annabelle Buckland
 

What is Disruptive Voices from UCL Grand Challenges?

In this series, guests from across UCL and beyond share their innovative solutions and ideas for addressing societal challenges, discussing topics from a cross-disciplinary perspective and inspiring and encouraging us to think differently about issues of local, national and global concern.

Ethne James Souch: Welcome to this episode of Disruptive Voices from UCL Grand Challenges. I'm Ethne James Souch, your host and coordinator for UCL's Grand Challenge of Inequalities.

Through this mini-series, Challenging Inequalities: Conversations on Inclusive and Just Futures, we invite you into conversations with UCL researchers and external individuals who are working to build a more inclusive and equitable future. In today's episode, I'll be speaking to Professor Jessica Ringrose, Dr. Marc Tibber and Sophie Wilson to explore online safety, social media's impact on mental health and how it connects to broader issues of inequality.

Jessica is a Professor of sociology and gender and education. She has expertise in youth digital cultures, digital literacy and educational interventions prioritising children's rights. She was recently awarded Grand Challenges funding for the project Smartphone Free Childhoods? Supporting Young People, Teachers and Parents Navigate Smartphone Bans into schools.

Marc is a senior clinical psychologist and lecturer in clinical psychology with a special interest in adolescent and emerging adult mental health. Recently his work has explored the role of interpersonal and social process in mental health and how issues of connection and disconnection affect individuals and communities, with a particular focus on the role of social media.

Sophie is the Head of Government and Society at Ipsos. She is a qualitative researcher and evaluator with expertise in public service delivery and regulation. This includes extensive research exploring public sector collaboration, place based policy making, online safety and media regulation. In published research ‘Online communications’, it explored the experience of sexualised messages among young people.

So thank you so much for joining me today.

Sophie, before we dive in, can you help set the scene for us? What do we know about how people in the UK are feeling when it comes to social media and online safety? And what are the polls telling us about public attitudes?

Sophie Wilson: Thanks Ethne And yes, so we know quite a lot about online safety and how this is a really significant concern for a large majority of the UK public at the moment. And actually over the last few years we've seen that kind of really come to the fore as an issue and it's not niche, it's really a mainstream worry. So previous polling from a few years ago found that over four in five adults are concerned about encountering harmful content online, including things like racism, misogyny and content encouraging self harm. And in fact two in five reported seeing such content in the last month. So real cause of concern there, and this translates into a strong desire for more robust action from social media companies. So there is appetite for regulation that we've seen come through and things like the Online Safety act and Ipsos

Research has revealed that nearly 7 in 10 adults, so 68% believe that social media companies in particular should do more to protect their users. And this is pretty consistent across different groups and across the political spectrum. And when we talk to people about what they want to see, it's really clear there's a desire for better alignment between the online and offline worlds. Actually this isn't really a divide anymore. We live our lives both online and offline and that's how we kind of experience reality in our day to day. And this is especially true given how much time we now spend online, including on social media.

So earlier this year we looked at the Ipsos IRIS platform, which is a passive measurement tool, so gathering insights into what audiences are doing online. And so in the background of people's devices, seeing what sites they're going to and how long they're spending. And that highlighted that amongst 15 to 24 year old Britons spend an average of 122 hours and 57 minutes on social media a month. So that's about five days in total or kind of an entire working week every month that we're spending on online social media amongst that younger age group. In terms of priorities, keeping children safe online, unsurprisingly, is really top of the list both for parents and also non parents. So 83% of adults think social media platforms should have a legal duty to protect children on their sites. And that has now come into fore in terms of regulations around the Online Safety Act.

But at the moment three and four parents are concerned about what children are seeing, hearing or doing online still. And this is compounded by the fact that one in seven are not confident they know what their children are encountering online. And further qualitative research that we've done with parents has shown that while parents can identify a range of risks, they don't always feel they know how to support their child in being safe and might feel a bit underconfident in identifying potential solutions. And in an educational setting, this has meant that a majority of parents have told or considered telling their child not to take a smartphone to school with them, with one in five going as far as confiscating a device before they go to school. Similarly, 3/4 would support schools to require children to put phones into deposit boxes during lesson time. So taking quite a bit a firm line in terms of smartphone device and drawing that distinction between lesson time and socialising.

Ethne James Souch: Thank you Sophie. And there's loads of interesting statistics that you've drawn upon. So in the show notes we'll link to any relevant material that our listeners can read further. so moving on to you Jessica. you've spent quite a lot of time researching as I mentioned in your bio on youth, digital cultures, digital literacy and tech related violence. So perhaps you've seen the good, the bad and the ugly when it comes to all things, technology. But when we're looking particularly at social media, how worried should we be about its impact on young people?

Jessica Ringrose: I mean social media is here to stay. It's reshaped how we relate to one another and how we our identity. I call this the post digital because we're always now in the digital. there's no real outside and this is especially true for the so called digital natives, you know, young people in, in society. So a media scholar, Dana Boyd wrote back in 2010 that young people heart or love their social media. It's really a part of their daily lives. It's entertaining, it's informative, it's hugely enjoyable and it offers them connectivity and friendship in ways that might not otherwise be possible.

And some social media platforms really do present very particular risks to everyone because they're using recommender algorithms which push content and if you talk to young people themselves and you prioritise their voice and their experience, they do themselves describe it as feeling addicted and you know that there's some content that's brain rot and they prefer not to consume it or they're being fed content they maybe don't want to see but they generally say that they can block such content and that the algorithm improves.

Another thing about the algorithm though is that the sensationalist content is boosted by it and Young people really talk about the concerns they have about hateful comments online, lack of moderation and particularly on platforms like Instagram and TikTok around misinformation.

But basically my position is they really just need more support on this issue. Snapchat purposefully uses design built game like features to lull young people into expanding their networks beyond what that which they can handle or which safe or even they feel comfortable with and it can open them up to adult predators.

And so I guess you can tell that I feel it's very important to get to a granular level with these issues to show what are young people's actual experiences. They are critical consumers, they're agentic users of social media. So how can we actually support them rather than having these risk and harm narratives that might obfuscate the concrete.

Ethne James Souch: Thank you. Yeah, so it's about creating an environment to support the children. As you say. Social media is not going anywhere and perhaps we can expand on that a little bit later on. so Sophie, as online safety is a key policy area, ah, that IPSOS has conducted research on, can you tell us a bit more about this? Is it saying the same thing that Jessica just mentioned? Could you expand a little bit more?

Sophie Wilson: I think it very similar to Jessica's points there. So we have been conducting research for a range of different clients online safety for many years now, both before the introduction of the Online Safety act and then also following it. both in terms of general public attitudes towards their expectations, what they want to see from future regulation, as well as deep dives into the experiences of particular groups, whether that be children and young people, parents or those who might have been affected, as well as stakeholders and what their perceptions are for what regulation should look like I think where perhaps then there's a broader conversation is actually what does that look like and how can we support people, both adults and children, young people, to have the tools that enable them to navigate online spaces effectively? So I think that's the bit that perhaps I'd be interested to hear kind of Marc and Jessica's insights there because I think we have, there's a lot of appetite from the public around that.

Ethne James Souch I just want to jump back slightly go into as I mentioned, the grand challenges funded projects So I'll come to you first Jessica, and linking to your grand challenges funded project, Smartphone Free chart. Could you tell me a little bit about this project and why you think this topic is important to research today?

Jessica Ringrose: Yeah, sure. so this project really arose in relation to the global rise of smartphone banning in schools and also parent groups who do not want their children to have or use mobile phones or restrict their use up until a certain age. So in England you might have heard about a group called Smartphone Free Childhood and they use them pledge and a movement to promote smartphone use till age 14 and it grew really rapidly in 2024. There's a similar pledge in the USA called wait till the 8th. That's the 8th grade and there's many other examples as well. So as an educational researcher I really wanted to look at how schools were navigating the current regulatory policy push for education to have smartphone free environments. what kind of policies were put in place, how were they put in place, how did they work, how did they affect how young people are using their phone both in school, but then away from school? And I had some hypotheses about what would happen or what would be the relationship between the restrictions on the young people.

So basically we know from loads of health research that abstinent approaches don't work to mitigating other health issues such as underage sex or drug or alcohol use. So denying young people access to social media will not ultimately to stop them nor to support them in better and safer uses of technology. And I was concerned, and others have been concerned that this attitude of regulation, restriction and punishment underpinning the way that a banning policy might be implemented in schools could erode trust between young people and adults, in their lives.

I was also, looking at whether the attitudes and this reduction of trust meant that young people might not feel as comfortable approaching trusted adults, e.g., teachers or parents, when they experience something harmful online for fear of getting into trouble. And also I was, then interested in how vulnerable and marginalised young people who are already more vulnerable to online harm may be the most negatively affected by a very restrictive environment. So, yeah, I've got some interesting findings that I have just literally got because I was still conducting this research last week, but I have tried to summarise them into some bitesize. Findings that we could maybe return to later.

Ethne James Souch: Brilliant I will just go on to Marc's and then we can come back to those, findings because I'm really interested to hear them. So, Marc your grand challenge s funded project Social Media and Mental Health. Perhaps. But do say if I'm wrong, paints a more positive image on the use of social, media with autistic individuals. So could you tell me a bit about this project?

Marc Tibber: In terms of this particular study, I got interested in the role of social media in the mental health and well being of autistic people and neurodivergent people more generally because of some of the work I'd been doing. But when I looked at the research that was already out there, a lot of it was very deficit focused and kind of catastrophizing and in no way do I want to minimise the harms and risks that we sort of talked about. But it seemed very one sided and all the research had, there was this strong sense of it being done to the community and to the people rather than with.

So the idea was basically just to ask what do autistic people. In this case it's actually adults I'm working with, but what do they think is important? What do they think should be research? What's of importance to them? And so what we did is we ran a Delphi group it's a way of getting a consensus position from a group of experts on a topic. In this case the experts being autistic adults themselves and getting a sense of what do they prioritise in terms of research.

I won't go into the methods, but basically we met with them multiple times in a series of interviews where they generated a bunch of research ideas and then rated them how important they were, discussed them until we kind of condensed it down to a group of research priorities that were highly endorsed and sort of reached good levels of consensus. So we had 21 autistic adult participants in the panel and ultimately we sort of boiled it down to 29 research priorities. we also had a qualitative component, to sort of explore how they thought we could make this research of maximum reach and impact. I could give you a very brief summary of the core. As I said, there's a lot in there. I mean, I guess what really stood out for me was going back to this issue of harms and benefits. There was interest in harms and potential risks, but there was a real drive for understanding how social media might be used to harness connection, to build community, to support identity generally. And in terms of autistic identity, shared identity and individual identity. And I think one particular interesting one, if there's one, I might just say the top priority, which was in some ways surprising to me, was they wanted to understand how and if social media could be used to self regulate and self soothe when they felt emotionally or dysregulated in terms of sensory experience, which I just thought Was very, very interesting in terms of narratives around the potential for social media to dysregulate, which I think is. That was also. They also wanted to look into that. And one person expressed a concern, again I thought was really, really interesting, was around people with high support needs and maybe a thought a sort of carer or someone offering support might take away their social media, thinking it was good but not realising the actual potential implications of taking away that source of connection, self soothing. Anyway, so I could go on but that's, that's the core what we found.

Ethne James Souch: That's really interesting and thank you for sharing with those research findings , did you draw on any recommendations on how we as individuals or these tech organisations or other people can help support autistic people in general or others to use social media?

Marc Tibber: I think for me, and it sounds like it's maybe echoed in the work Sophie and Jessica are doing is what really stood out was this emphasis on autonomy and agency and empowerment. A lot of previous work I think has been around sort of restricting use and safeguarding. Again, not to minimise that, but I think the people I worked with there was a real sense of they wanted to be empowered to make the changes that met their needs as individuals and you know, it's obvious but everyone is an individual and you know, people who are autistic often there's high rates of ADHA there's high rates of OCD, there's higher rates of mental health difficulties and which all bring different challenges. And so I think that and the broader work I've been doing, we've been developing a model of a kind of trans diagnostic model to try and make sense of how to harness the benefits and mediate the harms. We've looking in general population but also in autistic people and it seems to fit quite well. And at the core of our model is the idea that people tend to accrue the benefits more when they engage intentionally, purposefully in line with their values and their needs. So off the back of that we've been developing some interventions Just to try and help people increase their intentionality of use,

00:20:00

Marc Tibber: the mindfulness with which they engage and just their awareness of what they want to get out of the interventions. And that's been shown promising results within general population and I think be interesting to explore moving forward with autistic people. And I guess the last thing I would say is I don't think that removes the responsibility from corporations but to be brutally honest, I'm worried about what the likelihood is of them taking that responsibility given the incentives that are built into these technologies to maximise engagement at all costs.

Ethne James Souch: Yeah, I think that echoes through what we've been discussing today. And I wonder if I could come to you Jessica, just to draw on those emerging findings that you mentioned. Is there any similarities with what Mar c found or any differences that we can draw upon?

Jessica Ringrose: the participatory type of model, the methodologies that Marc was talking about really fit with the grand challenges. Push towards social justice or mitigating inequalities. I think taking a co productive and participatory approach is really key. So that really resonates with the approach that we've taken to working with schools and young people,So my findings are around how the policies are implemented in schools. because children know that phone bans vary across school settings and are inconsistent and they can find this really unfair. in the most restrictive school that we were in, young people were basically saying, you know, this is disrespectful, they don't trust us, because in some situations, schools are not even allowing children to have their phones on the way to school or home. And this presents a lot of different issues in that Case, for example, the school wanted parents to buy new brick phones or to get airtags for their children. So it just kind of like shows you the kind of lack of trust and the regulatory kind of imposition upon the person, and then the family. And this was in a, disadvantaged area. So there was issues around buying new packages and products. And also, the younger children were like, I can't even use the brick phone. so it's quite interesting to have this conversation with the children themselves about what did they think about these policies which had recently come into effect. And even in schools where there was, you know, children allowed to have their phones with them, they would like access to their phones at break times. And particularly this is with mental health and isolation issues. The phone is like a lifeline. And, lots of interesting things happened. We did qualitative, participatory, workshops, interviews with the young people. But we also did a survey with a charity called Life Lessons. but they conducted a survey with almost 650 young people. And lots of interesting things were coming up through that anonymous methodology of telling us what they thought.

you know, girls with their period in the bathroom, you know, they want to be able to deal with a crisis So lots of things that maybe weren't considered.

So very interesting conversations. and like I said, I'm just at the beginning of my analysis of this, but in terms of educational needs, there's definitely further educational strategies needed. These definitely echo what Marc 's been saying around self care and support, around phone use, but also like better digital literacy. so the educational needs really kind of go hand in hand with these support needs. They do themselves worry about spending too much time on social media and toxic content. but they actually are not so sure whether the phone ban necessary in school. You know, does that, does that really do anything about it? Because quite frankly, the majority of the time that young people are spending scrolling and stuff is at home. So there's not a lot, a need for a lot better support in and around the home environment because that's where, you know, children are like, well, I wish I had something else to be doing or an activity or whatever, but I'm on my phone or I'm lonely or, you know, so those types of issues are really key. And, I don't necessarily think that the phone ban actually does much. Of course it stops young people being distracted in lesson times. That's obvious. But does it really do much else to sort of help around, quote, unquote, addiction issues? I'm not sure. These are things that I want to kind of like, keep pursuing as I delve into my analysis more.

Ethne James Souch: Great. Yeah, so it's kind of rethinking the phone use and how we can support people at home because as you say, those people at school aren't 100% on their phone all the time. So I really look forward to seeing further analysis from you and further impacts from that project. so we've kind of, touched on some of the practical interventions recommendations, particularly looking at the individual. And I know, Sophie, you brought up earlier, about tech companies needing to take responsibility. And I wonder, from any of the research that IPSOS has produced, are there any practical interventions or recommendations you can draw on for the tech companies, but also for the individual?

Sophie Wilson: Thanks, Ethne, I think there are some interesting reflections there that resonate with some of our research as well. So, notably in the conversations around disinformation, misinformation, fake news and the importance of media literacy, I think that's really, really key. And we have both kind of qualitative and quantitative data around this. So from earlier this year, we found that in March that 77% of the of 16 to 34 year olds say that fake news is prevalent in online news from influencers. But at the same time almost half of that age group were trusting influencers to get their news from a great deal or a fair amount.

So again, there's just a bit of a disconnect there in terms of not necessarily recognising the potential for information and news that you're getting from certain influencers to be false and yet still trusting them and wanting to engage with them. And that's something that we've seen a lot in some of the conversations that we have with the public around news and information in online spaces in particular. And this process that actually where we get our information from now often we are kind of taking it with a bit of a pinch of salt. We're interested in more opinion based driven information sources and we'll then go and verify that news and information from other places. So it's not necessarily a kind of one directional way of getting information anymore. We're actually going through a process ourselves and verifying where we're getting information from multiple sources, which the Internet is incredibly helpful and that we have different layers of trust or we're getting using different brands and different ways to engage with what's going on in our world. And this is much more of a conversation perhaps rather than just I'm going to consume this one bit of information and that's it, I'm kind of done, I know what's going on in the world.

So I think there's something interesting there which perhaps points to that changing relationship that we have in terms of how we trust information that we consume and how there's a recognition around some of the nuances, perhaps around that then perhaps there necessarily was in the past. And I think that points to me in terms of intervention specifically in that space around the importance of transparency, being clear on where information is coming from, fact checking, having those opportunities to trust like verified brands that enable you to do that effectively and the way in which the architecture of online platforms can support and enable that. Whether that's about kind of community driven interventions to verify or kind of uptick in different posts or whether it's about having kind of verified branding on social media platforms. And I think also that the point that Jessica was making about this being about the home is quite important as well. This isn't just about educational settings but also about our relationships with our children in a home environment. And I think we also have data around parental views towards this and how parents are most likely to have had conversations about the risks of online activity with their children as a protection measure. So about 60%, having those active

Sophie Wilson: conversations to discuss risks and dangers, of online activity with their children. But I think what we've also seen in some of the conversations that we've had with parents themselves is that they don't always necessarily equipped or feel confident in those conversations. So again, I think there's work that could be done in terms of interventions to actually how can we support parents to enable them to feel able to effectively navigate those conversations with their children, Particularly in family environments where perhaps as a parent you feel like you know less about the online world than your children. Actually they're much more kind of digitally capable than you necessarily are. How can we enable those constructive conversations and resources that are actually helpful and that parents are aware of and can find and apply to their own environment? Marc Tibber: Could I add something onto the back of that? I just wanted to say something, I think maybe just links the kind of the response from the companies and this idea of intentional use. One of the things I think we find in a lot of our research is something that's very challenging is that the same features and affordances of technology, often that can be really problematic, can also be really helpful. So for instance, like delays in communication for some autistic people can be really useful to kind of rehearse things before they want to respond. But that also means that you get the positivity bias and people can sort of curate really carefully their online identity which can lead to social comparison. And I think that that's one of the challenges and where that kind of psycho education point, which I didn't mention but is built into our way of working in our model that Sophie and Jessica talked about is really important because understanding how these features and affordances can be unhelpful or if used intentionally, can actually serve their needs and their values, I think is a really crucial component. That interaction between the technology and the affordances and the individual.

Marc Tibber: So in this particular work we've got neurodivergent and neurotypical researchers across different areas bringing different lenses and actually at all stages. And I think that's another important thing. I think often the co production can be very tokenistic but actually I think if it's going to be done well it has to really be at all stages and we've really I think done that here and you know, facilitated by the grant actually from the beginning, designing it, from sort of feeding back on the process, identifying you know, the research priorities and then now even what we're doing, we sort of finished the manuscript but we got it, got it to be reviewed by some of the panel members and for them to feed back on the process so we can learn from, from it. So I think it's hugely fruitful for everyone

Ethne James Souch: Absolutely, And Jessica, can I come to you?

Jessica Ringrose: Yeah, I really couldn't agree more. the grant is incredible to help facilitate a sustained relationship. In this case it's with an educational provider. So Life lessons is a PSHE. So personal, social, economic, health, educational company

And we're going to hopefully inform some behavioural policy guidance and online safety guidance for schools to augment some of what they've put together around the smartphone restriction policies.

So it just gives the time and the space for like Marc was saying, a sustained engagement that's genuine. I think that there's been a real pushback around sort of extractive exploitative relationships with academics and the third sector. And the grand challenges is really brilliant for genuinely enabling us to facilitate this long term strategy. Ethne James Souch: Brilliant. That just describes the programme in a nutshell. And it just goes to show, just a small bit of seed grant, can create such meaningful and, important change that you might not necessarily get from those bigger grants.

Because this podcast is part of the Challenging Inequalities series and we've touched on this throughout the conversation, but from your perspective and just kind of bring all your thoughts together. Do you think social media and online spaces can impact or exacerbate inequalities? I'll go to Sophie first and then come to Jessica and Marc for any final thoughts.

Sophie Wilson: I think, unsurprisingly, I think it can do both. So I think, as we've discussed today, there are real opportunities for social media and online spaces to enable the public and individuals to access information, come together and gain huge benefits. And Marc has talked about some of the potential benefits, particularly in the kind of wellbeing space. But at the moment there is challenges. I think it would be remiss not to recognise that. And the, social media in particular can often reflect and amplify and create new forms of inequality. So I think there is still work to be done to try and reap the full benefits of what it can provide. But there's those two sides of the coin and I think there's to some extent that that's life. We'll never be able to solve all the problems, but perhaps if we can equip individuals and embed some of those skills into our ways of being, that can help us to feel agency to navigate some of the challenges and harms that are inevitably online.

Jessica Ringrose: Yeah, I really agree, with Sophie and the holding that tension between the positive and the negative and, what's the responsibility of different, , actors in society? And, you know, my project is looking at how the implementation of policies around youth and technology has to be done really carefully in order not to exacerbate, inequalities that are already shaping young people's experiences. So these kind of blanket

Jessica Ringrose: responses aren't helpful. What young people need are tools and supports to manage ethical tech youth use, including social media and AI. And if phone bans are the new norm, how are the schools upping their support of young people through comprehensive digital literacy? that covers social media and AI and is really looking at the ethical dimensions of supporting, both active engagement, but also like democratic digital citizenship. these types of digital literacies, they have to actually engage with the realities of young people's lives and treat them also as like agentic, intelligent users of social media. And I think that's what young people in my research are sort of crying out for and saying, hey, , we are responsible and we do know what we're doing and yes, we might need help, but, you know, let's do this in a, you know, respectful and sort of, yeah, positive way.

Marc Tibber: I think every technology has the potential to help or worsen people's situations and I think we see that in social media. I see it in my research. I think you see it in the broader research. , there's evidence that people who experience discrimination, minoritisation, stigma, prejudice, these sorts of things offline, often are more likely to experience these problematic patterns of use and have those experiences reflected and potentially amplified, as social said online. I think what I also see, and the way I kind of see it, is potentially that I think it has a potential to amplify both the costs and the benefits. So for instance, particularly sort of minoritised groups and like I said, sort of people of colour or, autistic people, we often see that they experience the repetition of the stigma and the prejudice and discrimination they experience offline, but also benefit the most from that sense of community and shared identity and connection that they can access online. and I think that's where, again, just like we've come back to, I think that's where encouraging intentionality to try and get the good bits where you can and avoid the bad bits where you can is really important as well as thoughtfulness around the incentives that we're building into these technologies.

Ethne James Souch: Thank you all for such an interesting discussion and for joining me today. Jessica, Marc and Sophie and you've been listening to Disruptive Voices. This episode was presented by me, Ethne James Souch, produced and edited by Annabelle Buckland, at Decibelle Creative. If you'd like to hear more of these discussions from Disruptive Voices, make sure you're subscribed to this podcast so you don't miss future episodes. Come and discover more online and keep up with the latest grand challenges, news, events and research. Just Google UCL Grand Challenges. Thank you.