Purpose 360 with Carol Cone

Purpose 360 with Carol Cone Trailer Bonus Episode 194 Season 1

Thorn’s Mission to End Online Child Exploitation

Thorn’s Mission to End Online Child ExploitationThorn’s Mission to End Online Child Exploitation

00:00
Child sexual abuse online is a devastating global health crisis that has grown exponentially over the last decade. Thorn, a nonprofit focused on combating this crisis, reveals that the number of reported child sexual abuse files in the U.S. alone surged from 450,000 in 2004 to more than 90 million by 2022. This alarming increase highlights the urgency to address the pervasive and harmful impact of online predators, which will affect nearly 70% of youth by the time they finish high school. The need for robust technological interventions and awareness is more critical than ever to safeguard children in the digital age.
We invited Julie Cordua, CEO of Thorn, to discuss the nonprofit’s innovative approach, which combines technology and social research to create solutions that help detect and mitigate online child abuse. Julie emphasizes the importance of collaboration with tech platforms and the critical role of parental guidance in fostering safe online environments for children. Her insights shed light on the multifaceted efforts required by companies and parents to tackle this issue.
Listen for insights on:
  • How technological solutions are empowering tech companies to detect and combat online child sexual abuse
  • Practical advice on how to have open, non-judgmental conversations with children about online safety
  • Importance of collaboration between nonprofits, tech companies, policymakers, and communities to create a safer online environment for children
Resources + Links:
  • (00:00) - Welcome to Purpose 360
  • (00:13) - Thorn
  • (01:35) - Julie Cordua’s Background
  • (03:35) - What Thorn Does
  • (06:05) - The Numbers
  • (07:39) - Example Story
  • (11:32) - Thorn’s Tools
  • (12:55) - Response
  • (14:49) - Advice for Parents
  • (18:28) - Practice Conversation
  • (21:39) - Regulatory Actions
  • (22:58) - Youth Innovation Council
  • (24:31) - NoFiltr
  • (25:44) - AI’s Impact
  • (28:48) - What’s Next
  • (30:58) - Staying Motivated
  • (33:26) - Last Thoughts
  • (35:20) - Wrap Up

What is Purpose 360 with Carol Cone?

Business is an unlikely hero: a force for good working to solve society's most pressing challenges, while boosting bottom line. This is social purpose at work. And it's a dynamic journey. Purpose 360 is a masterclass in unlocking the power of social purpose to ignite business and social impact. Host Carol Cone brings decades of social impact expertise and a 360-degree view of integrating social purpose into an organization into unfiltered conversations that illuminate today's big challenges and bigger ideas.

Carol Cone:
I'm Carol Cone and welcome to Purpose 360, the podcast that unlocks the power of purpose to ignite business and social impact. In today's Purpose 360 conversation, we're going to address a huge public health crisis that I believe very few of us are aware of, and that is child sexual abuse online. We're going to be talking with an amazing not-for-profit, Thorn. They're an innovative technology not-for-profit creating products and programs that combat child sexual abuse at scale.

Let me give you a sense of the scale. In 2012, about 450,000 files of child sexual abuse videos, pictures, conversations were online in the US alone. Fast-forward 10 years or so, there's almost 90 million, 90 million files online, and that's impacting our children at all ages. And unfortunately, almost 70% of our youth by the time they're graduating high school, have been contacted, have had their trust broken by a predator. This is an extraordinary issue. We must all respond to it. I have one of the foremost leaders in the not-for-profit sector, Julie Cordua. And Julie, welcome to the show.

So, Julie, tell us about just your role in doing amazing not-for-profit work and what inspires you to do that work, RED and now at Thorn? And then we're going to get really deeply into what Thorn does.

Julie Cordua:
Oh, great. Well, yeah, so good to see you again. When I saw you at the summit, it's like, "Carol, it's been years."

Carol Cone:
I know.

Julie Cordua:
So it's really nice to reconnect and thanks for taking the time to cover this issue. So I did not set out in my career to work in the nonprofit space. I actually started my career in wireless technology at Motorola and then a startup company called Helio. And I loved technology. I loved how fast it was changing and I really thought it could do good in the world, connect people all around the world. And then out of the blue one day I got this phone call and the person said, "Hey Julie, this is Bobby Shriver. I'm starting something and I want you to come be a part of it." And at the time he was concepting with Bono this idea of RED, which was how do you take the marketing prowess of the private sector, put it to work for a social issue?

And I thought, "Ooh, if we could use marketing skills to change the way people in the world have access to antiretroviral treatment for HIV, that's incredible. That's incredible things I could do with my talent." And so I joined RED and I learned a ton. And what applied to Thorn, my move to Thorn, was learning about how if we looked at social issues or problems less as, "Is this a nonprofit issue or is this a private sector issue?" And more at like, "Let's take all of those skills, all the best skills from all types of society and put them towards an issue. What could be done?"

Carol Cone:
Thank you. And I love how you describe Thorn on the website, as an innovative technology nonprofit creating products and programs that combat child sexual abuse at scale. So why don't you unpack that a bit and explain to our listeners what is Thorn and then we're going to get into all of the details why it is critically important for every single parent, teacher, and appropriate regulators to address this issue. This issue cannot be unknown. It has to be absolutely prevalent.

Julie Cordua:
Yeah. So child sexual abuse in our society globally has dramatically changed over the last decade. Most child sexual abuse today has some form of a technology component. And so that can mean that the documentation of the abuse of a child is spread online. That has been going on for much longer than a decade. But over the last decade, we've seen the rise of grooming, of sextorsion. Now generative AI child sexual abuse material, of perpetrators asking children or being enticing children for content with money or gifts online. As my head of research says, geography used to be a protective barrier. If you did not live near or with an abuser, you would not be abused. That has been destroyed. Now, every single child with an internet connection is a potential victim of abuse.

And many of the things that we have done in the past still hold true. We need to talk to our children about these issues, we need to talk to parents, we need to talk to caregivers, but we have a new dimension that we must do, which is create a safer online environment and create solutions at scale from a technology perspective to create safer environments. And that's what we do. So we merge social research with technical research with, this is where the concept of private sector thinking comes in, with software solutions at scale. And our whole goal is that we can help reduce harm and find children faster, but also create safer environments so kids can thrive with technology in their lives.

Carol Cone:
Thank you. And I want just for our listeners, let's talk some numbers here. On your website, which is a fabulous website, it is preeminent, it is beautifully done, very informative, not overwhelming. It helps parents, it helps children, it helps your partners. So you talk about 10 years ago there were like 450,000 files online that might be related to child sexual abuse, and now you say it's up to something like 87 million around the globe?

Julie Cordua:
That's actually just in the United States.

Carol Cone:
Oh, just in the United States. Wow, I didn't even know that.

Julie Cordua:
And the tricky thing with this crime is that we only can count what gets reported. So in the last year, there was over 90 million files of child sexual abuse material, images and videos, that were reported from tech companies to the National Center for Missing and Exploited Children, which is where these companies are required to report that content. So if you've got over 90 million files reported in one year, that's just what's found, so there's a lot of platforms where this content circulates where no one looks, and so it's not found. So you can imagine that that number is much, much higher of content that's circulated. And also, that was just the United States. So if we go to every other country in the world, there are tens of millions, hundreds of millions of files circulating of abuse material.

Carol Cone:
You know what I'd love you to do? The story you told at SIS was so powerful. People, you could hear a pin drop in the room. Could you just give that short story? Because I think it talks about the trajectory of how a child might get pulled in to something that was seemingly just simple and about a child.

Julie Cordua:
Yeah. This story, obviously I'm not using a real child's name, and I would say the facts are pulled from multiple victim stories just to conserve confidentiality. But what we're seeing with sextortion and grooming, how this is presenting, I mean it presents a lot of different ways, but one way that we're seeing grow pretty exponentially right now is a child is on let's say Instagram and they have a public profile. And actually, the targets of this specific crime are really young boys right now. So let's say you're a fourteen-year-old boy on Instagram, it's a public profile. So a girl gets into your messages and says, "Oh, you're cute. I like your football photo." And then moves the messaging to direct messaging, so now it's private. And they might stay on Instagram or they might move to a different messaging platform like a WhatsApp. And this girl, and doing air quotes, starts kind of flirting with the young boy, and then at some point maybe shares a topless image and says, "Do you like this? Share something of your own."
And this person that the child has friended on social media, they think is their friend. And so they've actually kind of friended them on Instagram, maybe friended them on any other platforms they're a part of. And because they're flirting, they may send a naked image. And then what we're seeing is immediately this girl is not a girl, this girl is a perpetrator. And that conversation changes from flirtation to predatory conversations and usually turns into something like, "Send me $100 on Venmo or Cash App right now, or I will send that naked image you sent me to all your family, all of your friends, every administrator at your school, and your coaches, because I'm now friends with you on Instagram and I have all of their contacts." And that child, if you could imagine, and I tell this story a lot, and I had children, feels trapped, feels humiliated.

And we see that these kids often do have access to Cash App or another thing we're seeing is that they use gift cards actually to do these payments at times, and they'll send $100 and they think it's over, but the person keeps going, "Send me $10 more, send me $50 more." And they're trapped. And the child feels like their life is over. Imagine a 13 to 14-year-old kid sitting there going, "Oh my God, what have I done? My life is over." And unfortunately, we have seen too many cases where this does end in the child taking their life.

Carol Cone:
Oh my God.

Julie Cordua:
Or self-harm or isolation, depression. And we're seeing now, I think it's up to about 800 cases a week of sextortion are being reported to the National Center for Missing and Exploited Children right now. And this is a crime type, when I talk about how child sexual abuse has evolved, this is very different than where we were 15, 10 years ago. And it's going to require different types of interventions, different types of conversations with our children, but also different types of technology interventions that these companies need to deploy to make sure that their platforms are not harboring this type of abuse.

Carol Cone:
So let's take a deep breath, because that is astounding, that story, and that a child would take their life or just be so depressed and just don't know how to get out of this. So first, let's talk about the technological solution. You're working with a lot of technology companies and you're providing them with tools. So what do those tools look like?

Julie Cordua:
Yeah. So we have a product called Safer, which is designed for tech platforms, essentially trusted and safety teams to use. That is a specialized content moderation system that detects image, video, and text-based child sexual abuse. And so companies that, which I think most companies do, that want to make sure that their platforms are not being used to abuse children, can deploy this and it will flag images and videos of child sexual abuse that the world has seen. It will also flag new images, but it can also detect text-based harms. So some of this grooming and sextortion, so that trust and safety teams can get a notification and say, "Hey, kind of red alert. Over here is something that you may want to look at. There might be abuse happening." And they can intervene and take it down or report it to law enforcement as needed.

Carol Cone:
And how have technology platform companies responded to Thorn?

Julie Cordua:
Great. I mean, we have about 50 platforms using it now, and that's obviously a drop in the bucket to what needs to happen. But I mean, our whole position is that every platform with an upload button needs to be detecting child sexual abuse material. And unfortunately in the media, sometimes we see companies kind of get hammered for having child sexual abuse material on their platform or reporting it. That's the wrong approach. The fact is that every single company with an upload button that we have seen who tries to detect, finds child abuse, and that means that perpetrators are using their platforms for abuse. So it's not bad on the company that they have it, it becomes bad when they don't look for it. So if they put their head in the sand and act like, "Oh, we don't have a problem," that's where I'm like, "Oh wait, but you do. So you actually need to take the steps to detect it." And I think as a society, we should be praising those companies that take a step to actually implement systems to detect this abuse and make their platform safer.

Carol Cone:
Would you want to give any shout-outs to some exemplary platform companies that have truly partnered with you?

Julie Cordua:
Yeah. I mean we have, and this is where I'm going to have to look at our list so I make sure I know who I can talk about. We've worked with a variety of companies. You have a company like Flickr who hosts a lot of images, who's deployed it, Slack, Disco, Vimeo from a video perspective, Quora, Ancestry. Funny, people is like, "Ancestry?" But this goes back to the point I make is that if you have an upload button, I can almost guarantee you that someone has tried to use your platform for abuse.

Carol Cone:
Okay, so let's talk about the people part of this, the parent, and you have so many wonderful products, programs online, maybe it's programs, for parents. Well, just talk about what you offer because you've got online, you've got offline, you've got messages to phones. Parents, as you say, need to have that trusting relationship with the child. And I love that you talk about that, once a child, if it's pre-puberty or such, when they get a phone in their hand, they also have a camera and it's very, very different from when we grew up. So what's the best advice you're giving to parents, and then how are parents responding?

Julie Cordua:
Yeah. I mean, so we have a resource called Thorn for Parents on our website, and it's designed to just give parents some kind of conversation starters and tips, because I think in our experience in working with parents, parents are overwhelmed by technology and overwhelmed in talking about anything related to sex or abuse, and now we're working at the intersection of all of those things. So it just makes it really hard for parents to figure out, "What are the right words, when do I talk about something?" And our position is talk early, talk often, reduce shame, and reduce fear as much as possible. Kind of just take a deep breath and realize that these are the circumstances around us. How do I equip my child and equip our relationship, parent-child relationship, with the trust and the openness to have conversations?

What you're aiming for, obviously that the very basic is no harm. You don't want your kid to encounter this, but think about that out in the real world. If we were to live with a no-harm state, you may be protecting your kid more like, "Don't go on a jungle gym or something." The reality is that kids will be online whether you give them a phone or not, they might be online at their friend's house or somewhere else. So then if you can't guarantee no harm, what you want to do is if a child finds themselves in a difficult situation, they realize that they can ask for help. Because go back to that story I told about the child who was being groomed. The reason they didn't ask for help was because they were scared. They were scared of disappointing their parents, they were scared of punishment. We hear from kids, they're scared their devices are going to get taken away, and their devices are what connect them to their friends.

And so how do we create a relationship with our child where we've talked openly about what they could anticipate so they know red flags? And we say to them, "Hey, if this happens, know that you can reach. I'm going to help you no matter what. And you're not going to be in trouble. We're going to talk about this. I'm here for you." I can't guarantee that that's always going to work, kids are kids, but you've created an opening, so the child, if something happens, they may feel more comfortable talking to their parent. And so a lot of our resources are around like, "How do we help parents start that conversation, approach their children with curiosity in this space with safety and really reducing shame, removing the shame from the conversation?"

Carol Cone:
Can you just practice with me a little bit of the type of conversation? I've heard about this child's sexual abuse that's happening just all over the internet. What do I do with my child? How do I have a conversation with them?

Julie Cordua:
That is a great question to ask, and I'm glad you're asking it because something I would say is don't get your kid a phone until you're ready to talk about difficult subjects. So ask yourself that question. And if you're ready to talk about nudity, nude pics, pornography, abuse, then maybe you're ready to provide a phone. And I would say before you give the phone, talk about expectations for how you use it, and also talk about some of the things that they may see on the phone. The phone opens up a whole new world. There may be information on there that doesn't make you feel good, that isn't comfortable. And if that ever happens, know that you could turn that off and you could talk to me about it.

You may also, I think it's really important to talk to kids when they have a phone about how they defined a friend and who is someone online? Is that person a real person? Do you know who they are? Have you met them in person? Also talking about what kind of information you share online. And then there's a whole list, and we have some of this on our website, but then I'd say like, "These are conversations to have before you give the phone when you give a phone every week after you give a phone." But I would also pair it with being curious about your kids' online life. So if all of our conversations with our kids are about the fear side, we don't foster the concept that technology can be good. And so also include, "What do you enjoy doing online? Show me how you build your Minecraft world. What games are you playing?"

And help them understand, because that safety, that comfort, that joy that they enjoy talking to you will create a safer space for them to open up when something does go wrong, as opposed to every conversation being scary and threat based. If we have conversations like, "I'm curious about your online life, what do you want to learn online? What are you exploring? Who are your friends?" Talk about that for 10 minutes and have one minute be about, "Have you encountered anything that made you uncomfortable today?" So have the right balance in those conversations.

Carol Cone:
Oh, that's great advice. That is really, really great advice. And again, where can parents go online at Thorn? What's the web address?

Julie Cordua:
Thorn.org and we have a variety of resources on there from our Thorn for Parents work as well as our research.

Carol Cone:
So can you talk about what you're doing in regulatory actions? Because it's really important.

Julie Cordua:
Yeah, it's really interesting to see what regulators around the world are doing. And different countries are taking different approaches. Some countries are kind of starting to require companies to detect child sexual abuse and others are, and I think this is the way the US may go, but it's going to take a while, are requiring transparency. And so that to us is a true baseline is companies should be transparent about the steps that they are taking to keep children safe online. And then that gives parents and policymakers and all of us the ability to make informed about what apps our kids use, what is happening.

Carol Cone:
What's your hope for the US in regulation considering the US is still struggling with regulating technology companies overall?

Julie Cordua:
I think it'll be a while in the US. They've got a lot going on, but I think that first step of transparency will be key. If we can get all companies to being transparent about the child safety measures that they are putting in place, that would be a big giant step forward.

Carol Cone:
Great, thank you. You're so smart in terms of really building a listening ecosystem, and I noticed that you have a Youth Innovation Council. Why did you create this and how do you use them?

Julie Cordua:
I love our Youth Innovation Council. You listen to them talk and you kind of want to just say, "Okay, I'm going to retire. You take over." I mean, we're talking about creating a safer world online for our kids. And this is a world that we didn't grow up with. So these kids, is just ingrained in their lives. And this is why I want to always be really careful. I work on harms to children, but I really truly believe that technology can be beneficial. It is beneficial to our society, it can be beneficial to kids, it can help them learn new things, connect with new people. And that's why I want to make it safer so they can benefit from all that.

And when you talk to these kids, they believe that too and they want to have a voice in creating an internet that works for them that doesn't abuse them. And so I think we would be remiss to not have their voices at the table when we are crafting our strategies, when we are talking to tech companies and policy makers, it is amazing to see the world through their eyes, and I really truly believe they're the ones living on the internet, living with technology as a core part of their lives, that they should be a part of crafting how it works for them.

Carol Cone:
So share with our listeners, what is NoFiltr? Because that's one of your products that's helping younger generation online to truly combat sexual images.

Julie Cordua:
Right. So NoFiltr is our brand that speaks directly to youth. And so we work with a variety of platforms that run prevention campaigns on their platforms, and the resource is often direct to our NoFiltr website. We have social media on TikTok and other places speaking directly to youth, and our Youth Council helps curate a lot of that content. But it is really about instead of Thorn speaking to youth, because we are not youth voices, it's kids speaking to youth about how to be safe online, how to build good communities online, how to be respectful, and how to take care of each other if something happens that isn't what they want, or that's our goal.

Carol Cone:
So it's not that you're wagging a finger or you're scaring anyone to death. You're empowering each one of those audiences. And that's a brilliant, brilliant part of how Thorn has been put together. I want to ask about the next really scary challenge to all of us, and that's AI, generative AI and how that is impacting more imagery and more child sexual abuse around the globe, and how you are prepared to begin to address it.

Julie Cordua:
Yeah. So we saw that one of the first applications of generative AI was to create child sexual abuse material. To be fair, generative abuse material had been created for many years in the past. But with the introduction about two years ago of these more democratized models, we just saw more abuse material being created. And actually, we have an incredible research team that does original research with youth, and we just released our youth monitoring survey and found that one in 10 kids knows someone, has a friend, or themselves have used generative AI to create nudes of their peers. So we are seeing these models be used both by peers as in youth, they think it's a prank. We know obviously it has broader consequences, all the way to perpetrators using it to create abuse material of children that they see in public.

And so actually, one of the first things we did a year ago was convene about a dozen of the top gen AI companies to actively design principles by which their companies and models will be created to reduce the likelihood that their gen AI models will be used for the development of child sexual abuse material. Those were released this past spring, and we've had many of those companies start to report on how they're doing against the principles that they agreed to. Things like, "Clean your training set, make sure there's no child sexual go material before you train your models on data. If you have a generative image or video model, use detection tools at upload and output to make sure that people can't upload abuse material and that you're not producing abuse material."

Things get more difficult with open-source models, not OpenAI, but open-source models because you can't always control those factors. But there are other things you can do in open-source models like cleaning your training dataset. Hosting platforms can ensure they're not hosting models that are known to produce abuse material. So every part of the gen AI ecosystem has a role to play. The principles are outlined, they were co-developed by these companies, so we know they're feasible, and they can be implemented right now out of the gate.

Carol Cone:
Brilliant work, and it's terrific that you've gotten those principles done. Truly, you are in service to empower the public and all users, whether it's adults or children or such. I mean, you've been doing this work brilliantly for over a decade. What is next for you to tackle?

Julie Cordua:
I mean, this issue will require perseverance and persistence. So I feel like we have created solutions that we know work. We now need broader adoption. We need broader awareness that this is an issue. I mean, the fact is that globally, we know that the majority of children, by the time they reach 18, will have a harmful online sexual interaction. In the US, that number's over I think 70%. And yet, we as a society are not really talking about it at the level that we need to talk about it. And so we need to incorporate this as a topic. You opened the segment with this, this is a public health crisis. We need to start treating it like that. We need to be having policy conversations, we need to be thinking about it at the pediatric doctor level. If you go in for a checkup, we've done a really good job incorporating mental health checks at the pediatric checkup level.

I've been thinking a lot about how do you incorporate this type of intervention? I don't know exactly what that looks like. I'm sure there's someone smarter out there than me who can think about that. But we have to be integrating this into all aspects of a child's life because as I said, technology is integrated into all aspects of a child's life. So thinking about this not just as something a parent has to think about or a tech company, but doctors, policymakers, educators. So it's a fairly new issue. I mean, I would say when I was working in global health, we've been trying to tackle global health issues for decades. This issue is kind of a decade old, if you will. I would say we're still a baby, but it's growing fast and we have no time to wait. We have to act with urgency to raise awareness and integrate solutions across the ecosystem.

Carol Cone:
Brilliant. I'm just curious, this is a tough issue, this is a dark issue. You've got kids right in the zone. How do you stay motivated and positive to keep doing brilliant work?

Julie Cordua:
Oh, thank you. You see the results every day. So if I'm ever getting discouraged, I try to kind of make sure I go talk to an investigator who uses our software to find a child. I talked to a parent who has used our solutions to create a safer environment for their kids or a parent who might even be struggling, and they give me the inspiration to keep working because I don't want them to struggle. Or I talked to the tech platforms. I actually think... So 13 years ago we started working. I said there were no trust and safety teams. So we were working with these engineers who were being asked to find thousands of pieces of abuse, and they had no technology to do it. One thing that gives me hope is we're sitting here at the advent of a new technical revolution with AI and gen AI, and we actually have engaged companies. We have trust and safety teams, we have technologists who can help create a safer environment.

So I've been in it long enough that I get to see the progress. I get to meet the humans who are doing even harder work of recovering these children and reviewing these images. And if I can make their lives better and give them tools to protect their mental health so that they can do this hard work, I feel like I'm of service. And so that progress helps. And then I will say for our team, one thing that's really inspiring is we say this a lot. I mean, we have a team of almost 90 at Thorn that work on this. And you don't go to college and say, "I want to work on one of the darkest crimes in the world." And all of these people have given their time and talent to this mission, and that's incredibly inspiring, and we offer a lot of wellness services and mental health for everyone. But I would say it's the progress that keeps me going. But we do offer a lot of mental health services and other wellness services for our employees.

Carol Cone:
That's so smart. So this has been an amazing conversation. I am now much better versed in this issue, and I thought I knew all social issues since I've been doing this work for decades. So thank you for all the great work you're doing. I always love to give the last comment to my guest, so what haven't we discussed, it could be comments, that our listeners need to know about?

Julie Cordua:
There's a few things. Sometimes this issue can feel overwhelming and people are kind of like, "Ah, what do I do?" If you are at a company that has an upload button, reach out because we can help you figure out how to detect child sexual abuse. Or if you're a gen AI company, we can help you red team your models to make sure that they are not creating child sexual abuse material. If you are a parent, take a deep breath, look at some resources, and start to think about how to have a curious, calm, engaging conversation with your child with the purpose initially to just open up a line of dialogue so that there's a safety net there and you start to do that on a regular basis.

And if you're a funder and think, "This work is interesting," our work is philanthropically funded, so it's a hard issue to talk about, we talked about that, and I really do think those who join in this fight are brave to take this on, because it's a hard issue and we've got an uphill battle, but it is our donors and our partners who make it possible.

Carol Cone:
You are truly building a community, a very, very powerful community with agency and products and tools, and you are to be commended. And I'm so glad we ran into each other at Social Innovation Summit. The new playground is technology and screens, and that's where children are hanging out and we need to protect our children. I know there's a lot more work to do, but I feel a little bit more calm that you're at the helm of building this amazing ecosystem to address child sexual abuse online. So thank you, Julie. It's been a great conversation.

Julie Cordua:
Thank you so much. Thanks for having the conversation on and being willing to shine a light on this. It was wonderful to reconnect.

Carol Cone:
This podcast was brought to you by some amazing people, and I'd love to thank them. Anne Hundertmark and Kristin Kenney at Carol Cone ON PURPOSE. Pete Wright and Andy Nelson, our crack production team at TruStory FM. And you, our listener, please rate and rank us because we really want to be as high as possible as one of the top business podcasts available so that we can continue exploring together the importance and the activation of authentic purpose. Thanks so much for listening.

This transcript was exported on Sep 05, 2024 - view latest version here.

p360_Thorn RAW (Completed 09/05/24)
Transcript by Rev.com
Page 1 of 2