And We Feel Fine with Beth Rudden and Katie Smith

In our first guest episode, Beth and Katie are joined by Dr. Desmond Patton—renowned social worker, AI ethics leader, and founder of SAFELab—for a soul-stirring conversation on what’s ending, what’s beginning, and why joy is more than a feeling—it’s an intentional system.
From pioneering research on youth expression and gun violence to building JoyNet, a machine learning platform designed to surface joy in digital spaces, Desmond shares how community, nuance, and vulnerability can change the future of tech. Together, the trio explores why context matters in AI, how social media misreads grief as aggression, and what it means to decolonize data through trust.
This is a masterclass in human-centered design—one rooted in lived experience, radical listening, and the belief that joy and justice are not opposites.

🔑 Topics Covered:
  • What’s ending: the era of joyless performance
  • What’s beginning: joy as an intentional operating system
  • The origin and mission of JoyNet
  • Why traditional NLP tools misinterpret Black and Brown grief
  • CASM: a 7-step contextual analysis system for social media
  • Building tech with, not for, marginalized communities
  • How AI systems get culture—and people—so wrong
  • Scaling empathy without erasing depth
  • Social media as a space of both trauma and healing
  • Reimagining metrics, value, and thick data
  • Storytelling, digital connection, and the slow power of joy
📌 Key Takeaways:
  • Joy is not frivolous; it’s resilient, rooted, and revolutionary.
  • AI systems must be designed with contextual nuance and cultural fluency—or they cause harm.
  • Grief doesn’t look the same across cultures, and we need tech that understands that.
  • Participatory research and lived experience are non-negotiable in building responsible AI.
  • The movement toward healing, justice, and connection is growing—even if it’s quiet.
⏱️ Chapters (Timestamps):
00:00 What's Ending and Beginning: Joy as Operating System 
03:00 JoyNet and the Science of Digital Uplift 
06:00 When NLP Fails: Misreading Black Grief as Aggression 
12:00 Introducing CASM: Contextual Analysis of Social Media 
16:00 InterpretMe: A Tool for Training Ethical Annotation 
20:00 Why Youth Voice and Lived Experience Must Lead 
26:00 Collaborating with Tech Platforms for Change 
30:00 The Case for Thick Data Over Scale 
35:00 Polarization, Algorithms, and the Cost of Misunderstanding 
40:00 The Quiet Power of Joy Posts and the Future We Can Choose

Creators and Guests

BR
Host
Beth Rudden
Pronouns: she/her Beth Rudden is the CEO and Founder of Bast AI, where she’s designing explainable, personalized AI that puts human dignity at the center. A former Distinguished Engineer and global executive at IBM, Beth brings 20+ years at the intersection of anthropology, data science, and AI governance. Her mission: make the next generation of intelligence understandable, accountable, and profoundly human. She’s helped reshape tech in healthcare, education, and workforce systems by applying ontological natural language understanding—yes, it’s a mouthful—to build AI that reflects cultural nuance and ethical intent. Beth is the author of AI for the Rest of Us and a global speaker on AI literacy and the future of power. On And We Feel Fine, she brings curiosity, clarity, and contagious optimism to every episode. With Katie, she explores what it means to end well, begin again, and build something truer than what came before.
KS
Host
Katie Smith
Pronouns: they/them Katie Smith is the Co-Founder and CEO of Humma.AI, a privacy-first, empathy-driven platform training culturally competent AI through community-powered data. Their unconventional journey began in the online adult space, where they held executive roles at Playboy and leading video chat platforms—gaining rare insight into how digital systems shape desire, identity, and power. Later, Katie turned those skills toward public good—leading digital at the ACLU National and crafting award-winning campaigns for marriage equality and racial justice. Now, they’re building tech that respects consent, honors community, and shifts power back to the people. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a genre-defying exploration of trauma, identity, and transformation. A queer, nonbinary, neurodivergent thinker and builder, they bring systems-level thinking, futurism and humor to And We Feel Fine. Expect honest conversations about what’s ending, what could begin, and how we co-create tech—and futures—worth believing in.
AL
Producer
Alexia Lewis

What is And We Feel Fine with Beth Rudden and Katie Smith?

At the edge of collapse—and creation—two unlikely co-conspirators invite you into a radically honest conversation about the future. This isn’t just another tech or self-help podcast. It’s a story-driven exploration of who we are, what we value, and how we might reimagine the world when the systems around us stop serving us. We blend personal storytelling, cultural critique, and deep inquiry into what it means to be human in an age of AI, uncertainty, and transformation. We’re asking better questions—together.

Because the world is changing fast, but maybe that’s precisely what we need.

Hosted by Beth Rudden and Katie Smith, two builders of systems and challengers of the status quo. Beth is CEO of Bast.AI and a globally recognized expert in trustworthy AI, with decades of experience leading data and ethics at IBM. Katie is the founder of Humma.AI, a strategist who drove innovation and revenue growth at major global brands before turning to human rights and technology for social good. Together, they make complex issues, such as AI and its impacts on everyday people, clear, personal, and impossible to ignore.

Beth Rudden is the CEO and Founder of Bast AI, a pioneering company building explainable, personalized AI for good. With over two decades of experience as a global executive and Distinguished Engineer at IBM, Beth blends anthropology, data science, and AI governance to create tools that amplify human dignity and intelligence—not replace it.
Her work spans healthcare, education, and workforce transformation, using ontological natural language understanding (NLU) to make AI transparent, accountable, and accessible. Through Bast AI, Beth is reimagining how organizations deploy AI that’s not only accurate but aligned with ethical values, cultural context, and cognitive well-being.
Beth is also the author of AI for the Rest of Us and a passionate advocate for AI literacy, epistemic diversity, and the right to understand the systems shaping our lives. She speaks globally on the future of AI, power, and social contracts—and believes we’re all stewards of the next intelligence.

Katie Smith is the CEO and Founder of Humma.AI, a privacy-first platform building community-powered, culturally competent AI. With over two decades of experience leading digital strategy and social innovation, Katie blends systems thinking, Responsible AI, and storytelling to create tools that serve dignity, not domination. Their work spans mental health, civic tech, and digital rights, using participatory AI to make systems safer, fairer, and more accountable. Through Humma.AI, Katie is reimagining how people and businesses engage AI that’s accurate, inclusive, and governed by consent and care. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a provocative exploration of identity, trauma, and transformation. They speak globally on the future of technology, power, and justice—and believe human empathy is the intelligence that will define our time.

Subscribe to our Substack for bonus content: https://substack.com/@andwefeelfine

Katie Smith (00:13)
I

Desmond, thank you for joining us for and we feel fine. This this podcast that Beth and I are doing. so as you know, these we have these big themes of what's ending and beginning. So I just thought maybe we'd start there. And then we have some great questions for you. But what do you think is ending and beginning or what comes up for you when that question is posed?

Desmond Patton (00:25)
Okay.

Are you thinking about this question as a personal question or just in general?

Katie Smith (00:36)
You know, this podcast is really focused on personal stories, but really at the intersection of like culture and technology. But Beth, how would you describe it?

Beth Rudden (00:46)
I mean, I think that you've already met the beginnings of like you're our first guest. however you want to receive that question and however you want to answer it, it's dealer's choice.

Desmond Patton (00:57)
Got it. It's such a cool question. You know, I think what is personally and professionally ending for me is a need to do things, events, and experiences that don't map onto my joy plan for my life, both professionally and personally.

And this, it was beginning, I think, is this intentional living that I think has roots in a desire to be purposeful, a desire to be useful, a desire to be a connective tissue for others. And I think that requires intentional and purposeful living.

And what's beginning in that space for me is like, is trying to understand how joy can be an enduring operating system for me and how I make decisions and how I connect, how I grow. not, I'm not talking about the kind of joy that you post on Instagram. I'm talking about soulful.

joy, joy that comes from pain, joy that comes from trauma, joy that persists in the midst of chaos and challenges. yeah, this is, and I've been on this journey for a couple of years now and it's been really, really cool.

Beth Rudden (02:23)
Have you,

I have to ask, have you read the Book of Joy by his, oh my goodness, it's one of my favorites, yes.

Desmond Patton (02:26)
Yeah, yeah.

Well, I feel the thing that I didn't tell you is that I teach a course on joy called The Journey to Joy. And so that book is it's not a required text, but it's one of our supplemental texts that we have for the class. And so yeah, I'm a big fan.

Beth Rudden (02:45)
Yeah, mean, Reverend Desmond Tutu and His Holiness the Dalai Lama, and I love the relationship that they have together. And I do meditation and they're on my meditation, like in videos sometimes where they're like actually, you know, talk. There was a whole, I'm a huge fan, apparently. When you said that, that's where I was like, had to ask if you read that.

Desmond Patton (02:51)
Yeah

Mmm.

Yeah.

I love that.

Beth Rudden (03:12)
Great, great, great book. Beautiful book. Yeah.

Desmond Patton (03:13)
Yeah.

Katie Smith (03:14)
Desmond, you've been working on JoyNet for a while now. Do you want to share a little bit about that? Because I feel like you've had some breakthroughs.

Beth Rudden (03:19)
Yeah.

Desmond Patton (03:19)
Yeah.

So join it. Sorry, my dog is very interested in the conversation. He's interested in joy as well. ⁓

Katie Smith (03:26)
Dogs are walking on this.

Beth Rudden (03:27)
Dogs are welcome. We

have dogs all the time. part, they're family members.

Katie Smith (03:31)
later on possibly.

Desmond Patton (03:32)
So Joynet is a machine learning powered platform that came out of my work actually on gun violence. So I have been trained as a gun violence researcher who spends a lot of time in tech space because one of the things that I realized pretty early on in my career is that

A lot of young people are living their lives online and it's not this virtual thing. is a serious additional piece of their life. And for the work that I was doing around firearm violence, oftentimes the articulation of aggression, need to cope and process would happen on social media platforms. And so one of the things that would occur

that I missed in my research was the ways in which young people would use joy and happiness, humor as a way to deescalate violence online. And so they would see a friend engaging in aggressive conversation and they would just put in a funny meme or a joke or a song or something that they knew would turn the temperature down on a heated conversation that would happen online.

Beth Rudden (04:29)
Mm-hmm.

Desmond Patton (04:46)
And at the same time, my colleague Andre Brock at Georgia Tech was writing about how Black folks were using joy as a way of connecting on Twitter. And so those things converge. And I was like, we should do something around this instead of just kind of observing it. And so I created some youth advisory councils with my colleagues in the safe lab. And so my team and I

found young people and we just started to think about what joy is. we asked them to define joy based on their own lived experiences and how they make meaning of it culturally. We asked them to kind of put words together and pictures together about what joy meant and use that as a starting place to think about, how can we help young people who are on social media platforms, who are doom scrolling, kind of struggling with life.

find joy more rapidly. so Joynet hopefully is a tool for that work. And so I partner with computer scientists at Columbia. And then we've been training algorithmic systems to find concepts of joy on TikTok and then hopefully Instagram. And the goal is for young people to be able to search for joy, find joy on these platforms, but to also connect

on JoyNet itself around Joy and to also think about how they can perhaps build relationships offline with people who are also looking for Joy as well. So we have completed the platform and we'll be doing some beta testing of it with Youth in Philadelphia this summer to see if anybody will actually use this platform. And then if that goes well, then we have a plan to do more clinical testing with folks who

maybe managing various mental health challenges, anxiety or depression in three cities in the US. know, a way is a way, but we've already engaged an application process with colleagues at Columbia to do that work. Yeah.

Katie Smith (06:44)
incredible.

Beth Rudden (06:45)
That it, congratulations. That is amazing. It leads me, when did you realize that the traditional data science approaches were missing something essential in analyzing all of this youth language online? Like.

Desmond Patton (06:58)
⁓ immediately.

Beth Rudden (06:59)
Was there like a pivotal point? I mean, how did you or was it just that these people are not understanding the language that people are actually using?

Desmond Patton (07:00)
the

No, the pivotal point was when we tried to see if natural language processing tools could automatically understand African-American English. was some of our first studies, probably around 2015, 2016. And every single time it would fail. The interpretation, its ability to explain it would just be completely false.

it would oftentimes, one of the issues was that it would oftentimes confuse aggression and grief. And so a young person could make a post about grief and they may use various types of language, they may use curse words, whatever. And it would almost always identify that expression of grief as aggression. And so that was happening immediately when we would use these tools to process language on Twitter at the time.

Beth Rudden (07:35)
Interesting.

Desmond Patton (08:01)
So it was abundantly clear that these tools could not stand alone, that we would have to first, instead of using big data, use thick data. And so going smaller and providing really robust explanations around what's actually happening in these posts, hiring young people as research assistants and annotators so that we have

Beth Rudden (08:23)
Mm-hmm.

Desmond Patton (08:24)
a nuanced and deeper understanding of what's being communicated on the platforms was really important. It really took us pausing and slowing down. This couldn't be your fast big data thing. This had to be slow and methodical and careful. And one of the things that was always really complicated for me is that the N-word would consistently get flagged. ⁓

Beth Rudden (08:48)
Mm-hmm. Mm-hmm.

Desmond Patton (08:49)
NLP

tools. And so I had to tell my white colleagues that the N word is not necessarily an aggressive term in the Black community. And that's an awkward conversation, and yet one of the most important conversations. And I think it really changed the trajectory of our research and also our relationship, because what I learned is that this work requires trustworthiness. It requires community and relationship. It requires vulnerability. And so I had to be, I had to feel comfortable with my colleagues.

Beth Rudden (08:55)
Mm-hmm.

Mm-hmm.

Mm-hmm.

Desmond Patton (09:18)
to have these conversations and believe that it will be heard and trusted and action would happen so that we don't make these mistakes. And it absolutely worked out that way.

Beth Rudden (09:29)
My 21 year old is back living in my house and he is a War of Thunder player. And he just totally lights up when he talks about like all of the different tanks and the planes and like all of these things. And he sends me this thing that I can totally understand how any NLP would not understand this. But it's a list of classified document leaks of

Desmond Patton (09:34)
Mm.

Mm-hmm.

Beth Rudden (09:53)
different people in military situations telling War of Thunder that their game is not correct, that their turret actually goes left, not right or something like that. And it's a great example of that. they use joy to diffuse. some of the, you know, this is about, you know, understanding history and making sure that they have like, you know, the best of the best understanding, even though, you know,

Desmond Patton (10:02)
wow.

Beth Rudden (10:18)
they're like World War I and World War II kind of things are declassified, but they have a lot of modern kind of things that my son is very, very interested in. But how do you not only discern grief from aggression, but also like wits from humor? And that's gotta be very, very tricky. How did you...

Are you using any ontologies or like any work to ground? You know, what is what is known as good? Like how did you get through this?

Desmond Patton (10:46)
No, you know, I'm a social worker and my groundedness comes from lived experience and active listening and not the kind of listening that just happens with your ears, the kind of listening that is full body, the kind of listening that listens for understanding and for deeper context. And so a part of this meant that we had to take off our academic hats, leave it at the door and just listen to people.

Beth Rudden (10:49)
Mm-hmm.

Mm-hmm.

Desmond Patton (11:14)
share their stories and dig deeper and ask more robust questions, really try to understand cultural nuance. But I think the critical shift in the work was bringing all young people. Because I think the reality is, and I think we could all probably learn from this, is that we need to approach this with humility and understand where our expertise ends and where the expertise of others begins.

And that was a really big like, well, because when you're trying to get a PhD, you're trying to be an expert, right? And everyone thinks you're an expert. And that is counterintuitive to the kind of process that I think we need in AI to actually be true listeners in this space. And so I think for me, the key component was building trustworthy partnerships and relationships with young people who would share their life and share their stories and help us to understand when a post was just a lyric.

Beth Rudden (11:58)
Mm-hmm.

Desmond Patton (12:04)
versus a post that we actually should lean into. And we wouldn't know that.

Beth Rudden (12:05)
Mm-hmm.

Yeah, that's.

Katie Smith (12:08)
Yeah, that

nuance. That's, it's tricky. And so you've developed this system. So, you know, the C A S system, you know, it's contextual analysis of social media. Can you share more about that and how's this related to what you're, you're discussing now?

Desmond Patton (12:26)
Yeah, so Chasm came out of this understanding that we didn't know what young people were saying online and that we brought our own biases to this work. And so we needed a process to face and confront our own biases to hopefully get us to a place where we're annotating data sets, data sets comprised of social media, texts, images in a way that brought forward full context and nuance.

And so it is really kind of a merge of a theoretical understanding of the importance of context as an apparatus and also a methodological practice for how we annotate data. so Chasm is a seven step procedure that we apply annotating social media data that is all about the annotator. And it kind of takes this kind of qualitative process. So you go.

you are asked to go back to the social media posts and to just extract context. Learn more about the poster. Learn more about what's surrounding their posting. Look at the engagement. Look for details that are not necessarily readily apparent, but are hidden in text, if you will. And so the annotator, before they can label a post, say it is aggression or grief, before they can apply that,

Beth Rudden (13:24)
Mm-hmm. Mm-hmm.

Desmond Patton (13:44)
label, they have to go through the seven-step procedure to learn more context. So how this works in kind of annotation process. And so we built an annotation system that we call Betas. And you are given a social media post from a data set. We can say we got a youth-based data set from Chicago. And so the annotator is given a social media post, and they're asked to first define what's happening in that post with no context so that we have a baseline understanding.

Beth Rudden (13:58)
Mm-hmm.

Desmond Patton (14:11)
Then we asked them to go through the chasm process, which then says, go find context, context, context, context. So now they've spent hours looking for context. And then they're asked to go back to the same post and to then reinterpret that post with the context. And so that allows us to compare and contrast how their opinions and interpretations have shifted over time of having more context. And now when they go off to then label a social media post,

Beth Rudden (14:24)
Hmm

Desmond Patton (14:37)
that will then be used to train algorithmic systems, that label has more weight. It's not just like I'm looking at text and I'm labeling. Now you have more of an educated guess on how you should label a post. Now it's not 100 % accurate by any means, but it certainly gives us closer to have a deeper meaning in hopefully labels that you can trust and that are informed by context. So that's CASM.

Beth Rudden (14:44)
Mm-hmm.

Katie Smith (15:04)
That's incredible. That's incredible. How do you scale something? Sorry, quick follow up. How do you scale something like that?

Desmond Patton (15:10)
I don't know. That's a great question. And it's not something that we've done. I'm truly a qualitative person. And so it's interesting because I do a lot of work with tech companies and the question is always about skill. And I'm like, I'm not your person because I go deep. But I do think what I have learned is that this particular process can be applied in multiple contexts, in multiple spaces. And we've had lots of people from different

Beth Rudden (15:37)
Mm-hmm.

Desmond Patton (15:38)
backgrounds. We've had librarians, educators, journalists ⁓ who are all interested in thinking about deeper meaning. And I think what we've done two things. One is that this process can be adapted for you. And so you may not need all seven steps. Maybe you just need one or two questions that are reflexive questions that you ask of yourself. You ask of your team that can be applied to a setting. So that's one way in which we've done this. And then we also created

Beth Rudden (15:41)
Mm-hmm.

Desmond Patton (16:05)
another platform. It's called Interpret Me, and we did this with the Teaching Systems Lab at MIT. And we built this platform a while ago to actually help people kind of go through a reflexive training experience where you can then practice chasm in real time and get automated feedback through the systems that we've built in that space. And so you are presented with a social media post, and you interact with actors, paid actors that

Beth Rudden (16:11)
Mm-hmm.

Mm-hmm.

Desmond Patton (16:32)
can walk you through different scenarios so that you get contextual information and then you're asked to do then think about how you will act to oppose based on those interactions that you go through. And so that I think interpret me, which you can Google and find online can be one way in which it can be applied to different contexts, different companies and so forth.

Katie Smith (16:51)
Love it. I can see ideas bubbling in your head, Beth.

Beth Rudden (16:54)
Well, I know, I think it's wonderful. And I completely appreciate that. And one of the things that I know is that we have to pay more attention to these theoretical edge cases, because everybody's getting it wrong that they think that they have, you know, the 80 % and they have no ability to do variance.

Desmond Patton (17:11)
Yeah.

Beth Rudden (17:16)
Desmond, the one thing that I hear loud and clear is that the people who have the knowing in their body of the culture should be the ones who are doing the labeling, not somebody who is disconnected from it or even a statistical or ball of statistics. Question on like, all right, so what was the realization or.

Desmond Patton (17:27)
Yes.

Yeah.

Beth Rudden (17:38)
You know what I really want to know? Who were your heroes? How did you get to where you are? who did you have in your life? Did you have like a librarian or a pastor or like, did you have somebody that taught you how to do social work in the way that you're doing it?

Desmond Patton (17:51)
So I got my MSW at the University of Michigan and I would have to say that the way in which I approach my work came from my MSW program and it really came from just like some of the first social work classes that I took at Michigan. I learned about reflexivity. I learned about trustworthy partnership and engagement. I learned about honoring the dignity and worth of every human being. Like these were like just kind of

Beth Rudden (18:14)
Yeah.

Desmond Patton (18:18)
principles that structured my training as a social worker. And so those professors are my heroes and I honor them in my work. I honor them in how we do our work because it really is the foundation for my approach.

Beth Rudden (18:26)
Mm-hmm.

It is just a huge shout out. most people, I mean, I, in order to call yourself a social worker, the amount of education and certifications that you have to have are way different than going through a six week bootcamp to create an algorithm that decides whether you get a loan. And it is so frustrating to me to have that dichotomy out loud and proud. mean, you just beautiful, beautiful example.

Desmond Patton (18:55)
Yeah.

Thank you.

Beth Rudden (19:02)
What made this urgent for you?

Desmond Patton (19:05)
yeah, I mean, kids are dying, right? And I think that, you know, my first entree into this space is that I observed two Black men from the South Side of Chicago, very well-known rappers, kind of having a beef online. And one of them posted their location on Twitter, and he was murdered in that location within three hours. And that was just, that wasn't the beginning, but that was...

Beth Rudden (19:07)
Right. Yeah.

Desmond Patton (19:30)
kind of earth shattering for me. And when I went to the literature to kind of figure out what was going on, there was no literature about this phenomenon. And so with my colleagues from UChicago, where I did my PhD training, we wrote the first paper on this phenomenon that's called internet banging or cyber banging. So like, think people, I don't think people realize that this is like, to this day, people are dying because of.

Beth Rudden (19:37)
Mm-hmm.

Desmond Patton (19:56)
their engagement on social media. And my hope in this work and how we do the work is twofold. One is that we want to eradicate these deaths and reduce them substantially. But also on the flip side, there's ethical issues to this work as well. And so one of the scary parts of this is that these processes can be misused. They can be used to hyperseveral black and brown communities. They can be...

Beth Rudden (20:07)
Mm-hmm.

Mm-hmm.

Mm-hmm. Mm-hmm.

Desmond Patton (20:21)
you know, if you don't work with young people, if you don't work with communities, you don't understand what they're saying, it's easy to make the wrong decisions. And we're kind of seeing similar approaches happening with ICE right now. There's lots of like, you did this, you said this, now we take you in and then it's a wrap. And this is kind of what has been happening in the gun violence space for over a decade as well. So this is serious.

Beth Rudden (20:33)
Mm-hmm.

Yeah, recidivism also. ⁓ What have you learned about how grief shows up differently, differently online for Black and Hispanic youth?

Desmond Patton (20:47)
Yeah. Yes. Yes. Yeah.

Well, I think so. One of the important things to underscore is that we all grieve, and that is the starting point. And what I learned is that we all grieve based on our lived experiences and histories and cultures, right? And so, for young Black folks that I work with, grieving happens in a multiplicity of ways. It happens through posting lyrics. It happens through video. It happens through

Beth Rudden (21:11)
Mm-hmm.

Desmond Patton (21:25)
memorializing pages that are constructed. It can be hyper expressive and really communicative. There's a lot of expression and a lot of creativity that can happen in this space as well. And it may not, we may not always use the same words. We may not even say the word grief, right? And so I think what I've learned from this work and the beauty of it is that folks,

Beth Rudden (21:41)
Yeah.

Desmond Patton (21:49)
Even when it's hard, right? Even when it's hard to like say the thing, social media, think we always talk about the negative aspect of social media, but I think what social media can also do, especially for people who struggle with expression, is to allow for various forms of expression that is quite healthy, right? ⁓ And this is really complicated when you live in a neighborhood with high rates of gun violence and maybe your lived experiences, it's grief is often and frequent.

Beth Rudden (22:07)
Mm-hmm.

Desmond Patton (22:16)
And that can perhaps lead to prolonged grief challenges, right? But social media can create a space where folks can, in real time, whenever they want to, just say things, write things, draw things, create videos, create music, and share it, and give interaction and engagement. And I don't think we know enough about the utility of that kind bi-directional experience in terms of healthy.

Beth Rudden (22:40)
Mm-hmm.

Desmond Patton (22:43)
healthy and promotive ways of showing up online.

Beth Rudden (22:47)
Super important.

Katie Smith (22:48)
It's such an important way for us to connect, know, as a queer, you know, non-binary, you know, neurodivergent person, you know, we've talked about this before, like, in the way that I had to find a date was I had to put an ad in the LA Weekly and then it evolved into like, you know, now we can actually meet each other online. But then it's like, it's crossed over where the monetization of these platforms

Desmond Patton (23:02)
Hmm.

Mm-hmm.

Beth Rudden (23:10)
Mm-hmm.

Mm-hmm.

Katie Smith (23:11)
has corrupted

the experience, right? We're just not able to have those authentic experiences quite like we used to. know, I've said before, like, I have friends who met on an AOL chat and are, that are still married, you know? There's like beautiful experiences that have happened. It's just, we have to get back to that somehow.

Desmond Patton (23:14)
Yeah. Yeah.

Yeah.

Mm-hmm. Yeah.

Yeah,

yeah, I completely agree.

Beth Rudden (23:33)
What the bellwether is, and one of the questions that I have for you is how have systems like law enforcement, public health, education typically responded to the expressions of grief that are different or that are non-traditional?

Desmond Patton (23:50)
Yeah, it's so different, right? Because I think law enforcement systems are very much in tune with this, from a very clear legal perspective around identifying bad actors, creating pathways for the collection of evidence to prove that you done something wrong. And so your expressions are up for debate, they're up for interpretation.

and have now been cataloged as evidence without any rigorous adjudication of whether or not that interpretation is true or not. And I think in other spaces, I think there's some utility. So I think one of the things that I talk a lot about is how might we use these interpretations to help people get better treatment.

Beth Rudden (24:20)
Mm-hmm.

Desmond Patton (24:36)
If you want to get better, if you're struggling with grief and you go to a therapist, but they don't understand your nuanced ways of expressing grief, then you may not get the treatment or the support or the medication that you need because there's a misalignment there. And think the same can be applied to any other system. And so I think...

We are in a space where we need more work in this space to help with explanation, to help with interpretation, and to really think carefully about how these things are applied in our society. But it's been hard to figure out that sweet spot because in tech companies, things get

This need to focus on scalability oftentimes reduces some of the deeper connection that I think is really important. And then academia, sometimes we move too slow, right? Or we don't have enough resources to actually do the kind of things that we want to do, especially now with the lack of grants to apply for. So I...

Beth Rudden (25:33)
Thank

Desmond Patton (25:43)
I am hopeful because I keep meeting great people like you all who ask these really deep and important and thoughtful questions. And everywhere I go, people I'm seeing, I've been doing this work for gosh, 12, 13 years now. And there's definitely more people who are tuned in, who are listening, who are trying. ⁓ And that's for me, hopeful.

Beth Rudden (26:01)
Yeah.

Katie Smith (26:04)
is doing this work? Because, know, one of the things that, you know, as I learned more about you, you know, over the times that we've known each other is like, this incredible discipline that you're developing with community with these youth councils is so useful to all the different platforms. And I know you advise, but how is it being implied? Who else is doing this work?

Desmond Patton (26:20)
Yeah.

And are you thinking about individuals or companies and organizations or all the above?

Katie Smith (26:31)
I'm thinking about how the work that you're doing is actually being applied to the major social platforms now. Like, they actually seeking, I'm hoping they're seeking you out.

Desmond Patton (26:37)
Yeah ⁓

Yeah, I have had the great fortune of working with almost all major social media platforms. Back when Twitter was Twitter, I sat on the academic advisory and there were some amazing people there at the time who were thinking about these issues. currently on TikTok's content advisory group.

for the US and they are always asking and trying to think thoughtfully about how do we bring more context to how we make decisions about content and how do we work with youth and community members. I sit on Spotify's safety advisory and we engage in really reflexive conversations around what safety means and how does it apply to how we think about podcasts, if you will.

And I'm also on Axon's ethics and equity advisory. And I just came back from Seattle last week where we were having these really great conversations about some of the new products and how do we center the voices of community. And what I have loved about all of these experiences is that usually there is a C-suite connection. And so this isn't just conversation for conversation sake.

Beth Rudden (27:48)
Mm-hmm.

Desmond Patton (27:50)
they report directly to a C-suite level person who has a budget and power to actualize these things. I think oftentimes, I'm not trying to like toot the horn of these companies, but in an era where DEI is just violently attacked, these companies are staying in the course and they're still putting money.

Beth Rudden (28:07)
Mm-hmm.

Desmond Patton (28:10)
and energy and action behind these things. And I think that should be applauded.

Katie Smith (28:15)
Yeah, definitely. Yeah, I

agree with that.

Beth Rudden (28:17)
If you had to wave a magic wand, you had the power to change these social media companies.

Is there, and this is gonna be leading, but like, don't they realize that thicker, small, higher quality data is far better for the people, like, and for a product and like to create a richer experience? Like, is there some understanding of that?

Desmond Patton (28:45)
Yes, and it doesn't make money, right? And so I think to answer your question, these things have to be incentivized. They hire brilliant folks. It's not that people are necessarily unaware. But one of the crucial gaps that I see is missing is true research alongside product development. And usually, when I am advising folks, I'm

Beth Rudden (29:05)
Mm-hmm.

Desmond Patton (29:10)
just basically giving them research 101, in particular, integrating participatory action research strategies and methods to make sure that if you truly care about community, then you start from the beginning. You don't wait until stuff has hit the fan, and now you have a safety concern, and now you want to talk to community. We start from the beginning, and it is this iterative process that moves throughout the product.

Beth Rudden (29:25)
Mm-hmm.

Desmond Patton (29:38)
life cycle. But that takes time. That is not always in alignment with priorities for companies. And companies move very, very fast. And they're always competing. they have to, and I think the chief challenge is that they're beholden to their stakeholders. And so I think we have to help them reimagine various incentive structures.

so that they can then put this on the same level as their everyday products as well. And that takes, yeah, there you go. Start your own thing, yes.

Katie Smith (30:11)
Did you have a follow-up?

Beth Rudden (30:12)
I mean, what metric would you choose to change? Like, you know, have you thought about like, because here's the deal, it could make money, but the current models require large volumes of data, which drives the incentive for people to harvest data. So if they could give richer experiences using high quality, thick data, which I love by the way,

Desmond Patton (30:21)
Mm-hmm.

Beth Rudden (30:38)
How can we get that to be the new metric? And then that's better for the environment.

Desmond Patton (30:43)
Yeah, I I think we need really strong use cases that show that when we have thick data, you actually are getting closer to the lived human experience. And hopefully you're opening doors so that everyone's identity and way of life is represented in the work. And that's probably going to lead to more engagement. think a part of this is truly understanding what humans want and need. ⁓ And I think

Beth Rudden (31:08)
There you go.

Desmond Patton (31:10)
they have tapped into that through advertising for sure. And yes. Yes. And for me, that doesn't exclude a way of approaching this work that is true to inclusion and diversity as well. And so I think both can happen.

Katie Smith (31:15)
They got really good at the thumb motion. got very good at that.

Beth Rudden (31:32)
Mm-hmm.

Desmond Patton (31:36)
But it takes folks, I think it takes a re-imagining of the work and how we do the work together. And so my friend and colleague, Courtney Cogburn, who you should think about talking to, talks a lot about these untethered imaginations and untethered from supremacy and toxic ideas, untethered from capitalism, just a freedom to reimagine possibilities that aren't connected to the now.

Beth Rudden (31:52)
Mm-hmm. Mm-hmm.

Desmond Patton (32:03)
And I think that like we could all like, wouldn't it be wonderful if we could all just kind of spend some time doing that thinking and kind of come back and then think about what's possible. But I think so many folks are just, the train is moving so fast that they don't think they have the time to do that.

Katie Smith (32:20)
Hmm.

That's exactly the kind of conversations we want to have on the podcast. You know, like this is it. Like we, don't have to buy into the narrative that's happening right now. You know, like, this is a moment where we can just go, Nope, not that. What do we want to see? What do we need and want? What does that future look like? You know, this is why I like what best is, bass is working on. It really is like a re-imagining and just a complete different, like a shared narrative shift on the AI.

Desmond Patton (32:24)
Yeah. Yeah.

Beth Rudden (32:24)
Mm-hmm.

Desmond Patton (32:29)
Yep.

Yep.

Yeah.

Katie Smith (32:47)
I know this is like an interview, I want you guys to talk for other reasons.

Desmond Patton (32:49)
Yeah,

we love that.

Beth Rudden (32:52)
Yeah,

mean Desmond, I know is that when I first started, people really didn't understand there's an alternative. Like they didn't even know that you could show your work or you could give the reason why the AI said this and too many white dudes doing the training. mean, that's the example I use all the time with Alexa.

Desmond Patton (33:00)
Mm-hmm.

Hmm.

Yeah.

Beth Rudden (33:13)
Like, why do you think it didn't work for women and young boys? This is so obvious to me. I'm also, I'm a trained anthropologist. So I did ethnography and that was when I saw the data scientist shoving all the data without even looking at it into the, was like, there's something, there's so much more you could do if you validated the data.

Desmond Patton (33:16)
Hmm.

Mmm.

Yeah.

Beth Rudden (33:35)
and use statistics and use like qualitative techniques. What do you think?

Desmond Patton (33:37)
Yeah.

Beth Rudden (33:41)
What do you think the tipping point could be for people to really understand and slow down? Because something Katie and I talk about a lot is we get one body, one life. As people get older, there are limitations to what you really want to do. there is something where when you have so much

Desmond Patton (34:01)
Mm-hmm.

Beth Rudden (34:06)
It's like it's rested, it's corrupted, it's not good unless you're giving back to communities.

Desmond Patton (34:09)
Yum, yum. ⁓

Yeah. Yeah, I think is exactly what you two are doing. I think we need to tell more stories about possibility, about imagination, and to tell stories that are complicated, that are problematized, that are not black and white. I think we need more of that. And we need that to surface to the top.

Because I'm really concerned about our digital life and how we show up in digital spaces. And one of the concerns that I have in particular is around the conversations that we have around the Palestinian and Israeli war. And I watch every single day on social media the articulation of language that says you're either left or right.

Beth Rudden (35:01)
It's so good.

Desmond Patton (35:02)
And

it is continuously pushed in a way that further exacerbates conversations that say you have to be here or there. And I think it doesn't make any sense to me why we can't have systems that help us to find commonalities in our connection. We don't always have to agree, but can we push commonalities? Can we push synergies? Can we push

Katie Smith (35:11)
Right.

right?

Desmond Patton (35:26)
points of connection because the polarization is harmful and it's not solving any of the problems. And yet I wish these platforms would think deeply about how they can use all the power that they have, all the resources that they have to bring people together. And yet I don't see it. don't, every single day it gets worse. So I'm really concerned about that.

Katie Smith (35:49)
Yeah.

It's been one of the, mean, there's many reasons that I left meta. That was a big one because I'm like, I just don't even see myself in these conversations anymore. the social justice movement for over a decade and my whole feed is doom scrolling. the whole, everyone's and everyone's upset with you. There's like, oh no, we cannot do this. We have to, one of my favorite things to do is actually go to play billards, go play pool. And the thing about playing pool is that you will just show up and you will play with anyone.

Desmond Patton (35:58)
Mmm.

Mm-hmm. Yep.

Hmm.

Mm-hmm.

Yeah, yeah. Yeah.

Beth Rudden (36:19)
Mm-hmm.

Katie Smith (36:20)
you're gonna end up talking to that person

who and you're not going to agree with them, right? One of my favorite things to do is to reveal at the end that I'm queer and I'm not binary and because they like me first and they go, shit.

Desmond Patton (36:25)
Yo.

Mmm.

Right.

Beth Rudden (36:34)
You got a

Katie Smith (36:36)
You don't look like the trans

people that I see on TV or whatever. I'm like, you know what? All of us are just human. We're just living our lives, man. Like, on.

Desmond Patton (36:43)
Yeah, yeah,

for sure.

Beth Rudden (36:46)
Well,

and I think I learned so much from my own children and the channels that they're getting information in. My son, who is like obsessed with War of Thunder and like all these things, it's not for violent reasons. It's because he wants to know all the variations of the planes and all these things. But he listens to like YouTubers like CPG Gray, who does like explainer videos on

Desmond Patton (36:51)
Mmm.

Hmm.

Beth Rudden (37:12)
amazing things like why people are polarized. you know, then there are, know, these, I, my niece is a, she works for a large gaming company and she is a community manager. And she's like, Beth, there are easily five to six amazing things for every bad thing that is out there. There is the balance. So I think you were dead on on like, how do we incentivize those like,

Desmond Patton (37:14)
Mm-hmm.

Beth Rudden (37:39)
Who was it? John Kronsky who did the some good news during COVID where you know, you're just trying to give the good news and I told my friend that yes, you know Yesterday I live in Denver. So somebody's like, you know, did you see the terror in Boulder? Are you okay? And I'm like there I'm planting flowers. There's six neighbors that I said hello to today I recognize that there is some person but why does that news agency

Desmond Patton (37:43)
Mm-hmm, mm-hmm, mm-hmm.

Mm-hmm. ⁓

Beth Rudden (38:06)
have to blast that. I had somebody from Germany ask me about it this morning. And I'm like, hold on. have like the person who is on the pedal of the news is exacerbating all of the fear. And I limited my meta Facebook thing way back in the day because I am very aware of what that does to my own.

Desmond Patton (38:26)
Yeah.

Beth Rudden (38:31)
to my own body, it makes me feel gross and I can't live there. we are, this is where I think, I know all tech companies do not wanna hear this, but people are, they're quietly quitting like all these platforms because nobody can handle that level of stimuli for that long. It's unhealthy.

Desmond Patton (38:32)
Yeah. Yeah. Yeah.

Yeah,

very much so. I couldn't agree with you more. And that's why I started, maybe about six months ago, I started just posting about joy. I started with lessons from my class. But I really just wanted to offer something else, something else for people to read, something else for people to engage with. And I can't tell you how transformative it has been. I'm a gun violence researcher, and it

Beth Rudden (39:02)
Yeah.

Desmond Patton (39:15)
has shifted how people see me. People come up to me and talk about joy. And I love it. They see me as the joy person. I'm like, I have changed my whole identity just by being vulnerable and putting myself out there and trying to connect with other people. And I think I just came back from a human connection to AI summit ⁓ at Lone Rock in Bailey, Colorado. And it was one of the most transformative experiences because we focused on the human connection piece.

Beth Rudden (39:35)
Mm-hmm.

Mm-hmm. Mm-hmm.

Desmond Patton (39:43)
AI was

a through line, but was not the center. It was about how do we help young people have a life where human connection is a part of the AI experience. I think I would love to see us continue work that promotes human connection in this space and not just focus totally on AI.

Beth Rudden (40:05)
That is a new beginning. That's right. mean, AI, a friend of mine says he's like AI is the electron microscope for the human condition and we're using it wrong.

Desmond Patton (40:07)
Yeah.

Mmm.

Mm-hmm, mm-hmm, mm-hmm,

mm-hmm.

Katie Smith (40:21)
You know, your joy posts are just fantastic, Desmond. No, really, like when I see them, like you, I want you to know I'm sure this is true for others. Like it gives me that little, the positive dopamine and of like, you know, not the doom scroll. gives me.

Desmond Patton (40:25)
thank you.

Katie Smith (40:36)
what are we thinking about with joy today? And what I love about in your production of these notes, which are just very digestible, is that there is so much vulnerability in it. Like you really do talk about like what you're sort of struggling with and why the intentionality of joy is so important to you and therefore should be important to all of us. I just want you to know that's actually had an impact on me and I've changed some of my behavior because of that. Like my dog, for example, is my joy plant.

Desmond Patton (40:57)
Thank you.

I love that!

Katie Smith (41:05)
And so I think about it a lot because of you, Desmond, and I'm sure that's true for

many people. So like the work you're doing with JoyNet, just the work you do and how you show up every single day, it's just like I have so much deep admiration for how you show up in the world. Thank you so much for what you do.

Desmond Patton (41:18)
Thank you.

Thank you. Thank you.

Beth Rudden (41:23)
Such a pleasure to have you on. I wanna see if we can figure out how to get those five good examples out faster or better or more prevalent. it's not gonna change unless we all decide to decide where we wanna put our energy and where we wanna put our attention. And I am...

Desmond Patton (41:32)
Mm-hmm.

For sure.

Beth Rudden (41:46)
I am desperate for that to happen sooner rather than later. It makes you feel better.

Desmond Patton (41:48)
Yeah. Yeah, I

know I appreciate that. I really enjoyed this conversation. It's been such a refreshing thing to talk to people who also care about human connection as well. And I think that there's a movement happening. It's not a big flashy movement, but it's happening. And I think we just have to continue to find our people.

Beth Rudden (42:06)
Mm-hmm.

Desmond Patton (42:12)
find places of connection, grow, learn, share together. And I think we'll get somewhere.

Katie Smith (42:18)
Amazing.

So, can people find you? Anything you want to just sort of shout out?

Beth Rudden (42:19)
Amazing.

Desmond Patton (42:24)
Yeah, so if you're interested in the work of my lab, it's Safelab at UPenn. And so if you Google that, you'll find out more about our research and our efforts there. I'm also on Blue Sky and LinkedIn, so feel free to connect with me there. And it's just my name. And if I have a joint plan, yeah, for sure.

Katie Smith (42:39)
and find the joy plan, everyone. It's

definitely something to look into. It works. Thank you so much, Desmond. Thanks for joining us.

Desmond Patton (42:47)
Thank you.

Beth Rudden (42:48)
Thank you so much