And We Feel Fine with Beth Rudden and Katie Smith

In this premiere episode of And We Feel Fine, Beth Rudden and Katie Smith crack open the big questions: Who owns your data? What does ethical AI look like? And how do we reclaim digital spaces for safety, joy, and human connection?

Through stories, sharp critique, and vision-casting, they explore how ontologies shape AI systems, the hidden harms of bias in tech, and why personalization can be a force for good—if driven by care, not clicks. They call for a future where technology supports thriving communities, not extraction.

This is not just a conversation about AI. It’s a conversation about us—what motivates us, what we value, and how we build a more humane internet from the ground up.

🔑 Topics Covered:
  • How Big Tech extracts and profits from your data
  • What ontologies teach us about understanding people and systems
  • Cognitive bias in AI—and how to design for cultural nuance
  • Data privacy, ethical marketing, and digital safety
  • Reimagining social media beyond toxic engagement loops
  • The future of work and the call for a care-centered economy
  • Everyday miracles, digital dignity, and collective power
📌 Key Takeaways:
  • Data isn’t neutral—it’s powerful, and it’s personal.
  • Cultural bias in AI can be mitigated with better ontologies and diverse input.
  • We need a new digital economy grounded in human connection and care.
  • The business model of the internet must shift from extraction to trust.
  • Personalization can be beautiful—when we drive it, not algorithms.
  • Reclaiming our online lives means rethinking the system itself.
⏱️ Chapters (Timestamps):
  • 00:00 Understanding (as an example) Palantir and Big Tech's Data Extraction
  • 07:34 The Role of Ontologies in AI and Human Understanding
  • 21:00 The Future of AI and Its Societal Implications
  • 33:55 Safety and Moderation in Online Spaces
  • 40:17 The Pursuit of Happiness and Community
  • 46:52 The Need for a New Economy56:56 Building Genuine Relationships Online

Creators and Guests

BR
Host
Beth Rudden
Pronouns: she/her Beth Rudden is the CEO and Founder of Bast AI, where she’s designing explainable, personalized AI that puts human dignity at the center. A former Distinguished Engineer and global executive at IBM, Beth brings 20+ years at the intersection of anthropology, data science, and AI governance. Her mission: make the next generation of intelligence understandable, accountable, and profoundly human. She’s helped reshape tech in healthcare, education, and workforce systems by applying ontological natural language understanding—yes, it’s a mouthful—to build AI that reflects cultural nuance and ethical intent. Beth is the author of AI for the Rest of Us and a global speaker on AI literacy and the future of power. On And We Feel Fine, she brings curiosity, clarity, and contagious optimism to every episode. With Katie, she explores what it means to end well, begin again, and build something truer than what came before.
KS
Host
Katie Smith
Pronouns: they/them Katie Smith is the Co-Founder and CEO of Humma.AI, a privacy-first, empathy-driven platform training culturally competent AI through community-powered data. Their unconventional journey began in the online adult space, where they held executive roles at Playboy and leading video chat platforms—gaining rare insight into how digital systems shape desire, identity, and power. Later, Katie turned those skills toward public good—leading digital at the ACLU National and crafting award-winning campaigns for marriage equality and racial justice. Now, they’re building tech that respects consent, honors community, and shifts power back to the people. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a genre-defying exploration of trauma, identity, and transformation. A queer, nonbinary, neurodivergent thinker and builder, they bring systems-level thinking, futurism and humor to And We Feel Fine. Expect honest conversations about what’s ending, what could begin, and how we co-create tech—and futures—worth believing in.
AL
Producer
Alexia Lewis

What is And We Feel Fine with Beth Rudden and Katie Smith?

At the edge of collapse—and creation—two unlikely co-conspirators invite you into a radically honest conversation about the future. This isn’t just another tech or self-help podcast. It’s a story-driven exploration of who we are, what we value, and how we might reimagine the world when the systems around us stop serving us. We blend personal storytelling, cultural critique, and deep inquiry into what it means to be human in an age of AI, uncertainty, and transformation. We’re asking better questions—together.

Because the world is changing fast, but maybe that’s precisely what we need.

Hosted by Beth Rudden and Katie Smith, two builders of systems and challengers of the status quo. Beth is CEO of Bast.AI and a globally recognized expert in trustworthy AI, with decades of experience leading data and ethics at IBM. Katie is the founder of Humma.AI, a strategist who drove innovation and revenue growth at major global brands before turning to human rights and technology for social good. Together, they make complex issues, such as AI and its impacts on everyday people, clear, personal, and impossible to ignore.

Beth Rudden is the CEO and Founder of Bast AI, a pioneering company building explainable, personalized AI for good. With over two decades of experience as a global executive and Distinguished Engineer at IBM, Beth blends anthropology, data science, and AI governance to create tools that amplify human dignity and intelligence—not replace it.
Her work spans healthcare, education, and workforce transformation, using ontological natural language understanding (NLU) to make AI transparent, accountable, and accessible. Through Bast AI, Beth is reimagining how organizations deploy AI that’s not only accurate but aligned with ethical values, cultural context, and cognitive well-being.
Beth is also the author of AI for the Rest of Us and a passionate advocate for AI literacy, epistemic diversity, and the right to understand the systems shaping our lives. She speaks globally on the future of AI, power, and social contracts—and believes we’re all stewards of the next intelligence.

Katie Smith is the CEO and Founder of Humma.AI, a privacy-first platform building community-powered, culturally competent AI. With over two decades of experience leading digital strategy and social innovation, Katie blends systems thinking, Responsible AI, and storytelling to create tools that serve dignity, not domination. Their work spans mental health, civic tech, and digital rights, using participatory AI to make systems safer, fairer, and more accountable. Through Humma.AI, Katie is reimagining how people and businesses engage AI that’s accurate, inclusive, and governed by consent and care. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a provocative exploration of identity, trauma, and transformation. They speak globally on the future of technology, power, and justice—and believe human empathy is the intelligence that will define our time.

Subscribe to our Substack for bonus content: https://substack.com/@andwefeelfine

Beth Rudden (00:39)
I've known who they are since the start because they're one of the few companies that use ontologies.

And the way that they use ontologies is very different than the way that we use ontologies. And it's so super clear. But when people ask me who is my true competitor or why people aren't doing this, I point them to Palantir and I talk about how much control you can have over a human being if you understand what makes them tick. And so if you understand when a

13 to 17 year old girl is feeling insecure, you can sell her beauty cream or whatever, you know, at that moment in time because, you know, she's feeling sad. So, you know, that is a very powerful tool. So do you want to use that tool to make people understand how ⁓ they are feeling and maybe give them a vocabulary?

of how they are feeling or do you want to use that information so that you can sell it to advertisers who can sell them whatever they're advertising and make lots of money off of the, you know, the feeling of insecurity that a 13 year old might feel.

Katie Smith (01:52)
Yeah, I like this because I think this is where our missions, I mean, we're aligned on so many different levels, but I think this is when our missions sort of like, you know, collide is this idea that Big Tech is extracting all this data to then use it to extract more out of us. Right. And with Huma, we're saying, let's just stop that. Let's, you know, that our data is actually gold.

Beth Rudden (01:59)
Yep.

Katie Smith (02:17)
And if we understood that value, we would treat it differently. We would treat our information differently, right? And we would expect more, right? So yeah, that's interesting. Tell me more about Bast and how they're different and how you all are different and better.

Beth Rudden (02:33)
So we've always wanted to use ontology and an ontology is a formal knowledge graph, but it's also the study of the nature of your reality based on the language you use. So it's like your worldview. And you'll hear a lot of times that like LLMs don't have a worldview. And so what we do is we ground a worldview based on good

solid information like a para rescue medical protocol or a construction manual or a set of EMT codes, paramedic codes or whatever. But it's ⁓ highly baked, good information that many people have been trained on. And so we use that grounding for the LLM to have a worldview or context to understand against.

What I think that a lot of people are right to not understand or not comprehend is that the large language models appear to have understanding, but they don't. They don't actually understand the meaning of the words. So it's tragic when a human being is building a relationship with this large language model.

and or replica or any of these AI systems that are based on generative AI, because they're literally divorcing the data from the context and then using that data in a way that is inappropriate and inorganic. And just to give you an idea of like the alienness of the intelligence, a large language model

Katie Smith (04:14)
Mm.

Beth Rudden (04:24)
remembers everything that you've ever done, but it has no ability to specifically recall that memory. And so there's like a sensitivity in that all of your data is there at all times, but it takes kind of a forensic expert to pull out that exact level of specificity, but that is coming. And that's a lot of what

can do is because they use a similar notion of giving ⁓ ontologies the rules in order to control what pieces of information are pulled up or worse what you know what a human being is likely feeling through inference that you know there's transitive inference there's abductive inference it's it's inferences

⁓ The best way that I know how to describe it is like these are the shortcuts that we make when we see somebody, ⁓ we make a generalization ⁓ about them. We have stereotypes. Those are shortcuts that we make. And those shortcuts are based on our own cultural, ⁓ environmental, socioeconomic, our own ⁓ understanding of our world, right? Back to what is the reality that we have.

Katie Smith (05:33)
Mm-hmm.

Beth Rudden (05:47)
So if we've only seen pictures of Middle Eastern men that have the gun belts on them, and then we encounter a Middle Eastern man in an advertisement, we're going to have a reaction that is predictable because we're going to activate our own inference or our own bias because we have only understood that Middle Eastern men have gun belts on them.

Katie Smith (06:17)
Mm-hmm.

Beth Rudden (06:17)
And

mean, I'm talking about like, you maybe a five year old or something like this, but like that's how humans, I mean, you know, I'm not trying to say that's what it is. I'm saying that somebody who has never been exposed to a wide array of Middle Eastern men, you know, being around their families and cooking dinner and changing diapers and like, you know, doing all the things that I know Middle Eastern men do. ⁓ my gosh, ⁓ you know, sacrificing and, you know, going without food and...

Katie Smith (06:22)
No, we do have definitely interest in this.

Beth Rudden (06:47)
doing things for religion and the deep, deep understanding of poetry. There's a whole, whole different world when you have more understanding. so Bast uses AI and ontologies for understanding, whereas Palantir, you can codify rules that says, if somebody has never been exposed to...

anything else, then let's use this bias that this person has to really activate and outrage them so that they're going to engage with whatever social media, whatever advertisement, whatever they're doing. That is normal. That is, you know, something that

I mean, I don't want to say that it's normal that everybody knows this because people do not know this. This is not taught in schools. We should definitely teach this in school. Yeah, but there's 180 cognitive biases. And so think of that as like 180 shortcuts that we make. Think about the errors that we make, you know,

Katie Smith (07:43)
because yes, it's possible. Yes.

Beth Rudden (08:00)
literally every day because we're blind. We have a cognitive bias that was ingrained in a way that is very difficult for us to see unless we are training ourselves to see it. so a lot of my goal and mission and, you know, why I wanted to show people that there is a way to do this is because there's a way to use it to help a mom

Katie Smith (08:13)
Yeah.

Beth Rudden (08:28)
who doesn't have the vocabulary, who has maybe four kids under 10, understand that she has the skill of negotiation. She has the skill of supply chain. She has the skill of logistics. She has the skill of critical thinking, understanding how to do conflict resolution, how to diffuse situations. She might not have that vocabulary. And so why not use an ontology

Katie Smith (08:55)
Mm-hmm.

Beth Rudden (08:58)
that says, hey, I know enough about you with one sentence that you're a mom with four kids under 10. Here are the things that most moms with four kids under 10 have as far as skills. And then here's how these skills could translate to that job you wanna go get down the street. That's a way to use ontologies to understand or, you know.

bank of a river verse bank that you keep your money in or, like there's so many misunderstandings of ⁓ basic English words. Imagine translating this, you know, with different cultures and different understandings. And what I found in the customer surveys and the work that we're doing is that

⁓ In emergency medicine, there's a golden hour of time where you have one hour to kind of assess a patient, make sure, and would you talk about this a little bit? I know that you're a little bit of an expert.

Katie Smith (10:08)
Not an expert. was an EMT after high school and I haven't been recertified in many years. But I will say I like this analogy that you used earlier, which is sensitive, right? Because we're talking about a group of people. We're talking about Middle Eastern men. And it reminds me of when I was in Egypt. I took this wonderful trip, this DNA journey, know, to 13 countries in 18 months. And I ended up at the Red Sea because I was like,

Beth Rudden (10:23)
Mm-hmm.

Hmm.

Katie Smith (10:36)
traveling to all my spots, I could track my DNA and I have very distant Coptic Egyptian family, which I'm very proud of. It's a wonderful history. And this guy saw that I was reading this book and I had the goal to be reading a particular book, God is Dead. And so, and I'm reading this, right? As I was between research and I was like having dinner and reading this book.

Beth Rudden (10:58)
Mm-hmm. Neatly.

Mm-hmm.

Katie Smith (11:06)
And the guy, the waiter comes up to me and I had built a rapport with him. He comes up to me he goes, that's an interesting book that you're reading. Can you tell me more about it? And he's like, I'm like, I would love to talk to you. Cause I was actually interviewing lots of different people. I'm like, would you be interested in having like a detailed conversation, like after work? So he comes and he meets me after work, of he doesn't drink. just having tea. And he's like, why do Americans hate us? He's like, and he says it was so much vulnerability. He's like, I just don't understand. He's like,

I am a father, I am a religious man, you know, like I care about my kids. He's like, I don't understand these movies, like we're always the villain, you know? And it's an analogy for large language models, because large language models are like a Hollywood movie. It's just nonsense. There's nothing true. And you know, some people say fiction is the most truth, but in this case, when we're talking about culture, when we're talking about like individuals,

Beth Rudden (11:38)
Mm-hmm.

That's right, that's right.

Mm-hmm.

Mm-hmm.

Katie Smith (12:06)
Like he didn't see himself in any of the Western media, especially the movies and all the news, right? That was happening in the Middle East. He felt completely unseen. And so like, is his story in the LLMs? Absolutely not. Of course it's not. And so there's these generalizations, which is how I think we've gotten these, bias and the discrimination. But to your point, it's just baked in the models because they scraped the internet and there's no context.

Beth Rudden (12:12)
you

We did a study and this was back when I was at IBM and I wanted to give people a way to truly understand how difficult it is to understand social bias. And when you're a data scientist and you go into Kaggle and you are putting together a bunch of the data and the models and all this stuff, one of the most used data sets is the Titanic data.

And because we were looking at things in a little bit of a different light, I had designers and artists who were part of the team. And I wanted them to create a ticket that showed ⁓ whether somebody would be able to get on a lifeboat or not as they're entering into the Titanic. And so we had the data scientists doing the predictions, using the data, and then we had the...

designers designing these tickets that somebody would see. And it was beautiful with like a woman and her kids or a man from first class. so of course people got it and they got it really quickly because they're like, Beth, a man in first class is more likely to get on a lifeboat.

Katie Smith (13:36)
Yes.

Beth Rudden (13:59)
than a woman from steerage with two children. And what is interesting about the data is something that there's this really great, beautiful story about Doris Day. And she's standing on the other side of San Francisco when the big San Francisco fire is happening. And she recognizes that there are no atheist in foxholes. And when things are really going to pot,

Katie Smith (14:02)
Right.

Beth Rudden (14:28)
like human beings will help one another. So a lot of the women and children from steerage did get on lifeboats, but because of the social bias of the time, it was perfectly acceptable for a woman who just needed transport in steerage to be like, yeah, have no class of getting on, know, the guy from the first class is gonna get on the lifeboat before me. And that was just an acceptable understanding of the time. And everybody got it.

Katie Smith (14:35)
and ⁓

Mmm.

Beth Rudden (14:57)
because it's such a hard thing to think about the water that you're swimming in, right? The social environment that you are. And so many of us are trained when you're thinking about archeology, you're taking one artifact or maybe a set of artifacts to recreate a whole story. And the minute that you ask why, you're activating your creativity.

Katie Smith (15:27)
Hmm.

Beth Rudden (15:27)
And the

minute you activate your creativity is when you are introducing all of these inferences and biases and things that say more about who you are. And it's a reflection of what you're doing. so LLMs, when they divorce data from the context and then throw it into the neural net to have the neural net sort of guess the features or guess the patterns. And this is really reductive because it's a lot more.

Katie Smith (15:40)
naked.

Beth Rudden (15:57)
difficult than that. What happens is like it's a funhouse mirror type of reflection as opposed to a true reflection that you could get with an ontology that says, know, Katie, your experiences in 15 countries, you know, your genetic experiences that actually increased your capacity to know about all of these different cultures. Look at how that influenced you.

Katie Smith (16:23)
⁓ yeah.

Six times.

Beth Rudden (16:26)
And then

that could be reflected back because we only carry a small memory of that, right? But alien intelligence remembers everything. And so you could activate that as part of your understanding of your reality. And that's the promise of AI is the hyper personalization. But to get there, we need to talk about who owns the data.

Katie Smith (16:54)
Yeah, yeah, yeah. So that's interesting because, you know, when you're swimming in the water, you just assume what there's that specific analogy when you're swimming in the ocean, you just assume that you're a fish. Like, what's the analogy? What's that? You know what I'm talking about? You just start you just referenced it.

Beth Rudden (17:10)
It's my children are rolling their eyes at me as I speak this. okay. Here's the joke. It's not a joke. An old fish swims past a bunch of new fish or a bunch of young fish. And the old fish says, hey kids, how's the water? And the kids go, wait, what's water? And it's.

Katie Smith (17:31)
Got it, thank you, yes.

Beth Rudden (17:33)
It's like the fox and the grapes. It's like an old, old kind of ⁓ moral and anecdote.

Katie Smith (17:44)
like that analogy when it comes to big tech. We just assume big tech is it. Like they're the ones. They're the ones building the infrastructure. They're the ones delivering the products. They're the ones that are going to save the world. It's all these white men at a round table that have all funded each other or know each other from previous projects.

Beth Rudden (17:49)
Mm-hmm. Yeah.

they're not working together. They're not exchanging ideas at a roundtable.

Katie Smith (18:10)
Well, what I mean is I'm pitching an image that I have in my head of how they're all connected and where all their money comes from.

Beth Rudden (18:10)
Yeah.

yeah.

There's actually

a picture, it's called the PayPal Mafia. And there's a picture of them. Yes, okay, yeah.

Katie Smith (18:21)
Yes. Yes. Yes.

Yes. I know exactly what you're talking about. And we just assume the consumers of AI and the consumers of these products like, ⁓ well, of course it's them. They're going to do it. They're the geniuses. put them on this high, high level. And the problem is what we and we've learned this the hard way is that they're so disconnected to that man that I met in the Middle East, in Egypt. They're so disconnected from my life.

They're so disconnected from so many different people, marginalized communities, people with disabilities, religious minorities, people of color, LGBTQ +, neurodivergence. I'm sorry, none of this is reflected in any meaningful way in any of their products. And we're having to deal with it now, right? So there's something to that that we just like, so how's the water? We just assume, well, you know what water?

like, we're just in this extractive environment. We just thought it was okay to be in this extractive environment, because that's just the water we've been breathing, right?

Beth Rudden (19:28)
It's

also, so another story and the term is willful blindness. And a lot of it is wrapped up with our understanding that it's math and therefore it must be objective because it's mathematical. And math is hard and it takes a lot of work and those people who are doing math.

must be understanding what they're doing and must be checking their work and math only has one answer and all of these things. then, you know, science, science is replicable. I want anybody to go out there right now and try to get the same answer out of the LLM. It is not science. I digress. okay. It took 25 years after a woman proved that x-rays

Katie Smith (20:10)
Yeah.

Beth Rudden (20:23)
causes cancer in fetuses. It took 25 years for us to stop putting women who are pregnant under x-rays. And that is because there's a huge enamorment with technology and a huge enamorment of next time it won't cause cancer. And because the cancer was so far removed and it was bad, really bad.

And the woman who proved it even had men prove that her work was correct and that x-rays absolutely cause cancer. 25 years for humans to stop using this technology.

Katie Smith (21:05)
We don't have that time. We don't have that time

with AI. I know some people think, you know, it's going to radically change in two to three years. Like, I'm curious, where are you in that? Like, how quickly do you think it's going to completely change our lives?

Beth Rudden (21:26)
⁓ mean, change your lives in a way that we can all, you know, just sort of integrate into our lives much as we would the iPhone. I mean, for those of us who remembered like dial-up modems and, you know, all of this was sort of like, you didn't have to like go to the hotel room an hour before your call to get online. You know, you could now take it from the subway or what have you. So, I mean, there's a lot.

things that, you know, yeah, I mean, I think, you know, for me and for a lot of, let's say our generation that is dealing with elder care, as well as dealing with childcare, puppy care, like, you know, we have like that, that, that kind of both where we don't have, we don't have shits to give, we don't have time. So, you know, anytime that I can use AI to get outside, I do.

⁓ Anytime I have to write an email to some, you know, bureaucratic organization that I can, you know, more easily express myself, heck yeah, I'm using an AI. You know, ⁓ there's just, there's too many things that I think are already here that we are changing our lives with artificial intelligence. It's also being codified into a lot of the decision-making processes, which many of us have been trying to

get people to understand for a long time. do think, yeah, well, hiring loans, redlining, military industrial complex, there's gonna be self-driving cars. There is a lot that ⁓ the slow moving government agencies just are not.

Katie Smith (22:55)
Yeah.

Beth Rudden (23:17)
prepared for in some instances, I think that this is a little bit of a data apocalypse that there's a lot of things being ripped out, a lot of knowledge systems that are being forgotten that will have to be reinvented, but there is nothing new about that. Nothing new at all.

Katie Smith (23:35)
Yeah.

So

I just came back from New York, as you know, for the All Tech is Human gathering and I treated myself to a nice brunch, a nice dinner. But one night when I was coming home after the gathering, I was so tired and I saw a Taco Bell. I was like, oh my God, I never do this, but I'm going in the Taco Bell. And there was no person to greet me. There was just a computer. And it made me think of my mom, because this is just a tiny glimpse of the future, right? Just the tiniest little glimpse.

Beth Rudden (23:43)
Yeah.

Mm.

Mm-hmm.

Katie Smith (24:07)
And so no human greeted me. It was just a bunch of screens. And I thought of my mother because my mom worked at a company at first was called Thrifties and eventually turned into Rite Aid. And she worked there for over 30 years. And you know, when she first got that job, it was almost like a respectable job. Like you could, you know, almost like have a home and take care of kids. And then progressively got worse. And it was very, very difficult for her. She made less than $30,000 a year raising two little kids. Very hard.

Beth Rudden (24:36)
Mm-hmm.

Mm-hmm.

Katie Smith (24:37)
And

I just think, what about that mom today? Where is she gonna work? So I love that analogy you were saying earlier and like she had no time to think about any of these things. And I'm thinking of the mom today, there's no way that she has time, because she's just surviving to think about these things. And I think about her every day, not just my mom, but the person who is the equivalent of my mom.

Beth Rudden (24:50)
Mm-hmm.

That's right.

Katie Smith (25:04)
And, know, that's part of the reason why we wanted to build humorous, like, well, you're already on social media. What if we just like re-imagine it so it actually works for you and you can create better outcomes for yourself and through the small we've created, you know? ⁓ Anyways, that sort of gets to our why I think with this podcast too. Do you want to, do you want to speak to that?

Beth Rudden (25:16)
That's right.

Okay, so I was, when I was leaving IBM, my sister was in a great deal of, she had done over 100 hospitals, in hospital stays in like the last six months, eight months, something like that. was just, it was a lot because she had, she has a dysautonomic.

she has POTS and she finally got diagnosed, but in and out of the hospital a lot. And I, me, I had put together a whole mural board of like all of her hospital stays, what drugs she was taking, what were the connections with the drugs and like, you know, it was the story. And when you're going into a hospital, you're seeing a physician and a physician only has 15 minutes to see you.

So how do you compress like three, four, five, 10, 15 years of health history into something that can get the physician to hear you, see you, understand what you have, and most importantly, get you out of pain? And pain management is a very, very difficult topic in a lot of places because of the opioid crisis and the inability to

to help people in a way that I think is humane. And so that's my why, is I would show this big-ass mural board to physicians and they're like, holy crap, could you do this for all of my patients? can I ask it questions? And it was just a way to use my talents of like, here's an ontology of all my sister's stuff.

and here is a way that you can comprehend. And I tuned it so I was like, you're a physician, you want to see, and like, was, you know, testing this all the time. And that's, it's my, mean, also my mother too, who went back to work. And when she went back to work, she had been a homemaker.

And I remember her like going back and getting her degree and getting her CPA and she was an accountant and becoming a controller and you know, I remember her going to school and I would be in the most beautiful library in the world and I would sit with her and I would open up a book I couldn't read it and she's like well start looking up each one of those words and Like, you know, how do you how do you give people vocabulary? How do you give people the the

the time and the wherewithal to, because what you said about time is so poignant. She had no time and she had no time to translate, you know, all of those things that I just rattled off of like a mom with four kids under 10. And there's so much to our world that is cloaked in jargon and I was done.

was like, this doesn't need to be that difficult. What if everybody had a translator?

Katie Smith (28:34)
Yeah. And just as a quick side note, Homemaker would be a fine title as long as we fully understood what that meant.

Beth Rudden (28:41)
Well, how about pay them?

So, well, no, no, they are the conditions. My mother was, is the condition for my father to be successful. She earns two thirds of that salary, not my, not the husband. And so I always wanted to develop a model, you know, where, yes, I mean, where you're paying for conditions to be successful. So if you...

Katie Smith (28:45)
Hey moms, I like it. Go to your best.

Yeah.

Beth Rudden (29:09)
are ⁓ a single non-binary human being. You need dog care, you need gym care, you need laundry care, you need all of those things. What if everybody had a stipend where part of their salary went to the conditions for them to be successful?

Katie Smith (29:27)
I love that. I love that. So, okay, so this is great because now we're getting into imagining. So tell me your why for the pod.

Beth Rudden (29:35)
Because you're so cool.

Katie Smith (29:38)
I I

love that we just get to talk to each other for a couple weeks. That's really selfish, but tell me more.

Beth Rudden (29:45)
I think I am so tired of hearing about supermen and superhumans and superagency and, you know, cloaks and, you know, I'm like capes and I'm like, you should see women wearing capes. But like, just I think, you know, we need to get back to being human. And I think that there's a lot of things that people are not able to say that

I am

Katie Smith (30:15)
I appreciate that.

you know, the why for humor, it's interesting. It really started with this concept of safety, you know, as someone who's really good at patterns. Like I could see that Facebook was, it was being sort of hijacked in a way.

by messages that weren't serving us. I'll just say that. We'll go into this deeper, I'm sure. But there was a lot of messages that just don't serve us. And because of the ad model, you have to amplify this stuff, right? And then the polarization gets worse. And then, you know, then it becomes okay. A huge fragment of society is like okay to attack another group of society, right? It just turned into like a culture war. And I really think that had a lot to do with social media. And I just thought, well, how do we create safety? Because it's

Beth Rudden (31:04)
Mm-hmm.

Katie Smith (31:05)
Now

it's jumped offline. So, you know, and if you look at the FBI hate crime statistics, it's been spiking since 2016, if that means anything to anyone. And so I remember, you know, I was working at a research and policy shop at the time. And I said to my executive director, I'm like, I'm really scared. Hate crime is going to spike and it's just going to get worse. We have to create resources for people to find.

Beth Rudden (31:31)
Mm-hmm.

Katie Smith (31:33)
help if they need it. And he's like, well, we don't really do that. But if you feel strongly about it, go ahead and create a page. So created a page, very simple web page with just a ton of resources for our communities. It became the number one visited page. It had the highest time on page than any other page in the entire history of the organization. Because of the timing and because of the need. And it just struck me. It's like, okay, what I saw was real.

Beth Rudden (31:34)
Mm-hmm.

you

Mm-hmm.

Katie Smith (32:04)
And then I saw AI company coming. know, my friends and I were nerdy. We've been talking about singularity for a long time. It was like the fun nerdy conversation, but not, I didn't really think that was going to happen. when open AI came out with chat GPT, I was like, ⁓ shoot. Maybe we're closer to AI actually having a massive impact. And I thought this is going to make our safety even more precarious. ⁓

Beth Rudden (32:10)
Sure.

Katie Smith (32:30)
because of just the nature. You understand this stuff better than me, but my instinct told me. Like this is gonna make us all less safe.

Beth Rudden (32:38)
When you were watching people go to bat online, wasn't it really obvious who was not engaging in those conversations?

Katie Smith (32:49)
Tell me more.

Beth Rudden (32:51)
Like, did you notice that those conversations were just being done by like one or two people who were just constantly, you know, vitriol and engaging? Yeah. Well, it's...

Katie Smith (33:04)
It was bots. It was bots. I mean, I think you and I talked about this, but it was very

much bots at first. And then it wrangled. The bots created an atmosphere for real people than to speak their mind in a way that they hadn't spoken.

Beth Rudden (33:16)
Mm-hmm.

Right, because there were no ⁓ human consequences of being ostracized because they could go online anonymously. what I saw was a lack of leadership and a lack of community moderation. And the communities that were highly moderated were just inundated with people trying their best to try to keep

Katie Smith (33:37)
absolutely.

Beth Rudden (33:49)
It was a full-time job. And no one was paying for that.

Katie Smith (33:51)
more than that, right? Yeah. Well, there's

like lots of different levels of risk. The first level of risk, which is the hardest one, and this is triggering, but child porn online is an epidemic. And, you know, I think there's a lot of people just trying to fix that problem. And it's such a big problem. And I think you suck some of the best minds when it comes to safety and trust just to fix that problem. And I'm so grateful for the work they do.

Beth Rudden (33:57)
Mm-hmm.

Mm-hmm.

Katie Smith (34:21)
And without the right tools, that can be very traumatizing, right? So that's one level of safety for all of us and for kids. Then there's like the next level of just like ⁓ the bullying, right? The shame. It's just like, don't look like us, you're not like me, therefore you're bad or whatever it is, right? And teen suicide has gone up, especially for trans kids, right? So if you're in a vulnerable community, you're even more vulnerable.

And so there's like this catch 22 is like, some of us only way we find community is online, right? But, you know, that could be really dangerous right now on certain platforms. then like meta is just completely stripped out. Any of really smart, good people who are trying to do good moderation are just gone now. I mean, there's still a team there, but it's not what it used to be.

Beth Rudden (34:56)
Mm-hmm.

I mean, what you said earlier too about safety is that it's moved so quickly from online to in-person. And when it moves in-person, it is especially deadly for people to have things like child pornography normalized. And where there's no consequences, no accountability, no moderation, no ostracization, there's...

you're really handcuffing the people who are trying to help clean that up. And I think this is a beautiful why for Huma.

Katie Smith (35:52)
So it was based on safety, you know, it was

really just based on safety, knowing that the social networks are not going to get any better. In fact, you can see they're getting worse. And I predicted this like three years ago. I'm like, it's just going to get worse. And then as I've, you know, worked through the process of, you know, building the strategy, vetting the strategy and talking to amazing people across, you know, lots of different expertise and domains, I realized

there was actually a business model here that if you strip out the ads, we actually can go back to what social media was supposed to be, which is just genuine, authentic connection. If we know who each other are, we're going to be more likely to be kind to each other. And then where some other patent pending tools that we got in there too, that we're going to do some really interesting things to ensure it stays safe. But then it also, for me, the why was I'm sick of my data being extracted.

Beth Rudden (36:23)
Mm-hmm.

Katie Smith (36:49)
As somebody who's been doing digital marketing and business management of apps and some of the biggest websites and most popular websites in the world since the late nineties, I know all too well how we marketed through social media. know all too well how we use data to make money. And I think at this point, I'm like with AI again, it's just going to get exponentially worse. And we have to just completely flip the paradigm.

You know?

Beth Rudden (37:21)
⁓ One of my whys for Bass too is that they're doing it so wrong that they can't, I mean, thank God they can't think like this. I mean, I wanna pin this for a second because do you really believe with all of the digital marketing and everything that you had access to,

you still only had a couple artifacts of who a person is. And the more you would get to know that person from meeting them online, the more depth of relationship that you would share and the more that you would be able to build into that. And my hypothesis is like, I wake up every day and I'm like, doing it wrong. And I'm like, the reason is not because I want them to do it right, but because that they're

is a better way that we can use it to connect the dots and to connect each other. people who are using it where they believe that they can extract data from you and believe that that data is who you are authentically, they actually can't do it right.

Katie Smith (38:23)
Yeah.

So in e-commerce, you just get enough data so you know how to sell them the next thing. And I got really, really good at that. At Playboy, I knew exactly where somebody would start. And I had archetypes or personas and knew exactly what to take them every step of the way until we made tons of money off one individual lifetime customer value. was really good at that. Really good. Legacy program. In politics, we do it. Cross-tabbing. You know, like you...

You get voter information, you compile that with like e-commerce information and, you know, all the other data sets that you can get your hands on. So, you know, exactly who you're talking to, to give them the message that they vote for your person. Right? So even the good guys, even the good guys are extracting this information to do what they think are good things. And

You know, for the first 10 years of my career, I was in e-commerce. So I was learning how to just make money. And the second half of my career, I was, I thought I was doing the right thing by using that persuasion, data-driven persuasion, working to get people to, to find their truth, find their values and do the right thing. And look, there are people who are doing this and it does work, but I keep thinking to myself as I've had to take a step from both step back from both of these worlds that.

Beth Rudden (39:46)
Mm-hmm.

Katie Smith (40:04)
If the mom with the two kids that is like my mom that's living today knew that we were doing this.

Would she care? I think she would, but she would be overwhelmed, right? She would just be, this is what I meant by the water thing. It's like, well, I guess we just live in water. I guess this is just the world we live in. This is just it. And it's like, so my wife for the podcast is, no, it doesn't have to be this way. It doesn't. You, single mom with two little kids that are just surviving, we got you. We're talking about you and we're our building pools.

Beth Rudden (40:32)
Mm-hmm.

You

Katie Smith (40:42)
so that you have agency and dignity and your data is not being sold and resold and people are telling you what to do and ⁓ what to buy next. There's a different way.

Beth Rudden (40:55)
I think if there was an EMP that took out all the electricity and we had to go back to growing our own food, watching the next season of The Last of Us. So if we had to go back to this communal group setting, humans behave in a very predictable way. And the thing that I

I get that you can understand enough about a person to sell them the next thing or to get them to the next step, the next best action. By the way, I had kind of ⁓ a group for recovering ad tech data scientists.

Katie Smith (41:36)
Mm-hmm.

Yes.

I might need to add myself to that group.

Beth Rudden (41:51)
⁓ Because, you know, it's very powerful, but it is not all-knowing. And the reason that it works is because there's not an alternative. And when you give people an alternative and you start talking about what could they do, the possibilities of like community building where you could pay somebody

Katie Smith (42:05)
Yes!

Beth Rudden (42:19)
in flowers that arrive to the local hospice every week instead of like the money that would go to them that maybe they would buy flowers for the hospice that they were attached to in whatever way. I mean, when we have more human connections being made with the community and AI is able to, ⁓ God forbid, further

the ability of the human race to be able to create better quality of lives for one another. ⁓ Better quality of life.

Katie Smith (42:54)
Yes. Life, liberty, and happiness.

AI has to do it too.

Beth Rudden (43:03)
Well, why not ask an AI or point AI systems to help us create better quality of lives? And I guarantee that there are so many more things that we can do with data that we have consent for.

Katie Smith (43:22)
Yes. Now we're talking. So the one of the reasons why I like the name of our podcast. Well, originally it was inspired by it's the end of the world as we know it by R.E.M. And I love that song that I'll never sing again, everyone. But I, but I love that song because it's chaos. It's total chaos. He has a hard time remembering the words when he sings it live, because when he sort of just, when he wrote it.

It was meant to be chaos. And ⁓ I really liked that that was the inspiration for the name of our podcast and that we switched it. Right. And we felt fine. I think for me, what that means is we can get there. We can get there. There's a different world for us. There's a different path for us. and maybe we'll.

Beth Rudden (43:59)
Mm-hmm.

Yes.

Katie Smith (44:18)
actually be okay and we'll be fine. And then you also had this other like fine actually means in terms of an acronym. What did you say?

Beth Rudden (44:25)
Fucking irrational, neurotic, and emotional. I feel fine. Fine.

Katie Smith (44:30)
And

it was that. the concept with our podcast is endings and beginnings. And what I'm hoping is ending is this world of extraction, right? Well, that single mom with two little girls just has no choice. And to this world of, I know this is like very big theme for everyone right now, but abundance, right? We don't have to live in a world of scarcity, this world of abundance. And like, I'm not naive either. When we talk about like what's happening with The Last of Us,

Beth Rudden (44:52)
Mm-hmm.

Katie Smith (44:59)
We're talking about like tribal warfare is going on. ⁓

Beth Rudden (45:04)
But there are also so many more of these Doris Day moments where we're not, you know, if it's like humans versus wolves, you know, like I ⁓ think we need to figure out how to celebrate the single mom who says, you know what, I cannot afford the mental tax of having Instagram anymore.

because I can't keep up with those people. And in the way that they're putting their lives out there in a way that is a platitude that I have no desire to engage in anymore. And I see more people across all socioeconomic spectrums. They are just checking out. And they're going back to...

Katie Smith (45:35)
Yeah.

So many people are checking out.

Beth Rudden (46:01)
magazines or books or, you know, TV without device time, you know, doing one thing. Yes, yes.

Katie Smith (46:09)
We're a job of platforms, frankly. People are going to other platforms,

yeah. Because there's smaller community in other platforms, like multiplayer, like video games. People love that stuff. Discord, Twitch, you know?

Beth Rudden (46:21)
I would like to do a whole episode on how many times the data that you have is never the data that you need to solve the problem. this isn't like for why, why do people think this is not widely known? And it, you know, I can give you anecdote after anecdote, after story after story, every single time when you actually want to try to solve a problem.

the data set that you need is often something that is so innocuous you would never have even thought about it. Again, because of that social bias and because of our inability to understand the systems and believe in the systems. And I think now everybody sees it like the truth in leadership, the truth in advertising. The truth of leadership is right in our face. These leaders, they are there because we have

allowed our systems to lift these people up into positions of power. What the fuck? is that who you want your boy to grow up to be in life? Really? Are you proud of that human being? Like...

Katie Smith (47:26)
I mean, look at them. Is that who you want?

my gosh, no everyone. Yeah.

Although I think with media today, and this is another reason why I think social media is not serving us. It's now, and in fact, Mark Zuckerberg was just in front of Congress saying, well, we're not really social media anymore. We just like, we deliver media, right? And he's right.

Beth Rudden (47:57)
Like,

like dude, like, you know, dude, like, I don't.

Katie Smith (48:00)

He's taking too many weightlifting pills. So, you know, because he's got to be tough for his bros in his new wrestling.

Beth Rudden (48:11)
In the

chain, like there's this gold chain thing going. Okay, I devolved, Petty, sorry. Okay.

Katie Smith (48:17)
Yeah, yeah, yeah. So we're not, we're not.

sometimes we're in general. No, Tom Petty is amazing. One of my favorites of all time. Really true.

Beth Rudden (48:23)
Tom Petty is good.

Katie Smith (48:31)
good. So yeah, I mean, these platforms are not serving us like I was saying to you the other day, I was like, I was on blue sky and I just kept seeing, you know, I started following smart, good people who care about the world and cross domains, you know, expertise. And I just like, this is Instagram again, I'm back to doom scrolling, right? Nobody wants that. Nobody wants that.

Beth Rudden (48:54)
Yeah, nobody has time for this.

So I do think that I'm always a big fan of looking for the cure next to the cause. And I think that some things that have really helped me is I have a good understanding of how I feel in my body when I'm doing things. And when I feel in my body like I've just consumed a freaking chocolate cake and Cheetos, I'm...

I'm not going to go there anymore. I'm done. I have to delete it off my stupid phone and then a couple of months later, it'll find its way back. But I think that there is more truth in that cycle. And if that's happening, realistically, advertisers, how much are you getting out of that in actual sales? Or do you give a shit anymore because it's just part of your cost of goods sold?

Like I think the metrics are all false to create this false narrative of actual value when there is none.

I'd love to tee this up. I think we need a different economy.

Katie Smith (50:05)
Yeah, yeah, yeah we do. We need to flip it. It just needs to change. Look, you know, here's the thing that I do like about capitalism. I'll say one thing for capitalism. I think there is an innate human desire for freedom and innovation, right? To live our own lives the way that we want to live it, right? Freedom. And so when you think about that, when you think about our ancestors, we did trade, right? We've all.

Beth Rudden (50:19)
Mm.

Mm-hmm.

Katie Smith (50:32)
We've sold things. We've always marketed things. We've done this forever. The problem is monopolies. We don't have on the Silk Road swapping goods anymore. We literally have just a very few number of companies who own our data, are profiting off that data, and then the wealth gap is increasing. So all of us who gave the data, we're becoming more and more poor, whereas the 1 %

or the 0.0 whatever percent is becoming more more rich. And it just has to end. And so I think the way that we do it is we cut them off at the knees, so to speak, which is like, take our data back.

Beth Rudden (51:14)
Hmm. Actually, I've been lying to any extractive AI for or any extractive platform for about a decade. So, you know, there is that.

Katie Smith (51:25)
I like your rebellious nature. I appreciate it. I went the other way.

It wasn't until I got to the ACLU that I realized I should care about my privacy. I was one of those people that said, well, I'm not doing anything wrong, so I don't care. I don't care who looks at me. I don't care. I've got nothing to hide. Go ahead. And then I just realized, well, now I care, right? It depends on who's in power. When you close the door,

Beth Rudden (51:36)
Mm-hmm.

Well, yeah.

Katie Smith (51:55)
Close the door on your data. Nobody needs to see it.

Beth Rudden (51:59)
And that's, mean, I had to, I really, think that ⁓ there is an aspect of privacy that people are indiscreet and people are not respectful of places that have, you know, have the ethos and the ethics to be able to engender that.

Privacy and you know, there are so many things where when money can buy you privacy and Everybody else has to be poor the If you you play this out and Really understand the implications of it. It's that the freedom that you talked about of being able to do whatever you want

When you have a large group of people who are controlling the narrative of you were born to somebody who did construction, therefore you will only ever do construction. That is the fear that I have is that people will control how high or how effective people can actually earn

you whatever version of that you want, but that version of capitalism is actually correct that you have the freedom to be able to build capital, irregardless of your stature and your nature. Going back to the Titanic example, right? Like it was just understood that people were born into a class, therefore they were superior to other human beings. What the fuck?

Katie Smith (53:42)
who's doing construction to be able to start his own construction business. Right? That's what we want.

Beth Rudden (53:46)
Well,

and we want the child to maybe grow up and become a computer scientist or whatever they want to be. And I think that's the freedom, that ability, the pursuit of happiness, liberty. ⁓

Katie Smith (53:50)
whatever she wants.

Yes, yes. I'm

serious about this pursuit of happiness. Like I'm really, that to me is my North Star in all of this. Like how do we actually create happier, healthier lives?

Beth Rudden (54:17)
How do we create liberty? How do we create justice? I have a question that I'm putting into some of my keynotes.

What single everyday miracle would you most want AI to help preserve in our rushed world?

Katie Smith (54:41)
Hmm.

That's a really good one, I think, to give to our guests that we bring on and let them like think about it. So that by the time they get to us, they'll have like a really good answer because I don't have one right now.

Beth Rudden (54:52)
Yes.

No, I mean, I got it from, you, do know Pablo Neruda? He has a book of questions. And so I got an AI to write a question in the world of, ⁓ do you hear explosions of yellow in the middle of fall? I mean, just like, God, his questions are just goddess, like gorgeous.

Katie Smith (55:04)
Yes, of course. ⁓

I love that.

I

I'm really concerned about the ocean right now. I'm concerned on so many, there's so many different things to be concerned about, but I feel like.

Beth Rudden (55:27)
Did you know

that LLMs take as much as they take? Did you know that?

Katie Smith (55:35)
Yeah, that's why we're a California Benefit Corporation. We actually have a fellow who was working on analyzing the climate impact of the different models. And then we're going to benchmark ourselves to be better and better and better and more efficient and more sustainable because we have to be. And constraints breeds innovation, right? So we're going to be very constrained as a startup that doesn't come from big tech. And so we're going to figure it out.

Beth Rudden (55:48)
Good.

Katie Smith (56:04)
You know, maybe, you know, this is another good reason that Bast exists. I know that you guys have a lower...

Beth Rudden (56:10)
I

ran around the world trying to get people to understand that like a large language model, one training run, which they were doing hundreds a day was costing like 15 swimming pools full of fresh water. And when I was talking about this, so many people were like, you're telling me Santa Claus is not real. Like they didn't believe it. They don't understand the cost because so much of our business, capitalism,

has not accounted for the total cost of ownership.

Katie Smith (56:41)
Yes, thank you, Beth. Thank

Beth Rudden (56:44)
I would like everybody to understand and I can give statistics as well as give you all of the studies and it has been ignored in the same way that environmentalism was ignored in the 80s and 90s. It was illegal to do an environmental assessment before you built and building on it. Illegal. They were doing it like the hippie firms were doing it under the covers of the government.

Katie Smith (57:11)
Anything you want to say to wrap up today?

Beth Rudden (57:15)
It's good to see you. This is awesome. I'm so I know I'm so glad we do this. This is great. Thank you.

Katie Smith (57:17)
It's much fun.

Beth Rudden (57:23)
Have a great day.