And We Feel Fine with Beth Rudden and Katie Smith

In This Episode
Katie and Beth explore how AI, genomics, and healthcare are colliding—and what it means for the future of prediction, ethics, and our most personal data. They also delve into personal aspects of work-life rhythms and reimagine what it means to live in communities built for connection, not isolation.

Why You’ll Want to Watch
  • How predictive analytics can go dangerously wrong without quality input
  • Why ownership and consent matter in genetic research and personal data
  • How built environments shape our sense of belonging
  • What ancient innovations can teach us about designing for dignity
  • And: Why your vacation might need a vacation
Meet Your Hosts
  • Beth Rudden – CEO and Founder of Bast AI, former IBM Distinguished Engineer, and author of AI for the Rest of Us. Beth builds tools that make AI explainable, ethical, and human-centered, spanning healthcare, education, and workforce transformation.
  • Katie Smith – CEO and Co-Founder of Humma.AI, and author of Zoe Bios: The Epigenetics of Terrorism. Katie blends two decades of leadership in tech commerce and social impact, building privacy-first systems rooted in equity, systems thinking, and human experience.
Top Takeaways
  • AI in healthcare is powerful—but it’s only as good as the data it’s built on
  • Predictive tools need consent, context, and cultural competence
  • Genetic data must be owned, not extracted
  • Community design impacts public health and connection
  • Offline spaces are still the future of meaningful interaction
Chapters
00:00 – Intro & Vacation Reflections
05:11 – The Intersection of AI and Healthcare
12:16 – Predictive Data and Bias
19:02 – Ethics in AI and Genetic Research
24:44 – Rethinking Scientific Breakthroughs
28:16 – Lessons from Ancient Infrastructure
30:37 – Longevity vs. Quality of Life
31:46 – The Role of Government in Research
34:30 – Consent and Ownership in Genomic Data
36:35 – How AI Affects Time Management
39:00 – Libertarianism and Public Good
40:00 – Transparency in Tax Systems
42:16 – Recycling and Community Ethics
43:37 – Cultural Shifts in Behavior
44:11 – Designing for Connection
47:41 – Co-Living, Shared Spaces, and Mental Health
51:27 – Healing Through Community
53:06 – Offline Belonging in a Digital World
56:30 – Outro

Creators and Guests

BR
Host
Beth Rudden
Pronouns: she/her Beth Rudden is the CEO and Founder of Bast AI, where she’s designing explainable, personalized AI that puts human dignity at the center. A former Distinguished Engineer and global executive at IBM, Beth brings 20+ years at the intersection of anthropology, data science, and AI governance. Her mission: make the next generation of intelligence understandable, accountable, and profoundly human. She’s helped reshape tech in healthcare, education, and workforce systems by applying ontological natural language understanding—yes, it’s a mouthful—to build AI that reflects cultural nuance and ethical intent. Beth is the author of AI for the Rest of Us and a global speaker on AI literacy and the future of power. On And We Feel Fine, she brings curiosity, clarity, and contagious optimism to every episode. With Katie, she explores what it means to end well, begin again, and build something truer than what came before.
KS
Host
Katie Smith
Pronouns: they/them Katie Smith is the Co-Founder and CEO of Humma.AI, a privacy-first, empathy-driven platform training culturally competent AI through community-powered data. Their unconventional journey began in the online adult space, where they held executive roles at Playboy and leading video chat platforms—gaining rare insight into how digital systems shape desire, identity, and power. Later, Katie turned those skills toward public good—leading digital at the ACLU National and crafting award-winning campaigns for marriage equality and racial justice. Now, they’re building tech that respects consent, honors community, and shifts power back to the people. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a genre-defying exploration of trauma, identity, and transformation. A queer, nonbinary, neurodivergent thinker and builder, they bring systems-level thinking, futurism and humor to And We Feel Fine. Expect honest conversations about what’s ending, what could begin, and how we co-create tech—and futures—worth believing in.
AL
Producer
Alexia Lewis

What is And We Feel Fine with Beth Rudden and Katie Smith?

At the edge of collapse—and creation—two unlikely co-conspirators invite you into a radically honest conversation about the future. This isn’t just another tech or self-help podcast. It’s a story-driven exploration of who we are, what we value, and how we might reimagine the world when the systems around us stop serving us. We blend personal storytelling, cultural critique, and deep inquiry into what it means to be human in an age of AI, uncertainty, and transformation. We’re asking better questions—together.

Because the world is changing fast, but maybe that’s precisely what we need.

Hosted by Beth Rudden and Katie Smith, two builders of systems and challengers of the status quo. Beth is CEO of Bast.AI and a globally recognized expert in trustworthy AI, with decades of experience leading data and ethics at IBM. Katie is the founder of Humma.AI, a strategist who drove innovation and revenue growth at major global brands before turning to human rights and technology for social good. Together, they make complex issues, such as AI and its impacts on everyday people, clear, personal, and impossible to ignore.

Beth Rudden is the CEO and Founder of Bast AI, a pioneering company building explainable, personalized AI for good. With over two decades of experience as a global executive and Distinguished Engineer at IBM, Beth blends anthropology, data science, and AI governance to create tools that amplify human dignity and intelligence—not replace it.
Her work spans healthcare, education, and workforce transformation, using ontological natural language understanding (NLU) to make AI transparent, accountable, and accessible. Through Bast AI, Beth is reimagining how organizations deploy AI that’s not only accurate but aligned with ethical values, cultural context, and cognitive well-being.
Beth is also the author of AI for the Rest of Us and a passionate advocate for AI literacy, epistemic diversity, and the right to understand the systems shaping our lives. She speaks globally on the future of AI, power, and social contracts—and believes we’re all stewards of the next intelligence.

Katie Smith is the CEO and Founder of Humma.AI, a privacy-first platform building community-powered, culturally competent AI. With over two decades of experience leading digital strategy and social innovation, Katie blends systems thinking, Responsible AI, and storytelling to create tools that serve dignity, not domination. Their work spans mental health, civic tech, and digital rights, using participatory AI to make systems safer, fairer, and more accountable. Through Humma.AI, Katie is reimagining how people and businesses engage AI that’s accurate, inclusive, and governed by consent and care. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a provocative exploration of identity, trauma, and transformation. They speak globally on the future of technology, power, and justice—and believe human empathy is the intelligence that will define our time.

Subscribe to our Substack for bonus content: https://substack.com/@andwefeelfine

Katie Smith (00:14)
Welcome back everyone to and we feel fine. I am in I'm you know, always blessed to be with Beth Rudden. How are you feeling today, Beth?

Beth Rudden (00:20)
You

I'm a little ⁓ frazzled is a good word, but ⁓ I did Almonte, Colorado, which is a beautiful, beautiful place right outside Gunnison and has some of the best fly fishing in the entire world. And we went white water rafting and we do this kind of camping glamping because there's like a cabin, but it's still like no air conditioning and you know, other things. So.

It was fantastic, but it was too short. Like, I feel like I can get in the mode and then I'm like, crap, we gotta go home.

Katie Smith (00:58)
Yeah,

my gosh, we always need like a vacation from the vacation. An extension of the vacation and then a vacation after the vacation. So at least a month.

Beth Rudden (01:02)
Absolutely. Yeah, but it was really...

Or we just need to be making a living. so then we don't have this, you know, all the way up, I was building my product design and what I think the user journey should be for what we're doing. And, you know, all the way down, I was thinking about, you know, how to apply some of the scenarios to what I was doing. So it's not like, and this is...

something I think a lot of people can understand is that I haven't had working hours where I punch a clock. And I did at one point or another. I've worked for many different places where I had a punch, you know, and then put your ticket up and then you come out and then you punch the clock and you only get paid for those hours with like, you know, a 20, 25 minute, you know, my gosh, they had smoke breaks back then. Like you had to divide it up every time that you went in and out. ⁓

So it's like I don't have working hours, I have deliverables. And so there's this stream, especially owning your own business, ⁓ being the CEO, wearing all the hats and trying to figure out all the things. I'm sure you know nothing about this.

Katie Smith (02:23)
I work through the holiday as I do. But part of the reason why I work through the holiday is because my little puppy who I bring up all the times, eight months. So this is her first fourth of July and like I could tell as it started. So every single time the sound would happen, would like, cause I'm trying to create language with her. It's like simple language structure that she and I can repeat. So I'm like, bing bang, boom, boom, bad.

Beth Rudden (02:42)
Mm-hmm.

Katie Smith (02:48)
And she was like, okay, I get it. We don't like these sounds. I'm like, no, we do not like these sounds, but you're safe, But yeah, you know, I worked most of the holiday. I'm still dealing with the legalese, which is blowing up my brain, but yeah.

Beth Rudden (03:01)
Did you have fireworks?

Katie Smith (03:03)
No, I didn't even look outside for fireworks.

Beth Rudden (03:07)
Yeah, so that's typically where my dog has been, ⁓ they go crazy, but my puppies, got in 2020, or 2020, COVID, like real COVID, you know, just beginning of COVID. And ⁓ we had a big dog that passed about a year later, and we took the puppies and the big dog to the same,

place in the same cabin at Three Rivers Resort, which is just phenomenal. So total shout out for them. And ⁓ I swear they were so calm. They were like, we know the routine. Yeah. They they like got there. They're like, you know, just, you just love, you know, I think dogs, ⁓ I heard this, they can, they can smell ghosts. Like they, they knew the smell of like dogs that have died. Right. And

Katie Smith (03:44)
Really?

interesting.

Beth Rudden (04:01)
Yeah,

And then also, you know, another kind of ghost thing is ⁓ my husband has amazing night vision goggles and all the things that are very, cool for army people. But ⁓ my goodness, you could see all the satellites and the shooting stars and the night was so clear because there's no light. ⁓

Lots of light, pollution. Thank you. That's the right word. And we were like just outside just watching, that you could see like the satellites too, because they and they're so close to each other. And we could totally talk about that. But like, mean, you know, true astronomy and I just you your neck hurts from looking up at the stars.

Katie Smith (04:25)
solution. Yeah.

Amazing.

Beth Rudden (04:51)
What a great problem to have. I always, yes.

Katie Smith (04:53)
Yeah, that's why like Anza -Borrego

here in Southern California. It's ⁓ a dark sky, like, you know, preservation area. Yeah, exactly. It's amazing camping. But yeah, so I protected the puppy this time just to like, ease her into it. But the day after when they were still going off, she would just look at me and like, we're okay. And she's like, okay, we're okay. So yeah, we got her.

Beth Rudden (05:01)
Mm-hmm. Mm-hmm. So, right, so no light pollution? Yeah.

Hmm.

She has her person.

Katie Smith (05:22)
Yeah, we

got through it. But yeah, I was working, but you were doing some product and we were talking about AI as we do. know, part of what we know is going to be so big in AI is healthcare. And you know, what Google DeepMind is doing around this is sort of interesting in terms of the genome program that they got going on. think you have some opinions about this,

Beth Rudden (05:26)
Mm-hmm.

As we do.

Katie Smith (05:46)
I like that we're sort of tackling different subjects and we're not solving any issues. We're like opening discussions because like we can have more guests and deeper discussion about every topic we have covered already. But, and so I think this is going to be another one where we could go deeper later, but let's start the discussion. Tell me what you think is interesting or what's ending in beginning in terms of AI healthcare and this genome research, which looks promising, but also has some implications.

Beth Rudden (06:16)
So I will caveat this for anybody who wants to listen. had a wonderful boss who told me that he was going to send somebody behind me whenever I talked about a new subject, because we would always be inventing things. so apparently, patenting is a lot of fun, but you know, that's something that like, even though we are not solving all these things, having these conversations and putting things out there is

you know, in the hopes that somebody does start a patent and does start to think about, hey, that what that person said on the podcast gave me this idea that I could then go and create a whole patent for or invent something. And that's that's generative. That's what language is about. So when I was reading about the the Google DeepMind, ⁓ the I think the article had the word predict.

I don't know, 470 times, something like that. And yeah, and it is something that I key off of a lot as a data scientist because we are always looking at, how did they predict it? What did they use to develop the target to understand what they wanted to predict against? What was the context?

Katie Smith (07:19)
Well.

Beth Rudden (07:39)
And none of that was in the article. And every time I hear the word predict, I'm like, they're using the data of the past to predict the future. And that is very dangerous in all ways, because really what happens is people tend to codify the past because they don't have ⁓ information that is varied enough.

Katie Smith (07:50)
Mm-hmm.

Beth Rudden (08:04)
in order to be able to truly predict, especially something like DNA and RNA, that is the base building blocks of all humanity, do you have African Aboriginal women as a part of your data set? I don't think so. So we have this notion that we understand, ⁓ especially at the DNA level and with the COVID vaccine at the RNA level and some of how that works. ⁓

Katie Smith (08:18)
Right.

Beth Rudden (08:32)
do not have a grasp of this. am not an expert in any of this other than helping my child pass her ninth grade biology exams. what I see and what I hear in that is a hell of a lot of hubris. And I question, what do their data sets look like? What is their variance? What kind of things are they predicting against what

you know, what past data set what and we found out things that I've read anecdotally where ⁓ there are some people in Africa that are completely immune to mosquitoes that that ⁓ that have malaria. And I'm like, there are some functional differences in bodies that have adapted in, you know, in generations in the same lifetime across generations.

So what are we doing when we say we can predict these things and predict what will happen when we might not have even close to a two billion sample size representation of humanity? Is that what you thought I was gonna say?

Katie Smith (09:47)
Right. No, think that, well, I mean, I didn't

have any expectations, but it makes sense. Garbage in, garbage out. It's the same old rule. It's always going to be the same. And even with the synthetic data, and then it's like collapse at some point, you know, like it doesn't work. And so we need quality data. And I think that's a really important point.

⁓ I'm thinking about 23andMe and all the different DNA tests that I have taken. 23andMe is really interesting right now too, because they just went through bankruptcy and the CEO, I think, just started a nonprofit to buy the capture, the data back. So she's trying to protect it, which I think is admirable, but I don't know how it's being used and in the sale, who got access to what, you know, to help pay for the bankruptcy.

Beth Rudden (10:26)
Mm-hmm.

Katie Smith (10:40)
It's interesting. The reason why I bring up 23andMe is because I found out through all the different... Because I checked all the boxes. I'm like, give me everything. I don't care. was one of those people. My sister is like, I don't want to know if I have Alzheimer's. I'm like, okay, well, I'm going to find out. And I was like, we don't have Alzheimer's. So I was that person. But what we do have, or I specifically have, is degenerative eye disease. It runs in my family. And so like my grandma went blind.

And so it's very possible that I could go blind. Now I'm very healthy for my age, you know, biologically younger than my, my, my numbers, so to speak, but like I could go blind, you know, because of this disease that I have, matter how healthy I am. So could this, you know, a deep mind alpha genome project, are they trying to solve these types of issues? Are you, what do you think that they're trying to go out?

Beth Rudden (11:37)
That wasn't apparent.

Katie Smith (11:38)
Yeah, it's not apparent, isn't it? It's like there's no transparency in what these big models are trying to actually achieve. Like what I just described is the real thing. Like I would love to know that there's some solution, right? That you can do, you know, DNA splicing and you can fix this little, you know, issue I have in my DNA and my eye and like I will not go blind. That'd be great. You know, and apparently it happens to a lot of people. So it's a good, just anyone out there.

Beth Rudden (12:06)
was.

Katie Smith (12:07)
Do that, please.

Beth Rudden (12:09)
There you go. mean,

I delivered as promised. So I was on a panel a couple of years ago with a scientist who was young and lovely and wonderful and so, so, excited. And they were working for a very large company and they were saying, ⁓ my gosh, did you know that with

the visual cortex and the knowledge that we have within the visual cortex that we can guess gender. And I'm like, really? And I didn't.

Katie Smith (12:47)
gender.

Beth Rudden (12:48)
Yes. And

I was like very, and I just, I was on a panel, like lots of people. I was like, I hope that you are just saying that metaphorically because that's not really the right thing to be able to predict in this world today because you, again, you're saying that you don't have the variance, but you're saying that there's some indicatives, right?

Katie Smith (12:56)
Yeah.

Yeah.

Beth Rudden (13:16)
She's ⁓

the right thing to do, right? so, yes. And ⁓ also loss aversion, ⁓ many, many, many scientists are so desperate to have their thing work. And what working means, means that they can say, well, I can tell in...

Katie Smith (13:47)
Yeah, right. The blind study dilemma.

Beth Rudden (14:11)
you know, 66 with 66 % certainty with this data set at this time that these people will be this gender. What are you doing? You know, our right. Yes, there you go. And that is that's what I think that people need to be asking. Why are people at Google DeepMind fucking around DNA and RNA? Are they like, you know, metabolic scientists that are trying to cure degenerative

disease that could help people. If so, you don't actually need to predict anything. What you need to do is you need to look for an understanding of that individual edge case in the dataset and see if there are some genetic causes that are common among people in a different situation that could apply something similar. And then CRISPR or gene splicing ⁓

not for the faint of heart. There is an entire Netflix show on how people are breeding glow in the dark dogs because you can do this with CRISPR. so again, it's like...

Katie Smith (15:18)
Yeah.

iterations are massive.

Beth Rudden (15:22)
Do I want to be in my own laboratory to create glow-in-the-dark dogs? Is there a market for that? And if there was, would I do that to puppies? What other kind of situations are being caused for this? What is the first, second, and third impact of the science? And how would you do that?

Katie Smith (15:49)
Yeah.

Beth Rudden (15:50)
Another example, just again, incredibly, you know, a woman that I was working with said, I really want to find out how many data scientists are women. So I'm going to upload all of the pictures from LinkedIn or whatever, and ⁓ get have the model guess which one are women that have this title. And again, I'm like, there's actually a better way to do that. You need consent.

Katie Smith (16:18)
Yeah.

Beth Rudden (16:19)
from people

on how they identify. And then you actually get data that is not only representative, you don't need a shit ton of it. You need people to consent to giving you that information.

Katie Smith (16:37)
This is why 23andMe is interesting, so they were a consent-based model. So they were constantly asking for consent to do another, for me to participate in another research study. And so, yeah, it was this sort of clever way to get us all to participate in trials. ⁓

Beth Rudden (16:59)
But that's, mean, that's a, what they did not foresee is basic data ownership that should come with an effective data strategy that identifies what data is owned by whom. And, you know, at Bast and IBM, all data and insights belong to the creator, full stop.

And if you can't write that simply into contracts, which I, I would say that I employ a lot of lawyers. I allow them to make a living because I need them to write things in a way that is very clear where the data belongs to the person, the creator. And there is something, I don't know, what if your DNA changed when it was taken with or without your consent?

Do we know?

Katie Smith (17:55)
Oh,

I'm absolutely sure it's been sold and resold and my genome is out there. It is out there, which is really trippy to think about. But yeah, it's a thing.

Beth Rudden (18:02)
Yeah.

It does say AI for the better understanding of the genome, which I think is a noble aspect, but understanding from whose perspective, from what context, for what reason, who's the user of this understanding? Is it for scientists to go look at the cool things that we can do with this science? Or is it, what is the sequencing

Katie Smith (18:16)
Thank

Beth Rudden (18:35)
Why are they predicting? Why are they using past data to predict what? That to me is the sure sign that somebody might not have read an anthropology or sociology study ever.

Katie Smith (18:40)
Yeah.

on computer science people just going at it. ⁓ I'm sure there's lots of cross disciplines on their team and so we're not knocking them. But it's interesting that this is like one of the most well-funded AI labs since the beginning. And they still can't tell us what their goal is with this. So it does, like, you do this really well in your book. It's like, we have to ask these questions.

And when we ask these questions, we may not like the answers, you know, and that's the problem. You you said that very clearly in the last episode.

Beth Rudden (19:22)
Mm-hmm.

Who's funding it?

Katie Smith (19:30)
Well, I think the company is, think Google is just, know, of the R &D, yeah.

Beth Rudden (19:30)
It's.

Okay, great.

Wouldn't it be really great if we could trace like, who's funding the study as a part of like the the total cost of ownership? And, you know, not only who's funding the study, but why they're funding it and, and what they're doing. So I want to go back to this idea of like loss aversion for a second, where people

People will regret losing $100 more than they will enjoy finding $100. Right? Yes. And the world that we're living in today has an unfortunate aspect for science that you should be finding null hypotheses 99 % of the time.

Katie Smith (20:09)
Yeah, I've read about this study. It's interesting. Yeah.

Beth Rudden (20:27)
So you're going to regret not finding what you seek 99 % of the time if you're doing it right. And so if you're using it for an explanation, you have to make sure that that explanation has a list of what assumptions that you're making in order to say, this is 66 % accurate against this data set at this point in time.

Katie Smith (20:35)
Yeah.

Right,

right.

Beth Rudden (20:54)
with

this data that was collected in this way that was sampled in that way, it demands replication too. It demands somebody else to say, you're right, I got 66 too. And that is where I just, I feel like it's questionable.

Katie Smith (21:16)
definitely. I just, we have questions. And so the other thing I think of with this project is, you know, with Race Counts, this project that I worked on at Advancement Project, we looked at racial equity across 58 counties, across seven key issue areas. And what we learned when we, you know, did, you know, little data walks with different communities up and down California is they said, well, that data doesn't actually tell the story.

And so the other thing I'm just wondering too is just as they're doing this work, when you're not in the communities that you're seeking to serve, what is the data really telling you? And this is, you know, going back to what you were just saying, it's like, you know, from, you know, from their point of view, whose point of view follow the money, but also whose point of view is getting baked into this whole methodology. And, and you can't solve a problem unless you know what the problem is you're trying to solve. And so this like,

race to build Is that just what this comes down to?

Beth Rudden (22:16)
But ⁓ convolution, yeah, I I will say that like in the article, they do have the assumptions or like some of the current limitations and the challenges that they're having and using these models to understand sequence based is that you are

Katie Smith (22:20)
but they're using it to harm.

Beth Rudden (22:46)
again, capturing the influence of very different regulatory elements. So these things rely on different systems in the body. There's all kinds of things that we just don't know yet. And ⁓ I think that funding science like this is so, so, important. I just wish we would celebrate the null hypothesis more than the supposed thing that

Katie Smith (22:59)
Yeah, sure.

Beth Rudden (23:15)
that would make it useful. And that's like a change in the, ⁓ maybe this is an ending that we could say that we could end this p-hacking to get to that 1 % of the 1 % of the 1 % of discovering something new. And what if this was the ending of discovering new things and the beginning of taking what you love, remixing it and making it relevant.

for what you want it to do for yourself or for somebody else or to do something that everybody else in the world can replicate and nobody would be harmed. Like that's what I want. Like I want to change science. I want to change scientific discovery that is, like I feel like the pendulum has swung too far in the direction.

Katie Smith (23:56)
Yeah.

Beth Rudden (24:14)
of everything is ⁓ bought or sold. It's not science. It is literally a study that supports somebody's funding or marketing strategy. And that needs to change.

Katie Smith (24:25)
Yeah, there is some purpose

of just not being, it's not clarified to us what it is.

Beth Rudden (24:34)
Well, and you you and I also exchanged some ⁓ some articles recently where it is very clear what the purpose is. And people are out there talking about, you know, openly using, you know, using language that is for controlling large swaths of humanity. And, you know, that that is

like we need to shine the light on that because to me that's where do we want those people to use the studies that scientists are doing because scientists are genuinely, I think, really wanting to advance the understanding of what they can do with these new tools and with AI. I think that the ⁓

the people who then take that science and say, ⁓ hey, we could use this for a military strategy, or we can use this to control, or we can use this to... So really, really interesting story, and I will find the link so that you can link it. And this comes from a TED radio hour, and one of the TED talks in that TED radio hour, because they do sort of like a digest, and it's super cool, love the format.

And ⁓ one of them was talking about how GPS was discovered. And it was two scientists who were tracking Sputnik. And they were like with a pen and paper and they were like, you know, listening for Sputnik to go deet, deet, deet, deet all around the world. And a general walks in and says, what are you guys doing?

And they're like, ⁓ we're ⁓ mapping, you know, Sputnik going around the world. And he's like, interesting. And he's like, so can you predict where it's going to be? They're like, yeah, it's easy. You know, like on this schedule, it takes this much time. It's going at this speed. Totally predicted. General goes, could you predict where a missile might land if it's launched from an aircraft carrier on the sea? They're like, yeah.

Katie Smith (26:34)
Yeah.

Yeah.

Beth Rudden (26:51)
Totally, that's math, right? We can use math to do that. So just note the, like in my head, was just, it was such a beautifully, beautifully told story about these people who were curious about, you know, mapping Sputnik and mapping the, where it's, where it's going, and then being able to predict with a high level of certainty, you know, hundred percent that it's gonna, you know, come up in this and that's.

That's so interesting to me because like that creation of certainty is a good thing, but it involves understanding the context and then maybe using that in order to be able to use a satellite to give you driving directions so much so that we don't even know street names anymore or, you know, why urban planners put so much effort.

Katie Smith (27:25)
Hmm.

Beth Rudden (27:46)
into the creation of streets and communities in the creation of streets and highways and the creation of supply chain. And it's it's fascinating to me like that, that we've, keep evolving the technology, but look at how much there could be to discover that we can improve, not find the novel idea, but remix and improve what we have in order to, I mean,

Katie Smith (27:56)
Right.

Yeah.

Beth Rudden (28:15)
We're still using concrete that comes from the Romans mixing mud with salt water. mean, right? Like we're still using some of these things that we not many people know the origin for. And when they do know the origin for they're like, wait, what? You know, and then you're like, well, yeah, works. And then, you know, they had running water. They had

Katie Smith (28:39)
Thanks. ⁓

Beth Rudden (28:44)
you know, springs that had biological and mineral deposits that were natural antiseptics. You know, there's so many things that we have to discover, to understand, but to do that, that's not often funded unless there is a way to control or hurt or harm. I, yeah, yeah.

Katie Smith (29:08)
government is the biggest part. Yeah.

You know, it's been said it's worth repeating that the AI race is largely because they know where the big funder is. They want to get funding from the US government. You know, that's the biggest funder in the United States for everything really right now. Biggest contracts come from the US government still, despite everything that's happening right now. So there's like the functional, it's like, how do we like use this new

this transition we're going into to make things more functional and better, that really resonates with me because like I think about built environment, like you were just talking about built environment. I think about my time, for example, in Jerusalem. And I write about this in the book where there's just like layers and layers of generations and thousands of years built in these hallways. And

It's, mean, I've never felt anything like it. And speaking of ghosts, like you can sense the ghosts in these areas. And anyways, it really marks the time and all the different times. And then if you look in our built environment today, it's like, okay, this marks our time. What is the future? I've been thinking about this too. like, so if the genome, when I think about the genome work, I'm like, please help us focus on like actually helping people live.

better lives, not necessarily longer lives. I think there's like this big movement around longevity and I think that's great. But also I just wish to your point, like there was a little bit more of a focus on creating like higher quality lives today, you know? So that's one thing that I think about. ⁓ But yeah, I've been really thinking about like lived environment too, but yeah, maybe that's a subject for another podcast, but it's all related.

Beth Rudden (30:34)
Hmm.

Oh, I think so, I

do, you know, it says that, you know, our model predictions are intended for research use and haven't been designed or validated for direct clinical purposes. And, you know, there's all these like, it's all these warnings that I remember going to Ireland when I was young and on the cliffs.

there was no warning sign, was no like railing, there was no like, you know, things that, that, that, know, hey, dumb ass, don't walk over the cliff, you know, kind of thing or, or what.

Katie Smith (31:23)
Thank you.

If the wind blows,

be careful. Yeah.

Beth Rudden (31:36)
Right. And I think that, you know, in some ways we live in a really litigious world and that's something that drives how we think about things and the ⁓ warnings that we have to put on. But I am positive that those scientists are trying to do things. Most people go into psychology because they want to solve a problem of their own. It's personal. Most people are

personally trying to solve problems. And, you know, I've talked about my sister and her being sick and really wanting to help solve some of these problems because I and you were using 23andMe because you wanted to have the knowledge of how to have, how to weather the weather in your world because you don't, we're not, we're not taking advantage of information that we have and we seem to be seeking information from

sources that are not necessarily giving us what we need. They're just giving us the next headline. that's something, yeah.

Katie Smith (32:42)
Yeah. I wish there was like a

big project like DeepMind that's like, we're going after Alzheimer's. Here's like the monthly update, right? Here's all the goal values, right? You know? Yeah. Yeah! You know, like tell us what's going on. And then maybe we would trust AI more to be like, okay, well, if you're using AI, show us, show, you say this all the time, show us your work. You know? Like, can you show it to us? Especially if it's in that realm, it's like,

Beth Rudden (32:50)
Yeah. Yes, we're going to the moon. Here's the update.

Katie Smith (33:11)
I know, yeah, maybe some pharmaceutical company is bankrolling it, and so they can't talk about it because they want to be the first to patent the product to get it in the market. yeah, no, it probably is something like that, which is sort of sad when you think about it.

Beth Rudden (33:26)
Well,

and that's not the way that genius works. It's something that strikes a number of people at a number of points that is stochastic. It's random, it's variable. It's the randomness that allows us to create the next thing, typically out of necessity too, which is why a lot of scientists go into their field because they want to understand how they can do better.

Katie Smith (33:31)
Yeah.

Beth Rudden (33:55)
and how they can help and they can solve problems. It's the funders that then reap the benefit because the company owns the patent, the company owns the code, the company owns the work product of a human being, just like the company owns the diamond. But it was...

Katie Smith (34:19)
reason why the government should be funding these research projects, because what if actually the government owned it and then could just give it to their people? That sounds like exactly what our framers were looking to do, for the government to do.

Beth Rudden (34:36)
Absolutely. So in that distribution of people talking about genomes who have no business talking about genomes, but I love that because it gets us thinking about different ways of doing things that may not have been thought through because we're not thought leaders in this area. And so having that diversity of ideas come forth from people who are in, you know,

Katie Smith (34:55)
Yeah.

Beth Rudden (35:05)
are in very different disciplines in life doing different things. ⁓ I was asking a friend of mine this, I was like, how often do you use chat GPT to give yourself 20 minutes? Because he has young toddlers. And he's like, I have never thought about that. How did you come up with that question? I was like, that's what my mom would tell me. She's like, buy yourself 20 minutes for yourself so that you can fill your cup before you have to go do the next thing.

You do this with Tomlin, You're like, okay, we got to go for a walk now, but later on, mommy gets her time to do podcast and then, well, whatever. It's interesting because I think that the way that I understand how AI works, I'm constantly asking AI to help me with something or thinking back in my younger days.

about how I could give myself more time, how I could give myself more grace, how I could provide something in a way that would allow me to go off and finish reading my book, which is typically what I want to do instead of what I have to do. So it's interesting. Like I think that that's...

So when you go for a STTR or a cyber SIBR, like the government funded ⁓ small business, the government does own a portion of what it funds. And it's a ⁓ nominal portion. But I don't know that there's a distribution mechanism where it's like the government funded this research. And a lot of that research sat on shelves still does. You know, it's not

Katie Smith (36:32)
Yeah, yeah, yeah.

Yeah.

Beth Rudden (36:53)
distributed.

Katie Smith (36:55)
Even the earliest, you know, I'm on this libertarian kick because one, I was raised by libertarians. So I understand them, I think very, very well. But also we're being this country I feel like is overtaken by libertarian philosophy right now. This bill is like an example of that minus the deficit. Real true libertarians would not like the bill either. Nobody really likes the bill. It's a very unpopular bill. ⁓ where was I going with this? Were you going to say something? Sorry. We're going to say something.

Beth Rudden (37:08)
Mmm.

You know, I

wanted you to define libertarian.

Katie Smith (37:24)
so as I would understand it, it's a very small government. Like government is basically your minimum viable product government. If we're speaking in like startup terms, it's the minimum viable product. Like it's for national security. It's for infrastructure, like roads and bridges. It's for, ⁓

Beth Rudden (37:48)
A cupcake,

not a wedding cake.

Katie Smith (37:51)
Well, but you know,

the underlying philosophy is personal freedom. like my grandpa would be like, look, I made my money. I want to pay the least amount of taxes possible so that I can decide what I want to do with that money. And what he decided to do with that money oftentimes was to help people in his community. But the problem is not everybody's like my grandpa. But no, they believe in personal freedom. They don't want to pay taxes.

Beth Rudden (37:58)
Yeah.

Katie Smith (38:21)
are as little as possible, they will go wherever they can to avoid taxes. And personal freedom in true libertarian fashion really means like, you know, your life is your life, my life is my life. you know, it's your private, you have your privacy, right? So it's interesting, like, when you talk about true libertarian, it's sort of easy to sign up for it. You're like, well, that sounds great, actually. But the problem is that

Beth Rudden (38:43)
Mm-hmm. Mm-hmm.

Katie Smith (38:50)
way that it's being applied right now is it has starved education. It has created a really broken healthcare system. Everybody, you tell me, are you happy with your roads right now? Are you happy with your bridges? Like I have broken roads and bridges all over the place. know, like, you know, things that look dilapidated, like our built environment is so dilapidated. So the government, the one of the basic things.

that government is supposed to do has done like a really poor. I shouldn't say that in terms of like developed countries were okay, but we're not in the top 10. You know what I mean?

Beth Rudden (39:30)
Yeah, well, in health care, ⁓ know, ⁓ reproductive rights, ⁓ lots of things going on there. But what I think is a solution for it all. Seriously, this is my wave of magic wand is everybody gets to see where their specific tax dollars go. Everyone.

Katie Smith (39:49)
How would you do that though? mean, that's like, it's a massive budget. Like, it would be meaningless, right? It would just be meaningless.

Beth Rudden (39:58)
No, no,

think you can totally do it where you are. Yeah, absolutely. It's like, you know, it's it's lineage and provenance of like, I'm giving you a dollar from my paycheck. Where is that dollar going into who? Yes. Yeah. But but in a way that I think.

Katie Smith (40:14)
So it's like crypto, it's like crypto government.

They're

going there now, Beth. That's the direction they're going. The Crypto guys won. Doge is purposely trying to use AI in crypto or blockchain.

Beth Rudden (40:35)
Again, depends on what reason. What I would love to see is that transparency. So Elon Musk tax dollars built this. Right? Seriously. Like, you know, this road was built by the tax dollars coming from the Santa Monica, like, you community. I think that would be in you got a report on where your tax tax where your specific tax dollars go every I mean,

Katie Smith (40:44)
Okay. Yeah.

Well,

let's play that out. So I get where you're going. like behavioral change can happen in these transactions that we have with like our bank and our government or with the Department of Water and Power, you know.

Beth Rudden (41:11)
Yep.

Katie Smith (41:18)
Smart meters were a really good example of behavioral change. When we would look at our bills and we would say, oh, shit, I'm using more energy than my neighbor. Like you feel a little bit of shame and it would actually work that people would try to reduce their energy. So they would get that, that smart meter readout again and be like, okay, you're doing better. It worked. So are you thinking.

Beth Rudden (41:39)
Of course it worked.

Yeah, well, I'm thinking too is like, you know, recycling, right? Everybody desperately wants to recycle. And in my household, sometimes it's like, let's punish ourselves by trying to recycle the most amount. But realistically, if you've ever been to a recycling plant, you know, less, less than it's, it's demoralizing, it's cognitive difficulty all the time to think about how inadequate the recycling is.

Katie Smith (42:06)
Yeah.

Beth Rudden (42:06)
And

the recycling is inadequate because people put trash bags of trash in the recycle and then people have to filter that out because not everybody is doing it well. I've seen literal notes on my neighbors, like recycle bins saying we're not taking this. This is this is not recycle. Like and then yeah. And then you think about Japan where you're not allowed to have more than X amount of trash.

and everything has to be reused or recycled. yes, and like if we lived with our own trash, Katie, this would have gotten a lot better a lot sooner. Instead, we shipped it to other places so we didn't have to see it. So we would consume, throw shit out, and then consume and then throw shit out. And I'm gonna go back to something I know I said that I think is just gonna be super interesting.

is what happens when you change the impulse control of 30 % or a tipping point or more of the population. And so people who are on Ozempic or Wigovie and they don't have the impulse control anymore, it's gonna change our economy because that entire economic system is based on, my God, I feel like crap, I wanna be beautiful, let me buy some eye cream, right?

Katie Smith (43:26)
That's really

interesting. That's a very interesting observation. I hadn't thought about that. That those weight loss, well, it's being applied to weight loss now, that that could actually change behavior in trade. Yeah. my God. You're maybe right on that. Okay. Mark the moment, everybody. Remember what Beth just said. This is a good prediction. This is good one. ⁓

Beth Rudden (43:29)
Yes!

everything. I think it'll be bigger than AI.

Yes!

Well, this was, I don't

know, I read an article a couple of months about this. can find that article as well.

Katie Smith (44:00)
So, you know, what do you think the built environment is going to look like next? This is why I was telling that story from Jerusalem to today. There's a market difference where I don't feel like there's been a market difference even from like the 50s. So what do you think the lived environment of the future is going to look

Beth Rudden (44:18)
I think, I hope it's more community centers, like, you know, more people who are in community living together because they are part of the same community. ⁓ less, so one thing I love about my family, it's hard, really hard, but when we're in small spaces together, we always come out feeling closer because we literally are.

And there are things that you have to learn to do, like put the toilet seat down or what have you when you're living in small spaces or make sure the rubbish is in the bin because otherwise the dogs will get it. Like, you know, all these things that you have to do, human factors that we spent, I don't know, 75,000 years doing. And I think that this, I really hate the tiny boxes that we live in.

And so many people are lonely and suffering in tiny boxes. For what? For why? For privacy? And I think in some ways, the built spaces are reflecting our anger of needs not being met, of wanting privacy, but we're not doing it right. And I think that we're causing a lot of harm.

by not having more things out in the open and for community. Like I ⁓ was chastised a little bit. I definitely had more sensitivity to this, but ⁓ I had free range children where I let my children range and other adults would come tell the children that they needed to be parentally supervised to which my children were.

Katie Smith (46:06)
my god.

Beth Rudden (46:08)
wonderful

and charming and did not want to engage with people who were going to tell them what to do because their parents weren't even telling them what to do. So they were like, ⁓ I know what kind of parent you are. I shall stay away from you. I'm so sorry to bother you. Let me go on. I think we would have less. We would have less question about what kind of mom to be, what kind of mom.

Fur mom should I be? What kind of like, adult should I be? Who am I? What do I do? How do I act? This just happened to my neighbor. What do I do in this situation? Why are we asking an AI those questions instead of just learning about it because we're living together? And I think that I always wanted to live with... ⁓

Katie Smith (46:54)
Yeah.

Beth Rudden (47:00)
certain people in my life because their wardrobe and my wardrobe were like identical. And I'm like, we are buying the same stuff twice. We can be so much more efficient if we live together. You know, same thing with like, you know, making, you know, making a big vat of like macaroni and cheese instead of like buying individual macaroni and cheeses. And I don't know. I want it to be together, Katie. I want I want it to be where we're sharing.

the body stuff together because we're not doing that well. I think it's causing harm.

Katie Smith (47:38)
Yeah, you know, there was a conversation that I had with friends. remember being in the kitchen. It was like a few years ago. And this is when I was thinking just the very beginnings of humor. And I was like, imagine if there was like a kiosk in like a community center and you could just like ask that kiosk anything.

Like, you know, it was just like a little happy character or whatever. You could just ask it anything and it would help you and tell you where to get that resource. If you were hungry, if you need housing. But if you wanted to find the queer group or whatever it was, like it would just like help you. And the reason why I like the idea of this like kiosk in these community centers or just like in transit centers is because I really don't like the future where we're looking at our phones or the idea that we're just talking to our classes all day long.

and like we're just recording each other. Speaking of privacy, you know, like I don't even like it when my neighbors have rings. I understand people feel safe with this stuff, but it's like surveillance is everywhere. And I just don't like the idea that we're going to live in that world, although I don't know if we'll change it. But if I could choose, we would remain like private, you know.

Beth Rudden (48:28)
yeah.

Katie Smith (48:50)
humans that could interact with community and not be like watched all the time, but had like quick access to basically anything we needed. Because if we're having four day work weeks, or some of us are just not working, to your point, we are going to need these like big sort of community centers, where there are sort of like these kiosks of like,

Who do you want to do crafting today? Okay, great. You know, the crafting is at that park this week or whatever the case may be, you know, or thinking about, well, Denver is a really good example. I feel like you guys have done a really good job with built environment with like bikes and like integrating nature. And so instead of.

Beth Rudden (49:27)
Mm-hmm. Mm-hmm. Yep. We have

10 miles of green space without crossing any major roads. So it and it's I would definitely say that, you know, I lived, I grew my children up in this like don't come back until it's dark kind of community. All their elementary, middle school and high schools are like

Katie Smith (49:36)
It's amazing. No, I love it.

Beth Rudden (49:55)
within walking or driving distance that is not a hate commuting. Like when I was younger, like driving in a car for an hour to work for nine hours to drive home for an hour. mean, that was brutal. I imagine I knew a lot of very, very poor people who raised their kids in the car because that's the only place.

that you could do hair and do meals and do all the things that you need to do, the homework and like, you know, be safe. Like the playpen, you know, was the car. And so it's just really, it's very, it could be done so much better if we had our tax dollars going to community centers and community gardens and, you know, community places to get fresh food and.

like that was a prescription that your doctor gave you is that you need to go to your community center and get a box of vegetables and learn how to cook those to introduce better food in your diet. And that was literally just part of your prescription. Like that wasn't, ⁓ I don't know, we're getting there. think that the end of...

Katie Smith (51:03)
Mm-hmm.

Yeah.

Beth Rudden (51:17)
these ideas that everybody has to have their own little car and their own little box house and their own things instead of public transportation, know, public gardens, public ways to be in. One of the stories that I hold is ⁓ having an understanding of community healing. When somebody is sick in a community,

the social group that they're in, the community that they're in is part of their healing journey. And what are we doing? you know, we decry the elderly, the sick, the weary, the people that are not on Instagram doing the things all the time. And like, I love to watch it, but I...

Katie Smith (51:53)
Yeah. ⁓

Beth Rudden (52:15)
I get really upset after a while just because I can't live in that space that I always have to do more. I have enough. I am enough.

Katie Smith (52:26)
Yeah.

Yeah. Even social could feel so different if it's in this context of community, you know? And of course we're working on that too, but like, yeah. So I like this idea of like thinking through what is that jump from, you know, online to offline, you know, you know, cause we talk about like AI taking over and it's like, you know, the algorithms and who's coding it and like, who's point of view. it's like, also just how do we get more

Beth Rudden (52:33)
Mm-hmm.

Katie Smith (52:56)
Yeah, that's a big thing for me right now. It's just like building community offline. You know, I honestly, haven't been very good at that. I have tons of friends all around the world, like amazing groups of friends, but I have not ever been really good about building a community or like just fitting into a community. So, and I think that's probably true for a lot of different people. And it's like, this would be a really good time for us to figure that out. Like, how do we get, we need to get good at that again.

Beth Rudden (53:24)
I was always the one, I have a theory that you might be similar, but I always like to dabble in different communities. not necessarily, I don't know if I, one seems like way too much commitment. I think, yeah, actually.

Katie Smith (53:45)

That's definitely my thing. Yeah, for sure. Yeah.

Beth Rudden (53:47)
Yeah, yeah,

I'm I'm and I'm also a very introverted, so I have to come back to myself and do my things. ⁓ So I have somebody and I will I will get them on here. But ⁓ I have friends who are building communities and we should invite people who are ⁓ my friend that I was thinking of as Kim. And she is reinventing theater and theater ⁓ in a way where it's like.

you're experiencing theater as a community. Like I saw Hamilton in London versus Hamilton in New York. Totally different, totally different experience. Different audience, different experience. Laughed at different places. King George was much funnier in London than he was in New York. It was hilarious. Like, I mean, the whole thing was just, just brilliantly done. And she's taking that and using that to create community.

Katie Smith (54:24)
Yeah, yeah, yeah.

What? ⁓

Beth Rudden (54:43)
and give an understanding of how to reinvent theater so that it is both a in-person and interactive kind of thing, but also something that you can take with you and can build a community from. And I know other friends of mine that are creating pop-ups where they're like, hey, I haven't gotten together with my ex-IBM friends in New York City. Come out here, Beth, can you come out? And I'm like, ⁓ I totally want to. How about we do like...

Katie Smith (54:57)
love it.

Beth Rudden (55:12)
A traveling show, but it takes those people who know how to create community. I remember. Yeah, I was about to say I remember, and I'm sure you did too. Like, wasn't community a bad word for a while? Like, don't develop a social app because communities are dead. They don't work. And I'm like, of course they don't work like organically with a freaking product. It's not a product.

Katie Smith (55:21)
Very important right now.

Yeah, jump

on to your branded Discord. You know, maybe some of them do, like, yeah, which people?

Beth Rudden (55:50)
I think it has to be organic. Let's talk about community next and bring in some of this stuff.

Katie Smith (55:56)
that. And I brought the Discord thing too, but it's like, it's people just selling each other. Even when you jump onto these Slacks, it's just people trying to sell each other something. Like the actual community piece is not clicking. Like I don't see it clicking yet. And I see new platforms that are like popping up. And so yeah, it's definitely like a conversation to have.

Beth Rudden (56:18)
I love it. This has been lovely.

Katie Smith (56:42)
hi, my name is Katie Smith and I am the co-host of And We Feel Fine.

Beth Rudden (56:47)
Hi, my name is Beth Rudden and I am the co-host for And We Feel Fine.

Katie Smith (56:51)
This podcast is brought to you by Humma.AI We are building empathetic AI made by and for community.

Beth Rudden (57:00)
This podcast was brought to you by BAST AI We are building AI products that are fully explainable, fully transparent, and keep all of your data yours.

Katie Smith (57:09)
All right, have a good one. See you next time.

Beth Rudden (57:11)
Bye bye.