At the edge of collapse—and creation—two unlikely co-conspirators invite you into a radically honest conversation about the future. This isn’t just another tech or self-help podcast. It’s a story-driven exploration of who we are, what we value, and how we might reimagine the world when the systems around us stop serving us. We blend personal storytelling, cultural critique, and deep inquiry into what it means to be human in an age of AI, uncertainty, and transformation. We’re asking better questions—together.
Because the world is changing fast, but maybe that’s precisely what we need.
Hosted by Beth Rudden and Katie Smith, two builders of systems and challengers of the status quo. Beth is CEO of Bast.AI and a globally recognized expert in trustworthy AI, with decades of experience leading data and ethics at IBM. Katie is the founder of Humma.AI, a strategist who drove innovation and revenue growth at major global brands before turning to human rights and technology for social good. Together, they make complex issues, such as AI and its impacts on everyday people, clear, personal, and impossible to ignore.
Beth Rudden is the CEO and Founder of Bast AI, a pioneering company building explainable, personalized AI for good. With over two decades of experience as a global executive and Distinguished Engineer at IBM, Beth blends anthropology, data science, and AI governance to create tools that amplify human dignity and intelligence—not replace it.
Her work spans healthcare, education, and workforce transformation, using ontological natural language understanding (NLU) to make AI transparent, accountable, and accessible. Through Bast AI, Beth is reimagining how organizations deploy AI that’s not only accurate but aligned with ethical values, cultural context, and cognitive well-being.
Beth is also the author of AI for the Rest of Us and a passionate advocate for AI literacy, epistemic diversity, and the right to understand the systems shaping our lives. She speaks globally on the future of AI, power, and social contracts—and believes we’re all stewards of the next intelligence.
Katie Smith is the CEO and Founder of Humma.AI, a privacy-first platform building community-powered, culturally competent AI. With over two decades of experience leading digital strategy and social innovation, Katie blends systems thinking, Responsible AI, and storytelling to create tools that serve dignity, not domination. Their work spans mental health, civic tech, and digital rights, using participatory AI to make systems safer, fairer, and more accountable. Through Humma.AI, Katie is reimagining how people and businesses engage AI that’s accurate, inclusive, and governed by consent and care. Katie is also the author of Zoe Bios: The Epigenetics of Terrorism, a provocative exploration of identity, trauma, and transformation. They speak globally on the future of technology, power, and justice—and believe human empathy is the intelligence that will define our time.
Subscribe to our Substack for bonus content: https://substack.com/@andwefeelfine
Beth Rudden (00:00)
like we're going to have the same thing five years from now where we're going to see that there were a lot of people who were using text. And I just, I don't want to disenfranchise those who just got a tool that, that you can now use it. It's, this is literally a tool of the patriarchy. This is the ability for anybody to sound like a, a white man.
Hi, my name is Beth Rudden and I am the co-host for And We Feel Fine and I'm here with my co-host Katie Smith. Katie, how you feeling today?
Katie Smith (00:48)
I'm feeling all right. You know, I've been really reflective lately thinking about like, despite the nonsense that's happening in the world, I'm actually really, really happy. I am having a problem with my knee right now, but other than that, I'm okay.
Beth Rudden (01:02)
My ears are bugging me. So I have like a swimmer's ear issues from I did diving and I popped stuff when I was a kid and now I get like ear infections and I have a Q-tip issue. I don't know if that's like TMI, but like, you know, too much Q-tip, like don't fuck with your ears.
Katie Smith (01:22)
You did it. You're
one of those people that really dug in.
Beth Rudden (01:25)
Well, yeah, but
it's, yeah, it's bad. I need to go get some antibiotics probably, but I'm avoiding the issue.
Katie Smith (01:34)
Okay,
we can answer that question honestly. This is everyone.
Beth Rudden (01:40)
yeah, I'm a fully grown adult human being. Of course I have like, you body issues, but yes, my ears hurting. Other than that, I'm kind of like you where I've gotten to a point where there's so much going on in the world, but I feel like I'm involved in ways that I can progress the plot or find people who have been, you know, doing amazing things and support them.
I think it's extraordinary times, but overall, I'm doing okay. It's almost surprising the amount that is being thrown at us, but in general, it's gorgeous to have a garden and to have ⁓ friendships and to have good food and long walks and in your case, beaches and surfing.
Katie Smith (02:34)
Yeah,
jumping into the randomness with you, Beth, is so much fun. And like all conversations in between too, you know. So we did a little prep for this session.
Beth Rudden (02:44)
what I really wanted to talk about is the
And I'll start with this understanding that I have about reading and learning how to read. And we spend six to seven years in developed countries teaching human beings that these symbols have this meaning in this context. And I know that there are so many different learning disabilities that we're likely discovering.
as we get more and more ⁓ literacy within different cultures, different people, different situations. And I know that dyslexia and the study of dyslexia actually informed literacy on how people, some people ⁓ with dyslexia understand spatial reasoning better because, and I know you're dyslexic, because you have
better spatial awareness. And most of the people in the work that I'm doing with ontologies and structures that provide scaffolding for an artificial intelligence to understand against that scaffolding or against those sets of facts, that is something that I feel like, know, folks with dyslexia, they have like a superpower almost where they have this ability
or well, were through necessity, they had to learn to communicate using different tools and different aspects. And what I'd love to talk about is this, there's a little bit of a backlash because ⁓ a lot of us who have been using generative AI for quite some time ⁓ can kind of, a human being can point it out when somebody is using generative AI.
Katie Smith (04:47)
Grrr.
Beth Rudden (04:47)
And
what I'm concerned about is that there will be a divide of people who are highly literate and who are able to use generative AI to be able to be more proliferate where they're writing more and pushing more out and publishing more. And that's a good thing. But what I'm more excited about is people who have dyslexia or people who have like English as a second language to use this generative AI tool.
to get your ideas out in the world. And like literacy and like when we learn to read, it changed our brain. You can see how the brain structure is changed in people who learn phonics versus people who learn to spot read. And one is definitely harder than the other. And so I'm always, and part of why I want
Katie Smith (05:26)
Mm-hmm.
Beth Rudden (05:43)
Part of what I want to do with fast is understand how our brains are changing as a result of interacting with artificial intelligence and I wanted to I don't know use you for a little bit of field research and could you tell me how you use generative AI and whether Hi Tomlin Yes, and and and whether you ever feel like
Katie Smith (06:05)
I was like, yeah, that's subject. Get into it.
Beth Rudden (06:13)
it's ⁓ you're being excluded because people who are looking at your posts are likely understanding that you're generating them. And how do you feel about that? Because I work really hard to make sure that I don't come off as sounding like an AI, but I also read extraordinarily quickly.
So I can like read it and be like change this, this, this, this, this, this. And so it's like, I'm constantly kind of going back and iterating and changing how I'm asking, how I'm prompting, how I'm using it. And I, being the archeologist, I want to lay everything down for posterity because it's an evolution. Like it's something that I'm evolving. And so I'm showing my work on, when, you know, in the before times when I was using the API, I could tell.
totally tell you what was generated and what wasn't. And
like we're going to have the same thing five years from now where we're going to see that there were a lot of people who were using text. And I just, I don't want to disenfranchise those who just got a tool that, that you can now use it. It's, this is literally a tool of the patriarchy. This is the ability for anybody to sound like a, a white man.
Katie Smith (07:37)
Yeah, wow.
Beth Rudden (07:40)
Help me unpack this.
wanna ask.
How did you feel when you knew that you could now generate content and put content out there in a way that you might not have been able to do before because of your dyslexia?
Katie Smith (08:00)
At first it felt like cheating. felt like, you know, you know, like I need help, you know, and it's like, I'm sort of cheating. That was my first instinct. And then once I started doing it more, I'm like, no, this is like, I don't have to worry about people interpreting me through typos, which a lot of people judged me just because of typos, right? Because of the elite environments that I was in, right? Especially people were just like, you know, how did you not?
Beth Rudden (08:18)
Hmm.
Absolutely, yeah.
Katie Smith (08:28)
get that typo and it would become a conversation about a typo in my communications versus what I was saying. And ⁓ so the fact that Grammarly and I've been an early adopter of Grammarly, like I've been paying for Grammarly since the beginning. It knows me really, really well. I knew that the AI was learning me and it has helped me tremendously. I will say though, like,
Beth Rudden (08:41)
Mm-hmm.
Katie Smith (08:54)
I remember the before days and then, you know, and then it's like AI helps me write everything now. ⁓ In the before days, you know, I wrote a book. I still have pain over the typos that are in that book. But what I like about my book is that that's me. That's my writing. That's my voice. And there's a part of me, even though it comes with typos, I like the sort of poetic fun way that I write.
Beth Rudden (09:14)
Mm-hmm.
Katie Smith (09:23)
And it's not grammatically correct. I never wanted it to be grammatically correct. I wanted it to have like a beat structure. So when you read it, felt sort of poetic, know? It had a rhythm to it. Whereas with professional communications, I can't write like that. I can't write like myself, you know? My creative writing does not translate to business. People don't get the message if you write it in a creative way. You have to write it in a particular
So when I started using Grammarly, and it's on my phone, it's on my browser, it is everywhere, I communicate with anyone. Yeah, it helps me actually communicate a message versus being criticized and judged for how I wrote the message.
Beth Rudden (10:11)
So I wrote a book, too, before ChatGPT. 2023 is when it came out, but like March, April, and we were writing it in November. And ChatGPT had no memory. And even some of the APIs were just very difficult. But I did have a generated portion in there.
And that was clearly marked as this is my conversation, you know, back and forth with the AI. And I think it's in an appendix or something. And I remember the struggle, the toil of getting to that rhythm, that intonation, that drumming, that like, I wanted it to sound like ⁓ a conversation and sound like, hey, I'm gonna...
draw you into this story, let me tell you a story, once upon a time, use story language and everything. And ⁓ one of the things that I really hate right now about generative AI, and I am so sorry for those of you who have not figured this out, but ⁓ one of the things that chat GPT or ⁓ open, like even Claude does this, ⁓
is that it uses counterfactuals or contrasts. Like, ⁓ it's not like music, it's like rhythm. You know, it's not like this. It's not just like this. It's more like this. And it's it's sort of one of those sort of sure signs like the dashes and the Oxford commas and everything. However, I...
Katie Smith (11:47)
However,
her comma is amazing. Okay, sorry, go ahead.
Beth Rudden (11:51)
Actually, why do you think the Oxford comma is amazing?
Katie Smith (11:54)
think that gets back to the beats. You take a breath, like you can make a point, like it's clearer.
Beth Rudden (12:01)
How many people do you know, know that a comma means take a breath?
Katie Smith (12:08)
I mean, I people in communications know that. you know, I've been surrounded by communications experts for more than a decade, but outside of that, I don't know.
Beth Rudden (12:16)
Yeah. It's like the serif. But I think there is a... It's just it's almost... It's very... It's supposed to be generative, right? But I do think that there is a ⁓ messy middle that we're in at the moment where...
Katie Smith (12:30)
really.
Beth Rudden (12:40)
I love having it where I can say, give me a bibliography and I can throw it a bunch of stuff and it formats it for me. And I don't think that's cheating. ⁓ I don't think that Grammarly is cheating. I think that's like using a calculator to check your work kind of thing. ⁓ But I do think that, you know, sort of like taking a...
Do you take, or how do you write with your conversational AI? And I'd love for you to talk about a little bit about the fact that you have trained your own model with Huma. And you've done a lot of work on making sure that this is something that does have your voice. Can you talk a little bit about the amount of work and time that you've put into that?
Katie Smith (13:31)
Yeah. So using AI to communicate for Huma or using AI to communicate as me, right? As the leader of Huma. ⁓ well, look, you know, different than the days where I would toil and I miss that toil. I hope I get to, you know, semi retire and toil. I would really love that. ⁓ but I just don't have that time anymore. So
Beth Rudden (13:55)
Mm-hmm.
Katie Smith (13:56)
Normally what's happening is I love perplexity, really love perplexity for research. And so I will get my sourcing and my news from perplexity and then I'll put it into my, my trained AI, you know, ⁓ model and say, okay, and I'll turn this into a sub stack, you know, ⁓ post. And then I throw that into Grammarly and then I start editing.
Beth Rudden (14:02)
Mm-hmm.
Mm-hmm.
Mm-hmm.
Okay.
Katie Smith (14:25)
And so the final polish is me, a human, and I try to take out some of those telltale signs, because I think they're just annoying now for the reader.
Beth Rudden (14:32)
Yes.
And it's like, I'm so sorry for pointing them out, but that's, that's the problem is like our brains, we adapt. We're so pattern making or like we're, we seek patterns everywhere. So the minute that an AI generates something, we're like, wow, that's amazing. That's awesome. It said so well, we stick it in Grammarly. We're like, crap. Now you can see blah, blah, blah. And it's like you, you go through that critic critical thinking.
you know, about it. And so your final product is always you. And that's
Katie Smith (15:03)
Yeah. So
it is, it is always me, but I will say it like, again, like it's different. It's not that sort of poetic, fun, creative writing style that is distinctly my voice. You know, I, one of the best feedback I ever got from a UCLA creative writing professor was like, you actually have a unique voice, Katie, like keep going, you know, keep going. And like, I need to get paid, poetry doesn't get paid, some people get paid, good on you. But.
Beth Rudden (15:11)
Mm-hmm.
Katie Smith (15:31)
No, so what I have to do now is like, I want to share what I'm thinking. So I can signal to stakeholders like, look, I want to very clear with everyone what my values are as a leader of this organization and a contributor to this movement.
And so I want to make those signals really, really clear. So lately I've gotten a little bit more confident and a lot of that is because of the AI, because I just don't have time to toil. So if I go through this process, then I can at least get to something quicker and just get those pieces out. And so, yeah, I'm curious how it's landing because I could see some people saying, well, that's just AI, but it's like, no, actually there's a lot of thinking behind it in terms of even my conversation I have with the product.
Like I don't have the type. I just have conversations with perplexity and I'm constantly like that curious little That's right. And you know, it's me asking the encyclopedia I am now just having these deep like very curious conversations with perplexity and then I synthesize that and then I take it to my model and then I work it more and then I get the grammar lane and I fine tune it and then it's out in the world like it is it is me.
Beth Rudden (16:18)
Yep.
using voice to text.
Katie Smith (16:46)
I just used AI to do the whole thing.
Beth Rudden (16:50)
So what would you say to an author that you respected? I mean, how would it make you feel if you found an author who said, I'm going to block anybody who puts a comment that is obviously generated by AI?
Katie Smith (17:08)
would say, you know, be careful. You could be discriminating against people who like really need that tool. You know, like look, before...
anything about Gen A, Gen AI came out and the impacts on climate and the swimming pools worth of water that we use for training. the list goes on of all the harms we've learned about Gen A, Gen AI. Grammarly was just like Google search and maps and like theory, like was AI behind the scenes. And it's like, nobody knew about it. And so I was already using it. But now that like,
There's this coded way that AI works. It's like we're being judged for a new tool because the new tool is flawed. know, it's like Grammarly at least just corrected the sentences, but you can't really tell it was Grammarly. was just like, that's a very vanilla way of Katie speaking.
Beth Rudden (18:05)
Well, what I found with Grammarly, and ⁓ I actually don't like the Oxford comma, but I rarely do lists more than, and definitely not at three. ⁓ So it's so interesting to me because the Oxford comma and a lot of the grammatical
you know, systems and syntax, it has a richer meaning in my mind because I've studied Noam Chomsky and linguistics. And it's so frustrating because there's such a little bit of a drop of understanding in the world about all of the things that, you know, computational linguists know and what.
people are studying as far as language. And it is fascinating to me because that's such a, those people are not computer scientists. So they're not in this world of building AI, but they are the very humans that have the dense, deep understanding of how these models should be trained to model the humanness of
you know, the language that gives you that turn of phrase or that stochasticity or the randomness that makes things ⁓ rhythm. And the, know, that tone and intonation, yes. Yeah.
Katie Smith (19:28)
Mm-hmm.
Jazz! No, poetry
to me is a little bit like jazz. Like, it's not going for perfection. It's just going like, how do you feel? You know?
Beth Rudden (19:41)
Yeah, I hold on for one second.
experiment. And this is what I'm going to do is I'm going to bring out the life of the mind by Hannah Arendt. Do you know who Hannah Arendt is? Yeah. Yeah. And so this is a this is a two part set. One is thinking one is willing. Okay. And I want to I want to try to start. I mean, I could probably start anywhere. But like,
Katie Smith (19:55)
Yes, of course.
Beth Rudden (20:15)
I'll just start at the beginning. The title I've given this lecture series, The Life of the Mind, sounds pretentious. And to talk about thinking seems to me so presumptuous that I feel I should start with less of an apology than a justification. No justification, of course, is needed for the topic itself, especially not in the framework of eminence inherent in the Gifford lectures. What disturbs me is that I try my hand at it, for I have neither claim nor ambition to be a philosopher or to be numbered.
among what Kant, not without irony, called Decker von Gerwib, professional thinkers. Literally in the first paragraph, she's like quoting Kant. She is referring to things that very few people would have any idea what the hell she's talking about. And she's doing it in her rhythm. And I don't sound very presumptuous, you know, and it's like the so fascinating to me because
All of the authors and the people that we have as diverse voices, at least in my head, in my library, that I surround myself with all of the wonderful human beings that I'm reading, it's like there is ⁓ a, this is a juicy topic. So I am most definitely as sapiosexual.
Like I love intellectual thoughts and like totally get off on intellectual masturbation and whatever you want to call it. You know, like I am very much like out loud and proud about that. People who know me are like actually one of the biggest compliments I got recently is it took two weeks for somebody who knows me. He was like, Beth, I got to tell you something.
⁓ you really gotta dumb it down.
like, or everybody could or everybody can can come up. Yes.
Katie Smith (22:16)
Come on, one of the
things I love talking to is because you do have this knowledge. Preach! Go! Come on.
Beth Rudden (22:24)
But I think what I'm trying to figure out is like there's a level of effort that people are going through and that level of effort is decreasing for some and increasing for others, but it's changing how we are able to push things out in a way that is changing our own brain.
Why I wanted to talk to you specifically about this is you do have a disability, but it is also a superpower because we can learn from how you're learning to be able to use artificial intelligence to push things out. And like, what if you could do that with your rhythm, with your intonation, with your voice?
Katie Smith (23:18)
amazing and also scary but amazing because I don't have time someday I hope to have time but in lieu of that time
Beth Rudden (23:25)
So,
well, let's talk about that time for a second because like we, both of us are CEOs, both of us are running businesses and doing the banal things of getting money and making sure payroll is there and all of the things that you do when you're running a company. But like the ability to reflect and to generate text and to think about things and think about how it's changing our brain, that's what
this wonderful time is and in this space. And I heard this. ⁓ It was a Susan Cain ⁓ post that is most definitely written by Susan Cain. She's such a lovely author. And it was on Substack. And in this post, she used a phrase twice that caught my attention. And the phrase is, I don't want to work. I want to make a living.
Katie Smith (24:22)
I saw you post something about that.
Beth Rudden (24:25)
And I was like, yes, that right there, like I want to make a living. What if making a living and the ability to have time to toil on the writing and have that effort, like what if that could be our vocation? Like what if that could be our calling? And what if we had that world and everybody's like, you're such a, you know, optimist, Beth, there's no way that's such a utopia.
But I'm like, wait a second, we do have enough food and technology in the world to know how to feed everybody. We do have enough understanding of governance to create governance systems where people are not killing each other. Did you read or did you listen to, ⁓ I do it synonymously, my daughter only listens.
Katie Smith (25:01)
Mm-hmm.
Beth Rudden (25:22)
And yes, there will be a deficit in her world because she is not going to read as much, but she listens to two to three books a week. And there is no way I am taking that away from her. And, you know, she's putting in the time in order to get to the story. And yeah, I mean, Hannah Arendt and that entire world that, you know, we came from would be like, my gosh, that's such a poo poo. But.
Katie Smith (25:31)
⁓ wow. ⁓
Beth Rudden (25:53)
Sweet baby Jesus, could we please, I feel like there needs to be an elitist translation because for so long elitists have ruled the world and now I'm like, but what if they didn't? What if everybody could use the tool of the dominant paradigm and sound just like the dominant paradigm? What do we do? And how do we surface our own?
just more of the same droll sounding overconfident fluff.
Katie Smith (26:32)
You know, there's interesting platforms, know, like Substack and you see like this mad rush to Substack right now. You know, Washington Post is looking at, you know, buying it. So that's very interesting. There's some dynamics there, but the reason why that's happening, right, right, is because media is so fragmented and people are looking for authentic voices. I mean, that's why I think this space is great. Like we are unfiltered, y'all. We are just talking and we decided on the topic, you know.
Beth Rudden (26:57)
You
Katie Smith (27:00)
Yeah, loosely over the weekend, but maybe right before this. Although we've been talking about it a little bit. But you know, there's like, there's, I would say there's a need more than ever for people to use their voice. I mean, isn't that what a democracy really is all about? And so this idea that there's going to be this new crop of voices that are going to bubble up and look, our new talent is going to turn into superstars. It may not be us because we're such nerds. Who cares? We'll have our little niche.
Beth Rudden (27:04)
the
Hahaha!
Katie Smith (27:30)
I just get to carry it out with, which is just fine with me. no, think that there's platforms now to democratize our voices in ways that are very different than what Metta has done, you know, like actually giving us voice. And so I think YouTube does that too. The thing I would say is ⁓ versus audio, like even as someone who struggles to read, like I had to read so many legal documents yesterday.
my God. you know, I have to have that reader, you know, like the adaptive sort of like, just like help me do every single lines, you know, to read. It was so painful. God, I wish I could have just listened to that, but they didn't give me that. You know, I had to actually toil and read and edit this like legal document. But
listening to podcasts, but actually I forced myself to read books and I forced myself to read documents and it's painful and it's hard and exhausting. ⁓ But I do think it actually is really good for us to read to like Empire of AI. Like I'm not listening to that. I'm reading that, you know. Did I scan through it first to like pick up some things? Yeah, but I'm like, I'm reading it now. And it's, ⁓ it's great. It's just
Beth Rudden (28:19)
Mm-hmm.
Mm-hmm.
Katie Smith (28:43)
And I hope we don't lose that because I do think there is some developmental ⁓ piece of that that seems like it's important.
Beth Rudden (28:50)
You
know, the irony is that there were people having this conversation, definitely not on YouTube with all of the technology, but same conversation about reading, same conversation about radio, same conversation about television.
Katie Smith (29:02)
Right.
Beth Rudden (29:06)
There's this ad hedonia, which is a Greek phrase and I am studying the hedonists, which ⁓ comes. So you have a Socrates, Plato, Aristotle, and then like the hedonists, which were the people. There were sort of two camps. One camp was.
all the pleasure all the time. The other camp was, you know what, you kind of need to get to a moderation, including moderation, everything in moderation, including moderation. And ⁓ the reason for that is something called ad hedonia, where people become like immune to these like short clips or immune to all of the writing that's out there. So they're just scanning things, looking for, you know, something that that is interesting to them.
or worse, mean, I remember ⁓ there's ⁓ factoids. So there's an actual literal phrase that means like, titles and generated like the ⁓ news titles and the news articles, they're often including of factoids, which are very biased.
facts that are made to seem sensational. So you'll read the article. And so that's what the entire LLM is trained on is a bunch of factoids, which again, like it's not an actual fact. It's a factoid. And there's this really great story about ⁓ somebody in IBM when they had Watson and they went to Japan and they completely
lost the entire sales pitch because in Japan and Japanese, the word factoid like was translated by the translator as ⁓ propaganda propaganda. And which again, so somebody's just like scanning headlines. They're just scanning factoids or they're just scanning propaganda, which is not good for you. So actually opening up a book.
Katie Smith (31:15)
now.
Beth Rudden (31:19)
Is that like I just.
Katie Smith (31:21)
What the
fuck is resiliency? feel like me reading right now and stepping a step, like taking a step back from the news is like, okay, I see what's happening with the bill, right? The big loser bill. ⁓
Beth Rudden (31:31)
Yeah. Big
loser.
Katie Smith (31:34)
Loser bill,
what a loser. People who are voting for this and wrote it, like big losers. But worse are the people who are going to lose their health insurance, more than maybe 16 million people, it's insane. So it's like, you know, the deficit, like future generations, like what we're doing to future. I don't even have kids, the people with kids, come on, you should care about the deficit. You actually should care. And the proposals from the Republicans are ridiculous. Anyways, I digress. But you know,
There's only so much that I can take. So it's really nice just to like have some music on and just read and just relax for as long as my little puppy will let me. It's actually been delightful even though it's a painful toiling experience.
Beth Rudden (32:24)
Mine is I have stacks of books everywhere. And I will pick books up and read just like I did earlier from whatever. And it just puts me in a different mood. And then I have the books that I'm actually reading. And I'm trying to get through, but I'm often interrupted. I do use the Pomodoro or the tomato timer kind of technique where it's like 25 minutes on.
Katie Smith (32:51)
Yeah.
Beth Rudden (32:54)
five minute break, get up, run around, do stuff. But ever since I was a little kid, I just, I don't know how to sit still very well. like sitting still or like walking with a book was always hard, but that's why I love Audible. And you know, I definitely consume information differently ⁓ when I'm listening to Audible. My opinion is that I feel like people should work as hard as they can to express themselves.
And if there is a tool out there that can allow ⁓ people who have been, there's lots of studies that I'm sure people could do in order to showcase that ⁓ people from India who had a British accent often got paid more than people from India without a British accent. I'm sure there's some supporting evidence out there, pretty positive, but I think that
That's where I was talking to a friend of mine in Germany this morning. And she was like, it's very interesting because the uptick of American companies using integrated AI is not as strong as European and Asian companies. And then from a personal use, India is off the charts. And I'm like, well, of course, they can sound like the white guy.
Katie Smith (34:14)
yeah.
Beth Rudden (34:17)
Why wouldn't you use this as a tool in order to be able to truly communicate what you're thinking in the language, you know, using this tool? So it's just, it's fascinating to me that like,
Katie Smith (34:28)
Yeah.
open
doors in these weird ways.
Beth Rudden (34:34)
It
also reminds me that the purists are probably kind of like the people at Facebook that were at the top that never let their children use their own product. And what does that say? And the reason that I think America is very behind ⁓ and Europe has a sort of a jump on AI integration is they had to get their shit together for GDPR.
Katie Smith (35:04)
Yep.
Beth Rudden (35:05)
And in order to do that, you actually have to classify data. You have to know what you're doing. You have to have some form of a knowledge base that you can understand what it was. You can't just, you know, take all the shit that you find on the internet and throw it into some neural nets and see what you got.
Katie Smith (35:25)
More nodes, more nodes, more compute, more compute. It's like, god, so.
Beth Rudden (35:28)
Yeah, I'm done.
I heard, well, all right, I'm not done. I heard today some stats. Something like 13 % of all the power in the world is being used by AI. By 2030, that's going to be 27%. And that just, makes me, I have learned from Glennon Doyle and Abby Wambach and sister that
Anger is a signal of your need not being met. And so if you get too rage, it means that you are starving for that need to be met. And I am rageful at this ridiculous like space race for bigger and bigger. They're building literal data centers. And you know, if you
Think about them as like churches. To what? To consuming power for something that isn't even able to do tone and intonation in a way that makes language sound like a human? Here is a prediction. What if in five years or so,
Katie Smith (36:30)
Yeah.
you
Beth Rudden (36:53)
we started to have people like my mother who reads an extraordinary amount and can read very, very quickly ⁓ and was an English teacher and who figured out different ways to teach people who were visual or who had dyslexia or who were on the spectrum. And she taught me a lot about my own disability and being able to sit still enough to read. But what if
there was a job where it was a human reader. so the AI could generate text, you would want to pay more to have a human reader go through and make sure that it was read by a human. And that job could be done by anyone of any age.
Katie Smith (37:45)
I like that. It's a great idea. ⁓ I like some of the thread that you're weaving is like, there's the judgment. Every time a new media that comes out, the old version, the old guard is like, no, that media, no to that. It's the end of all things. And then it just cycles. And so now it's like AI. So I will say like, has AI helped me?
Beth Rudden (37:57)
Yes. Yeah.
Katie Smith (38:14)
you know, in the past, like, few years, even launch my company. Yes, I'm not even sure I would have been able to have moved as fast as I did because I was able to communicate so quickly to various different people because of these tools. And it went beyond Grammarly, right? It went beyond what Grammarly was able to do for me, you know? And so, look, there's definitely ways that it's really useful for us. However,
would I had if I knew then what I know now about the compute and the long-term impacts, I don't know if I would have built my model and my memory quite the way that I did, you know? You know, this is why I really think what Bast is doing is so important and we need to talk more about the alternative ways, you know? I think you can use Gen.ai to a point, like, look, I still am a proponent of open source. We already have it, it's already there, give us the model.
Beth Rudden (39:12)
Mm-hmm.
Katie Smith (39:13)
we'll do good things with it. And then evolving into how do we now merge with something like Avast to do the explainable AI piece so that moving forward, at least it can be explained because the foundational models have been scraped. They scraped everything, read Empire of Air AI, it's horrible, they scraped everything. ⁓ I know whatever the courts are saying now, that's one thing, just use your own judgment. They scraped everything. Those people did not get paid.
Most of them didn't. Yeah, so it's like, how do we just pick? Yeah.
Beth Rudden (39:48)
Well, let's
talk about what I see for the future and what I'm doing with VAST is would you like a $200,000 a month AWS bill because you're constantly having to process large volumes of data to get to statistical certainty? Or do you want to pay like $20, $30 to AWS a month because you know exactly what data you're processing for what problem you're solving? ⁓ It's so simple.
like, and it's so like ridiculous, but the problem is, the, you know, it's, like cigarettes are good for you. Marketing is that it is absolutely a huge amount of hype that tells the entire world that you absolutely need big compute and big data to build AI. Some AI systems can benefit that way.
But most of the time, we are bringing down and even OpenAI is bringing down the cost of doing this. I just decry the space race. That being said, there was this other, actually put it in a blog and I remember doing this where I wrote before Empire of AI came out, I wrote a whole book in 2023 with my co-author.
talking about all of these things and people still AI for the rest of us. we want, we, yeah, we were, ⁓ we were really watching, you know, the last of us and we were thinking about all of these things that were, you know, the mushrooms and that, that show is wonderful and terrifying in and of itself. But like we were, we were thinking through how can we get the word out?
Katie Smith (41:13)
Yeah, very exceptional. What's the name of your book?
Great work, I'll right
Beth Rudden (41:41)
because nobody knew this. Nobody still knows this. When I was on the road and I remember so many VC conversations where I would start with, did you know that an LLM takes the equivalent of 15 swimming pools full of fresh water to train at once? And they'd be like, but they only have to train at once, right? And I'm like, my gosh, one training cycle, they would do like,
of those a day. I'm like, this is insane amounts of power. And they just could not understand what I was saying. And I couldn't figure out how to dumb it down anymore because it was telling them that Santa Claus wasn't real. And they don't want to believe me. A man does not understand something his salary depends on him not understanding. So we wrote this whole book and we still get
reviews and people coming to us saying, ⁓ my God, what you guys are saying is so relevant for today. And that's because we applied principles like parenting. How do you use parenting to train your AI model? Go read the book. It's fantastic. And to your point, it's human written. It's going to become more precious because the amount of AI generated
Tower of Babel's shit is going to become the noise. And so how do you find those signals? AI for the rest of us, go read the book. What I purported in that and what I predicted is exactly what I'm thinking we're going to see with this. How did we how do we grow up an entire generation of human beings that thought that the Internet was theirs to use? Like, how did we even
create human beings in this situation where they thought that they could take all of, and now there is a court case saying that Anthropic doesn't have, you can buy the book and then you can use that book legally to train your models. And that is devastating for us authors who toiled, right? And so what I think
Katie Smith (43:57)
Yeah, totally.
Beth Rudden (44:00)
we're gonna see is some forensics like me and forensic, almost like data information, understanding of where it comes from so that you can actually trace it back to the source. And this is what I've been literally banking on is that I believe that when you can prove origin, you can prove provenance and lineage,
it will make whatever you have more valuable.
Katie Smith (44:32)
Yeah, we're working on it. Let's go.
Beth Rudden (44:34)
Because when we're, and I don't have as much conflict with the generative AI for words, but I definitely do with mid-journey. And that's because I can't draw. And I can't draw like I want to draw. And then I can go and I can use mid-journey. And the prompts that I've always sort of done with mid-journey somehow turn out exactly in the area that I want it to turn out.
I also am so conflicted because so many artists' stuff was stolen. So I'm waiting for the day that my technology or something like my technology can help artists get reparations. I don't know, we're not very good at reparations.
Katie Smith (45:21)
That's working on that actually, so we should talk offline. I can see someone who's really trying to do that. Anyways.
Beth Rudden (45:24)
Yeah. Okay.
But think
about it, like when you can prove that you have origin or you have provenance and, you I use my journal because I journal every single day. It's like somebody could say, oh, you you could have written that this morning just because it has today's date on it. How do I know it's real? And then you could say, well, here's the entire book with all of the dates. And then here's the 17 other journals that go with that book. Would I have really generated all of that?
Katie Smith (45:54)
you
Beth Rudden (45:59)
And that's where I think that we are going to value things that are crafted, that are human created far more. So how do we use that with remixing that idea or remixing people's ideas and taking what we love and making it relevant for us to be able to use in a way that doesn't create
this conflict of having to use the mid journeys where you know they just scraped all of these artists work without really, in your less, you don't use as much AI generated pictures at all. Why, why, why?
Katie Smith (46:42)
No, I really avoid it. Yeah, I love
the authenticity of real photography. Even if it's not great, it's still better. I love photography. And I do, yeah, okay, so a couple of reasons. One, I recognize, you know, I'm friends with many artists that my artist friends are just horrified by the idea that their art was scraped and now a couple.
a repurpose that commercialize that they're making money, they're getting nothing. know, the artist communities are really struggling right now. So I just feel like it goes against my community of friends to use that type of stuff. With humor, we've avoided it completely. We're not doing, you know, that sort of computer vision on that side or even the image generation. You know, if we can get there by partnering with artists, we're thinking about that. And then we would repurpose their art. So we're exploring, as you know, like a
Beth Rudden (47:18)
Mm-hmm.
Mm-hmm.
Mm-hmm.
Katie Smith (47:37)
creative storytelling campaign right now that I can't talk about, but I'm really excited for when we can, because it leans into a responsible way to do this with an artist, with artists. ⁓ But yeah, just, it doesn't feel good. think they don't look good. They're shitty. You know, like you've created beautiful images, but like, but just like you said that there's telltale signs in tech. can talk.
Beth Rudden (48:03)
Mm-hmm. yeah.
Katie Smith (48:05)
AI images just look blah. They just like, I don't like how they look. And I, know, good, or different can dabble in little photos I can find throughout the years and just throw it. I'd rather have that, even if it's like not perfectly aligned with the story.
Beth Rudden (48:23)
Okay, so I think we both have our taste and this is, so glad we talked about this because my community of people are authors and the authors are, and that's why I have such angst in using any sort of generative AI without.
making sure that it's like a separate section or people know that this is like purely generated AI versus something that I toiled with. And you are in the same way. You have a similar distaste or disgust for AI generated art.
Katie Smith (49:10)
Yeah. And I feel like when it comes to the propaganda angle, like, look, sure, there's text, people can write all day long nonsense and try to persuade you on something, but there's nothing more powerful than an image in a video. So the idea also that AI is being used to propagate propaganda to write is sort of terrifying to me. I don't want to have anything to do with it until that's regulated and controlled.
Beth Rudden (49:14)
Mm-hmm.
Katie Smith (49:37)
I never want any of my images ever to look like AI. Yeah. And even on Humma, I would really like people to stick to real images and not AI.
Beth Rudden (49:40)
Mm-hmm. Yep.
How do you signal, I mean, just by doing it, we're not talking about these protocols. These are unwritten expectations. These are unspoken expectations. How do we kind of continue to go on when it's just, it's such a paradox where like I...
love the images that I generate, but I hate knowing the pain that the community is going through. ⁓ I can do something about it with language because I have an affinity for being able to read and write very quickly, but not to draw. And then you have that with, you know, drawing with your artist community and you can kind of...
use that and that that that allows you to kind of do that purity in the same sense that I'm doing that purity with language. God, I want to I want everybody to talk about these like unspoken expectations of how they are defining their taste out loud and proud and how they are expecting the people with similar tastes to get the signal like you and I have never talked about this even for this podcast.
I was like, ⁓ I love the boho art. Remember, we had this conversation. Yeah. ⁓ Goodness, Katie. This is cool. Like there's all these ways that human beings are expressing themselves to show what they love, what their taste is, but what they value like Hannah Arendt, like quoting German philosophers and saying how they don't want to be like.
Katie Smith (51:27)
but they value, right?
Beth Rudden (51:37)
German philosophers. it's so it's fascinating. And we are we are barely scratching the surface of what it looks like to have a good understanding of ourselves, of why we know what we know, why we do what we do, why you don't want to use art, you know, use generated generated art. That is I think that that's incredible. And I think that that's where
Katie Smith (51:39)
you
Beth Rudden (52:07)
In some ways, I think we have to really think about, ⁓ we have to think about the tools that we're using and we have to think about who is creating those tools, who is benefiting from those tools, what is the data that was used and this goes back to the book that Phaedra and I wrote. We gave all these people all these questions because we're like, if you ask these questions, you're not gonna like the answers.
Katie Smith (52:32)
What?
Beth Rudden (52:34)
So we're not going to tell you the answers. Just go find out for yourself and ask these questions. What were the data sources used? And that's what Karen Howe did in an empire of AI. She's like, because the New York Times and the Washington Post and many journalists did such a great job of forensically uncovering what data sources were used and the telltale signs of all this kind of stuff. think this is such a. ⁓
It's an interesting time to be alive. And I like this conversation so much better than the stupid. How do I make my life more productive conversation with AI? Because fuck that shit. I want to make my life more interesting to other people by explicitly sending my signal out to the world to find my other sapiosexual people. Right.
Katie Smith (53:27)
Yeah, well, I
100 % agree. And I think people will be craving a more human experience as we get just in dated with this AI, you know, marketplace that's going to become more and more automated and more and more efficient and more and more agents. I mean, that is I mean, we're going there again, if we like it or not, all those like, what's what is powering that? it like,
the mastermind's doing the puppetry today, or is it someone like Vast? I vote for you. And in terms of products, we're really focused on creating a community network that creates authentic connections and gets people offline. We weren't able to have a human experience, like build community, meet in person, use this platform to share experiences and interests, and I won't get too much into it, but like,
Beth Rudden (54:03)
You
Mm-hmm.
Yeah, that's right. That's right.
Katie Smith (54:25)
Yeah, we're banking on more humans.
Beth Rudden (54:29)
It has to be this way. ⁓ It's, you know, the, in all of, in all of human history, there is only one crystal ball and that is going to be literal demographics. So if we have all of this AI, agentic kind of ⁓ systems, those are going to be part of our demography. They're going to be part of our demos. They're going to be part of our
city they're going to be part of our
Katie Smith (54:59)
It'll
be associated with identity. will. It's funny because we're like saying, I don't think politics is so that it's like your agent is going to signal who you are. It's going to be.
Beth Rudden (55:12)
Whether you like it or not. I think that the choices that we have today where we can be finicky with not copying and pasting from generative AI onto a blog or not, or doing whatever we're doing or using mid-journey to generate art, that sends a signal. And I want to have more conversations.
about this because I feel like ⁓ it's the only way to introduce maybe a new way to ⁓ be curious about what and why people do the things they do and what their culture says about data and about using somebody else's data or using a system that we know took somebody else's data without their consent.
I'll be damned, we totally live in this beautiful age of transparency where it is allotting us the understanding.
to learn more about who you are and what makes you human.
Katie Smith (56:19)
explore your ideas. I'm really having fun with that right now.
Well, that looks like the end of that episode. Thank you so much, Beth.
Beth Rudden (56:27)
thanks for watching and listening And We Feel Fine. It's available to stream on YouTube, Spotify, iHeart and Apple Music. Don't forget to follow, subscribe, rate and leave a review. We'd love to hear from you.
And We Feel Fine is sponsored by Bast.ai, where we are pioneering full stack explainable AI.
Katie Smith (57:09)
And We Feel Fine is also brought to you by Humma.ai. We're a California benefit corporation creating empathetic AI made by and for community.
Beth Rudden (57:18)
See you next week.