The WorkWell Podcast™ is back and I am so excited about the inspiring guests we have lined up. Wellbeing at work is the issue of our time. This podcast is your lens into what the experts are seeing, thinking, and doing.
Hi, I am Jen Fisher, host, bestselling author and influential speaker in the corporate wellbeing movement and the first-ever Chief Wellbeing Officer in the professional services industry. On this show, I sit down with inspiring individuals for wide-ranging conversations on all things wellbeing at work. Wellbeing is the future of work. This podcast will help you as an individual, but also support you in being part of the movement for change in your own organizations and communities. Wellbeing can be the outcome of work well designed. And we all have a role to play in this critical transformation!
This podcast provides general information and discussions about health and wellness. The content is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Always seek the advice of your physician or other qualified health provider with any questions you may have regarding a medical condition. Never disregard professional medical advice or delay in seeking it because of something you have heard on this podcast. The podcast owner, producer and any sponsors are not liable for any health-related claims or decisions made based on the information presented or discussed.
Caroline Chubb Calderon
Jen Fisher: [00:00:00]
As AI and intelligent machines become increasingly integrated into our lives and workplaces, questions about human well being, purpose, and mental health take center stage. How do we preserve our humanity while embracing technological advancement? What role should AI play in supporting well being and mental health?
And what are the risks that we need to consider?
This is the WorkWell podcast series. Hi, I'm Jen Fisher, and today I'm thrilled to be talking with Caroline Chubb Calderon. She's a futurist, humanist, and the founder and CEO of Hello Humanity. A company dedicated to reimagining humanity in the age of machines. [00:01:00] Caroline works with leaders worldwide to create a future where technology enhances rather than diminishes our human experience.
This episode of the WorkWell podcast is made possible because of our friends at Lyra Health. Lira Health is a premier global workforce mental health solution trusted by leading companies like Starbucks, Morgan Stanley, Lululemon, and Zoom. Lira provides personalized care to over 17 million people with fast access to evidence based providers and tools that deliver proven results, including faster recovery and reduced healthcare costs.
This season Lyra and the WorkWell podcast are teaming up to bring you more insights on how to build a thriving work culture for today and the future. We'll be bringing you cutting edge data and research on workplace mental health and well being and we'll have some Lyra experts occasionally join us to share their perspectives on on [00:02:00] workforce mental health and creating psychologically safe and effective work environments.
Find out more at lyrahealth. com forward slash WorkWell. Thank you to Lyra for helping us elevate this season of the WorkWell podcast.
Caroline Chubb Calderon: Caroline, welcome to the show. Thank you so much, Jen. It's such a pleasure. Honestly, I can't wait for this conversation. I'm so delighted. Thank you for putting this work out into the world.
Let's talk about that.
Jen Fisher: Your work focuses on the intersection of humanity and technology. That means a lot of things. Um, so can you explain that to us, but also the way we're living and the way we're using technology, the impact that it's having on, on human well being?
Caroline Chubb Calderon: So, yeah, um, rich question. I have spent a lifetime thinking about how to unlock human flourishing and human potential within organizations predominantly, twinning with another [00:03:00] field, which is around disruption, right?
How do we unlock human potential so that we can disrupt markets, so we can disrupt futures, so we can really be the leaders of emerging futures? You'll hear me often say I'm a futurist and a humanist. The futurist part studies the impact of autonomous and technology machines on business and society to understand those disruptions.
And the humanist part is around understanding how we can lead for an emerging future of greater human and planetary flourishing. So those twin areas, uh, back in 2015, believe it or not, I'm a little bit aged in this area, um, um. led me to recognize that, you know, there was a profound shift in what was happening in the marketplace.
And that was led by the emergence of early artificial intelligence. And I, of course, because my work is in futurism, I could foretell that we were going to have artificial intelligence that would be dramatically [00:04:00] disrupting, not just the future of work, but the future of life and the future of society. So I have been hard at work over the last nine years, really exploring alternative visions for what the role of a human will be in this newly recalibrated world of artificial intelligence.
And. My personal hope and my personal vision is a hopeful one. I'm extremely optimistic. A lot of people have dystopian ideas of what this future would look like, but I actually believe that if we ushered this moment with wisdom, with care, with utmost ethics and responsibility, which this moment is calling for, I think we could create a future civilization of more freedom.
More compassion, more humanity, because the labor of our work, the labor of our lives, the [00:05:00] things that keep us so locked down would be bequeathed to machines who are going to be able to do it better than us and we're We get to do what humans do best. We get to be human again. We get to reimagine a civilization that's more centered around human and planetary flourishing.
And that's exciting. That's really exciting. So I started Hello Humanity, um, with the mission of reimagining. Humanity in the age of AI and doing, um, you know, multiple things, re imagining the economy, re imagining what it means to lead in this moment, re imagining what human intelligence really looks like.
What is that space that's outside of artificial intelligence and re imagining what planetary and human flourishing would look like in that future. So we're actively talking to a number of different organizations. We're in front of conferences all the time and really seeding the potential for this future.[00:06:00]
Jen Fisher: I think many of us either just don't know or we've forgotten what it means to be human because we've lived in this world and worked in this world that In many ways has kind of pitted us against machines or told us that we need to be able to keep up with machines, um, and measures us through our productivity and things that are kind of machine like measurements.
And so in this. You know, in this time, in this future, what is, what does humanity look like?
Caroline Chubb Calderon: Uh, I love this question and I want to answer it by giving a little bit of context, humans, you know, over the last 200 years through the industrial revolution, we've really been trained to be operating more like machines.
It has been a necessity. [00:07:00] Um, it is a hundred percent driven by our economic model. Capitalism is a fantastic economic model, but it has led to diminishing returns. Um, it has led us to be highly, highly extractive with the principle of growth. It's frail. It's a very frail system because we can't have, as the saying goes, infinite growth on a finite planet.
So already there, we can't have that. Same goes with human beings, like we can have infinite productivity from human beings that are fundamentally cyclical and need and need the building of resilience in order to sustain themselves. So the economic model is the reason why we are where we are, why there's so much mental health issues, why we are separated, why we live in eco chambers.
Um, you know, just tie it all back to how capitalism is really driving The [00:08:00] attention economy, all that's happening right now, if you really look at it from a sober point of view, is, Jen, we want your eyeballs on the screens as long as possible. We want to scrub all your data to know as much as we can about you so that we can commercialize you.
We want to expose you to as many imprints of potential sales, points of sale. And we want to extract from you as much as we can for our own benefit. That's not sustainable, especially in an environment where we have exponentially smarter artificial intelligence that can do that so perfectly that we end up losing our own agency.
We end up only simply being manipulated all the time. And I think we're feeling that already. So there's a rupture. It's happening. We're experiencing, we're seeing it and we need to re Fundamentally realign our ideas, our [00:09:00] economy, our values of what it means to be humans towards something that's a little bit, that's not a little, but much more nourishing.
And so where that takes me to is if, if this economic model of capitalism is towards the end of its life, then what is this new economic model that can be given that we can give birth to in an environment of artificial intelligence? And how do we modulate it in a way where we can lead to true human and planetary flourishing?
And so that leads me to lots of research. There's a number of different economic models out there. There's, you know, post growth, green growth, you know, dah, dah, dah, dah. The one that really stands out to me as a potential, a real potential realignment and flourishing for the future is around regenerative economics.
And so what regenerative economics asks is for an economics to be in right relationship with living systems. And [00:10:00] I think that's a really important piece that's going to now lead me to what's the worth of a human being in the future. We, part of what I see is a lot of our mental health issues stem from true rupture from life.
From the idea that we're actually living organisms. Um, we've really moved away from. What it means to be alive, what it means to have a life force, what it means to be part of the natural world, what be pulsing energetic human beings. So if we were able to birth or, or welcome in a regenerative economics model that puts us back in right relationship with life and living systems, and I'll, I'll see more about that if you're interested, then what we can do is.
Imagine a world where artificial intelligence does the mechanical work of providing [00:11:00] for goods and services. But then human beings do the life giving work and the life affirming work of what it means to live in right relationship with life itself. And so the purpose of human beings becomes. Not as producers, but as stewards, it becomes human beings being the ethical stewards of our collective humanity and collective more than human beings.
How do we ethically store that future for everyone's flourishing, not just our own? It becomes human being stewards of a future civilization that is promising for an infinite number of future civilizations and not just itself, right? So really thinking about the entire systems, not just their own benefit and really [00:12:00] thinking about long term flourishing.
And then on a personal level, it becomes human beings recommitted to Our community and our relational bonds, it's kind of moving from. Here's how I think about it and tell me if it resonates with you, Jen. I know I've been talking for a long time, but tell me if this resonates for you. We've been kind of creating a world of ease where we live in an economy where everything is driven to you.
Improving the ease of life. Well, that's led to, uh, efficiency focus where, you know, we're looking for mechanical solutions to physical problems. It's led to aesthetic enhancements, right? We want to design for pleasure, individual pleasure, um, and for own comfort. Um, it's also led to an emphasis on entertainment, right?
Where we become passive consumers and we look for digital diversions and we want instant [00:13:00] gratification. And that's led us to where we are now, but what would it look like to move from an ease economy to a regenerative and meaning ecology where we, where everything we design is to enhance our collective meaning.
So then that opens a possibility for what I call relational ecologies. The space between people is what heals. The space between people is what makes for human well being. That human to human connection, the more than human relationships, the natural world integration, all of those really vital. uh, pulses.
It leads to community integration, right? It leads to a re attention to the relational field of a community and the place that we live in. It leads to contemplative practices where we source meaning from [00:14:00] Seeking wisdom from developing our own ethics from, you know, spending our life learning and exposing ourselves to different ways of being and knowing, including ways of being and knowing that we have completely marginalized in our current environment.
Cultural systems, Eastern practices, indigenous wisdom, even just the knowledge of animal intelligence. Like if we just opened our eyes, we would recognize that we're really not at the top of the pyramid like we like to believe. Um, you know, so I, I, that's the future that it's a future of humans re humaning.
Um, and it's a necessity because our current model is just simply not sustainable.
Jen Fisher: Yeah, gosh, it resonates in so many ways, and I think in particular, you know, you talk about the way that life is set up or has been set up now is for ease, and I don't know, I mean, [00:15:00] intellectually, I get that, um, in terms of life.
The way I live my life and how easy certain things are, I get that too, but it is, it's also created so much loneliness, so much anxiety, so much depression, um. And, and so there's a part of me that's like, okay, yeah, I mean, maybe it's a life of ease, but is it really when we're all feeling these things? It's not just certain populations of people.
I think it's all of us. Um, certainly it's me. Um, and then I kind of, you know, you've, you've mentioned kind of mental health, mental illness, and I've been. Reading a lot and have some excitement and some angst around this notion of [00:16:00] using AI for mental health support. So, I guess in this world that you're describing, what opportunities are there?
What are the risks in this space? Because I feel like we are marching so quickly towards this idea that AI is going to solve so many things for us. But are we, are we seeing this in the right way? Because I hear you saying that, yes, there is this role that is going to allow These super machines to do all of these things that have led us down the path of being not human.
I don't know that I feel like we're moving in the right direction. Does that make sense?
Caroline Chubb Calderon: I think what I'd love to provoke your listeners with is to, you know, when we do a lot of disruptive innovation thinking, we do this. Technique called revolution, which is we take the rules of the current system and then we challenge that we intentionally [00:17:00] turn them upside down.
So what possibilities emerge in these, if these weren't the rules. So I'd love to ask your listeners to really think about how could we see with new eyes? The world around us.
Jen Fisher: Um, that's what I'm struggling with. I guess.
Caroline Chubb Calderon: Um, your question. So first of all, I just want to think back to we've we've hyper optimized.
We have gotten the gold star on economy of ease. We've done really good at that. And as you pointed out very appropriately, we are very sick. Um, and we need deep healing and that healing what I'm trying to say. Only comes from a depth of meaning and a depth of relational ecologies. We need each other. We need, we need each other and we need to be more attuned to the fact that we are living, breathing beings.
We're not mechanical. So we're going to have to make a pretty important [00:18:00] boundary between synthetic relationships and human relationships. So let me answer your question about how AI is being used for mental health. We are In my, and I'm going to say this provocatively, we are fundamentally off base if we think that we can deploy artificial intelligence to heal human beings.
Um, mental wellness. I think we, I think artificial intelligence can be deployed for clinical. Analysis and clinical assessments. And we've already seen a number of different, uh, during which that is happening and that's exceptionally exciting. And I think that's very healthy and very worthwhile when we start deploying, for example, Digital twins or, um, generative agents and so on that, that becomes really worrisome for me.
And I'm a big fan of Sherry Turkle. I don't know if you know her work. Um, [00:19:00] yeah. And I think she has it spot on her question is not, you know, how can we use AI to, you know, develop mental health or to improve our empathy and so forth? Her question is the right one, which is what happens to us? When we start using AI for our own mental health and our own development of empathy or so on.
And, and that is really the area that we really need to start thinking more of. Um, because we are, we are addicted to this idea of accelerating improvements through technology. And we have completely lost focus on what is true healing. Oftentimes when I'm in front of audiences, I, um, I end with saying, you know, the people in the room are usually Gen Xs.
And I end with saying something to the effect of, you know, we are the last generation that knows what a regulated nervous system feels [00:20:00] like in our bodies. And we can't demand a new generation to figure this out. We can't get tired. We can't, um, assume someone else is doing it. We have to be the ones that really stand up and take charge right now and help usher in this new world.
And I really mean it because you can't invent something if you don't know If you haven't felt it, if you haven't imagined it, you can't invent it. And we're looking at a whole generation of people who have been addicted to technology, believe that synthetic relationships that exist through social media, you know, that was the portal, by the way, social media has, was the slippery slide that led us to understand or believe that online interactions were sufficient, they're not.
And now we're looking at chatbots and, you know, uh, digital twins and even [00:21:00] agents, you know, with replica, et cetera, that are replacing that and making it worse because we're not even talking to human beings anymore. We believe that these bots are showing care for us, are showing love for us, are showing empathy for us, but synthetic care, synthetic love, synthetic empathy will never be the same as real care, love, and empathy.
And we need it. These are not soft, idealistic ideas of, you know, Oh, wouldn't it be great if we had more love, care and empathy in our lives? No, we actually need these things in order to be regulated, in order to be balanced, healthy, flourishing human beings. And the skills have tipped way in the opposite direction.
And you've seen it, like, if you look at what's been happening in the news, I know you've seen these, right? A 14 year old kid took his life in order to be with his AI girlfriend. He truly believed that if he took his life, he would be with her.
Jen Fisher: And, and, and isn't this [00:22:00] stories, heartbreaking stories like that.
And the fact that people are seeking these types of relationships is perfectly human, but shouldn't the signal to us be that people are seeking these types of relationships. Like on an enormous scale through digital technologies, which to me is. A signal that there's something really wrong with the way we're, we're, we're human ing because we are looking for what is very human in something that's not human at all.
Caroline Chubb Calderon: I'm, I'm going to go back to the framing of we have been conditioned and trained and living in an economy of ease. And [00:23:00] now we have to make the dramatic shift to an ecology of meaning. The reason I fund them, I have no data to prove this, but I believe, and I think it's logical that people are shifting towards these bots because the relationship is frictionless.
Because they are getting some type of validation because it doesn't require the muck and the vulnerability that is needed in a human relationship, the unpredictability, the vulnerability. And you know what that all speaks to me on? It's a lack of courage.
Jen Fisher: Say more about that.
Caroline Chubb Calderon: So it's a conditioning of ease and it's also lack of courage.
And it's a belief that, um, synthetic emotions are a sufficient proxy for real ones. Um, and I'm afraid we're going to, you know, if we really do allow ourselves to go down this path, we're going to live a shallower and shallower life. Um, so lack of [00:24:00] courage, you know, um, I've done a lot of work with Brené Brown.
I'm, uh, one of the people that she selected to help lead her Dare to Lead program. And the reason why I did that work is because I saw, and I still see, that courage is one of the reasons why we are never going to be able to challenge and shift the system. Because all this work, what I'm talking about with you, what people who are listening to this podcast, um, must recognize is that if you're at all interested in mental health, in the future of work, in human flourishing, in planetary flourishing, what that means is that right now you are, you must rise to the platform of system change leaders.
Cause our current system is not sustainable and it's unhealthy. And it's, it's, it's, um, it's a dead end. It's going to lead to a finite outcome. So we must all level up and become system change leaders. [00:25:00] So in order to do that, we need an enormous amount of courage. To stand up in those meetings and say, the way we've done it is no longer working, and we need to break the system and do something new.
Um, we need to have the courage to challenge conventions, to risk our, um, you know, our, if, if it's going to take it, we got to risk some of our profit margins. We've got to be the ones that are going to stand outside the system and be daring in creating a different future. Because as long as we operate within the system, we're never going to change it.
And that's so courage, courage. Um, Renee will be the first one to tell you it. The only way to courage is through vulnerability. Um, and that means the ability to stay in the unease. And yeah, back to our relationships, right? Like my relationship with you, the depth of my relationship with you, Jen, is only as [00:26:00] good as my desire to be connected with you, but also my ability to hold whatever will be difficult between us.
But you know what, once we go through that unease together, the bond between us will be. That much stronger and that much more meaningful. Um, so we, we need more courage. We need more meaning.
Jen Fisher: Yes, to all of that. And one of the other things that I know you've talked about is that we're experiencing a global creativity crisis.
And so I'm just thinking, yes, we, we need the courage to stand up and say. We're not going to do things the way that we've always done them. But if we're in the midst of a creativity crisis, I feel like that's part of the problem, too, that people have, I [00:27:00] don't know if, if, if given up is the right word, I don't, this is not a negative comment, but how do we get that creativity back that, so that when we do have the courage to stand up and say we're not going to do this the way that we've always done it.
but that we have creativity flowing so that we can come up with new ideas and solutions recognizing that may, they may not always work, but at least they're not the way they've always been.
Caroline Chubb Calderon: I have so many answers to give you, but I think I'm going to start with this one. It just feels more like that's what my heart's calling to speak about. I, um, I think part of the problem is just, we're all so tired. We're, we just are. Over stretched. Um, there's just no bandwidth for anything else. So the instinct is just let me get my work done.
Just let me move on. [00:28:00] I'm just going to check the boxes. I'm just going to do what I'm asked to do. And then I'm going to go home and I'm going to like. Just collapse and, you know, do the, do what I need to do to take care of my family and my, my health. And, you know, just keep the next foot stepping in front of the other.
Like, that's how it feels to me. I don't know. How does it feel to you as a field?
Jen Fisher: That's absolutely how it feels to me. Yeah. Well, let me just keep my head down, get my work done so I can get through the day and perhaps for many people also, I don't want to, even if I. Feel differently or believe something different.
I think in many cases, people are scared to say it. Cause there's a, a lack of trust or a fear of getting laid off. Um, let's just put it out there, you know? So people are like, let me just, you know, let me just do what I'm asked and, and not, you know, create any, any waves and collect my [00:29:00] paycheck and, and do what I need to do.
Caroline Chubb Calderon: That's so real. Um, I mean, that's, that's really, you know what that is, uh, not to be too provocative, but let's just really just call it something to just start asking ourselves if this is really what we want to participate in. It's survival. It's you, you do everything that I ask you to do for, you know, money, that's going to give you shelter and food.
And, um, You know, the bandwidth for your freedom, for your ability to create a free life is locked because I've got you, I've got you under my thumb. And what I actually hope it does is offer some clarity and ask, and for people to ask themselves, who am I, whose future am I advancing? And what quality of future [00:30:00] am I advancing?
And am I proud of that future? And I think part of it will be a need for us to reclaim our rights and reclaim our, our desires. Um, you know, really do a honest assessment of, are you living the life that you want to lead? Or are you living the life that you've been told to lead? Um, there's a difference.
And I'm very I mean, if you start looking with sober eyes across the communities that you might interact with, you're going to start seeing people that are shaping pockets of that potential future. Um, I'm very optimistic because I looked at Gen Ys and Millennials and Gen Zs. They're starting to, [00:31:00] you know, um, Stand against the system and they want to do something completely different.
I think it comes from an existential angst and this feeling of just not enoughness that what we have is, is we have a lot and a lot of abundance, but actually not, not much value, not much worth. Um, so we're starting to experiment, so that gives me a lot of hope and, and possibility. Um, but I hope that even just naming it, even though it's awful and hard, I'm hoping, you know, as, as a meditation, when you name it, it loses its grasp on you.
I hope that if we can name it and call it out, that we can actually be more courageous with it. Um, and let ourselves be the designers of our future.
Jen Fisher: Yeah, this, um, I, I hadn't actually planned on, on bringing this up during this podcast, but, but I, um, have a, the topic of one of my [00:32:00] newsletters because there's something that I've been sitting with for several weeks now that is really bothering me because, and I know we've talked about social media, there, there is this increase that I'm seeing in posts of On social media of people thanking their employer for allowing them to disconnect and be present during, you name it, their wedding.
The birth and care of their child in the first several weeks of the child's life, you know, the last breath of their dying parent or spouse and it is incredibly and increasingly troubling to me that we have gotten to a point that we now [00:33:00] believe that Being able to disconnect and be present, disconnect from work and be present to these experiences that are so uniquely And importantly, human is now a corporate benefit that we need to be grateful for.
Um, so I'd, I'd love to hear your thoughts and reactions to that.
Caroline Chubb Calderon: Um, while I absolutely bristle with the idea that we have to be grateful to companies for allowing us to get into these monumental moments in our lives. If I look at it from a little bit of a distance, um, I actually think that's healthy because they say more because companies aren't going to shift or change unless they believe that there's going to be value coming from that.[00:34:00]
And with us being grateful to that higher order. Behavior. Um, that behavior that's more aligned with, um, higher ethics and morality. We are reinforcing a system of higher order and morality. If we say nothing, you know, of course it should be so, but it's not so, right, Jen, like businesses don't operate like that.
So we, we can't expect them to be different without some transition. So the only way to help the transition along is to give validity to the small behaviors that lead us to that higher order transition. So I don't see, do you see what I'm saying? I don't know if I I,
Jen Fisher: I do. I mean, I, I, I guess where my head is going is Imagine if the opposite was true, right?
Imagine if we weren't quote unquote allowed. [00:35:00] Do you know what I mean? Um, it just, I, I 100 percent know where you're going. I, I don't disagree, but it does, it, it still, it still eats at my, yeah, it still kind of eats at my core that we even think that we should think. Organizations for allowing us to be present in these very human experiences that we all need in our lives.
Caroline Chubb Calderon: 100%.
Jen Fisher: Yeah.
Caroline Chubb Calderon: And how do you make change happen? Right. So from that, I mean, I'm just giving you a small perspective so that perhaps, um,
Jen Fisher: So in other words, we should all take to social media and start thanking our organizations for all the things that they're allowing us to do to allow us to be human .
Caroline Chubb Calderon: I think we should, uh, celebrate, uh, an emergence of a new future.
Yeah, yeah.
Jen Fisher: Authentic, very authentically .
Caroline Chubb Calderon: Very authentically. Yeah. Yes. Um, and, you know, but [00:36:00] also.
I often talk about, um, how we think in horizons and, you know, this is really about change theory, but a lot of our thinking is locked in horizon one thinking, uh, which is basically an evolution of our current system. It's how do we improve our, you know, uh, profits, our growth, our, you know, technology, how could we use AI?
Paper AI over everything that we do, you know, and I, um, informed lighting, AI driven cars, AI, whatever, take whatever field and just put it on top of it. That's level one thinking. That's horizon. One thinking horizon to thinking is sort of that emerging thinking around. Okay, what's the potential future? Um, they're kind of bridges.
But what we have a dire lack of Which I think you're pointing to, Jen, is horizon three thinking, which is what is that [00:37:00] emergent, completely different societal design that Puts work and companies in right relationship with all of the rest of life, and we do not spend enough time thinking about it. We do not have any clear visions.
In fact, if we think about if I ask you to think about, you know, visions of the future with AI, I'm 100 percent going to bet that you probably have visions of, you know, dystopian ism and, you know, human manipulation and all the stuff that we've seen on screens and books and so on. We need more beautiful visions of what the future looks like in order to even birth it.
We can't, if we can't even imagine it, we can't even birth it. So what you're seeding and what this, hopefully this entire conversation is seeding is how do we birth a future where human beings. Get to have [00:38:00] meaning, get to reinvest and revalorize human relationships and relationships with nature and relationships with more than human beings as co stewards of this, right, as co habitants of this planet.
And how do we put work in its rightful place? And what does even work look like when machines can do most of the stuff that we've named work and what I'm trying to say to you is that work needs to be what we, what we name work. Let that be, let that be led by machines. Let our work be in that relational field.
In the community building, in the caretaking, in the relational ecologies.
Jen Fisher: This is why I am such a huge fan of you and your work, because I do believe that the majority of the narratives that we hold [00:39:00] around this future with AI are ones based in fear and not ones based in, you know, beauty or possibility or not ones that are actually Human centered.
We use the words human centered, but what we're seeing and feeling and experiencing is not human centered at all.
Caroline Chubb Calderon: Can I tell you, I'm actually getting really allergic to the word human centered. I am too, .
Jen Fisher: There's a lot of words I'm allergic to these days, and that's one of them.
Caroline Chubb Calderon: I don't mean, I don't mean you, you, you.
I don't mean you saying it. I think you pointed to the exact right point, but I'm saying like, why? Why do we keep saying that everything has to be for the benefit of the human? Why are we not recognizing that we are just part of the web of life? And really, a higher order design for a future civilization is one that is supporting life.
In fact, uh, not to plug my work, but I will just say it. Please [00:40:00] do. That's why we're having
Jen Fisher: this conversation.
Caroline Chubb Calderon: So a lot of the work that I do is in AI ethics. And I was, you know, one of the first that wrote the IEEE ethically aligned design work, which became the basis and foundation for a lot of the ethical standards that you see around the world.
It was the first of the world. Um, and it was, I'm very proud of that work, but where my thinking has led me now is that that's not enough because what that work predominantly does is, and a lot of their thinking, Around AI is around. How do we minimize harm? How do we ensure privacy? How do we ensure that there's, you know, lack of bias in the system, et cetera, et cetera, all extraordinarily valiant efforts.
And so the more I've thought about it, what I realized is our fundamental standards for AI must be aligned with what allows life to survive. So I'm moving from human centric AI [00:41:00] ethics. to life centered AI ethics. What does it mean to create systems that are coherent with living system principles?
Because if we're not intentionally designing AI for life and with life, we are unintentionally designing AI against life, where we perpetuate on an exponential scale these extractive paradigms, these mechanistic paradigms, these separating paradigms. Instead of, how do we regenerate, how do we create healing, how do we, you know, regenerative is basically saying, We have put the world into crisis where our planet can no longer sustain life.
How do we not only stop the crisis and the damage, but how do we heal it? That's why I use healing all the time. So how do we heal the world in crisis? How do we heal the human in crisis? That's what I'm trying to. [00:42:00] Communicate. And so how do we do that? How do we, um, get a eyes to be in support in everything that they do to be moving from this idea of we must improve the individual.
To we must improve the entire system. And that's actually a huge possibility that AI opens because it is capable of thinking through multi order effects in systems thinking, whereas human beings can't do that that well. And if we were able to do AI that way, if we were able to design AI that way and design.
AI's recommendations to us to think about multi order impact, then I think it also does an important shift for our own worldview. Part of the reason why we're so sick is because we have, we live under the illusion of separation that we believe we're separate from nature. We're separate from each other and we've become extraordinarily [00:43:00] narcissistic.
It's all about my, my benefit, my value. Me, you know, my, my survival, my success, my profits, et cetera. We don't live in a single cell organism. We live in a multi cell organism and we need to see that all the work that we do has to actually be in service of the entire system elevating to higher order. And I'm hoping that if we develop AI with the same principles of life, which by the way, why aren't we?
that it actually also will drive a shift in how human beings valorize themselves. And the worldviews shifting from individualism to a more relational web.
Jen Fisher: There's, there's, there's so much here. Uh, and, and, and you and I both know we could, we, we could keep talking on this forever. We may need a part two, but I, I want to take some [00:44:00] of these really big concepts and bring them down to.
The day to day, week to week, you know, month to month lives of people in the workplace, whether they are our colleagues, managers, leaders. Give me some practical, pragmatic steps, um, that, that we can all start taking tomorrow to better support. A better world of work, but also just a better world.
Caroline Chubb Calderon: Um, I want to give you two ideas.
The first is if you have an AI agenda, you have to have a human intelligence agenda. It is a hundred percent in the realm of an organization and in their responsibility to ensure that we're leading to human and planetary flourishing. It would be [00:45:00] irresponsible, unethical, immoral, and unjust not to so the everything that you do needs to be reprioritized because of the exponential curves that we're under, it really has to reprioritize around human and planetary flourishing.
And so what are the ways that we can reinvest in human worth and human value at work? And I have a framework, um, I call it the four eyes. Human value at work really hinges on these four areas and you can challenge me on it, but I think it's pretty solid. The first is around our capacity to imagine, and these are the human centric innovation skills that can help us re imagine a radically different society and And we need active contemplation around that horizon three that I was talking about.
So all the skills around curiosity, vulnerability, courage, resilience, innovation, those we need to teach at scale. [00:46:00] The other area is around insight. That's the second eye. And for me, this is really around the ethical skills that are required to ensure that our actions. And all of our solutions are for the ultimate benefit of human and planetary flourishing.
So here, we really need to focus on moral and ethical reasoning. It's everybody's job to be an ethical steward of the future, not just the AI ethics teams or the future work teams. It's everybody's job. It's recommitting to teaching empathy and compassion because we know it's at the lowest level ever recorded in the last 40 years.
And without empathy and compassion, we don't have a chance for a better future. A, um, a caring future. It's committing to teaching mindfulness, for example, as metacognitive skills that allow us to see when we are leading the life that we intend to lead versus the life that we're being manipulated to lead.
Uh, we really have to get good at that. And then the other eye is around [00:47:00] inspiration. And here it's about the leadership skills that will help us mobilize, align, and inspire. all of us to become high, the highest versions of ourselves and to shift that system. So it's about systems thinking. It's about, it's even about peacemaking and community building.
It's these skills that we really are more and more fragile around because we are so disconnected from each other. And the last I is really the center of integration, which are. What are the skills that we need to enhance human thriving in the context of autonomous and intelligent machines? So how do we find agency?
How do we create digital wellness? How do we rediscover joy and play? And how do we revalorize? and awe, not just because they're, you know, delightful experiences, but because through the portal of wonder and awe, we rediscover health. We rediscover with nature. We [00:48:00] rediscover those elements of humanity that give us a purpose and meaning.
So those are four, um, areas that I would say every organization needs to invest in. And then the second thing for an individual. I really love this provocation. I've been doing it for myself and it's been so eyeopening, Jen. I am, I can't tell you and, and what, what I'm calling it as a life assessment. And what I mean by that is if we recognize that we are fundamentally a living system, part of living systems, and that currently part of our illness and, and, uh, and hurt is because we are so disconnected from life itself.
What can we learn from the way life works to realign our work and our family and our individual lives with those principles of life? And here's where it gets interesting. There's eight principles of life, Jen, [00:49:00] and Once you start looking at the world through that, things start to shift. It's really interesting.
So, for example, one of the very first and most important principles of life is that life is fundamentally life affirming. That means that life always creates conditions that support and enhance life's continuation. So if you just look at that one. And you say, How am I creating? What am I doing at work that is life affirming that enhances life's continuation?
Or how am I creating a life myself that is life affirming? What ways are my daily actions? Creating conditions that create systemic enhancement or ensure my long, long term vitality, not just for me, but of all living beings, that becomes a really interesting lens. So that's the first, first one, just how is my life, [00:50:00] life affirming?
How is my work life affirming? How's my technology life affirming? Just fill in the blank before the words life affirming, and it starts opening up a whole new field. The second principle of life is that life is interdependent. It means that life is always in nested relationships and cross scale interactions.
It's that systems view perspective that I was talking about. So, ask yourself, how is my, is the way that I'm living or I'm working impacting and influencing the web of life upon which I am also interdependent with? And that opens a whole different way of being. It opens a level of ethics that, um, we are not even thinking about.
It's, it's fundamentally Ubuntuism. Um, if we were to, you know, really peel the layer of that. But if I recognize that I am not an I, but I'm an interbeing, as my Zen teacher Thich Nhat Hanh used to [00:51:00] say, Um, it would shift the way we move in the world. Like moves us from that individualism to our, you know, interdependentism.
So again, I think that's a beautiful question and it's a beautiful exploration. And I think it brings us into a future that's more aligned with the playbook of life, right? Life has lived 3. 8 billion years on this planet. It knows what it's doing. We just haven't learned the lessons from her yet. Um, the next, um, just to finish, Jen, just let me, because I, I want to make sure your listeners work with, um, the next principle is that life is always in right relationship.
So this means that it's always in right relationship with everything else. There's a reciprocal exchange. There's, um, uh, an equality and an ethics of our interactions. It's always. Um, seeking to cause no harm and to be of benefit to the people and the systems that it's interacting [00:52:00] with. So again, just think about how you yourself are moving, how you're interacting with the technology.
Is the technology, is your relationship with technology in right relationship? Or is technology overwhelming your lifeness? Let's call it that. Um, does that make sense, Jen? It does. Yeah. Yeah. Um, so then if just a couple more, there's only eight of them. The next one is that life has robust feedback loops, and it's really important because that's the only way that we can adapt, right?
That's the way life adapts in the, in a system. So how are we sensing whether or not we're adapting well or poorly? In this current system. What is what is giving us the robustness of that information? So again, that might invite us to think more broadly about who are we sensing from what information we're getting?
What are the channels that we need to attune to? Um, and, you know, again, you see it either from a work [00:53:00] perspective or an individual life perspective. I think that's really important. Interesting question to ponder. And the last few are life is emergent. Um, so life always, this is so beautiful, life is always in service of your highest potential.
How do you move as a leader? Do you move as being in service of unlocking the highest potential of the people that you're with? How do you interact with technology? Is the technology in service of unlocking your highest potential as you an individual? How are you creating your community? Is your community in service of unlocking?
And manifesting its highest potential. Um, again, why don't we think about the world through the lens of life? If we start thinking about it this way, we could manifest so many more creative [00:54:00] possibilities. Um, life's resilience is diversity and not to say too much about it, but variety. is important for stability.
We need to expose ourselves to different approaches, different types of relationship, different ways of thinking, different ways of being. That's what leads to resilience. And for some reason, Even though there's a lot of work around DEI, for example, it all feels so functional, but we've lost the plot. It's not just a functional thing.
It is fundamentally needed in order for us to be resilient. Um, so I, I, I love that. Even just life teaches us that lesson,
Jen Fisher: right?
Caroline Chubb Calderon: Um, there's a principle of ecotones in, uh, living system science where there's two bio, um, biospheres that meet. And in that meeting place is called an ecotone. And [00:55:00] in that ecotone is where the most lushness of life happens.
There's all this cross pollination happening, all these new different life forms happening. And that is really the essence of it. Life. Life's success is only when it is as diverse as it can be, because if one system dies, it still has something else that it can lean on. Um, so again, How could we create technologies that really align with this principle?
How could we create AIs that are diverse and not just so centralized, um, in a couple of different, you know, uh, key players, how do we ensure that we're creating resilience and not just a singular system, for example. Um, and the last two, I will start with this one. Life is cyclic and seasonal. And I think you probably sense this so easily because of the work that you do.
We are living beings that are cyclic and we operate through seasons. But for some reason we [00:56:00] have completely abandoned that idea and that we need to be high performing all the time. I think at least there's a door opening now in conversations, right? People are recognizing the need for rest. And there's a little bit more, especially after COVID, there's a little bit more permission around taking important, you know, recovery time.
But if we really understood that that's how life works, our entire civilizational design would shift, right? Like we would, and our interaction with technology would shift. Our, you know, like our economy would shift, we would, you know, recognize that we can't keep growing at infinitum, we'd have to find ways to be cyclical and seasonal with it.
Um, the cyclical part is important because it leads to or points to, um, you know, regeneration in terms of using materials in material cycles, making sure that we make new from old, that we keep materials flowing and useful and not just so, um, [00:57:00] You know, use and throw out mentality, which is what we've had on.
That's right in order for us to stay. And the last one I love finishing with this one. The last one is that life honors community in place. Life always adapts locally and discovers collective wisdom. So it's about the wisdom of the community. It's about attuning to your bio region. It's about stewarding your local resources.
So it's a coming back into place. And I'll tell you why this is important for me. Part of our problem is a really dire crisis of truth. And if we don't have an understanding of what information we can actually trust, it's Because we are just infected with online information all day long, which more and more is getting created by AI, by the way.
We are going to end up with a tremendously schizophrenic society that's living in [00:58:00] delusional spaces, right? We're already seeing that. What we need to do is a radical reorientation to community and place, where trust and truth is interrelational. It's palpable. It's between you and me. So we need to spend more time attending to that because we're, we're capitulating.
We're, we're falling prey to an information system that's, you know, that's really dangerous. Um, so I love that, that life principle can even help us guide us in that direction. Yeah, I do too. That's, that's the way life works. Then if we don't align with it, then we're, we're causing our own destruction. So thanks for listening.
I, I wanted to make sure I got that in there because it's eyeopening.
Jen Fisher: It, it is very eyeopening. And, and I'm glad that you got that in. Uh, I feel like we could have a whole nother episode on going deep and in all eight of those principles. So, [00:59:00] Caroline, thank you for this conversation. Thank you for the work you do.
Thank you for all of your wisdom in trying to help make us. all better in this world that we're living in. I know I'm deeply grateful.
Caroline Chubb Calderon: I am too. Thanks, Jen. And, um, you know, again, I'm so excited about what the future holds. Can you imagine a more life centric? flourishing future for people on planet. That, that gets me really excited.
Jen Fisher: The WorkWell podcast is excited to bring you another Lyra Lens where I chat with a Lyra expert about today's episode. This conversation with Caroline really got me thinking about how AI is going to change, improve, and potentially increase access to mental health care. I see so much opportunity here, but I also see You some real risks if we don't approach this thoughtfully.
So I phoned a friend at Lyra Health to get her perspective. Jenny Gonzalvez is the Chief Technology Officer at Lyra Health. Jenny, [01:00:00] thanks for being open to answering my questions. But first, tell us a little bit about yourself and your role at Lyra.
Jenny Gonzalves: Absolutely. Hello Jen. It's so great to be here today.
Thank you so much for including me. Um, so as the Chief Technology Officer at LearHealth, I oversee all our technology efforts, including engineering, data, AI, security, and IT. The whole gamut, if you will.
Jen Fisher: In my conversation with Caroline, she talked about the risks of deploying AI for mental health support, and she made this distinction between using a I for clinical analysis versus creating synthetic relationships.
So what are some of the ways in which a I is actually being integrated into mental health care right now?
Jenny Gonzalves: Excellent question. So I think I'd first start off by saying, you know, at Lyra here, we embarked on integrating a I into mental health care well before it became the latest trend, right? It's all done very thoughtfully.
So first and foremost, I think what a I enables in a way that other technologies prior to it couldn't [01:01:00] is really personalizing care for the patient, right? So because with AI in place, what you can do is you can use the algorithms on the other end to come up with the best treatment course, right? Which again, prior to AI in the picture, there were more manual ways of doing this, and then they were just not optimized enough.
But not only there, you can then, once you know what treatment recommendations sort of best for the patient. Where you can further use AI is match them with the best provider, and all of this can be optimized with the best patient outcome of the mind, right? So that is sort of one way in which AI has been really, really, really impactful in the mental health care space.
The other things that we're seeing, and these are more recent trends in terms of AI in the mental health care space, is augmenting and assisting providers. This specifically took off in the last couple of years with the emergence of Gen AI technology. Then you take away, um, the mundane tasks, right? Tasks which just take up their time and aren't really [01:02:00] adding a lot of value in that care delivery or the care journey for the patient.
So things like, um, note summarization, if you will. So at the end of every session, providers spend anywhere from 25 to 30 minutes just sort of, like, making notes on that session. We've got really unique ways of, you know, taking that and auto summarizing them. And, but of course, not entirely an autopilot.
You're, you're still going to let the providers, like, look at it. And our providers absolutely love it. Like, this is one of those features where they're like, Oh my God. I can understand why. Exactly. And, you know, it saves me anywhere from 5 to 10 minutes. for a patient per session, which is incredible. So really, assistance through providers, and there's just a lot more we can do there, too.
You can also then use AI to proactively flag clients to providers, right? So, historically, what providers would have to do is take a bunch of content and kind of look through the insights themselves. But you can use AI to curate all of that for them and push it to the provider. So again, they're [01:03:00] focusing really on the patient and treating the patient in the best way possible.
And then you're also seeing AI play a part in just overall efficiencies around the care delivery aspect of mental health care and care delivery itself. You know, there's some, um, there's some, like, innovation and exploration happening on that front as well.
Jen Fisher: You've already answered this in some ways, but again, Caroline kind of kept mentioning this idea of synthetic care, synthetic love, synthetic empathy, and how this will never be the same is, Real care, real love, and real empathy.
And so you've already been talking about this a little bit, but I think that the fear or the question is really around how do we not lose that human connection in patient care between the provider and the patient?
Jenny Gonzalves: You know, when you're incorporating technology, whether AI or not, right? But specifically with AI.
Like, I, I think it's best to start with the exact problem statement and some goals in mind. What is the exact problem [01:04:00] and what do we think this is gonna help with? So I think that's, like, first and foremost. Now, you know, having said that, it sounds obvious, but you will be shocked at how that isn't necessarily broadly applied or even followed.
And then I think once you've identified that, whether there is an exact problem statement that you're hoping to solve, It's ensuring that you don't create that technology in a silo where the technical team is just kind of, okay, we know the problem, but now we're just going to solve it and we're not talking to anybody else.
That doesn't make anything better, right? You know, like at Lyra, for example, all of our AI developments are created hand in hand with our ex Chronicle team. And then lastly, you want to make sure you build it iteratively. So I'll give you an example of like the provider summarization that I talked about.
That, for example, we partnered very closely with our clinical teams and we consistently got to the provider's input as we were kind of building and working on that technology. But then when we rolled it out, we rolled it out iteratively, so you had a chance to get, you know, sort of incorporate feedback from the cohort [01:05:00] that had the feature live and so on, and you could incorporate it fully.
Uh, and then we're monitoring it closely, right? So that you're not, it's not like one and done and you forget about it. So really. There's just humans always involved in this process, right? It isn't just a technical silo and a technology silo and you're just sort of doing something in a vacuum.
Jen Fisher: Yeah, I think that's really helpful.
But I guess I keep going back to this, you know, there's this just this like really necessary human element in mental health care. Caroline, I think talked about, you know, technology makes things very frictionless, but ultimately not fulfilling, but we do see news stories and things like that, that, you know, show the, the frictionless nature of these relationships or perceived relationships that could be built with AI using AI.
And so how [01:06:00] do we make sure, how do we convince people that the real relationship is with other humans and the real kind of, there's this necessary element, human element in mental health care. And that the technology, like you said, plays a role to help augment the way that humans get the work done and the seamless nature of that.
But it's not a replacement for having a human care provider.
Jenny Gonzalves: Yeah, absolutely. You know, at Lyra, our motto has always been technology with the human touch, precisely to ensure that human connection is at the core of everything we do. So, uh, I, I don't think you'd take that away, Brian.
Jen Fisher: I love that. Thank you for joining us for Lyra Lens.
I'd like to thank Jenny for sharing her expertise with us today. For more information about Lyra Health, visit lyrahealth. com forward slash WorkWell. Until next time, this is the WorkWell Podcast.[01:07:00]
I'm so grateful Caroline could be with us today to help us understand how we can preserve and enhance our humanity and an increasingly AI driven workplace. Thank you to our producer and our listeners. You can find the Work Well podcast by visiting various podcatchers using the keyword, Work Well, all one word, to hear more.
And if you like the show, don't forget to subscribe so you get all of our future episodes. If you have a topic you'd like to hear on the Work Well podcast series, or maybe a story you would like to share, reach out to me on LinkedIn. My profile is under the name Jen Fisher. We're always open to recommendations and feedback.
And of course. If you like what you hear, please share, post, and like this podcast. The information, opinions, and recommendations expressed by guests on this podcast series are for general information and should not be considered professional advice, diagnosis, or treatment. Always seek the advice of your physician with any [01:08:00] questions you may have regarding a medical condition.
The podcast owner, producer, and any sponsors are not liable for any health related decisions made based on the information discussed. Thank you and be well.