The WorkWell Podcast™

Mind the (Future) Gap: Preparing for What's Next in Mental Health 
Special Live Episode from Lyra Breakthrough 2025

In this special live episode of The WorkWell Podcast™, recorded at the Lyra Breakthrough Conference, Jen Fisher hosts a dynamic panel discussion exploring how AI, shifting demographics, and evolving expectations are reshaping mental health support in the workplace.

Panel Experts:
  • Dr. Tom Insel - Former Director of the National Institute of Mental Health and visionary behind the bold statement that "AI is to mental health what DNA was to cancer"
  • Briana Duffy - Market President at Carelon Behavioral Health, witnessing mental health become a mainstream conversation across generations
  • Dr. Alethea Varra - Senior Vice President of Clinical Care at Lyra Health, pioneering the integration of technology and clinical excellence in modern mental healthcare delivery
Episode Highlights:
  • Why AI represents a transformational force in mental healthcare, offering precision in diagnosis and treatment like never before
  • The critical difference between AI as a "GPS system" versus autonomous "Waymo" therapy - and why we're not ready for the latter
  • How predictive algorithms can identify individuals at risk for self-harm up to five months in advance
  • The challenge of responsible AI implementation: why human oversight is essential to prevent dangerous "drift" in AI responses
  • Young people now listing "been in therapy" as a requirement on dating profiles - and what this means for workplace expectations
  • Why 70% of students prefer community-based care over traditional one-on-one therapy
  • The generational divide: younger workers prioritizing mental health support versus older workers' "tough it out" mentality - and how to leverage both perspectives
  • The shift from "mental health" to "mental fitness" - expanding the conversation beyond crisis care to preventative wellness
  • Value-based care revolution: paying for outcomes and results rather than time spent
  • Real ROI data: 30% reduction in overall healthcare spend for engaged members in sophisticated care programs
Quotable Moments:
"AI is like the number one use of therapy. Is that a good thing or a bad thing? I put this into a timeline where I think about how we did navigation... we had these paper maps to go on a trip, and now we use GPS. The question is, are we ready for Waymo?" - Dr. Tom Insel

"My job as a therapist so very often is to sit down with a human in front of me and to tell them something that is actually not going to make them happy. Generative AI tends to drift, and we've seen examples of that." - Dr. Alethea Varra

"If this (therapy requirements in dating apps) is the new mainstream norm in the dating world... it's not going to look materially different in the workplace." - Briana Duffy

Resources:
This special live episode of The WorkWell Podcast™ is made possible by Lyra Health, a premier global workforce mental health solution trusted by leading companies like Starbucks, Morgan Stanley, Lululemon, and Zoom. Lyra provides personalized care to over 17 million people with fast access to evidence-based providers and tools that deliver proven results.
Learn more at Lyrahealth.com/workwell.
 

What is The WorkWell Podcast™?

The WorkWell Podcast™ is back and I am so excited about the inspiring guests we have lined up. Wellbeing at work is the issue of our time. This podcast is your lens into what the experts are seeing, thinking, and doing.

Hi, I am Jen Fisher, host, bestselling author and influential speaker in the corporate wellbeing movement and the first-ever Chief Wellbeing Officer in the professional services industry. On this show, I sit down with inspiring individuals for wide-ranging conversations on all things wellbeing at work. Wellbeing is the future of work. This podcast will help you as an individual, but also support you in being part of the movement for change in your own organizations and communities. Wellbeing can be the outcome of work well designed. And we all have a role to play in this critical transformation!

This podcast provides general information and discussions about health and wellness. The content is not intended to be a substitute for professional medical advice, diagnosis, or treatment. Always seek the advice of your physician or other qualified health provider with any questions you may have regarding a medical condition. Never disregard professional medical advice or delay in seeking it because of something you have heard on this podcast. The podcast owner, producer and any sponsors are not liable for any health-related claims or decisions made based on the information presented or discussed.

Jen Fisher: [00:00:00] Hey everyone. This is a special live episode of The Work Well Podcast recorded live at the Lyra Breakthrough Conference. Throughout this episode, you'll hear our guest reference Jen's talk, or Jen's keynote, which refers to the opening address delivered by Jennifer Schultz, Lyra Health's, CEO. At the Lyra Breakthrough Conference,
ANNCR: we're about to start a live recording of the Work Well Podcast. Let us here at Breakthrough 2025. Join us in welcoming your moderator and host of the Work Well Podcast, Jen Fisher. Sure.
Jen Fisher: Well, hello everyone. Welcome to. Beautiful. Lake Tahoe says the Florida girl. How are you all doing? Louder, louder, louder. You [00:01:00] guys are awesome. Wellbeing at work is the issue of our time and the future of workforce. Mental health is not some distant horizon. It's unfolding right now in our care systems, benefit structures, and comprehensive solutions.
And it's not one size fits all. This is the Work Well podcast series. Hi, I'm Jen Fisher, and today we're asking what will mental healthcare look like in the future and how can we seize its tremendous opportunities? Please join me in welcoming our panel of experts. Dr. Tom Incel is the former director of the National Institute of Mental Health, who's boldly envisioned that AI is to mental health what DNA was to cancer.
Brianna Duffy, market president at Carelon Behavioral Health, who's witnessing mental health. Become a mainstream [00:02:00] conversation across generations. And last but certainly not least, Dr. AIA Vara Lira's own senior vice president of clinical care, who's pioneering the integration of technology and clinical excellence to transform modern mental healthcare delivery.
Let's give them a huge round of applause.
Together we're going to explore how AI shifting demographics and evolving expectations are reshaping what people need and expect for mental health support and what it means for all of you right now.
This episode of The Work Well Podcast is made possible because of our friends at Lyra Health. Lyra Health is a premier global workforce mental health solution trusted by leading companies like Starbucks, Morgan Stanley, Lululemon, and Zoom. Lyra [00:03:00] provides personalized care to over 17 million people with fast access to evidence-based providers and tools that deliver proven results, including faster recovery and reduced healthcare costs.
This season Lira and the Work Well Podcast are teaming up to bring you more insights on how to build a thriving work culture for today and the future. We'll be bringing you cutting edge data and research on workplace mental health and wellbeing, and we'll have some lira experts occasionally join us to share their perspectives on workforce mental health and creating psychologically safe and effective work environments.
Find out more@lirahealth.com slash. Work well, thank you to Lira for helping us elevate this season of the Work Well Podcast.
So Tom, I'm starting with you. You've made this bold, forward looking statement that AI is to mental health care what [00:04:00] DNA was to cancer and as a cancer survivor. That's a bold vision and a fundamental paradigm shift. Can you share why you believe AI is such a transformational force and what makes you so optimistic about its potential?
Dr. Tom Insel: Oh yeah, you bet. So first of all, big
Jen Fisher: question there.
Dr. Tom Insel: Thanks for having us. I think this is a great moment and a great place to have this conversation. And Jen, it's terrific to do this with you, especially, um. If you think about what genomics and DNA have done for other areas of medicine, it's, it's pretty transformative.
Oncology's may be the best example where we've moved to precision diagnosis. So it's not that you have breast cancer, you have breast cancer with a particular genomic type, and that matters because that determines what kind of a treatment you'll get. So it's changed the diagnosis. It's also really changed and it is increasingly changing the nature of the treatments.
That are available. [00:05:00] Mental health doesn't have that. You know, it's really been in so many ways an imprecise science, very subjective, which is a good thing sometimes, but it lacks the kind of biomarkers or kinds of tests that help, you know, what are the 17 different kinds of depression and how those relate to the treatments that you'd need for each kind.
Now, the hope we had. You know, 20 years ago was that genomics or imaging would give us that precision. It just simply so far hasn't delivered on that. But there is this possibility that ai, as we know it already, and especially natural language processing, which was kind of like the earlier version of what we now call ai, can really help us by giving us the kind of precision about language, about behavior, about emotion.
Even about cognition that we haven't had. [00:06:00] Um, and if you think about this, I mean just practically what that means is we go from telehealth 1.0 to Telehealth 2.0, and Jen kind of really laid the foundation for this, right? Where any meeting you're collecting real time data about sentiment coherence, you're getting information about their behavior both in the session and maybe.
Some kind of a rating of their sleep, their activity, how much time they're on screens, all of that. There's a tremendous amount of data that can begin to inform what somebody's actually struggling with and then can help to determine, you know, what's the right treatment. So I think it will have a massive impact, although it hasn't yet, but it could.
On diagnosis really shifting us to think much more about behavior, emotion, cognition. What about treatment? Jen put up that kinda amazing slide that says, like, it's the number, AI is like the num number one [00:07:00] use of AI is therapy. Uh, is that a good thing or a bad thing? I, I tend to sort of put this into a timeline where I think about how we did navigation for, you know, all the time when I was growing up and was younger.
We had these paper maps to go on a trip and now we use GPS. Um, and I guess the question is, are we ready for Waymo? Um, and I, I, I, I'm not sure about that. I think that, uh, it's gonna take a little while and a lot more study to get to the point where we have therapy in the sense that Waymo is a self-driving car.
But what we do have today are the GPS systems already, the things that can help, um, both therapists and families and, and patients navigate their care. And fill in the time between sessions, improve the ability for people to get access to care and improve the quality of care because it can train people up to give high quality interventions.
It's just a whole bunch of [00:08:00] things. Maybe we can talk more about that, but I'm, I'm, yeah. I'm pretty hopeful not to say that I don't think there're gonna be challenges, but I do think this is the transformative technology that we've been waiting for. And I do think, actually, of all the areas of healthcare.
That AI is going to impact, it's gonna be mental health that will have the greatest impact.
Jen Fisher: Brianna, building on that mental health care, as we all know and as we've heard, is deeply personal. It's deeply relational. So share your perspective, building on what Tom said, working with employers across all these industries, like what are organizations doing to find this sweet spot between the deeply human aspects of mental health support and ai?
Yeah,
Briana Duffy: following, I guess after Tom's speech, maybe what we need to do is put the sweet spot in GPS so we can all find it. 'cause I'm not sure how else we're gonna get there. Yeah, I'm kidding. We're figuring this out. I do think what you've talked about in terms of what we're able to do through AI gives [00:09:00] us technology advancement That for someone who's been in the field for 25 years is just.
Amazing. This is not something that we've had in the past, and yet there is Jen to the point that you made such an aspect of the work that we do that is so deeply human. And that human connection is so critical when it comes to a care journey and actually developing an ability to feel as though there is a wellness that is capable, impossible for someone who's not feeling great.
And so that wellness journey really today still does. Ultimately rely on human capacity, and so the intersection of AI in humans is, it's the magic we need to figure out until we can put that dot on the map. At Caron, we've been working a bit on how we can do better predicting of individuals who are likely to have a chronic need in a behavioral health nature.
What does that really mean? It means, an example would be when it comes to suicide prevention and preventing [00:10:00] self-harm. We now have the ability to predict within five months someone who is likely going to have a self-harm or suicidal ideation event. That's pretty powerful. We know when individuals seek behavioral health treatment that often part of their intake process includes asking risk factors.
Those risk factors, while we ask as many detailed questions as we can, I. Only typically surface in intent to have self-harm or suicide, uh, thinking and 30% of the time. And so it's a great example of where technology can really layer on top of what we know we can do from a human capacity to better tell us that there's someone who's likely going to have this event and allow us to be more proactive in nature.
The reality is though, once you have that information and you have a risk registry, how you move from having this actionable list. To actually doing something about it is incredibly challenging. You're not gonna go into your patient portal and say, [00:11:00] Hey, guess what? Turns out we've identified you as someone who may need to come back and have another appointment.
We're not gonna send a text and say, geez, this is what we've uncovered. We need to have a incredibly therapeutic intervention with that person. Grounded, as we all know, and trauma informed. Compassionate thinking, developing motivational interviewing in a way that we can engage individuals when they have that level of a need.
And the goal when someone enters into behavioral health treatment. Jen and I were talking about this. She's got the word hope on her t-shirt. I heard Jen talk about it this morning. It happens to be my favorite word. What we're trying to do as an organization, as an industry is to make sure that when people are raising their head and saying, I need some help, that we're able to help them in a way that actually brings them some relief and we can remediate the distress that they're feeling, creating hope that there is a different possibility.
So I still believe the humans are part of the game. AI [00:12:00] is definitely gonna make it better, but we're, we're looking for that spot on the GPS. Yeah.
Jen Fisher: Yeah. I think Tom, I I share your, uh, hopefulness if, if I, especially as I think about suicide, I mean, if we can save one life, shouldn't we at least try?
ANNCR: Yeah, absolutely.
Jen Fisher: So Athia. We've talked about, this had come up a few times already this morning about the, the HBR article that has showed that, um, the second, first and second most common uses of generative AI is therapy and companionship. And I wanna talk about kind of this emotional side, the emotional connection that people believe that they are, you know, that they're connecting with their AI systems and.
How that trend. Couple things. How that trend is, is like evolving across age groups, but then also really thinking about the responsible AI practices. Like how do we ensure, like truly ensure privacy protection, [00:13:00] ethical guidance when when the genie's already out of the bottle.
Dr. Alethea Varra: Yeah. You know, I think there is a, there's a lot that's happening and we're learning so much, so very quickly.
I think we talk sometimes about the generational differences, and there are some generational differences where some of the younger generations are just more comfortable with technology because it's. Been ingrained in every part of their life since the moment they were born. And so they're more comfortable with things like AI as well.
But I don't think it's purely generational. I think there's a piece of this that is a moment in time, it's a historical context that we as humanity are in, where we've all become much more comfortable with AI overall. You know, and that's partially because we've become used to, uh, Google providing an AI summary of our search results.
So we've become used to Amazon giving us an AI summary of product reviews, and my cousin has gotten used to Walmart giving him the [00:14:00] AI like logistics for what's the most efficient way to load the truck before it goes out. Right? It's just part of our lives now. It's part. Of the fabric and the white noise of our day-to-day lives, and we've, we've just become a little bit numb to it and a little bit accustomed to it.
And so we're much more comfortable with AI being part of things, and that's great because of all the reasons that Tom talked about and all the pieces around safety as well, where we can really use AI to make the providers more efficient, to take away some of the grunt work that is just not fun when you're a clinician.
Nobody. Signs up to be a clinician 'cause they love to document. Um, and we can do all of that and create safer systems and more efficient and more effective systems and interventions. And that's wonderful. But then you get to this point of like, okay, so what about the therapy or what about the coaching or the support and some of those areas.
And the truth is, is. AI and technology. We've had tools that really do utilize those types of interventions. For years [00:15:00] now, there's been CBT, you know, computer delivered CBT for years now, and it's been pretty effective. But the differences is that those interventions have been built on more traditional platforms and they've been built.
With more traditional ai, which was more role governed. I'm, I'm completely oversimplifying this, and our tech team is probably gonna hate me right now, but you know, it was more like if then if we are in the third session, then here's the content we should cover. If the client says this, then here's the response.
And again, it was very much contained within all of those rules and all of that structure. And that's shifted because we've shifted to this generative AI process. We, again, oversimplifying a psychologist, not an engineer, and you know, it's learning as it goes and we can give it the parameters and of what we would like it to do, but over, over time, that generative ai, its ultimate truth, [00:16:00] criterion, is, is the human in front of me.
Happy with this answer. And so it, we can give 'em those parameters, but it often starts to kind of drift. And the problem here is, is that my job as a therapist, so very often is to sit down with a human in front of me and to tell them something that is actually not gonna make them happy. Right. I'm asking them to challenge their own beliefs, to challenge their own thoughts, to challenge sometimes very deeply rooted patterns of behavior in their lives.
And I can vouch for the fact that in that moment sometimes that human doesn't like that response, right? Now I can handle that because I know it's a process and I know that over time this is actually what's really good for the human in front of me. But the generative AI does tend to drift, and we've seen examples of that.
You know, we saw the, the most infamous one at this point is the the AI chat bot that was helping to [00:17:00] support around disordered eating. And they gave it all the parameters. They told it exactly what to do and what was an okay way to coach individuals. But over time it began to drift until eventually it was actually.
Providing feedback that would exacerbate their disordered eating, that's a problem, right? When you just tell the person what they want to hear, that's a problem. And it's not to be fair to the ai, it's not that there aren't therapists out there that will just tell you what they wanna hear to the human ones.
They're not very effective. They don't work for Lyra. Uh, but you know, they all have a line too because they're fellow humans. And so they say, I'll tell you what you wanna hear, except for when it starts to get dangerous and when I'm starting to tell you something that could be harmful. But AI has no qualms with that.
It doesn't draw those same lines. And so you can cross those lines and get into places that are dangerous. And so when we talk about therapy. [00:18:00] Like, should we let that be the thing that says, no, you shouldn't do it. Like, let's give up. We don't know how to control it. At this point, my answer is no. What you're hearing here is, is our answer here at Lyra is, no, there's so much value here.
There's so much we can get out of that, and if we're gonna be ethical and we're gonna be responsible, we have to make sure that there is a human being. Who is providing that oversight, who is intensive, intensively monitoring all of those interactions so that we know that we're preventing that drift, that we're preventing that that exit from what actually is helpful and actually is effective.
And so many of those models out there in the world, there's none of that oversight. And if there is oversight, it's not done by an expert who really understands the difference between That. Sounds good. And that's bs because sometimes they sound pretty close. So you need like an actual experienced clinician to do that oversight and to make [00:19:00] sure that it remains safe and effective.
And when we can do that, when we have confidence in that, when we should absolutely use it for all the reasons they've talked about.
Dr. Tom Insel: Tom Ali, I think that's just great. That's like one of the best summaries I've heard about the challenge ahead of us. So, and just sort of to condense what you said, 'cause I believe so strongly that this is really important for people to hear if.
If the task is to do what we would call manualized therapy, like getting over a phobia using cognitive behavior therapy, it's, it's a, a matter of getting a skill. Maybe the analogy is if you're getting in the car and you've got the GPS and you know exactly where you're need to go, the car will get you there and the GPS works great, but if you are looking for a kind of therapy that.
Where you actually don't know what to put in the GPS, you know, something is deeply wrong in your life and you need to explore that, the kind of interventions you need. AI as we know it today with the [00:20:00] foundation models we have, is exactly the opposite of what you need, right? The example was when I was trained as a therapist, you know, I was taught to listen for what somebody didn't say.
So you'd hear somebody talk for. 30 minutes about their mother and you'd say, why are we not hearing about your father foundation models for AI that we have them today? Don't do that. They're the opposite of what you need for growth therapy, dynamic therapy, psychoanalytic therapy, whatever you wanna call that.
So the challenge is to build that new foundation and there now are startups trying to do that. It's really hard. Just as, again, another analogy when. We wanted to do self-driving cars. It's taken a huge, huge amount of time to do that. I have a 16-year-old grandson who's learning to drive 24 hours of instruction and he'll get a license.
He may not know how to drive, but he'll get a license. For Waymo, it's been a hundred thousand hours and they're [00:21:00] still not able to put Waymo. From San Francisco downtown to the airport, right? So it's still, so it's really, really hard to build these autonomous models for something like therapy. I, I'm not saying we can't do it, but we're nowhere close.
We'll need to have a hundred thousand hours of that kind of therapy. That becomes the way in which we train the next foundation model.
Jen Fisher: I, I will say athia that uh, when, when my husband doesn't give me an answer that I like, he tells me to go ask the ai 'cause it'll give me an answer that I
ANNCR: like.
Jen Fisher: Exactly.
So, so it's real,
Dr. Tom Insel: not your therapist. Yes.
Jen Fisher: But he's like, if you want, you know, if you want an answer, you'll like go ask the ai,
Dr. Tom Insel: that's why we call it a copilot. Yeah,
Jen Fisher: exactly. So, so Brianna, when, when we talked earlier, you, you made. A, a really interesting observation about young people and, and Jennifer, uh, uh, alluded to this too, about young people [00:22:00] putting mental health preference preferences on their dating profile.
So you have a story around this. Um, so beyond just being more open, how is this, in your view, like how is this generation positively transforming workplace culture specifically around mental wellbeing?
Briana Duffy: Sure. So I guess the backstory here is we have what we call sushi night at my house when I'm in town and my son is in town and some of his friends tend to stop by.
I like my status of bonus mom and I tell that that title proudly. Um, but this is a common event where people come into the house and we kind of catch up on what's happening in their lives. And one of his friends, um, is well experienced in the dating app world. Uh, I will just say that that is probably the way that we would describe that.
And, um, he is the king of no more than six dates typically, is the way that that goes. And that's sort of become his identity. Um, he'd gone beyond dates six at this point when I said, you're killing me. Oh, you're kidding me. Like, this is great. Tell me about this young woman. And he pulled [00:23:00] open his phone and it was the dating app.
And I admit I don't have a lot of experience when it comes to dating apps and my son knowing what I do for a living and myself, I guess identity as a behavioral health geek said, show my mom sort of what's sitting here in this profile. And common in the dating app world, and many of you may know this perhaps, and I'm telling you something you already know, is there's now sort of a requirement in your ideal mate that says, been in therapy or know who you are.
And what they're really looking for, I guess, is folks that have kind of figured out a little bit of that mental fitness and, uh, they, they know what they're bringing to the table. But I thought it was just absolutely fascinating and it made me think if this is the requirement now in dating apps and we're kind of saying we're gonna weed out who's got it together already and who needs to spend a little bit more time, it would save a lot.
It would save a lot of time. It would save a heck of a lot of time. But if this is the new mainstream norm, uh, okay, the dating world, that means that it's the same expectation at [00:24:00] home. It's not gonna look materially different in the workplace. And I think Jen did a nice job in a keynote this morning.
Describing sort of future workforce. And when we think about the millennials and the Gen Z'ers already being willing to talk about Ben in therapy and what they're looking for, alpha and beyond is gonna be looking for something, uh, probably even more than Ben in therapy. And so this idea of mental fitness is becoming more of an norm as an employer and what you need to be thinking about.
I think that this has, uh, pretty wide stream ramifications as an employer. Thinking about, you know, the definition of mental fitness and how that comes together, that changes a lot about what we're offering, not only for, you know, a safe and comfortable workforce, but it's an incredible sort of intersection with leadership and how you lead someone to make sure that you're leading.
And encouraging mental fitness. And so I think that has really the ability to transact organizations and who we hire in leadership roles, how we identify whether or not [00:25:00] people are doing a good job, how we recruit people into the workplace. I just think that this is just the beginning of the end, the beginning of the start.
Is it gonna be on our job applications? Yeah. Right.
Jen Fisher: So, so Tom, you also shared a, a really, uh, interesting stat from some of your, your research that 70% of the students that you work with prefer community-based care over this idea of one-on-one therapy or one-on-one care. Um, which I think is really exciting and interesting.
But we also know that in, you know, older generations, there's a preference for more private, in individualized pro approaches. 'cause there's still kind of that. You know, we're, we're not as open and willing to share about it. So how can employers create mental health systems that harness the strengths and differences of these generations to bring them together?
Dr. Tom Insel: Yeah. I think in the keynote, Jen said something like, we're there's too many
Jen Fisher: Jens? We were talking about that back. I'm like, wait, I didn't say that.
Dr. Tom Insel: [00:26:00] Uh, but what did she say? Five generations? Yeah. Currently in the workforce. Which is kind of mind boggling, and I'm not even sure where the separations are between them.
Uh, but it's really clear that this next generation that's coming into the workforce is looking for something different. We, we, a former company that I was advising had, um, mental health. Uh, services for students at uc, Berkeley, and the situation was, there was a long waiting list to get into the counseling center.
It was like three months, which is basically a semester, and it wasn't really working. Um, and so they brought this company in to sort of provide immediate care, and they had a whole series of different options that people could get. One of them though, was just being able to be in a group with people just like me, an asynchronous group.
And find people who were struggling with the same thing. And, and what the company did was they gave people some basic skills like [00:27:00] motivational interviewing skills, a few others, so that they were empowered to help each other and that's what everybody wanted. They didn't actually, at the end of the day, they didn't want therapy.
They didn't want the one-on-one stuff. They didn't want what, you know, was kind of traditional treatment for mental health. They wanted this opportunity and it was so interesting 'cause when we interviewed them to say like, why was this helpful? Then what they said was, oh, it wasn't about getting help. It was about giving help.
That was actually really empowering. That's what really gave me what I needed and was also very engaging for me. And now, you know, whether it was loneliness or anxiety or depression, whatever it is they were struggling with, they felt that they were able to overcome it within now this community. And I think that's a really interesting opportunity to think about.
It's, I love it. You know, it's easy enough to set up. You have to have guardrails. You've gotta make sure that this doesn't kind of run off the rails because of trolls or whatever. So you want to have [00:28:00] someone experienced. But I think in that case, we had one person per thousand members of the community, uh, and very quickly learned that there were really big upsides for people, especially of that generation, to be able to be part of something like that.
Jen Fisher: Yeah. I mean, the secret to happiness is giving help, not
Dr. Tom Insel: Yeah, who knew, which is, might be why we do what we do. Yeah. We just don't want to, we don't expect everybody else would wanna do that, but who knew?
Jen Fisher: All right. So aia, as we're talking about kind of this younger generation. You had talked about younger workers having a different threshold for workplace stress.
They are similar to dating apps, more likely to prioritize jobs that they do feel supports their mental health. But many older workers kind of say, oh, but you know, we bring resilience from a different perspective. You know, we, we toughed it out. We pushed through. There's [00:29:00] complimentary strengths here, even though it feels like there isn't.
How do we leverage that to create, I mean, I, I keep kind of going back to the, this, how do we bring generations together on this topic of mental health care?
Dr. Alethea Varra: Yeah, it's absolutely true. It's something we see in our own data and it's something we hear from all of you and all of our HR leaders that we work with, is that, uh, the newer generations have really come into the workplace and do have a fundamentally different relationship with work.
Just overall. Um, and they really, they, they really came in and, and. Objectively had very, uh, healthy or healthier approaches to their work where they were like, okay, um, I would like a little bit of work life balance. And there's, you know, only so much stress that I think is a good thing. And if I'm struggling, I would actually like support.
And, and it was interesting when that first started to happen because I do think some of the older generations grumbled. About it. You used to hear the kind of like, well, who do they think they are? Like they're, they're entitled, right? You'd use the [00:30:00] entitled word or, or you know. Well, I made it through that and I had to tough it out and I only slept four hours and I liked it.
Um, I'm sure you've heard this from your coworkers and from others in that and then, and, and I think it's partially 'cause older generations like didn't know what was. Possible. And suddenly when those younger generations were actually changing the workplaces and making them a little bit more healthy, uh, some of those older generations were like, oh, it could be better.
Right. And, um, and so started to, to really learn from those younger generations of, of what. What should we be asking for and, and what is positive in all of this? And that's a great thing. We, and the way that that happens, the way that we make that possible is for all of us to be willing to listen and willing to learn and to create opportunities for us to have those conversations where we can actually understand the [00:31:00] perspective of the person and what they're asking for, not just kind of make fun of it.
Right, and it goes both ways because those younger generations also, you know, the, the thing about those older generations is that they've seen a lot. Right. They've experienced a lot and based on that, they have built resilience. They understand when there's a lot of change in the environment or when there's a lot of challenges.
How do you kind of, you know, first they know you can make it through it 'cause they've made it through it before. So that's part of what they give the younger generations is. Hope. Yeah. Uh, where you can say like, Hey guys, we're gonna make it through this together and here's how we can do that together. Um, but again, it requires a, a level of open communication and respect, and we all have the power to facilitate some of those conversations.
Jen Fisher: Yeah. I would say as a, as a fellow Gen X or, um, I, I don't know that we, I think we did ask for things. But we were told no. And then we were just like, okay, we'll do our work then. Right.
Dr. Alethea Varra: [00:32:00] Exactly.
Jen Fisher: You know? But thank you for saying no. Right. We know we were grateful that they said no, so, so Tom, back to you. Um.
You again, you kind of, you're, you're, you're bold. You go make these bold statements. And so you've advocated for evolving the current mental healthcare system, suggesting that access alone isn't enough and that a value-based approach is what could really revolutionize this type of care. Um, and that obviously challenges conventional models.
So tell me what this value-based revolution would look like in practice. Um, and then what opportunities does it create?
Dr. Tom Insel: Yeah. Well, so Brianna and I can maybe do this together. This is really your wheelhouse. This is what you're
Jen Fisher: talking about last night, right? Yeah.
Dr. Tom Insel: Yeah. And it, you know, I do think it's an important conversation to have is whether the system that we've built is the system that we need.
Um, and it, you know, frankly, most. Patients and families would say, nah, it's just not [00:33:00] working. It's too hard to find care. It's easier with lira, but outside of this, um, it, it's really not great. Um, I, I work a lot in the public mental health system with people who have serious mental illness and are on Medicaid.
Um, and that's. That's not great. Uh, and yet it's often the benefits are actually better than for those who have commercial insurance, which is even more amazing. So there's a real struggle, and again, Jen in her keynote when she talked about her mom's work and how difficult it's for people. Um, so my interest has been in changing that entire ecosystem in some way.
And when I ask like, how do we do that? It's usually you follow the money. I mean, healthcare is still largely in the United States. Not the rest of the world, but here it's largely a business. And what gets delivered is what gets paid for. And the more it's paid for, the more likely it will get delivered.
Um, what we deliver in mental [00:34:00] health care is largely what's called a fee for service system. So you get paid by the amount of time you spend with a patient or a family. Um, think about that for a moment. It was like, is that, is that the right way to do this? Is that. Like, we wouldn't pay for surgery that way.
Like the longer the surgery and more money you get. Take your time.
ANNCR: Take your time. Yeah. Just
Dr. Tom Insel: right. Uh, or for most other kinds of healthcare, it's not usually based on time spent and in the public system that I know you have to document what you do every five minutes or something like that, which is a great use of AI, by the way.
Great way to, to, um, to offload a lot of that documentation. So the concept is to really shift the way payment is done so that you're paying for, for results, you're paying for outcomes, you're paying for what we would call value, um, and how you do, that's a little tricky. I think we have to get really smart about how to do it.
We've had more success I think, in maternal care and somewhat in orthopedics. Um, [00:35:00] here's a little hard, but you could imagine, you know, that you'd pay for people staying sober for periods of time. That that becomes not just the patient's goal, but the provider's goal and actually the payer's goal as well.
So you could see how you might kind of line align incentives to, to get towards health outcomes rather than outcomes that are based on time spent, but. Here's the expert. Do you wanna, do you wanna add on here? She's really the person who's thought so much about this.
Briana Duffy: Sure. I mean, I think what we talked about last night and what I really am hoping that we can collectively bring to the market outside of Lyra and moving into the larger ecosystem is high quality simplicity.
And what does that mean? It means that when we know that people need care, I. That we know the services that are going to become available to them are high-end quality and that it's easy to navigate. I mean, I hate to break it down to something so simple, but that's really what we're after. And being able to say to someone, this is [00:36:00] happening in your life right now, and you're a person who has serious and persistent mental illness, we are going to connect you with these resources.
With a goal of helping you no longer have, I'll pick a couple of different symptoms, which may be homelessness. It may be potential homelessness. It may be someone who's unemployed and wishes to be employed, someone who does not have good relationships with their family members because of their mental illness and the difficulties that they've had in their personal inner relations.
And those are the three things they're trying to solve for. If we can go to a provider who's offering care specializing. In individuals who have serious and persistent mental illness and say, we will define the quality metrics of what's going to make that person that we just described better by the way he or she is defining that definition, and we are going to give a added [00:37:00] like bonus payment for the outcome when those different pieces have arrived in that person's life.
It really changes the trajectory. It's not about saying. You have to be seen twice a week and you need to do medication reconciliation every time that you're there. Please make sure you've documented a really superb note. It's about how and when you flex the benefit based on that person's individual needs.
And we typically pay for that in some form of a case rate. It's like a DRG in medical healthcare. Um, but that really changes that dynamic of paying for those individual components, of having a housing stability worker, of having someone who's working unemployment rehab, of having someone who is providing more traditional therapy.
It's sort of all in the bucket.
Dr. Tom Insel: Yeah, so it's such a great model and you know, especially for serious mental illness, we know a lot of the interventions that are most important are in this general bucket of what we call psychosocial rehabilitation. The problem is we don't pay for any of that, and so it doesn't get done.[00:38:00]
It's very analogous to physical rehab after you break your leg. But that does get paid for, and you know, it takes about six to nine months and all of that is covered by insurance. But all of those same kinds of resources, whether that's a clubhouse or an ACT team, or a coach who comes to the house, or all these things that you need for rehabilitation from a psychotic episode, that doesn't get paid for it for the most part.
And so it really doesn't get done. People end up in this. Terrible situation where they have a first break, then a second break, then a third break, then they're homeless. Then they're incarcerated, and it's incredibly costly for society and it's just tragic for the individual and their family.
Jen Fisher: We've talked about, and I know at Lyra you all have introduced this concept of mental fitness.
Tell us a little bit more about that and. Why this has become part of the programming that you all offer at Lyra, um, that is potentially more [00:39:00] upstream than what we're, what we were just talking about. Totally.
Dr. Alethea Varra: Yeah. Absolutely. I mean, we were talking about this a little bit yesterday and it's, it's when you talk about.
Physical health, if I talk to you about physical health, you include the whole continuum in that in your head where you're like, okay, that includes whether I'm healthy and well, or whether I'm not. Uh, when I talk to you about mental health, most people don't include the whole continuum. They really think about mental health challenges and, and really focus in on that.
So what I really like about talking instead about mental fitness is that it, it causes us to take a couple of steps. Back and, and really broaden out the aperture and include things like preventative care and ongoing mental health practices. You know, we don't wait until we have a toothache to start brushing our teeth.
Like we shouldn't wait until we have an anxiety attack to start practicing mindfulness. But in order to do that, we have to be open to that. And, and we talked about, you know, different generations sometimes will be [00:40:00] hesitant to talk about their own challenges. And so if we can shift that language and make it more inclusive so that we are, we are talking about your own mental fitness, hopefully that can bring everybody into the conversation and help, uh, introduce that change.
Jen Fisher: Yeah, absolutely. Brianna, I wanna kind of touch back on something, uh, that you kind of mentioned before, but I know a lot of people in this audience are interested in, or perhaps their leaders are interested in, and it's ROI data. Can you talk a little bit about the types of returns that companies are seeing and discovering when they are investing in these more sophisticated types of care?
Sure. Happy
Briana Duffy: to. This was my favorite topics. Going back to the example that I gave earlier, the algorithm that we have for self-harm and suicide prevention. One of the things that we know is that, as Dr. Anzel said, that people are interested in having cost savings. So part of what we measure for ROI is always a monetary component, and we consider [00:41:00] those more of the hard savings in the algorithm that we're talking about.
For every engaged member that we work with, we have seen approximately a 30% reduction in overall healthcare spend, and that in itself is great for someone who's. Paying attention to the beans and wanting to know how the money flows. But we're also incredibly invested in the human side of what that also includes.
And so those are more of the soft savings. So hard savings and soft savings. When it comes to ROI, the soft savings we think about as really being more of the human aspect, the safety aspect, the cultural aspect. For employers. I'll give an example of one of the trades that tend to like to purchase. The type of solution that I've described are individuals who are in the construction industry.
I. For those of you that are in the construction industry, one, you're more expert than I am in what I'm about to say. But what I know enough to be dangerous is to say that the amount of physical exhaustion that individuals that perform manual labor, experience, what that does to someone's [00:42:00] body. The fact that individuals are often working job to job, and so having a steady paycheck sometimes can be more questioned.
Um, the are both sort of the environmental situations that impact in individuals. There are the human aspect of what actually occurs to someone's body as a result of that. And there's an anxiety that can be produced from both of those, and so we see that there can be more reliance on pain medication. Uh, I can go on and on, but you understand that this is a very tough constituency that you're supporting.
Being able to move upstream and being able to actually intervene with those individuals. What we have seen is. There's lower reduction in overall pharmacy spend. We know that there's just a reduction in overall healthcare utilization. People are taking better care of themselves because they're learning more of that I'm both mentally and physical fit.
Um, and, and that can just be an incredibly powerful tool. And so this is when you're starting to take, I think, some of the [00:43:00] components we've talked about. AI or preventative know specific population cohorts who needs something different. How do you match that to the populations that we're serving and how do you pay in a way that is bringing both monetary and humanistic?
I think reward is, is really what we're trying to do.
Jen Fisher: Awesome. All right. Well we're, we are down to the home stretch here. I was gonna bring a disco ball 'cause I found out backstage that the two of you like to dance. I'm not sure if you do, but it didn't fit in my suitcase. So you don't have to dance, but I want you to be concise and visionary.
In five words or less, what's one traditional idea in mental health care that could be reimagined for greater impact? A Alicia, let's start with you.
Dr. Alethea Varra: Okay. Five words. We can absolutely measure that. Uh, and that is because I'm tired of mental health being the special exception where we just have to trust the process and not measure anything.
I want every vendor, every healthcare system, and every health [00:44:00] plan to have some accountability. Okay, that was more than five words, but I'll give
Jen Fisher: it to you.
Briana Duffy: Rihanna, prevent disease and condition exacerbation. We can do better sooner. Got it. All right,
Jen Fisher: Tom.
Dr. Tom Insel: Oh, those are hard to beat. Those are awesome.
Jen Fisher: You could dance.
I'll,
Dr. Tom Insel: I'll do it with brevity. Three words. Okay. People, place and purpose. Mm. Those are the three words for recovery.
Jen Fisher: That was a mic drop moment. All right. Thank you to our remarkable panelists. Love it.
Thank you to our remarkable panelists for these bold insights and for illuminating the exciting future of mental health. What's clear is that transformation isn't just coming. It's already here driven by AI innovations, generational insights, and a growing recognition. That we have unprecedented opportunities to reimagine mental health support.
I'd like to thank everyone who helped make this [00:45:00] first ever live recording of the Work Well Podcast Possible. And of course, thank you to Lyra Health for creating space for these conversations through your support of the Work Well podcast.
I hope today's discussion has inspired you to think differently about how your organization approaches mental health, not just for today's workforce, but for generations yet to come. Thank you to our producer and our listeners. You can find the Work Well podcast by visiting various podcasters using the key word work Well, all one word to hear more.
And if you like the show, don't forget to subscribe, so you get all of our future episodes. If you have a topic you'd like to hear on the Work Well podcast series, or maybe a story you would like to share, reach out to me on LinkedIn. My profile is under the name Jen Fisher. We're always open to recommendations and feedback, and of course.
If you like what you hear, please share post and like this [00:46:00] podcast. The information, opinions, and recommendations expressed by guests on this podcast series are for general information and should not be considered professional advice, diagnosis, or treatment. Always seek. The advice of your physician with any questions you may have regarding a medical condition.
The podcast owner, producer, and any sponsors are not liable for any health related decisions made based on the information discussed. Thank you and be well.