AI: Voice or Victim?

Confused by how AI actually works in healthcare—or worried it’s going to replace your job? You’re not alone.

In this episode of AI: Voice or Victim, we bring in a guest who’s been on every side of the healthcare system: nurse, techie, strategist, C-suite executive, and now AI transformation leader. Stacy Olinger breaks it all down in a way that’s practical, human-centered, and urgent.

You’ll learn:
  • How AI saved time and lives in the middle of COVID

  • Why most AI tools fail in healthcare—and how to fix that

  • The real reason frontline workers resist AI (hint: it's not fear of tech)
Whether you’re a clinician, a hospital exec, or just someone navigating the healthcare system, this episode will shift the way you think about artificial intelligence—and help you start asking better questions.

CTA
 👉 Don’t forget to subscribe, leave a review, and share this episode with someone navigating the AI revolution.

Subscribe to AI: Voice or Victim for more conversations that move you from AI anxious to AI curious. Hosted by Erica Rooney and Greg Boone aka AISerious™, we're helping people and organizations embrace AI ethically, strategically, and with humanity at the center.
Follow us and join the movement to shape the future — before it shapes us.

🔗 Follow us and dive deeper:

Greg Boone on LinkedIn: https://www.linkedin.com/in/gregboone
Erica Rooney on LinkedIn: https://www.linkedin.com/in/ericarooney/

© 2025 Walk West Production


What is AI: Voice or Victim??

A podcast that explores how AI is transforming careers, businesses, and industries. Hosts Greg Boone and Erica Rooney deliver real-world use cases and actionable AI strategies to help professionals stay ahead of the curve.

Stacy Olinger: Everyone thinks about
chat, GPT, and so they're seeing

everything through the lens of what's
most familiar to them, but that there's

this large landscape otherwise known
as AI that needs to be considered.

And so how do you actually
train the nurses who can be

amazing patient advocates?

To advocate for their patients, but
also be involved in the change and

design the change, and be able to
speak up to help train the model.

Greg Boone: AI isn't the future.

It's now, and whether you're in hr,
sales, operations, or leadership, the

choices you make today will determine
whether you thrive or get left behind.

Erica Rooney: Today on voice or victim, we
are joined by a powerhouse in healthcare

transformation, Stacey Olinger and y'all.

She's my girl, so I'm gonna
give her a lot of shout out.

But working as a home care aide
to becoming a C-suite healthcare

exec, Stacey's journey is
nothing short of extraordinary.

She brings this rare blend of
clinical insight, strategic

foresight, and operational
excellence to every room she enters.

Stacey is passionate about creating
systems that are not just financially

sustainable, but human centered,
and she believes that AI is a

critical part of that evolution.

Today, we are digging
into the role of AI in.

Healthcare, which I am
fascinated to learn about.

I wanna hear what's up, what's
real, and what decision makers

at every level need to know.

So Stacey, welcome to the podcast.

It's so great to have you here.

It's great to be here.

Oh my gosh.

Well, I gotta kick it off.

What got you into healthcare?

Stacy Olinger: I had an opportunity of
a lifetime to grow up in Alaska, so I

graduated from the University of Alaska,
Anchorage and paying my way through

nursing school, I took a position as
a home health aide, and it was the

experience of driving out into the
middle of the forest where there was

absolutely no road signs and giving a
woman a bath with water that I heated

on her stove, that I, that healthcare
is deeply personal and people have the,

um, need to have healthcare delivered in
the way that's most important to them.

And they invite you into these most
intimate and sacred moments of their

lives that you have an opportunity to
trust together with them and build a

bond that is really transformational.

And that is what sealed
my love for care at home.

Erica Rooney: Oh my God.

First of all, it's just
an incredible story.

It paints such a beautiful
picture, but gimme a little bit of.

Of your background and kind of coming
up because as a non-techie person,

right, I'm, I hear healthcare and
ai and I'm like, I'm gonna need

someone to draw these parallels.

So just give me that red thread,
walk me through your story and

then we'll take it from there.

Greg Boone: Yeah.

Erica Rooney: Well I think, um,

Stacy Olinger: so I'm a Gen Xer,
so I actually grew up coding,

doing work in the computer.

Um, that was one of my other side jobs
and working my way, um, in high school.

And so I've always had
more of a techie side.

I call myself the 50 50, which
is 50% clinical, 50% techie,

which is also, um, creates, uh,
this own my own unicorn space.

I, um, have an absolute passion
to leverage and bridge the

gap between clinical and tech.

One of my, um, really probably deepest
personal experiences with taking a trip

to Japan where I was able to be trained
in the Toyota production system, um,

and partnered together with Virginia
Mason, who was a leader in this space.

What I learned there was that
there is joy to be done in the work

regardless of what the work is.

Whether or not you're a taxi driver
or you're a restaurateur or you're

an engineer, having joy in the
work is what makes it meaningful.

And a leader's job is to remove any burden
that prevents that joy from happening.

And as I saw that in healthcare.

When you're, when you're a provider
in healthcare, you don't go into

healthcare to fill out documentation.

I've never once heard a provider or a
nurse say, I would absolutely love to

spend my day typing in the computer.

I would love to spend my day in
my evenings at home, uh, charting.

Um, they go into the work because they
wanna make a difference in people's lives.

And so knowing that the.

Difference between making a diff
between making that impact and that

connection that's deeply personal
with each and every person that

they interact with in healthcare.

And what's standing in the way is all of
the regulatory burdens and the technology,

um, that was unintended, but has happened,
and being able to bridge that gap.

So fast forward after spending over two
decades as a Suite C-suite executive.

So I've held the roles of a chief
nurse officer, chief operating officer.

I've also stood up my own clinical
informatics team, so I guess you would

call me a Chief Nurse Information Officer.

There's a lot of alphabet
soup in healthcare.

Also chief strategy officer
and revenue officer.

I've really run, worn all the hats,
and so this gives me a very unique

perspective to not only being a nurse,
so I know how to insert a catheter, but

I understand the technical part of how
do you actually write the code to be

able to document the catheter and to
get the data out when you're looking

for your catheter acquired infections.

And when I had my, my last organization
that I was with, BJC and Washington

University, we had an opportunity
to implement machine learning

in the middle of the pandemic.

And while, um, any, um, tech
person that you talk with will

say, well, AI isn't actually new.

Machine learning's been
around for a long time.

What was novel about it is how
we took that machine learning and

applied it to the clinical setting.

So in the middle of COVID, imagine this.

You have, everyone's
running around, right?

ICUs are absolutely full.

Everyone's backing up in the ed.

We said we, uh, had already actually
been working on for the last year and a

half for the Department of Informatics.

This machine learning, it just happened
to be ready in the middle of the pandemic.

I guess in some ways you
could say thank goodness.

So we were able to predict within 24
hours of admission if a patient had

a mortality rate, um, that was high.

That would signal that, um, they
needed some additional attention.

They needed to be asked what their
plans were and the power of being

able to allow a patient and their
family before they were intubated in

the ICU to make the decisions about
what was most important to them.

Did they wanna go to the ICU?

Did they wanna die at home?

Giving them the opportunity to be in
the driver's seat is what AI enabled.

And most importantly, we were able to
under understand how the clinicians needed

to be able to work together with the
technology to make it transformational.

Most technology innovations in
healthcare are, here's your new toy,

sorry, you have to use it, and you
have to use a starting tomorrow.

And by the way, you still have
to keep up the same production.

So good luck with this.

And it's clunky and it's clunky.

So sorry.

What we were able to do is actually
work together with the providers and

with the nurses and get their feedback
and actually learn from them and

actually put in the clinical decision
making and the clinical controls so

that they had the ultimate veto powder.

So we were able to, to
get an over 87% adoption.

In the middle of COVID in
an ICU implementing ai.

And what I realized is that by
knowing how to be able to do that

and transform care, this is really
what healthcare needs today.

And so that's what caused me to
launch out and found Caleb Healthcare

Groups so I can help more healthcare
organizations bridge the gap between

clinical and technical in advance ai.

Erica Rooney: Alright, Stacy, here's
what I love about everything that you

just said, and this is, I think, very
pivotal for a lot of people when they're

thinking about AI is you started to
talk about taking, you know, the joy.

Keeping the joy in work.

And I thought immediately, okay,
people, what's not joyful is

the paperwork, it's the filing.

So yes, I see where this is going,
AI's gonna take all this away.

And I was like, okay, cool.

But then you took a sick 16 steps
further, which is really about the

patient care and how it could really
be so impactful, especially in the

last moments of somebody's life.

And for me, that is, you know,
really that aha moment about a AI and

about how it just rapidly changes.

Like we went from just like.

Optimizing some of the boring, mundane
tasks to really making a massive

difference in someone else's life.

So, I mean, thank you for taking me
on that wild ride in like 30 seconds.

But Greg, your thoughts?

Greg Boone: Yeah, I mean, I, I
thought a lot the same way, right?

Like, so on a personal level, about
20 months ago, I lost my mother to

cancer that we didn't know she had,
and it was widespread and we had about.

I don't know, two and a half, three weeks.

Right.

Uh, you know, so there was a
lot of thoughts that I had then.

And then a, a couple months back,
I was in a, uh, healthcare related

ai, um, discussion and our event
and, uh, Dr. Michael Jabor from,

uh, Microsoft was talking about.

Healthcare and talking about ai.

He talked about his own
personal story, right?

And how when his father passed away,
you know, no one, he said it was

five doctors around and we couldn't
figure out, they couldn't figure out

what was going on, blah, blah, blah.

But he said later, you know, I
don't know how, how much time was

in between when that happened.

And then, you know, the, uh.

You know, and how, uh, machine
learning and AI had progressed.

But he said he basically took those
same type of symptoms and all of

those, uh, medical records and
through different systems, they

all came back with the same Right.

Conclusion within two minutes.

Right.

And so to your point, and to Erica's point
as well, I think part of it, and people

kind of get lost in it, is the idea that.

It's about saving time here and it's
about this case, but there's also the

end result of giving people back time.

So it doesn't change the nothing
was going to save my mother, right?

But.

If I had more time, if I had more
recognition and understanding, like,

this is what's going on, you know, I,
I have a lot of regrets on the back end

of, man, I should have done more of this.

I should have done, spent more time there.

Right?

And so, not to say that, you
know, people talk about ai,

don't, they don't think about the.

The, the humanity behind it.

So thank you for sharing that because I
do think more people need to understand

it's not just about what can it do
for me individually, how can it help

other folks, whether it be consumers
or members or patients, right.

Or the family.

So thank you for that.

The.

The other comment I was gonna make,
to your point about the, uh, you know,

we've been using AI for a while and
the early days of the pandemic, how,

how, you know, thankful, you know, you
already put those things into place.

And I tell people this all the
time, it's like, it's like, hey,

I know Chad TBT had a moment in,
you know, in November of 22, right?

And that gave a lot of air
cover for a lot of executives.

I said, but.

The real air cover was given that started
a pandemic where people could come out

of the woodworks and say, you know,
it actually doesn't take us two years.

We can actually, you know,
use some of these tools.

But it was the first time I think, as a
society, people were like, okay, go ahead.

You can push that button now.

Right.

And I think that has
been very, uh, impactful.

I and I, and I need more people to
understand, or I hope more people

understand that it's not just the
chat GPT moment of, of November of 22.

Like the pandemic allowed us to finally
say, let's unleash some of these things

that we can do, and to your point, to
actually give something back to humanity.

So I really appreciate it.

I just wanted to tell you that.

Stacy Olinger: Yeah, I
really appreciate that.

I think one of the things that there
is a lot of resistance from the medical

community about is what about that one
time that there's a misdiagnosis and the

advocates who are also providers who are
speaking are about, what about the 10?

That we would've had earlier
diagnosis or a more correct diagnosis.

I think anyone listening to this podcast
has probably either known someone or

has had a personal experience where
they saw several specialists or had a

delay in the right type of diagnosis
and they didn't get the help Providers

would need to actually study full-time
to be able to assimilate all of

the me medical research coming out.

But that's what AI can then actually help
to assist is not to make the decision

for the clinician, but rather provide
them with all the most up-to-date

research so that they actually can
apply what's more widely known and

more widely practiced and proven to the
individual patient that's sitting in

front of them, and that's what's missed.

I also have a very personal story.

My brother, about a year and a half
ago, died unnecessarily from a, from

a pulmonary embolism at the age of 45.

And, um, from seeing three different
providers, I know that if each of

those providers would've had the,
his medical information at hand, they

would've made different decisions.

I don't blame the providers at all.

They were working really hard, but
there is so much cognitive load that

the providers have to remember of
what to do and what to document.

They have full patient schedules,
they have stuff going on at home.

And, um, this is where I think we
have the biggest opportunity to move

forward, but to move forward in a way
that recognizes that there is risk, that

there are ethics, and that there is bias.

That all needs to be addressed.

There needs to be governance, but that
should not slow down the innovation

and should not slow down our ability
to implement in a very human way.

Greg Boone: So, one, one more
thing, just as a follow up.

So I glad glad you brought it up.

This same event that I went to, I,
there was about a hundred people

in the audience and they're all
healthcare and education related.

And so I was that guy, and I
asked a question, said, we only

got time for two questions.

I was like, Ooh, pick me, right?

And the first thing I said
was, what do you do if.

Your only governance is from the
legal community that just says don't.

And, and so Dr. Jabor again had a very
great response 'cause he had already

shown some statistics about, you know,
uh, AI and their ability to diagnose

correctly versus the average doctor.

And what he said was, I think
it was pretty profound in

this, to your point, which was.

He said it's in the next five years,
it's more likely than not that it

will be illegal for a doctor or a
surgeon to, you know, provide any

level of healthcare without ai.

He said, more likely than not, he said.

So what I would say to that legal
community is this is going to happen

when you, you know, and it should happen,
and it's for the betterment of humanity.

So I just wanted to call that out.

It's like, I think people
are starting to recognize.

That you're, this is where we're going.

Right?

And so that's why when we talk about
AI voice or victim, either, either

ride of wave or get crushed by it,
but this is where we need to go.

Erica Rooney: Well, I'm glad
you brought up bias because that

brings me immediately to all of the
women in healthcare who are black.

Brown or any other color other
than white who are all completely

ignored in the healthcare system.

I mean, the data is out there
that, you know, if a black woman

goes to the hospital with pain,
that she's dismissed, right?

Immediately we're talking, yes,
there will be bias in ai, but there

is just bias in people that are
causing massive, massive issues.

So, I don't know about you,
but like I would rather have a

computer have a little bit of bias.

Than supposed to be the person who was
looking out for my health and wellbeing.

Greg Boone: I mean, I, I told you
the story, you know, my mom went

to the doctor many, many times.

They treated a lot of
different symptoms, right?

It wasn't until she was literally
on her deathbed that they ran her

through the skin and they were like,
she lit up like a Christmas tree.

I mean, she was at my house June 23rd.

She was gone July 31st.

Right.

There was plenty of opportunity.

Right?

But to your point, we don't know
right now, my wife, that side of my

family is uh, um, Hispanic, right?

Her two sisters are both nurses, right?

My wife is a dental
hygienist by trade, right?

So they're very into health.

But she said for me, to me, like,
while this is going on, she's like, you

know that this happens a lot, right?

Especially with, you know, people
of color, like they're just

treating the symptoms right.

And they're just viewing as,
oh, she's got a little pain.

She's just coming in here.

Right.

And so I, I also agree, like now I
also understand that there's bias.

AI is trained on the whole world
and the world has bias, right?

So it's not, uh, it wouldn't be
fair to say that there is none.

But to your point, what AI
doesn't do is get tired.

Right.

It doesn't have, you know, family drama
that is impacting your medical decision.

It doesn't have a golf game or create

Erica Rooney: experiences, right?

Because whatever lens they're
looking through, it's, it's from

their own personal experiences
and whatever that was Now.

Tainting or helping how they care.

Right.

It just depends.

Stacy Olinger: Yeah.

The best healthcare organizations
are actually using a level of

transparency and recognizing that bias.

So there's been some great studies
shown that if the provider has actually

shown how the model has been trained
and the potential bias that exists, they

will always make the right decision.

And you also have to watch for the data.

So if you just are taking your past
medical history, right, of let's

say, you know, a, a large data set,
let's say, of Medicare patients.

And you're recommending treatment
protocols based on the treatment

that's provided to those Medicare
patients for the last 10 years.

Then there's bias in how those
Medicare patients have been treated.

So you can, once you recognize
that, you can train to that.

And I think that's where we are
seeing some of the models really

well developed, but that it's not
about having out of the box LLMs.

Correct.

And that's where I, I'm
really passionate about.

Providing education and training and
support to the community of healthcare

professionals so that they can be educated
and ask the right questions and they can,

and then as stakeholders, as leaders,
implement those solutions in their

organizations in a very responsible way
because things are really shiny right now.

Looks really great, but if you're
not educated, then how do you know

to ask the questions and make sure
that you're supporting the, uh, the

right level of governance as well
as all the way to implementation?

Erica Rooney: Yes.

And you're doing an AI webinar
series soon, so tell me about that.

Stacy Olinger: Yeah, this was
really generated out of, uh, the,

out of the recognition that, um.

To Greg, the point that you mentioned
around everyone thinks about chat, GPT,

and so they're seeing everything through
the lens of what's most familiar to them.

But there, there's this large
landscape otherwise known as

AI that needs to be considered.

And so how do you actually train the
nurses who can be amazing patient

advocates to advocate for their patients,
but also be involved in the change

and design the change and be able
to speak up to help train the model?

Same thing with providers.

How can we be developing more
research and um, developing.

Our systems and structure so that
you have to be able to educate, uh,

whether it be the administration,
the providers, or the frontline

staff so that they know what it is.

So it makes it a little less scary,
and it also helps 'em to recognize that

there's a difference between generative
agentic and also just simply AI that

helps to tie together the data from
the, you know, whether it be tens or

hundreds of thousands of different
medical systems that are out there.

Greg Boone: Yeah.

And I, it's, I mean, yes.

And there is a, you know, there's a
difference, significant difference.

And I would say that, you know, we talked
a lot today about machine learning is

what we're talking about in a lot of ways.

And generative AI is a subset
of machine learning, right.

But there is a difference, right?

And to your point, um, a lot of people
have been more familiar with the gen ai

side of, of the equation, and there's some
value, significant value there, right?

Like, um.

Now what Eric I and I talk about
both on the show and then offline

though, is that a lot of times we're
in this bubble and we believe that

everybody has used chat, GBT or knows.

But actually have it, and I would
say 99% of the world actually

doesn't know what generative AI
is and what we're talking about.

And that's one of the challenges
I think that we're all facing as a

society and especially within, even
within corporate Americas like.

This is the first technology that impacts
every single person on the planet.

Right?

And so you and I both have,
uh, backgrounds in, in

software and in development.

And you talked about, um, kind of the,
the lean or Six Sigma or, you know,

type of approach from a transformation.

I tell people all the time, I posted
something just yesterday on LinkedIn.

I was commenting, you know,
on someone and I was like.

You keep talking about use cases and
you keep talking about X, Y, and Z and

how you're using AI as a business, but
what you're missing, the point is 90

something percent of your organization,
all they hear is you're replacing me.

Right?

Right.

And they don't know what AI actually is.

What are you talking about?

Is it a robot like Stacey?

What are we using?

What is chat GPT?

I've heard of it.

I mean, people are like,
yeah, yeah, I know what it is.

I'm like, alright, well what are
you, you know, how are you using it?

I don't really use it.

I'll say, you actually don't know.

Because if you knew you would use it,

Erica Rooney: there's not gonna be a
robot coming in to put in your catheter.

Let's just

Greg Boone: say I, I mean, I already
didn't, like I had a bad visual

earlier when she kept talking
about that, but now you just put

robots, and Stacy was the one who

Erica Rooney: started talking
about catheter, so I was just

pulling the red thread through.

I was like,

Greg Boone: and then that made it worse.

Right.

Just so we're clear, the red
thread's a really good visual.

Like, can we move back to Stacey?

Could you ask a, a. You know,
here's professional question's.

A provoking question.

Okay, here's a thought

Erica Rooney: provoking question, okay?

Because we've talked a lot about
the front liners, but I think that

it's gotta start at the top, right?

Because you've gotta lead by example.

We've gotta show people.

So let's talk about the
C-Suite for a second, right?

What should healthcare execs really
be thinking about when it comes to ai?

Stacy Olinger: Anything about what
problem you're trying to solve.

Uh, there's a lot of noise out there.

Sometimes you, um, actually
don't need AI right now.

You actually need data governance.

So if you haven't done the basics,
um, you need to start there.

And you also need to clearly
define what is the problem you're

trying to solve and who is the
problem you're trying to solve for.

And then from there, be able
to then partner with the right

solution, and most importantly,
involve the users in the design.

And so it's also important for the
leaders to understand is what is

the problem they're trying to solve?

What is the ROI?

What is the impact, not only on the
clinical teams and the patients, but

what, how are you going to be moving
forward to sustain and implement?

One of the misconceptions that
leaders have is who aren't, don't

have a lot of experience with AI,
is they think of it like their

electronic health record system.

Which is, I buy it, I implement it, and
then I just wait for the electronic health

record company to come up with an update.

And then I'm at the mercy of whenever the
upgrade comes and I use to just grumble.

And if it doesn't work, I don't use it.

AI isn't like that.

You need to be prepared to actually,
uh, retire an AI model that isn't

working and move to a different one.

It has to continuously be iterated, and
if you don't have that mindset, you're

not prepared or you maybe don't have
the infrastructure prepared to really

be able to, to move along with it.

If you've looked to see where we've come
and just the last six months or two years.

Completely different trajectory than
what the electronic health record

companies have done in the last two years.

And so changing your mindset.

But also most importantly, um,
thinking about the workflow.

So I love that you brought up the
point that people are gonna think

that it's replacing them, but in
fact it's going to augment them.

So I'll just talk about coding.

There's a lot of wonderful solutions
that are helping to automate coding.

That doesn't mean that the
coder is going to be replaced.

That means the coder then
is going to use the AI to.

More quickly be able to
validate and be able to do more

differently, not more with less.

But if you don't train the coder how to
use the ai, then they're gonna go back

because either they don't trust it or
they still want their job and they have

to show that they have to have, they
need so much time to review a chart.

They're gonna go back to the
chart and review it the way

that they've always done.

So workflow and workflow retraining
and being able to honor the human in.

Experience and augment and help the sup,
the teams to be able to support that is

actually revolutionary in healthcare.

It's not something we've done, um,
or have necessarily patterns to do.

Greg Boone: Yeah, I think it's, I
mean, those are such critical points.

I know recently there's gone viral
about the, uh, Cee o at at Shopify,

and basically he said, I, he said
this is gonna get leaked, so I'm

just gonna put it out myself.

Right.

He said, because my
email's gonna get leaked.

And basically what he, he said,
paraphrasing is basically.

He's mandating now.

He made it optional to use AI
or he encouraged to use ai.

Now he is mandating to use AI to
the point where it's gonna be a part

of people's performance reviews.

Right.

And they wanna see how
people are using it.

Right now there's a lot of, uh,
folks saying that that's the wrong

approach and this and that, like, and
I understand why he is doing it, right.

Part of it is what he's trying
to get folks to recognize that he

doesn't want to become the victim.

He doesn't want the company to
become the victim, which then they

would become, and that's part of
the point that people are missing.

Right.

The other part is the expectations, right?

Shopify's e-commerce is radically
different than when we're talking

about members and patients, right?

But if you think about the job to be done
that you kind of described earlier, right?

Imagine what kind, how much more
time you can give back to families.

Right, like it's about the, the inpatients
in that experience, it's not just

about, you know, you as an employee,
but if you can create that through

line, right, uh, from, you know, what
the customer or what the patient needs

back to what the coding person, the
person, no matter what your role is in

organization, if you can create that
through line, that can be super powerful.

When we talk about workflows.

I tell people all the time, AI is, will
absolutely replace a lot of processes.

Right?

And if your main job is just to pass
the paper from one machine to the

other, then yes you need, there is some,
some risk in that, but I can't imagine

anyone one signed up for that or two.

Any leaders like, yeah,
that's what I want you to do.

Every day.

Right.

So there yes to augment
makes a lot of sense.

I think the, the thing is, I agree that
from the top down, uh, but if you've

been in any type of transformation, you
also have to find internal champions.

You have to find pilot projects.

So I guess I'll, I'll.

I ask this question 'cause I've seen this
a lot come up of late, which is, what do

you do in a highly regulated, highly uh,
invested company that's been spending

the last 10 years investing in centers
of excellence, centers of analytics,

big data, and now you're telling
folks there's a different way, right?

So are you seeing resistance
because of the tech debt and

what people have invested in?

Is that holding folks back?

Do you believe in the healthcare
space or am I missing the mark?

Stacy Olinger: I think for the listeners
out there who aren't very experienced

with healthcare, healthcare is not like
the finance and banking industry in

which the data sets are standardized.

I can't, I, I can go anywhere in the world
and go to an ATM and access my funds.

I cannot go any in the world where in
the world and access my health record.

And so one of the biggest challenges
is that when meaningful use was put

out, there was not a requirement
to have interoperability.

And so there are health record systems
out there that don't interface and

don't have interoperability, so there's
not a way to get the data in and out.

Um, I work with some organizations
that are using, um, RPA to be

able to get the information out.

That's crazy.

Greg Boone: Can you define that
front, front, the audience?

I know what RPA is.

Yes.

But,

Stacy Olinger: um, it's
the process automation.

So it's using a bot, um, to be
able to do what a human would

do in terms of It's a robotic

Greg Boone: process
automation or something.

Stacy Olinger: That's right.

Robotic process automation.

And so it looks like.

Training the comp computer to do what
a human would do and log in, get the

information, and then move it over
to a different programmer screen.

And so recognizing that's some of the
barriers is just simply how the data

architecture is and on the limitations
of the electronic health record companies

and the number of different inputs.

The other is second barrier
is around, um, hipaa.

Which is the, um, health information,
um, privacy Protection Act, which

allows you to say who you wanna
share your information with.

There isn't a really elegant way to
be able to, um, provide that level of

authorization that you want as a patient
or family member to anyone that you

may encounter, whether it be an urgent
care or a pharmacy or the hospital that

you go to, or two different hospitals
that you go to if you're on vacation.

So that's another
challenge is that, um, by.

Uh, trying to protect your health
information and does provide a barrier

to the data going around and helping
to support with some of these efforts.

So those are two things I think about.

And the third is just simply some of
the risks that we talked about before.

And legal considerations and, um, the
traditional, um, glacial speed that

some healthcare organizations move at.

Um, and some of it is due to caution
and, um, and, and does, um, allow

for the the time needed to get
the results and to get accuracy.

Um, however that is or is.

A balance, and I think that's
what COVID showed us, right?

When we're able to speed things
up and take and have appropriate

guardrails, we can see actually
innovation reach the patients faster.

Most organizations have actually
rolled back from their COVID policies

and approvals, and so I think that's
where I'm seeing some of the slowdown

in the Inva innovation as of today.

Hmm.

So interesting.

Erica Rooney: I'm also interested
in like, how are the frontline

workers feeling about ai?

Because I think in my experience,
those are the people who I see are

like the most anxious about it.

What's your opinion?

Stacy Olinger: I think you have
with anyone that you would talk to.

You do have the early adopters and you
have the late adopters, so you'll talk

to some providers who absolutely love it.

It's, it's absolutely been a game changer.

It's allowed them to be at home,
at dinner with their families.

It's allowed them to focus on what matters
most and being able to spend the time

that they want to with their patients.

There's others that don't understand
it or find so much friction with it.

They don't use it at all.

So I'm seeing a really widespread,
um, adoption and some of it is, is

directly related to the implementation
or, um, the, the type of ai.

Um, and some of it is actually, um, due
to the comfort level of the provider.

So I'm seeing a really wide range right
now, and I do expect that to change

over time, but I think that's what we're
seeing is that normal change curve.

Erica Rooney: I know we talk a lot
about the psychological safety, and

when you're talking about change and
implementing that, I imagine though,

in the healthcare setting, right, it's
hard to say, oh, it's okay to fail.

Like that's what we're all doing here.

Like let's just talk about 'em.

You're like, yeah, you know,
so it's not like they have

this safe place to go and fail.

What advice would you give?

I would say frontline workers who
are anxious, who want to lean in,

but are terrified of the failure.

Stacy Olinger: There's a arc going from
novice to expert, and if you have been a

provider and you're now 20, 30 years into
practice, you do not wanna walk in front

of your patient, whether that be into
their house or into the clinic room, or

into the hospital bed, and not act like
what you're, you know, what you're doing.

And so you have to decrease that barrier.

So the story that I shared earlier about
the machine learning that we implemented

within Epic, what we learned is that even
though we could signal that the patient

had a high likelihood of mortality,
we then had to address the secondary

barrier was a, was a hospitalist or
internist prepared to have the really

difficult conversation with the patient
and family about the trajectory of

the disease and helping them to go
through the decision making process.

And what we found out was no.

So it wasn't always, it wasn't,
it's not just about the tech,

it's also about the communication.

But this is where AI can help
in that so that providers

have a safe space to practice.

There are models out there.

I mean, right now you could talk
to chat GBT and say, okay, when you

can even train it for the model,
all right, I wanna have this type

of communication with my patient.

You could say, and so we actually
did this without using ai.

We put them all in a
classroom and we did roleplay.

We had actually trained professional
actors that the providers could interact

with to get comfortable having these
conversations so that once we had

the technology, it actually was able
to make impact at the frontline, and

now technology is able to do that.

That.

The technology could be that train actor
that we actually had to pay people to

do, but you could actually all do this
in the comfort of your living room and

have a conversation to help you feel
like you are ready and now expert, so

that when you walk into that patient
room on the next day, you can have the

conversation with confidence and then you
don't get bad reports for poor bedside

Erica Rooney: manner.

Greg Boone: Yeah, I mean.

I, I tell people that it's like,
so you touched on so much there.

I think one of the things I was
playing around with last weekend, I was

using Hagen and I was using Synthesia
to create digital avatars, right?

To basically, and to do these
types of workshops, and I

fed it in some information.

I was just amazed at the, you know,
the, the virtual digital twin or

whatever you want to call this avatar.

That was speaking back to me with
all the stuff that I prompted,

using emotion and things.

But, but to your point,
it's a great way to train.

Another thing I tell folks all the time is
I use the role playing aspect of it too.

Right.

So I set up some prompts
for HR and for other folks.

I'm like, it's like, it's not
just getting a question, you're,

it's having a conversation.

The reason why it's called chat,
GPT, it is a conversational engine.

It's not like a traditional
search, you know, uh, scenario.

So the role playing aspect, I think
people, if they understood, again,

there's a lot of education, has to
go in a lot of de demystification.

The other thing is, like
you, you said the word epic.

That's the, the number one.

Every time, uh, someone in in
healthcare, they'll reach out to me.

It's like either they love
it or there's some eye roll.

It's like, oh, we
implementing Epic right now.

Do you guys know anything about this?

I was like, man, I don't know
anything about that, but like,

that's like the, the platform, right?

But what I would say is
like, you already have.

Like we, we got comfortable with doctors
and, uh, PAs and nurses and folks

walking around with tablets, right?

So you're already are
looking at something, right?

And so I would say it's not a big
stretch to then just have that

something be an ai, not some, uh.

I'd rather you be looking at that than
when I actually look at the screen, it's

just a bunch of check boxes and dropdowns.

It's like, and you're asking
me about my medication?

I'm like, man, you, you should know.

Like you shouldn't be asking me.

Right?

And so for me, as I think that,
and I always tell folks this,

regardless of what industry you're in.

As the consumer gets used to as society
gets used to using a technology, if you

go back to the, the internet and people
say, there's no way someone's gonna

put their credit card and buy anything
in 1998, I remember sitting at IBM and

people saying that, and I'm like, I
don't never say never in tech, right?

But as people start to adopt these,
as it becomes more commonplace, it's

also gonna be the expectation, right?

And so if I'm sitting there right,
and someone doesn't have a, what I

conceive to be a clear understanding
of what's going on with me.

I'm going to start looking at
it and now I'm judging you.

I had this just happen to me recently
where I've been dealing with certain

issues related to my knee and I finally,
in December, I was using, um, it's when

Google dropped their, uh, their, if
you had a pro version of, of Gemini,

their deep research, everybody keeps
talking about, oh, deep research new.

I was like, I don't, I've been
using it since December, but

the beginning of December.

But I started to dig around there and
I was like, this is what's going on.

What is the issue?

Within a minute, he came back and
told me what was actually going on.

I've been going to the
doctor for three years.

Erica Rooney: So what you're saying
is you've become one of those people.

Greg Boone: Like, this is not WebMD.

This is a, uh, democratized intelligence.

Erica Rooney: It's, it's elevated how
you're using it, trying to self diagnose.

Greg Boone: So, so here's the thing,
Stacy, like, you kind of touched

on this earlier, and I'm not trying
to diminish, uh, the conversation

'cause I'm gonna turn it into like
a, a restaurant analogy, right?

And what I tell folks, it's like you
are, you were saying something about

someone that has 20, 30 years of
experience and this and that, right?

It's still contained in what they know.

As an individual.

And so what I was telling my friends,
we were sitting at dinner and they

were asking, um, our server about wine.

I said, you understand that she can
only give you the information based on

what she knows, what they're selling
and what she believes you can pay for.

Right.

I was like, so the idea that we are
getting the best of humanity with that

context and those guardrails in my,
I can go on my phone and I can ask

the world what types of minds should
I be trying, what should I was like,

I, I don't understand the argument.

I, I understand the argument that
people make, but I also don't get it

right because now I'm relying just
on what you know as an individual.

And I'm putting all of my faith.

And so now you, that's just wine.

Now you put healthcare in the mix.

Now I am subjected and that's why
people you know would go get a

second, third, fourth, fifth opinion.

Alright, I'm gonna get a billion opinions
and I'm gonna get it to coalesce.

And then how close is it to
what you're talking about?

So sorry.

Erica Rooney: No, all I was gonna
say is we've been conditioned to

trust and believe our healthcare
provider, regardless of.

Anything else, right?

How much experience, anything.

And so I think what this is gonna do is
just give us a different option, right?

To say, Hey, I still want that
one-on-one human interaction in times

that I'm talking to a person and they
can use their judgment and their care.

But also I have more knowledge at my
fingertips, which I think is really huge

Greg Boone: and I can go
have dinner with my family.

Right.

You said that, and I've heard that
even in like, uh, there was one on

the financial industry and they were
talking, uh, financial services.

They were talking about young, um, folks
going to work on Wall Street or that.

They're like, Hey, I
finally have something.

Now I can actually make it
home in time to eat dinner.

Right.

And that's amazing for me.

And I think that's one of those
things people don't understand.

I always talk about flipping
the time equation, right?

We always talk about this is moving
away from mundane tasks to, and you

know, in our, in our world, it's the
mind blowing creative of, of, you

know, marketing kind of world, right?

But the idea of being able to give
that time back, but then it still goes

back to how, Greg, how do I, because
I also have heard from some folks,

and you mentioned this earlier, like.

If you just tack it onto the end of an
already existing process, you just added

more work in this, how they're viewing it.

They didn't view it.

And that's like if you had a 10
step process using ai, reimagine

a process, and maybe it's three
or four steps, or maybe the agent.

Or the, the system.

So we talked a lot.

Let's one, you know, what's
your reaction to this?

Stacy Olinger: Uh, well, first of all,
um, thank you for raising the issue.

Right.

Thank you for enjoying, uh, enjoying
the conversation with you about

the human element, because that's
truly what's most important.

I think a lot to a lot about when, um.

I love the container idea that you just
expressed, that you can only give outta

what you already have, but what AI enables
is actually as a whole entire other world.

So lemme talk about end of
the life for a minute, and we

were talking about your mom.

Everyone has different life experiences
and different things that are

important to them and different values.

And just because you're from a certain
country or have a certain color of

skin does not mean that you have a
certain religious belief or you have

certain, um, culture or you have
things that are important to you that

the provider actually knows about.

But what if care could
be that individualized?

We talk a lot about precision medicine.

We talk a lot about the geno, you
know, genome and how AI can help enable

and specifically recommend treatments
based on your genetics, which I

also think we haven't talked about
yet, which is really gonna be great.

We're gonna see that in the future.

Greg Boone: Like CRISPR or we talking,
what are you talking about here?

Stacy Olinger: Well, if you have a certain
genetic code, certain medications are

going to affect you in a different way.

Certain chemo agents will affect you
in a different way and be able to match

those up without having to go through
10 different trials to figure out which

one did is going to be really great.

Um, but back to the other human element of
it, if your provider actually knew it was

most important to you and could provide
that care, plan, those recommendations

based on what was important to you,
that you were able to, um, share with

your provider and then that was actually
shared with the entire care team,

would that not be transformational?

Would that not meet what's
missing in healthcare today?

So.

We think a lot about
what we want it to do.

And you're talking about
like the one or the 10 steps.

And I think about also, what about
if we just completely reimagined?

A lot of times when you do
ask the frontline staff,

how could a AI impact you?

What could I do?

Like, well just make me
not have to do this paper.

But what, if not in, in addition to
not having to write down the piece of

paper or check all the boxes, what if
it also was able to allow you to get

back to the heart of the medicine of
why you got into it in the first place?

Erica Rooney: Yeah.

I think the question needs to be shifted,
not just to like, how can AI help me with

this problem, really, but how can AI help
me with the emotions of my clients, right?

Like how I want them to.

Feel right?

Because I know when you started talking
about emotions, if I were a doctor, I want

people to feel good when they leave me.

I want them to feel like they're
control of their bodies and their

health and their lives, right?

And, and I'm sure that's
probably most physicians, right?

So how can we use ai?

To really pull those emotions
in to serve our clients.

And I think that's huge.

We talked a lot about
using AI in healthcare.

I, 1,

Greg Boone: 1, 1 thing, if
I can on, go ahead, sir. Go.

'cause the parallel, what you
said is basically the parallel

on the marketing side is hyper,
hyper personalization at scale.

Right.

And so people have been talking
for many, many years about,

you know, selling one-to-one.

They're radically different.

You're talking about hyper-personalization
in, in healthcare, right?

Like you need to know a lot more
about me, not just my symptoms.

Like what is, you know, 'cause we all
fill out the same kind of questionnaire

on the intake or like, are you a smoker?

Do you do this, blah, blah, blah.

Right?

But to your point now, we have actual
tools where you can give a level of

healthcare with hyper-personalization.

Right, and how meaningful
would that actually be?

And I also think that it would transform
how people think about healthcare.

'cause a lot of times there are a
lot of folks will just avoid it.

I know in my family and in my culture,
and I can't speak for all people of color,

but I know a lot of my family members have
a very anti healthcare view because they

feel like it's a very cold experience.

And so if you could actually elevate
this, and I think we have tools

now for the first time in humanity
that can make this more realistic.

So I just wanted to comment on that.

'cause I, I loved how you phrased that
and if more people can Understood,

understood the outcomes of it versus
just the, to your points like.

This is not to go do your
essay or homework for you.

Imagine what level of care you
can give these people now, right?

And their experience.

And then how more people would then
want to go into the healthcare system

would then would help drive down costs.

And we're not, you know, you're
doing more preventative care and

things like that, you know, so.

I really appreciate all of this, so I'm
sorry that's, it's a great topic for me.

Sorry.

That's

Erica Rooney: all.

Listen, my job is also to
keep this train on the tracks.

Okay?

Right.

And so one part of our podcast
is we love to make things

actionable for our listeners.

So I've got two questions left for you.

The first one is gonna be if you walked
into somebody and you were like, this is

my one piece of advice for you to take
in the next 24 hours to move yourself

forward on this AI needle, from anxious
to curious, and then eventually to

AI series, what would that action be?

Stacy Olinger: The action is that not
making a choice is making a choice.

Oh.

And so be active.

Be active.

Take your responsibility, attend a
webinar, educate yourself, be involved

in the future that you wanna see.

I

Greg Boone: love it.

So I love that you said that, right?

The webinar piece, like not, you're
like the only, like the second guess.

Everyone else that says, oh,
just be curious again, they don't

know what you're talking about.

And one guest said, uh,
go watch some videos.

So I was like.

There it is.

You have to see this.

Go to the webinar.

Someone needs to see and don't, um, and
my hope is that in your webinar, you

at least show something because a lot
of people talk about it, but they're

talking from the perspective as if someone
already knows what this looks like.

Right?

They already know what
the experience is like.

I'm telling you, 90 something
percent of the world does not

know what we are talking about.

We are in a bubble.

Erica Rooney: I mean, this
conversation alone though, has

shifted my perspective on what AI
could possibly do in healthcare.

So.

If you're listening and your mind
hasn't been shifted, I don't know

what to tell you, but what you
have shared has been incredible.

We also love for people to see like, how
is this used in the real world, right?

I

Greg Boone: play our game

Erica Rooney: and this is
where we play our game.

This is called Last Chat.

All right?

And this is where you have to pull
up your chat, GBT, your Gemini, your

AI use of choice on your phone, and
share with us what was your last chat.

Oh, and don't worry, we
don't make you do this alone.

We'll dive in here too and
tell you what our last chat

Greg Boone: did.

I have to participate.

Erica Rooney: Absolutely.

We're going all in.

Greg Boone: Oh boy.

Erica Rooney: So, okay, let's,
all right, what was yours?

And if we need, oh, we're going first.

We're gonna go, I go last.

Can I go last?

You can go last, but if you need to give a
little context for the prompt, you may go

Stacy Olinger: ahead.

Well, oh, I was actually having chat.

GPT Help me with an AI
implementation guide.

Greg Boone: How can you,
without disclosing any, uh,

confidential information, how
did you phrase that prompt?

Or how did you ask that?

Stacy Olinger: Um, so I actually first
loaded up my own original content.

So I said, here is what
I do with organizations.

Now I want you to structure this,
um, in a six piece framework.

So I want you to give me a title
and give me a one sentence, um,

uh, uh, takeaway, and then tell
me what group it's focused on.

And so it did.

Greg Boone: It's great.

And so just, so if I, if I dissect that
because I, what I, one of the things I

try to get across to folks, this is not
your old school Google search, right?

You know, the average Google search
last year was like four words.

The average AI search
was over 22 words, right?

Because people need to understand
that I'm having a conversation.

You weren't searching for something, you
were asking it to go do something for you.

You gave it greater context.

You loaded it up so that it
understood exactly what's the

mission, understood more about you,
what you're trying to accomplish.

Right.

And a lot of people don't understand
that you can give it, you know, it's a

multimodal, so you can give it images,
text, video, whatever you need to

do, um, to give it greater context.

And then you structured
how you wanted the output.

Right.

Again, this is not just a search of old
school, I'm gonna get 10 blue links.

Right.

And that's where I'm saying the power of
these tools, and that's how I, you know, I

use it in a, a very similar way, you know?

But now we also have
tech backgrounds, right?

And so I don't know if that's the nature
of, because of how we grew up, right.

Or our experiences.

But that's one of the things too, and
I tell people it's natural language

processing, meaning that you have a very.

You know, have a regular
conversation with it.

Again, just ask it in those terms.

'cause I'm certain someone's
gonna be listening and like,

well I wonder how she wrote that
and, you know, and this and that.

No, you asked it the way that you read it.

Right.

You just, you're talking to it.

So I, I appreciate that.

Alright, Erica, what's yours?

Erica Rooney: Alright, this just
makes me cackle a little bit.

Oh.

Because this has happened on another
episode where my husband hijacks my

chat GBT account so I can kind of
see what he's doing in real time.

So Dan, is he aware, listening aware?

No, he is not well, he is aware
that he shares this with me.

He is not aware that we take
this game and go public with it.

So apparently he's interviewing
today and so he was asking what

are some good interview questions?

And I, I followed through.

He's

Greg Boone: interviewing someone,

Erica Rooney: he is interviewing
someone to be on his team and so I

followed through the interview prompts.

And it finally got it all the way
down to like, how do you answer the

question about work-life balance?

And then I see that he went in further
to say, Hey, refine the following

statement about work-life balance.

And then he goes into describe the culture
and what he thinks that balance is.

To get a very short, sweet,
succinct answer to give to

the candidate when they ask.

So that was his last chat.

Now I know what he's up to today.

Greg Boone: Well, and then he
could take that a step further and

say, can you gimme a role playing
exercise in which I do this?

He probably didn't

Erica Rooney: account for that much time.

He's probably gotta get
on that call like now.

So you're saying he probably

Greg Boone: did that while
the question was coming up?

Maybe

Erica Rooney: on the interview, you
know, but what, what's your last chat?

Greg Boone: Oh, so I gotta go.

You're turn.

I thought I had like, uh, BSed enough
that I didn't have to go Your turn.

Yeah.

Like when you're in class, you try to put
your head down so they don't call on you.

Yeah, that never works.

Um, so mine, um, I'm gonna give some,
it's specific to my knee situation.

I said I have two small
tears in my meniscus.

I have osteoarthritis.

Why does my knee, I know I'm old.

You tell me this a lot.

Why does my knee feel worse and swell?

Some after icing it should.

I just stick to heat?

Erica Rooney: See, you don't need to
ask Chad g Bt that you could've hit

up Erica and I would've just said, you
old quit playing basketball so much.

Greg Boone: See, that's why I
don't ask you for much advice,

but, and so what it did was it
talked about the likely reasons.

And now one of the other things
I choose to use a lot of times,

uh, some of the deep research,
like there's different models.

So what people don't also understand
is if you want a fast answer,

you know, folks will have a flash
or they'll have a, a basic one.

If you want a deeper research
where it's going and it's gonna.

Pull back, you know, 50 different
websites and those things take, you know,

sometimes anywhere up to half an hour,
like this is not just a fast response.

Right.

And so that's a whole different, we
need to have a, an episode talking

about the difference between how
people are using these models.

Right.

But it gave me the likely reasons.

It gave me, you know, uh, heat therapy
obviously is also telling me you should

probably still go talk to your doctor.

Like, I am not, you know, but.

The other thing is because I give
it so much context, it talks in

ways that I, that's familiar to me.

Mm-hmm.

Now, which is good or bad.

I've told people on other episodes,
the other problem that because I've

trained it and it's gotten used to me
every time, at the end of it, it gives

me a monetization strategy of how I can
turn this into a business opportunity.

No matter what I ask it.

I was like, I know, Hey
man, this is healthcare.

Please stop.

I'm just talking about my personal health.

I'm not trying to sell anything.

Erica Rooney: Mine always
gives me like a, let's do this.

Yes.

'
Greg Boone: cause she's trained
it so it talks like her right's.

Erica Rooney: Funny.

Greg Boone: Yeah.

Spicy.

Erica Rooney: Yeah, spicy.

Well, Stacy, thank you so much for
coming on, for talking about AI and

healthcare and your vision of it for
the future and everything that you're

doing with these webinar series.

Where can people find you so that
they can come check out these

webinars when you have them?

Stacy Olinger: Oh, fantastic.

So we'll put all, hopefully
a link in the show notes.

Um, but you can also go to caleb hd.com.

That's the website for
Caleb Healthcare Group.

You can also follow me on at Stacy Olinger

Erica Rooney: and she's on LinkedIn.

She lives there a lot too.

So thank you so much, Stacy.

Thank you.

Thanks for joining us
on AI, voice or victim.

If you want to stay competitive
in the AI age, start now.

Take one insight from today's episode and
put it into practice in the next 24 hours.

Make sure to follow us, share
your thoughts, and subscribe

for more actionable AI insights.

See you next time.