In the Interim...

In this episode of “In the Interim…,” host Dr. Scott Berry examines the challenge of communicating complex statistical concepts to non-statistical audiences. Drawing from firsthand experiences in agriculture, professional golf, and clinical development, as well as examples involving historical and scientific figures, Scott reflects on why technical rigor alone often fails to influence. The discussion focuses on the consequences of mismatched language, the importance of empathy, and the utility of simulation when bridging the gap between analysis and stakeholder understanding.

Key Highlights
  • Illustrated barriers to statistical communication using stories from farming, golf, and early career encounters.
  • Examples involving John Glenn, Ada Lovelace, and Charles Babbage show how communication, not just science, determines impact.
  • Insights from Alan Alda on empathy as a foundational tool for scientists presenting technical ideas.
  • Clinical trial simulations revealed knowledge gaps—such as misunderstanding of power—when communicating with decision-makers.
  • Emphasizes the necessity of translating analytic outputs into operational, financial, or clinical language for meaningful impact.
For more, visit us at https://www.berryconsultants.com/

Creators and Guests

Host
Scott Berry
President and a Senior Statistical Scientist at Berry Consultants, LLC

What is In the Interim...?

A podcast on statistical science and clinical trials.

Explore the intricacies of Bayesian statistics and adaptive clinical trials. Uncover methods that push beyond conventional paradigms, ushering in data-driven insights that enhance trial outcomes while ensuring safety and efficacy. Join us as we dive into complex medical challenges and regulatory landscapes, offering innovative solutions tailored for pharma pioneers. Featuring expertise from industry leaders, each episode is crafted to provide clarity, foster debate, and challenge mainstream perspectives, ensuring you remain at the forefront of clinical trial excellence.

Judith: Welcome to Berry's In the
Interim podcast, where we explore the

cutting edge of innovative clinical
trial design for the pharmaceutical and

medical industries, and so much more.

Let's dive in.

Well, welcome back to, in the
Interim, I'm Scott Berry, your

host, and welcome everybody back.

We're we're another interim analysis.

So today's topic and it, uh,
is, uh, a really hard topic to

sort of, um, uh, talk about.

But I, uh, I'll, I'll introduce
it by introducing the first

time I met my father-in-law.

You'll, you'll wonder where this is going.

So I was, um, uh, dating my wife and
we went down to the farm in Iowa.

He is a farmer in Iowa
Crops and, and hogs.

And the first time I met him and I
was, I was at, uh, just graduating

college and just making plans to go
to graduate school in statistics.

And he's a farmer.

And the first time he met me, the
very first question he ever asked

me, it wasn't, it wasn't anything,
was I gonna be good to his daughter?

And we were, by the way, we were not,
we were not engaged at this point.

Um, the very first thing he
asked me was, why would a farmer

in Iowa care about statistics?

And to many of you out
there, you'll recognize this.

This is a layup question.

Um, this is a question we dream about now.

I was an undergraduate who was good
at probability and I was pretty good

at math despite having a, a father who
was a statistician, I, I was naive.

I, I didn't know a lot about statistics.

I certainly knew the history and
the development of statistics.

Was largely through agriculture.

Much of fisher's work was
agricultural and experiments.

An important part, I'm in Iowa where Iowa
State was a a a a huge academic center

and developer of statistical methods.

So this is a layup question, but now.

I'm, I'm explaining to
somebody who's a farmer.

He's going out and he's combining and
he's planting and he's raising hogs of,

you know, he doesn't care about Bayesian,
he doesn't care about frequentist, he

doesn't care about hypothesis testing.

What does it do to a farmer?

Now, I'm sure I.

Fumbled the answer.

I remember the question.

I don't really remember my answer, but
given this, and now if I had the chance

to re-answer the question, I think I
could do a much better job with answering

it, but I'm sure I fumbled it now.

I didn't fumble it enough
that, uh, uh, I wasn't able to

eventually marry his daughter.

Um, and, and I've talked about
Tammy on this show many times.

By the way, if you're watching on
the video, she hates that I put the.

The, the headphones with
one ear open like that.

So maybe I'll do this.

So she thinks it doesn't
look quite as bad.

Um, but the, the point of this was,
is now very technical and I was

good at math and I was good at this.

Now I have to explain to a farmer.

Statistics does for him, and he wants
it in his language about, um, uh, the,

the amount of, uh, cost to plant, how
much reward you're gonna get from,

from, from that come time to, to combine
and, and reap the, the, the size of

the, um, the, the reward at the end.

The price for this.

Uh, what crop to plant, uh, how
to plant it, how to rotate it.

Lots of things that, uh, tremendously
experimented on and we understand that.

And how do we learn what
the right answer to this is?

I think I could talk
to him in his language.

Um.

Uh, about what it means to him with,
with dirty boots and, and, and how

a statistician could be helpful.

Similar discussions I have with
my brother, my older brother,

Don, uh, same name as my father.

He's a professional golfer.

And many times we have this conversation
is what could a statistician do

to help a professional golfer?

It's kind of interesting because
there's been a huge explosion

in the quantification of golf.

These guys all know all their numbers.

They know their shots gained.

Tremendous.

There's been a huge explosion.

15, 20 years ago.

The, the statistical analysis
of golf, it was, it was.

Poor.

It was quite poor.

Um, but now it's tremendous and
I think much easier discussion.

But explain to somebody who's hitting
a golf shot, what you can do for

them in their language of that.

So coming to this question
about statistical communication,

how do we communicate with.

People, and we did, I did a podcast with
Jenny Davenport, uh, from Roche, and

she spends a huge amount of time on this
communication and talked about it and

talk, how do you talk to C-Suite people?

So I, I, I want to dive into this
topic and talk a little bit about it.

And I, I will from the front.

I, I will say this is a hard topic.

It's one I continually try
to strive to do better.

I, I don't know if I'm good at it.

Uh, I, I, I try hard at it.

It's incredibly important.

We can do incredible things.

Statistically or quantitatively, if
we can't communicate or get buy-in

for, then it doesn't do any good.

And so you could be an incredibly
talented mathematical statistician.

And if you can't communicate what
it means to a farmer, to a, to a,

to a golfer, what what does it mean?

So I, I want to talk a
little bit about this and.

Um, it's, it's a scary thing to talk
about and, and as a statistician and,

okay, I'll refer to myself, the things
I'm very self-conscious or have been very

self-conscious about, uh, over time is.

And, and probably three things
that a lot of people share.

This is how we write.

Writing is really, really hard,
and I don't know if anybody

really thinks they're good at it.

I think my father Don's
a tremendous writer.

Uh, I think he works really, really
hard at it how we communicate.

So how we, how we teach, how
we present something we're very

self-conscious about and how we code.

I think we're really uncomfortable.

If people read our code, they
might figure out, we, we don't

actually know what we're doing.

Or, or there, there's a logic to
coding and your logic is bad and

you, you could do this much better.

And things we're self-conscious,
so very self-conscious about this.

I, I don't want to come across as arrogant
in any way that I'm a good communicator.

Things I think about and strive for.

Um, and also introducing this topic.

So I'm gonna present a couple
examples that have kind of hit me

that, that, that I think present
this, this whole idea of, of

communication, scientific communication.

And part of it is, is the fact
that the kind of books I read,

I, I tend to read, I, I, I.

In decent health, uh, generally,
but I don't necessarily sleep well.

So, uh, one of the habits I've gotten
into, which a good habit is sometimes my

mind is racing and it's hard to sleep.

And so I, I tend to read before.

Before bedtime.

And I, I tend to read nonfiction, uh,
not necessarily super exciting things.

Part of this is, uh,
these are things I like.

I'll read the history.

I'll read, uh, you know, uh, uh, uh.

The story of Wyatt IP in the 18 hundreds
or, or different periods of history.

The boys in the boat, they're, they're
essentially non-fiction things and I'm

learning something, uh, uh, about it.

So I read a book by Alan Alda.

The actor, uh, famous
for his role in mash.

He wrote a book called, if I understood
You, would I have this Look on my Face.

He has spent a tremendous amount of
time teaching scientists communication.

And he writes a book about it.

I, he, he really has spent years,
uh, with academic scientists, other

scientists in the art of communicating,
and it's a really interesting book, a

fairly straightforward, simple read.

Written very well, as you can imagine.

And he talks about the role of
empathy in this, and I think that's

largely understanding your audience,
having empathy for them, and you're

presenting something that they're
going to get something out of this.

Am I presenting everything I
know and I'm a super smart guy.

You know, what does it mean
to the person consuming that?

Or am I presenting something in something
that they would enjoy listening to?

So I, I, I referred to my father, Donnie's
a, he's a wonderful writer, and he, once

I once asked him about writing, and he
said, you know, I tend to try to write

something that people enjoy reading.

And there's a huge part of
that empathy for your audience.

And, and he talks about this, he
also talks about doing improv.

That, uh, scientists should do improv and
it, it adds to your ability to communicate

the role, the position you're in.

Really interesting.

I've never, I've never tried that.

Um, but, but a really interesting thing.

But he talks about the, the timing
of when the book was written.

He talks about the 2016 election.

Uh, with Trump versus Hillary Clinton,
Donald Trump versus Hillary Clinton.

And he, he contrasts the way each of
them speak and communicate and really the

different way in which they're perceived.

And I, for example, I, I, I, I,
I appreciated Hillary Clinton.

I thought she presented things.

I understood her, but he presents it
very much in a way that a lot of people.

Don't understand necessarily
what she was saying.

And so she didn't appeal to them as
somebody, what do I get out of this?

What does it mean to me, the
farmer with the boots in the

ground, what does it mean?

And, and she didn't appeal or, or present.

It was presented overtly technical where
Donald Trump is not in any way, shape

or form, overly technical and, and had
a different kind of appeal for that.

Another wonderful example of
this is a book I just finished.

Uh, so I very much recommend that book
was a, a very different kind of book, but

it was a book called Wingman The Wingman,
and it was written by Adam Lazarus.

And it's a story of John
Glenn and Ted Williams.

And it's the un uh, the, the
subtitle of his it, the Unlikely,

unusual, unbreakable Friendship
between John Glenn and Ted Williams.

So, John Glenn is the,
um, renowned test pilot.

He flew combat missions in, in
World War II and, and Korea.

Ted Williams flew combat
missions in the Korean War.

He was in the middle of his, his prime
for the Boston Red Sox, and he had in

World War ii, he had learned to fly.

He was a pilot.

He never had combat missions,
but he learned to fly.

He was pulled up by the military
and he became John Glenn's

wingman in the Korean War.

That they flew in pairs and his, his role
was to be the wingman for John Glenn.

They're very, very different
people, and it goes through

the history of both of them.

And it's a fascinating read.

I, I'm a sports fan, uh, but it's a, a
really interesting read about both the

men and the contrast between them, um, and
the, the unlikely friendship of the two.

So they, he goes through a
story where John Glenn, he's a

renowned test pilot, uh, by all
accounts, very much a boy scout.

Uh, just good.

Uh, uh, a good person does well by others.

Um, he is in the military, uh, uh, a hero.

He then becomes a Mercury seven astronaut.

So the original seven astronauts that are.

Are setting up to, to go to the moon.

This is early on the first rocket ships
to go into orbit and to orbit the earth.

He's the first in
Friendship seven on in 1962.

He's the first astronaut to orbit Earth,
the first US astronaut to orbit Earth.

We were actually slightly behind the
Russians, and that was the big rush that

we have to beat the Russians to the moon.

Um, and this was.

Everybody's was, was on this,
and there was actually some

issues in the friendship.

Seven.

He successfully lands and he
becomes one of the most well-known

people in the United States.

He's a hero to everybody.

He's, he's what everybody wants to be.

The book, the Right Stuff by Tom Wolf
also read that a very interesting read,

um, uh, talks about this whole, the, the
Mercury, seven astronauts and test pilots

and the role of pilots in the military.

It's a fascinating read, and
then it becomes a movie and

the movie comes out in 1983.

So the interesting, what, what
is the whole point to this

thing he runs for president?

In 1984, so he becomes
a US Senator from Ohio.

Uh, a very respected politician.

He's a American hero, an astronaut
by all accounts, a a a Boy Scout.

Uh, what everybody wants their,
their, their sons to be what

ev what they aspire to be.

Incredibly potential politician.

So in 1984, he's running, he's a Democrat
and he's running for president and he.

He doesn't do very well at all.

He ends up, uh, in Iowa the
first, the first primary, he

ends up fifth on Super Tuesday.

He ends up fourth.

He ends up dropping out of
the race and Walter Mondale.

Uh, wins the Democratic nominee
for President, uh, within this,

and he gets essentially destroyed
by Reagan in the election.

Um, but he wins the Democratic primary.

Second is Gary Hart, uh, who you
can go back and look and he ends up.

Uh, uh, being caught with a
mistress and despite daring people

to, to, to, to, to find if he's,
uh, unfaithful and then they do.

Uh, but, but John Glenn, this
incredible American hero, ends up.

Fifth and ends up, uh, sorry,
fourth overall in this.

So how does this happen?

How does somebody who is such
a perfect candidate, uh, to be

president and, uh, you know, not
even competitive within this?

And so you can look and, and this
book goes through it and talks

about it, by the way, he asked Ted
Williams to, to, um, to endorse him.

And Ted Williams is a staunch
Republican, and he doesn't, and he,

Ted Williams late in life says it's
one of the biggest regrets in his

life that he's a, he's a huge fan,
and, and, and John Glenn is a hero.

And he didn't endorse him because he was
a staunch Republican and a Nixon fan.

And it's sort of a
really interesting read.

Um, but largely the book says it's because
of his communication and even Glenn.

Uh, acknowledged this at the time.

He, he's giving a talk in Alabama
and his son is an anesthesiologist

and he says, my son puts people
to sleep like father, like son.

So he, he admits that he's not a
very good speaker, and largely this

book goes through and says he's very
much a technician and an engineer.

Very smart guy, and he's
describing the details.

He's like a scientist describing
all of the details of policy.

And it, none of it appeals to the voter.

What does that mean to me, where Reagan
is very different than that Reagan is, is

appeals to the voter and they understand
what it means to them as the voter.

It's a very similar situation, and
so in many ways he was a bit of.

A poor speaker, he didn't
communicate well to his audience,

which in running for president,
uh, at the time, are the voters.

And he's essentially playing this role
of the scientist, uh, in all of this.

And we can all sort of
take some lessons to that.

Are we talking about the technical
details where somebody says, you

know, if I had this look on my face,
would I understand what you're saying?

Do I understand what that means to
me and the decisions I have to make?

The last example also coming from a book,
uh, through this, it's a very interesting

book called, uh, ADA's Algorithm.

And it's a story of Ada Lovelace and
her relationship with Charles Babbage.

And, uh, and she has a really
interesting history, uh, with, with, um.

Uh, bill, uh, William Blake and, uh,
a really interesting book about, uh,

her, her being sort of the, the four
scientists for software, even though

there is no computer at the time, and
that's, that's sort of part of the story.

She was apparently brilliant
and understood the ramifications

potentially of the Babbage
machine more so than Babbage even.

So it's a very interesting
read in her role in that.

So I, I take something out of this
where he had, he had built this

idea for the analytical engine.

After what, what was
the difference engine?

And he wanted to build this analytical
engine, which would've been as

they refer to as touring complete.

Essentially the first computer ever
built, ever built, which it would

enable the use of software would've.

And this is, uh, you know, in the
mid 18 hundreds, he had that ability.

He needed funding to build this analytical
engine, which would've been unbelievably.

Powerful at the time, and I think,
you know, maybe he didn't understand

what it would mean to other people.

So he had the opportunity to get funding.

So on November 11th, 1842, he meets
with the Prime Minister of the

British government, sir Robert Peel.

And interestingly, at the time,
1842, it's among widespread

poverty and hunger and Britain.

The, the struggles and the prime
minister's dealing with that.

And that's of course forefront in his
mind is poverty and hunger in Britain.

And he's going there to ask for money
to build the analytical engine, which

could be the forefront in the start of
the technical revolution, maybe a hundred

years before it actually happened.

Um, uh, within that.

So he writes about this meeting as an
unmitigated disaster, and I'm largely

quoting from this book, by the way.

The, the author of this book is
James Inger, uh, within this,

so, um, this is from the book.

It says, peel was in no mood to
meet Babbage at all, let alone

in the mood for a stressful
confrontation with a mad scientist.

So Babbage writes, first of all,
he was, he was, the meeting was

conducted in a defensive, suen, bad
tempered, qualy, self-centered and

self ping pitying manner that would
only irritate and in alienate peel.

But this is the part that I,
I found really interesting.

So, but Babbage in his bitterness and
haste to justify himself, tried quoting

to peel a comment that the mathematician
Plana had made, that the invention of

the analytical engine would provide
the same control over the executive

department of analysis as we have here.

There too had over the legislative now.

I don't fully understand this, but largely
it means we would have the ability to

have this computing machine that we
can provide instructions to and it's

gonna compute everything we want it to
compute and describes this of having

control over the executive function.

A rather famous quote, and this
is what Babbage is telling Peele,

who's dealing with hunger and
famine in 1842, uh, Britain.

Uh, he's not, uh, uh, telling
peel the impact that this would

have on people of Britain.

What, what would happen if we have this
ability, what it would do to the economy,

the impact this would have, and, and
the writer says he should have been to

the point pleasant and done his utmost
to explain his work to peel in language

that presented the practical function
to Britain's economy of his invention.

Talking to the farmer with this.

Boots in the mud, talking to the
golfers, hitting golf shots, talking

to the Prime Minister about what
it does to the economy of Britain.

That's, that's explaining
this good co communication.

John Clinton, being able to explain
to John Doe voter what it would

mean to him if he became president.

He was unable to do that.

This in many ways is the
role of us as a scientist.

We're doing this very technical
work, and in this case, Babbage

had this unbelievable machine.

The the real, the first touring
complete machine wasn't built to

1942, I think it says in there, but
he had the possibility of a hundred

years earlier potentially having this.

And it was because of communication
that it didn't happen, or at

least hypothesized by this.

So.

What does this mean?

Now?

What does this mean to a scientist?

What does it mean to a statistician?

What does it mean to
be a good communicator?

I, I harken back a little bit to what,
what, uh, Alan Alda talks about empathy.

I, empathy is understanding what's
my audience, what, what, what am I.

Am I talking in their language?

When I talk to somebody who's a
clinical development lead, am I talking

about the advantages of a Bayesian
posterior probability to a P value?

Which there are many, but it she
doesn't, he or she doesn't care.

Does it change?

Them am I communicating to them?

I don't want to have that
conversation with them.

And by the way, occasionally there'll
be a statistician who will ask that

question when I'm really trying
to communicate with the clinical

development lead about why this adaptive
design is better than a fixed design.

Sure.

It's using Bayesian me mechanics.

It's doing longitudinal modeling.

It's really neat stuff.

I'd love to share the details of it,
but that's that, that doesn't help him.

Him or he or she in that
scenario, I need to explain in

their language what this means.

I have to play in their sandbox.

I have to be the scientist who understands
the quantitative thing, and I do all that

work, but that can't be what I communic.

Now I have to be able to defend
that and talk to a statistician, a

regulator, statistician, uh, uh, a
referee at a journal or, or any of that,

and be able to be the best at that.

But I have to be able to explain
to my audience and have empathy for

them, what does this mean for them?

Now, in order to do that,
I've gotta understand.

What is their language?

You know, what, what does
the farmer care about?

What does the professional
golfer care about?

What does the voter care about?

What does the clinical
development lead care about?

I in that setting?

And I have to speak in that language.

Uh, I have to understand what
resources are we measuring?

What are the constraints?

What are the science of the disease?

I, I, I can't.

I can't ask them to know all
that and then come play in my

sandbox and them understand power
and them understand statistics.

Or should we do an MMRM or a
T-test or a slopes model that

that's not what their experts at.

And if I'm having that conversation
and we're talking about that, I've

lost, I'm not communicating well.

I want to talk about
timelines for approval.

I want to talk about picking
the right dose for go.

No go.

I want to talk about how this
trial helps fund the trial.

I want to talk about, um, uh,
what it does for the disease or

the endpoints and all of that.

I'm, I'm learning that and
I'm talking in their sandbox.

Um, uh, in the setting.

So this first kind of came to me
where I was, I was designing a trial.

It was early on in berry consultants,
and it was a, you know, really nice

trial, Bayesian trial, adaptive design,
longitudinal modeling, how we, how

we figure out an alpha adjustment.

Bayesian Bayesian does nice work
in here and presenting really

nice simulations of the trial.

Multiple trials, we could
do this, this, and this.

And the person I was presenting to asked
me a really simple question that was a

bit jarring, said, what would you do?

And why it was jarring was, I
didn't know the answer to that

and it wasn't part of my slides.

Think about how bizarre it is that
I'm presenting multiple designs and

I'm sort of being the statistician.

I'm in my sandbox, I'm doing really
cool stuff, and I'd love to tell a

lot of people about how cool it is.

Person doesn't care how cool it is.

What, what development plan,
what design should we pick?

Why it was jarring is I, I, I
recovered a little bit and I thought,

and I said I would do this design.

But in order to do that, I've
gotta understand so much about

what are the goals of the trial,
why would I pick that design?

I ended up going to slide seven and
showing, look, this design, this

is what it does in this scenario.

And then I go to design 14 and I
say, if you saw when I presented

this over here on Design 14, and
you compare that to, to slide seven.

Look at how much better this
does on resources and time.

It gets the right answer.

And then you go to slide 21 and
you say, look what it does here.

So I would, I would pick this one on
slide 14 'cause it does better on this

and realize I don't have that on a slide.

How do I not have a slide
that shows this design?

This is what it does for you.

Clinical development
lead in your language.

I should have that slide.

I should be able to answer
the most simple question.

What would you do?

And have the slides presented to do that.

And that's what I think communication
is when I am able to have that

discussion in their language.

If they're on my side of the
field, I'm doing it wrong.

They're trying to do my science and.

You know, that's a bit of the
fear because in some ways we

can really learn and understand
what it is they're trying to do.

And again, the goals, uh, of that.

But our presentation needs to be
succinct and presented in their

language for what they want.

You know, we're, we're not candidates
for, uh, uh, uh, president.

But the message is very much similar.

That, that I, what do, what
do I, what do I do for them

and what does it do for them?

What is the resources?

What is the performance?

Now I've gotta know what the
right thing to present is.

What, what performance metric
do I present in the design?

Is it.

Power under a set of delta?

Is it the probability of making
the right dose selection and then

moving and getting success, and
what endpoint clinically meaningful?

I have to understand the disease.

I have to understand the constraints
about the different arms so that I can

have that conversation in their language.

If I haven't done that work or done that
part, I can't have that conversation.

And I, I'm speaking in a stale way of,
of math and then I'm asking that person

to interpret it and, and make decisions
based on a bunch of facts that I've

given him, which is really, really hard.

And that's the stuff we
should be doing now, the.

This is hard to do, but it's, it's
what the wonderful part, and it's

the challenge of this is if our role
is entirely, they do all the work.

They calculate, they, they think of
a delta, they think of an endpoint,

the number of arms in the trial,
and I just need you to calculate a

sample size that's in a vacuum and
I give them a number, uh, you know.

I don't think we're doing a good job.

I don't think we're communicating well.

We're asking everybody else
to be statisticians and all

we're doing is calculating.

So there are certain things to
learn to to try to get there.

So for example, if I'm ever asked, what
sample size do I need in this scenario,

the thing I never tell them is two 40.

I think it's the right answer to the.

Question they don't actually have,
but I'm trying to learn a little bit

about what are the, what are the,
the, the things that could go wrong.

What are they thinking about?

How sure are they of this delta?

How did they come up with that?

I might present a graph that
shows power over sample size

for multiple effect sizes.

And then I circle right there and say,
80% power for that is two 40, but look.

Different effect sizes.

Here's your power.

Are you worried about this?

Are you worried about that?

A bigger sample size, 90% power.

It does this.

And a lot of times now, all of a
sudden, they tell me what's going on.

They tell me about their, their
concerns, the potential things

that maybe the design could litig.

And maybe it's about
the patient population.

They don't understand the variability.

They told me it's the standard
deviations three, and they almost

create this sort of, I know everything.

It's this, this, and this.

Just gimme the sample size.

When really I'm trying to
learn about all these parts.

Now you have to have that interaction.

You can't do this where they
throw something over the fence

and you give a sample size.

That's hard to be a good statistician
and a huge, a good communicator

because when I present to them,
I don't know what performance is.

I don't know what the metrics are.

I don't know anything about
the financial constraints.

I don't know anything about
competitive landscape.

Are we racing somebody else
All, I mean, all of these.

Things should go into, when I present
multiple designs, here's what it does

on those factors important to them.

But I have to know
what's important to them.

So I, uh, maybe it's a whole separate
podcast of trying to learn what's

important to them, trying to get
anticipated regret after the trial,

what successes and all of that.

But it's having these conversations
in the language, uh, that

they speak, not statistics.

And yes, we.

I, I, an example happened to me where,
um, somebody came to me with a design

for a certain delta and the sample
size of two 40, they had 80% power.

And yep, this is a good design and we're
presenting alternatives to that design.

And I simulated that design
under Delta, under the thing.

And I showed them
simulation results and the.

The, the individual said
to me, wait a minute.

If I have that blockbuster
effect, 20% of my trials fail.

Now, I think most people
listening to this will understand,

well, that's what power is.

But that person didn't
actually understand.

To them power was, oh, that's just
something that tells me my sample size

and it's a good trial if it's powered.

They didn't necessarily understand
that that meant if I have that effect,

delta, the probability we lose this
trial is 20%, which is a huge, uh,

error rate, uh, uh, within that for
them on something they care about when

that, but showing them simulations.

It was blatantly clear in that setting
what it meant, and now all of a sudden

I'm doing, I'm speaking much more on
their side of the fence when I'm showing

those simulations and I'm doing that.

And we don't want them doing the power.

We can make fun of them.

They don't know power,
but that's our fault.

That's our fault for being
mathematical, for being, uh, way

too far on the other side and not
communicating, and not working, and

not, not not working on their science.

So, you know, it, it doesn't do any good
to say, oh, that person should know power.

We don't wanna ask them to do the
statistics, don't do presentations

that ask them to do that work,
work on their side of the science.

Uh, within that, now you have
to understand your audience.

Are they statisticians
that you're talking to?

Are they regulators?

Are they clinicians?

Is it the VP of development?

Are they funders that the biotech is, is,
is asking you to talk to about the design.

If your presentation is the same
for all of those, you're not

doing it right because all of them
have a different understanding.

They have different language, they have
different science, and we can do that.

Being good communicators, being
able to speak their language,

not asking them to speak ours.

And the more you can
work on that, it's hard.

And it's really hard and I don't
get it right and I get it wrong.

And I told you some of the cases
where I, I, I did it wrong.

But I'm striving to try to do
that, uh, in those scenarios.

And I'm, I'm, I'm trying to get
to that point, um, within it.

Now to me.

There's a huge value in this
in clinical trial simulations,

and it, it, why is it a value?

It, I can present a design and
analytically it has type 1 error

control and, and I can say 240 is
your sample size, but I haven't, it's.

It doesn't communicate well
with that other person.

All the, for example, that person who
didn't understand power, didn't know

that 20% of the trials fail on that.

If I show them simulation results
of what effect size comes out of the

trial, and every one of those is a
trial, they understand the result of

the trial, the person I was speaking to.

So here's a trial where
it finishes, it does this.

And it makes this decision, and this
is what it means clinically to you.

This is how long it took, and
this is the resources it took.

Here's the result.

This is what it means to you economically.

That's incredibly insightful.

That's their language.

And then I can show them the simulations
and I can do it in any detail

They need to speak their language.

And simulations are a great
tool to do that to it.

I can say, here's an adaptive
design that's a promising zone

design, and it has this power.

They don't understand what that means.

Many people don't understand
what that design means.

Now you show them
simulations of the decision.

It makes the effect size at that time what
the DSMB would do, what the result is.

All of a sudden, oh, I don't like that.

Or, or why does it do this?

Or why does it do that?

They understand it.

It's, it's in their language.

So clinical trial simulations, there
are a lot of statisticians out there

that, oh, this is for the weak-minded.

Because they're really good
at analytical calculations and

they're probably better at it.

But does it enable you to
communicate exactly what it is

you built as well as you can?

Somebody who's gonna build a house
for somebody shows them pictures of

what the house is gonna look like.

Clinical trial simulations is that tool.

I show you a picture.

I can show you a movie
of the trial you built.

Which is hugely valuable in
communicating in their side of the

field, not my side, which the math.

And we can talk about conditional
power and why below 50%.

There's no inflation of type one error.

You know, we're, we're doing it
wrong in that circumstance, but

if I'm showing them trial results,
that's a great communication

skill, uh, within that setting.

Yes.

Um, it, it's a little bit like, um,
I, I'm a candidate talking to a voter.

I'm talking what it means to them.

I'm not talking about the details of my
healthcare policy, and this is why it's

good and it's Bayesian versus frequentist,
and, and you should like that.

They're gonna look at me like I,
okay, what does it mean to me?

They don't, they don't know what it means
to them, and that person's asking a lot

of that person to, to, to understand and
then consume what it means to them, rather

than saying, this is what it means to you.

That's our goal.

Hard.

But I believe that's good statistical
communication, um, uh, to have impact.

It also gets this, this idea
that we all strive for to

get a seat at the, the table.

Excuse me.

A lot of people say we want to be at
the table when the decisions are made,

but if you get an opportunity to get
to that table, how do you behave?

Do you earn a right at the table?

And if you're having that communication
and the science and the the needs of

the people at the table to make that
decision, when they say what decision

you would you make, are you stuck
with, oh, I don't know, and I don't

have the right material to show you?

Or do you have that.

Um, that that's really the ability
for us to have a bigger impact.

And I think we have a lot of solutions
that have, can have better impact if

we can have that communication to them.

So I'm not running for office.

I, I would be a terrible
candidate, by the way.

Um, I, I'm not sure I could communicate to
the voters at all, uh, in that scenario.

I'm striving to be a candidate
statistician, trying to communicate

better and hopefully, hopefully you
can get something from this 'cause

you are my audience today, which is a
little bit ironic and a little bit hard.

Until next time, we'll
be here in the interim.