A podcast on statistical science and clinical trials.
Explore the intricacies of Bayesian statistics and adaptive clinical trials. Uncover methods that push beyond conventional paradigms, ushering in data-driven insights that enhance trial outcomes while ensuring safety and efficacy. Join us as we dive into complex medical challenges and regulatory landscapes, offering innovative solutions tailored for pharma pioneers. Featuring expertise from industry leaders, each episode is crafted to provide clarity, foster debate, and challenge mainstream perspectives, ensuring you remain at the forefront of clinical trial excellence.
Judith: Welcome to Berry's In the
Interim podcast, where we explore the
cutting edge of innovative clinical
trial design for the pharmaceutical and
medical industries, and so much more.
Let's dive in.
Scott Berry: All right.
Welcome everybody.
Back to, in the interim, uh, I'm
your host, Scott Berry, and I
have a, a guest today, Casper rba.
And I, I, I'm, I don't know
how close I got to that, but
I, I gave it a, gave it a shot.
Uh, so Casper, welcome to in the.
Kaspar Rufibach: Thank you Scott, and,
uh, you, you pronounce it quite well, so
very good approximation to my last name.
Scott Berry: Okay.
So, so Casper is currently a, uh,
statistician, and I wanna make
sure I get your title correct here.
What is your title at Merck?
Kaspar Rufibach: So I'm, uh,
co-head a group which is called, um,
advanced Bio Statistical Sciences.
So I'm the co-head of Advanced Bio
Statistical Sciences at at Merck.
Scott Berry: Yeah.
And we'll, we'll go into a
little bit of, of your history.
Uh, uh, Casper's having a lot of impact
on, uh, ICH, uh, European leadership
within the statistical community,
within pharma, uh, within regulators.
Uh, and has a, has a
really interesting history.
So I'm, I'm thrilled to have a
discussion with Casper of all things
science, leadership statistics.
So let's start with, uh,
how you got into statistics.
Maybe a little bit of your, your
background, your educational background.
Kaspar Rufibach: Um, yeah.
That's funny.
In high school, I mean, I was
drawn to math and physics and
these, these quantitative things
I really was interested in.
Uh, but I wasn't sure
what, what should I study?
And, and in high school, or
what we call here gymnasium, we
had these descriptions of jobs.
And there was one description
of an insurance mathematician.
And, uh, he was describing how
he ensures kind of oil tankers
and what you need to do for that.
And, and that I found really
interesting and I didn't want to
study pure math for certain reason.
So I thought, and so what he did
study this insurance guy was, uh, a
subject at the University of Byrne
quite close to where I grew up,
which is called, uh, mathematical
Statistics and Actuarial Science.
So I then went for that and
started, and I had a lot of
insurance, mathematic mathematics.
I learned about statistics and it really
hooked me up and I thought, I will
go into a finance industry because in
Switzerland you have a lot of insurance.
Your banks, at least at
the time when I studied.
And uh, then when I started my PhD.
Scott Berry: Well, let,
let's back up a little.
So in the US I think we have
actuarial, uh, exams, for example,
somebody becomes an actuary.
Did, did you, did you go
down that route at all?
Kaspar Rufibach: No.
So I did then a master's in actuarial.
Uh, so mathematical statistics
and actuarial science.
And I think these exams, they would then,
or I would take them once, I would have
entered industry and worked for a while.
I think this is how, how it works here.
Then you'll take these exams
and then you become an actuary.
Um, technically, um.
So, yeah, I then started my PhD
quite after the Master's and, um,
at the university always, uh, one
of the PhD students would kind of
be rented out to a coordinate, uh,
a collaborative group, the Swiss
Institute for Applied Cancer Research.
Like you would then work three days
on a PhD and two days you would.
Get exposed to biostatistics there.
And, uh, that job became available
and I talked to my office mate who
did it before and said, oh, is, do
you think that'd be something for me?
And then he said, oh, I
always found it interesting.
Just give it a shot.
And, uh, so I started there and that's
how I ended up in biostatistics.
I, so I started kind of designing
and analyzing phase two and phase
three cancer trials back in 2002.
And, uh.
Yeah, it, this really hooked me
up and I found it really very
interesting and uh, and stayed
with that and never looked back.
Scott Berry: Oh, very interesting.
So, so there, you're at
the University of Byrne.
You complete your PhD at that point.
Um, and then you do a
year postdoc at Stanford,
Kaspar Rufibach: That's
Scott Berry: in the us Yep.
Was that a biostatistics,
uh, associated or not?
Kaspar Rufibach: No, no, my PhD, so
technically I have a PhD in mathematics,
but it was, so it was a, a topic in
mathematical statistics, kind of semi
or non-parametric density estimation.
And um, yeah.
Then there is, in Switzerland you had this
opportunity to apply for postdoc grants.
So it's in the Swiss government
who, who gives you some money?
To basically go anywhere.
Um, and I took that opportunity
and, uh, in Stanford, I, uh, I
worked on, on related topics,
uh, compared to, to my PhD.
So it's not like I pursued something
by statistical, but I mean, one of the
perks going to Stanford, of course, is
you and as a postdoc anyway, you're free.
I mean, you get the money upfront,
you basically can do what you want.
So I got in touch.
I, I did, but a lot of people do.
When I was at Stanford, I, uh, I looked
for, what are Swiss people around here?
And then I met a few of those, uh, a
professor of neurology and, uh, postdocs.
And they were all from Switzerland,
so I met them because they were Swiss.
But then we started to collaborate.
So I did, I then worked
with them on, on doing some.
Very early machine learning stuff in,
uh, in Alzheimer's disease, kind of
making, building prediction models.
Um, so I, I kept a little bit the
foot in this Biostatistical arena.
And then when I came back, um, I,
there was an opportunity at the
University of Zurich at the, uh,
Institute for Social and Preventive
Medicine in the biostats department.
If you want.
Um, where I could continue
to do that research.
But then also, uh, one of my
tasks was consulting of medical
doctors of the university hospital.
And that's where, I mean, before at
this collaborative group, I did clinical
trials, but then with this consulting, and
I guess everybody who has done that for
a while in their professional life, they
know how this is, it is, I mean, you just
sit at your desk, you have these doctors
coming and they have all these problems.
And every, every 30 minutes you get
exposed to a very different problem.
And that's where I, I think, broadened
my horizon and my knowledge about
statistical methods, uh, dramatically.
And I also kind of complimented the
thorough theoretical background that
I have through my PhD and postdoc.
Then really, I, I really learned
about applied statistics.
Um,
yeah.
Scott Berry: a, a, a
very interesting thing.
It sounds almost like a drive through
consulting, if you will, um, kind of thing
and, and, um, uh, with, without, perhaps,
I dunno, maybe you had the ability to be
on an extended grant and work on it and,
and really get into the details of that.
But, uh, uh, a, a nice hit and
run, uh, look at consulting.
So you do that, you, you, you, and then.
The decision is what to do.
I don't know if you're, you're
entertaining academia at this point, if
you're entertaining different places, but
that's the point where you go to Roche.
Kaspar Rufibach: That's right.
So you are perfectly right at some
point and you need to make a decision.
And, uh, I always kind of, my, my
kind of personal deadline was my
35th birthday that after that I
know which direction maybe to go.
So, um.
And I mean, there were a few things.
One thing was this theoretical research.
I came to the office every morning and,
and it, it, it started to feel like what
problem no one has, do I solve today?
And uh, and I got a
little bit tired of that.
And the other thing is I was teaching
in a master program in biostatistics,
and basically I was telling
students how to run clinical trials.
And of course you can
learn that from books.
I had this experience from the
collaborative group, but at some point
I felt maybe I should have a look myself
first before telling others how to do it.
And uh, of course when you, I worked
in Zurich Basel is one hour away.
And if you are a biostatistician
in Switzerland, it's quite
obvious at some point.
I mean, you know a lot of people in
the ecosystem in Basel anyway already.
And if you are interested.
And that's it.
It's very obvious to knock at
the door maybe at some point.
And that's what I did.
And then, uh, joined Roche as
a trial statistician in 2012.
And I, I started one week
after my 35th birthday.
So I was well within
my, uh, my own deadline.
Scott Berry: Okay.
Uh, very interesting.
Um, the, I, I'm still trying to figure
out what I want to do in life, and
I, I, I blew past my 35th birthday,
so that, that's a true, so, so, um,
uh, Basel, is Roche Bayer a, a, a
hub for pharma within Switzerland?
Are there others that I should know?
Kaspar Rufibach: I think there are
others you should actually know.
Scott Berry: Okay.
Okay.
Okay.
Kaspar Rufibach: I mean, uh, Basel I think
is the only city globally that is the host
of two of the biggest pharma companies.
The other one being Novartis.
Scott Berry: Oh, Novartis.
Yes.
Yep.
Kaspar Rufibach: they're both
headquartered in, um, Basel
and around those two companies.
A huge ecosystem has grown over time.
Pharma, biotech in general, but
also biostatistics specifically.
There are a lot of companies, like
you mentioned, Bayer, who has expanded
in Basel, uh, recently quite a bit.
There are some mid-sized pharmas.
There are biotechs.
Um, and Switzerland is very small.
As I said.
I, there's one hour train ride to Zurich
with ETH and then the, this biostats
department at the University of Zurich.
Then it's about one hour to burn
with other biostats capacity.
It's about 90 minutes to lose
on with another ETH and, uh, and
statisticians, uh, one hour to Freberg.
So it's, everything is very close.
It's basically from, for coming from
America, you could basically say it's.
Virtually all in the same city because the
trains can a 60 minute train ride if you
think about a 60 minute drive somewhere
that's within the same city, basically.
So there's a huge ecosystem of
statistical capacity and it's
also, there's a huge sense of
collaboration, especially in Basel.
So for example, we have the Basel
Biometric Society, and you would
organize seminars and trainings.
And they're just open to anyone
from all companies, from academia.
We have Swiss medic, the
regulator, which is also in burn.
Um, and then you would just
convene and have discussions.
And I think this contributes to a very
attractive workplace for statisticians.
Uh, that puzzle is and, and, and has been.
Yeah.
Scott Berry: Yeah.
That's awesome.
And, and that's where
you currently reside?
Kaspar Rufibach: So I, as you see
in the background, I live about two
and a half hours away from Basel in,
Scott Berry: Oh, okay.
Kaspar Rufibach: In, in my hometown.
Um, but I, I'm, I mean my eco, my
professional ecosystem, uh, is darn shot.
That's where Merck, internally,
a lot of my stakeholders are.
And, uh, Basel where, where there
are, there is this kind of, uh,
critical mass of statistician
where all these events are.
And a lot of my collaboration partners,
so, uh, I, I regularly go to Basel
and just to meet up, uh, people.
Scott Berry: Hmm.
Yeah.
Very nice.
Okay, so, so let's go back.
So you go to Roche, uh, you
become a, a statistician at Roche.
Uh, I, I've never worked
within a pharma company.
I Tell me what life is like as a
statistician, and it doesn't have to
be specific to Roche and, and nothing
confidential, but what is life like
as a statistician within pharma?
Kaspar Rufibach: So I, I mean, I came
with this academic background, a, a
very thorough education and theoretical
statistics, applied statistics,
and I entered Roche with, I mean,
it was my explicit wish and it was
also the clear kind of communication
from those rush people who hired me.
They basically said, you
have all this background.
We see your potential, but bottom
line, you have no clue about.
Clinical trials and drug development.
So you learn that first and
uh, this is what I did and
this is what I truly enjoyed.
For about five years, I was a trial
statistician, so I was running trials,
designing trials, and enjoyed that a lot.
You learn a lot and you learn this.
You come in and you have the idea
that the biggest piece that you
contribute is your statistical.
Expertise.
And of course that's, you
do, that's what you do.
But I think the even bigger piece, the
more senior you get is you help teams
identify their questions, be precise
about what they want to know, what they
need to know, what is the bigger picture,
kind of this structured thinking that
you learn when you write a PhD thesis.
You have this huge problem.
You have no clue how to solve.
And then you chop it up in small
pieces, you solve all the small
pieces and you put it back again.
And that drug development is,
is, is very similar to that.
And the more you get exposed to
that, the more you learn about that.
So, um, I found it very, very,
yeah, I found it.
I mean, you asked about
how, how's life at pharma?
I found it very, very interesting.
Immediately I found I could bring all
this statistical expertise to the table.
You have the collaboration
with all these other functions.
You have to understand the inner
workings of clinical trials.
Um, make sure that when people from
other functions wanted to change
something about your trial, always in
your head you checked against, okay,
do I still get what I need at the
end of the trial when they start to
change and tinker around with things,
um, I found this very, very interesting.
Scott Berry: yeah.
So, so to some extent, um, I, I
guess it, it, how do you define what
statistics is, what the science of it is?
I mean, in some ways this,
this is kind of our expertise.
The clinical trial science is,
you know, clinical trial is asking
a question of mother nature.
You're, you're posing a question to Mother
Nature and the whole part of this, and you
know, what are we trying to, to answer?
What questions are we asking?
How are we asking them?
How does it fit in within the development?
I mean, to some extent, this is the
science of the clinical trial, which is.
Statistics now, it, it sort of depends.
It much more so than calculating a sample
size or that sort of thing and, and
trying to figure out the whole puzzle.
What is the puzzle is is kind of a
strength of statisticians if we jump
into the full team of all of that.
I mean, to some extent it is statistics.
You may not learn that in
a textbook, but somewhat.
That's our science.
Kaspar Rufibach: No, I completely agree.
And, uh, the way I phrase it, when, when
I tell younger people, for example, what,
what to expect in pharma industry, I
tell them, you enter as a statistician,
you know, all these methods, um, you
have a good quantitative understanding.
Uh, but you have to evolve
into a drug developer.
What you described, you said
it's statistical science.
Maybe you can say it's drug
development, and that's much,
much broader than statistics.
But what you learn as a statistician
and what you bring as a statistician
is paramount and absolutely
crucial in drug development.
And, uh, and you, I think
statisticians should not shy away.
From assuming these responsibilities
and assuming that impact
that you can actually have.
And, uh, sometimes I, I, I, I say
that some statisticians or have the
tendency to hide behind technicalities
because that's what we have learned and
that's where we feel comfortable in.
But the reward is, is huge.
If you, if you broaden your
knowledge, if you understand what.
Other functions are doing what do they
need so that you can help them get what
they need, and always make sure that
the statistical bits and pieces, uh,
remain intact so that when the trial's
finished and, uh, has been successful,
that you can proceed and, and, and
actually get the license for the drug.
Yeah.
Scott Berry: Yeah.
Yeah.
Uh, okay.
So you, you described it as though
for five years you kind of jump in,
you become a trial statistician.
What, what happens after five years then?
Then what?
Kaspar Rufibach: Um, of course when
I started at Roche and, and I did
this trial work, it's not like I
forgot about, about all my background,
so I, I talked to a lot of people.
I felt, oh, and, and each time
somebody came to me and said,
oh, I have this challenge here.
And of course I would read the
paper, I would talk to someone else.
I would just give a
talk in the department.
I, I would continue to do
research at the time we.
Uh, Roche started to establish a
quantitative decision making framework
around assurance probability of success.
I got sucked into that and saw that
there are a lot of open methodological
questions, which I started to tackle
and work with other people on them.
And, uh, yeah, a couple years later.
Uh, Roche didn't, or, or had a kind
of a methods group, but did not
really commit to it at that time.
And at some point there again,
was kind of this ambition to, to
build or to have a methods group.
And, uh, they hired a couple of very
senior people or, or kind of took very,
very senior people, but it was some
kind through all the work I already did.
I was formerly not a methodologist
in a methods group, but
I was behaving like one.
On,
I mean, I tried to, to make sure I
did my work on the trials efficiently.
And then you always have half a
day you can do something else.
And, and so I used that time.
Um, well, so I, it was then
somewhat obvious that I should
become a member of that group.
And the concept at the time was
that nobody was a full-time member.
You always worked 50% on projects
and 50% in that group in order
to make sure there is connection
and you are not kind of like an.
Internal academic thing that
is disconnected from the,
from the actual business.
So we called that a rotation, and then
that's how I ended up in that group.
I stayed on the project, just
kind of cut down a little, a
few of my responsibilities.
And the other 50% I was then
formally part of our methods group.
Um, and then over time we, the
concept was changed a little
bit because it became very.
Difficult to stay on projects
because then project timelines
would always dictate, of course.
And you, it was difficult to find
the other 50% and by the fact that
a large piece of the work, even in
the methods group was, it was and
is consulting of project teams.
There was no risk.
You lo you lose the connection
to the business anyway.
So then, uh, at some point I just
became a hundred percent member of that
methods group and uh, you know, the per
case, you then get a lot of freedom.
To do,
Scott Berry: So, yep.
Kaspar Rufibach: yeah, you go ahead
Scott Berry: So, so let's,
let's back up a little bit.
So you, you gave a word, you
said assurance, um, uh, and
predictive probably, which, um,
has a bayesian nature to it.
Uh, in that, would you
consider yourself a Bayesian?
Kaspar Rufibach: Um, so we first,
maybe we can first have a discussion
what the basic nature of assurance is.
I mean, you have this unknown Well,
because that's a question often comes up.
Uh, what is Bayesian?
Does base the RM need to be involved?
Is it just kind of you have an
unknown parameter over which
you assume a distribution?
Uh, that's always a funny discussion.
I mean, I don't think, I mean,
first of all, I don't care so much.
I have a problem.
I need to solve that problem and whatever
helps me solve that problem that I use.
Uh, but I, I don't think, I
mean, I have a couple of papers
that have base in the title.
Um, primarily around assurance.
Uh,
Scott Berry: Uh, okay.
Yep.
Kaspar Rufibach: I, I don't
think people would consider me
a Bayesian, and I I do neither.
Sometimes I even say, if you look
at what I did in my PhD, I'm a
diet in the wool frequentist.
Um, but.
Uh, yeah, whatever works.
But I also, I mean, it's not like
I cannot open, stand and then
implement something just from scratch.
It would really, I, I would, it
would make me scratch my head first.
Uh, so it's not like this is
routine stuff for me, but, uh, I'm
very open to, to whatever works.
And what I, of course find tempting is
base gives you the interpretation many.
Stakeholders.
Clinicians mean they get
from frequent statistics.
And, and I think this is, uh, what,
uh, what makes it tempting, of course.
Scott Berry: Yeah.
Yeah.
Uh, specifically within this, I, I know
you list in some of your interests,
uh, predictive probability of success,
which is sort of, which is a, a
Bayesian, I think Bayesians have tried
to use assurance because frequent has
took all the, the nice words and we
have to come up with terminology to
mean similar things, whether it's a
credible interval or, uh, uh, assurance.
But, um.
If, if you're employing methods within a,
a pharmaceutical company predicting the
likelihood of success in a trial, it's
incredibly natural thing to integrate over
the uncertainty of the various parameters
rather than picking a single value and
saying, if Delta's true, this is our.
This is our power.
So, uh, it, it feels like an incredibly
natural way, as you described,
to solve a problem of interest
is using Bayesian type things.
Is that commonly done in pharma?
Kaspar Rufibach: Yes.
I think yes.
So, uh, so at Merck for example, we just.
Are still in the process or,
or, um, I mean, that had already
started before I joined a year ago.
I give want to give full credit
that the people who were there, they
developed a really nice quantitative
decision making framework, uh,
where, which is based on assurance.
Uh, so you have like a trial design,
you have a definition of success,
which is very important, which
effect you want to beat, and then
you, you try to build your prior.
Such that it synthesizes or
integrates all the evidence you have.
And that evidence can come
from different places.
Internal clinical data,
real data, benchmarks.
I call it benchmarks.
Uh, you have this huge databases with
trials and you just count how many
made it within a certain indication.
That gives you some indication
and then you have uncertainty that
maybe is difficult to quantify, but
you still want to feed that into.
Your assessments or we have kind
of a framework where, where you
can somewhat formalize that.
Um, but ultimately you get a number
out of it because that's what you need.
It feeds into decision making.
It helps evaluate your pipeline.
Uh, but for me, the biggest
asset in such a framework is
it incentivizes transparency.
Because whatever you can, whatever
you want to quantify, you have to
put your cards on the table and
make a very transparent assessment.
What evidence is out there?
How do we treat it?
Why do we not take it into account?
Why do we take it into account?
Decision makers don't like the small
probability of success that you compute.
Okay?
Then you have this list of
criteria and they can pick which
one do we want to mitigate?
Okay, this one's 2 million.
This one's 5 million.
I mean, of course I'm exaggerating
how easy this is, but.
This is the direction in which this goes.
And, uh, I think this
is very, very useful.
And then we use base in those finding,
for example, base and logistic
regression is a very established
tool in, in those findings.
So I think pharma, pharma has to
solve problems and whatever helps, um,
Scott Berry: yeah.
Kaspar Rufibach: you
Scott Berry: Yeah.
So yeah.
Yeah.
Your description of assurance, calculating
predictive probability of success, the
synthesis of information across things.
Was was, was right out
of the Bayesian textbook.
So, so we'll welcome you in
anytime, uh, uh, you want to
do, you, you speak the language.
So it's, so, it's interesting you
gave this description of, uh, uh,
trial level work where you're,
you're working within a, a, a team.
Probably more well-defined, um, um,
goals and needs of designing this
trial, carrying a drug through.
And then you described almost going into a
methodology sort of group and maybe cross
teams larger impact your, your career thus
far, uh, seems to be that you get either.
You're excited about or get
pulled into leadership roles.
Um, and within this and, and the
methodology aspect of it, it's really
interesting because I find sometimes if
I sit down for six hours and I write.
Uh, programs to calculate a model.
I feel like I made huge progress, but
when I do things that are more sort of
broad within it, it's hard to judge.
Have I had an impact in that?
You, um, so you, you, you have these
very well-defined trial, uh, things
where you're making progress and then
you have much more broad type impact
trying to have an impact on statistics.
Do you personally.
I find like you, you can measure and
have that impact on a broader scale.
You enjoy that.
It, it seems, at least
from your activities.
Kaspar Rufibach: That's an
interesting question you're asking.
I'm just.
Thinking about how I look at this,
um, and you said, I mean, I, I took
these leadership roles and that's,
maybe, that's maybe not even true.
I mean, I, uh, I just
fill gaps that I perceive.
For example, I I with a colleague
from Novartis, I co-founded
the F Spy PSI special Interest
Groups, estimates in oncology.
And I mean, I gave a talk very early
in the estimate days, and then Afghani
approached me and said, well, we are
discussing the same problems at Novartis.
Maybe we should, we should have a beer.
So we had a beer and then we
said, oh, maybe there are others.
So we put a call out and like one or two
years later we had like, I think about
eight subteams and a hundred members.
And uh, of course we started
it, so we were kind of supposed
to, to lead the whole thing.
And with this Spy Methods
leaders, it was similar.
I, I always felt there was
an under appreciation of
these more technical aspects.
There is an spy leaders group, these
are the department leaders in pharma
companies, and they discuss very pertinent
aspects of, of that role, kind of
careers for, for statisticians and, and
how should we interact, uh, among, uh,
ourselves and, and how do we position
STAs departments in companies and.
That's a more than a
full-time job for everybody.
And then I felt maybe the, the methods
piece should deserve more attention.
And then again, I just, I mean,
I'm well connected in Europe, I
just called a couple of people
and said, well, what do you think?
Should we start something like that?
And then we, we just
started something like that.
So, and then of course, because I
initiated it, I now end up kind of.
Of being perceived as the head, but
this is just a group of people having
regular discussions amongst each other
and trying to come up with an agenda,
which are the topics where everybody
profits if we tackle it together.
And yeah, it's, it's not like I'm,
I'm looking for this, these things,
I just try to, to solve problems
or, or to, to help things progress.
Um, I think that's my ambition.
Scott Berry: Yeah, Yeah,
that, that's great.
And, and additionally, you
are now, uh, finding yourself
solving problems with ICHE 20.
Which is the, uh, uh, ICH uh,
guidance on adaptive designs.
Uh, you are now becoming a
topic deputy topic lead as
part of your FPA role in that.
So more leadership.
And I know, I know we don't want to
dive into that since this is, um,
um, your, you're working on this,
um, so finding many leadership
opportunities, which is great.
Now you're at Roche.
I think it was about
Kaspar Rufibach: Not anymore.
Scott Berry: No, no, I know you, I'm
Kaspar Rufibach: Ah.
Ah, sorry.
Scott Berry: So, so, yep.
So chronologically, uh, you're, you're
finding yourself solving lots of problems,
many of them becoming leaders within
F Spy, uh, within FBA number
of these also at Roche.
Now you leave Roche about a year ago.
Um, and now tell us about
what you're doing now.
Kaspar Rufibach: So it's a, a
very similar role, but in a, in
a somewhat different environment.
It's Merck.
Uh, so the German Merck is a smaller
company, a smaller department,
uh, a smaller pipeline, but is.
One, one aspect that, that I found very,
uh, exciting about Merck and, uh, the
vision of, of the leader of the department
basically is to try to integrate all
quantitative functions in one shop.
And of course, every company tries
to do that to certain extent,
but here it was, it it is really
an, uh, an explicit attempt.
To bring together quantitative
pharmacology, real world data statistics,
statistical programming, and then you
have data science and, and, uh, a group
that does more the platform stuff.
It's all in one department.
And with the hope that adding this
up is more than the sum of its parts
because, I mean, this is something
which always made me struggle that.
For me, I mean, a lot of people
still make a distinction between a
statistician, an epidemiologist, a real
world data scientist, and of course
they don't know all about everything.
But I mean, ultimately you have
data and, and, and, and you want
to make quantitative statements.
You want to quantify the,
or assess the biases.
You want to quantify uncertainty.
Somebody who was exposed
to so many different.
Questions during this four years
of consulting for medical doctors.
This for the, from the first
day when I joined the industry,
this felt somewhat odd to me.
Um, and so for me, this, this, trying to
converge this and of course bringing all
the specialties and all the different
kinds of knowledge up to the table.
I don't know as much about data quality
and registries and how to set these
things up as a trained epidemiologist.
But I, I, I don't agree When people
say, well, we are epidemiologists
we do causal inference.
I have read quite a few books and
papers about causal inference lately.
So, uh, yeah, I, I can, I can talk
about, I teach about this, I can
talk about this and, and, and I would
expect an epidemiologist who works
in drug development that he or she
knows something about clinical trials.
And at some point it's just
different flavors of the same thing.
Um, so this is one thing that,
that I found very attractive.
Um, and then, yeah, uh, I have the,
the opportunity to work a hundred
percent from home, which at that time
was kind of something that I enjoy.
I live in a very, uh, very nice
area with uh uh, so that is also
something that allows me to kind of.
Bring all a few aspects
of my life, uh, together.
That's also something that I found.
Scott Berry: Oh, very nice.
It, it seems a common theme of yours to,
um, uh, a collaborate, bring together
multiple groups, uh, uh, trying to.
Uh, you know, raise the boat for everybody
that e everybody benefits from this
collaboration and, uh, having silos within
Merck, uh, trying to get rid of that.
So that, that's fantastic.
I, I, I wonder if I could ask you a couple
things about your sort of viewpoints and,
and partly I'll ask you things that I'm
asked that I don't have a good answer for.
Um, so now, now you're
bringing together all these
quantitative groups within Merck.
Um.
What, how are you positioning
Merck in this role with ai?
And I know AI is, is in and of itself,
is a, is a big group of things from,
uh, you know, uh, language models
to, to programming to other things.
How, how are you, you
know, dealing with ai.
Kaspar Rufibach: Um,
so first I look at it as, as, I
mean, as base versus frequentist,
if it helps me solve a problem.
So be it.
That's fine.
Um, and I actually personally, because
I think this is our duty, our ambition
as well, I poke myself very, very
regularly for all the things I do.
Day long could, I mean, we have
our internal version of Chachi
T could judge GPT help here.
Scott Berry: Yeah.
Kaspar Rufibach: And what I
find myself is it rarely does.
If I, if, if I, I mean, I have
my style of presentations I
have when I write the talk.
I sit down, I take a piece of paper,
I sketch what I'm going to do, and I
tried it a couple of times, but there
is no way AI can help me with that.
It's, it's my way of doing things.
And, uh, so, so that's one thing
I, for my daily work, and I mean,
I don't care so much if an email.
I am not an English native
speaker, but that's fine.
Maybe one or the other
sentence might sound odd to a
native, uh, English speaker.
I'm fine with.
I don't need to check.
I mean, I didn't do it before
when it was not available.
I will not do it now either, so, um,
and also, I mean,
Scott Berry: So, so that's
daily, that's daily tasks.
So that's daily
tasks and, and I, it, it's
a little bit different.
I, I, I, I agree with you on all
of that and, and in some ways.
Over, you know, 25 years I've sort
of become, uh, I, I, I, I, it's
careful because I, I could say
pejoratively set in my ways and I, I, I
attack problems as certain,
you know, there this, and then
other people coming along.
I can watch my son who's a statistician.
Utilize these tools and make
himself more productive.
And maybe that's sort of in his ways,
but what about, what about getting away
from sort of the process part of this,
uh, into the quantitative part of this,
into, uh, modeling into, um, a trial
design, moving into the statistics realm?
Uh, and I don't wanna say.
You know, getting rid of statisticians,
but doing some of that task.
Kaspar Rufibach: Um,
maybe I don't know enough enough about it,
but I, I, at Merck, I have been involved,
I mean, worked with the teams in.
Doing a very comprehensive
probability of success for a phase
three or kind of preparing an FDA
meeting for a phase three, and then
dealing with the feedback from FDA.
And I mean, if you design a phase three,
of course at some point you need a sample
size and you have a couple of assumptions,
maybe AI can do that, but that is maybe
this, this part, this much of the way, um,
until you then have a protocol that works.
Is maybe, I mean, it, it is so
much more and I don't see, there
are so many discussions behind,
there are so many different things.
You need trade offs to make, you
need to, you need buy-in from other
functions, from stakeholders, uh,
until you really have a design.
And then you go to the regulator
and they, they have their own
ideas and then you come back and.
I mean, my, my kind of prime example to
illustrate the challenge I see with this
is, I mean, we now get a lot of talk about
AI can write a statistical analysis plan.
Well, that's maybe fine.
Okay.
Now, I am very experienced in drug
development or, or say a CSR, a clinical
study report and you have all the
outputs and you have the protocol.
So it, it seems like an automated
thing to write a clinical study.
Report maybe.
Okay.
So when I then think how a team
writes a clinical study report,
it takes maybe, I don't know, say
two months of these two months.
In the old days, a, a medical writer
would write an 80% draft in about a week.
And then seven weeks, the team
would discuss what is the messaging?
What do we want to say?
What, oh, this should go here, and
Oh, this is not accurate, blah, blah.
Seven weeks.
Now with ai, maybe the first week you
cut it down to three hours, but I'm
not sure the remaining seven weeks
are affected by what AI does at all.
Maybe it is, but so this is and
and I, I sometimes feel, and.
Yeah, those people who are trying to push
AI into drug development, if you really
think what we are doing all day, it
can make certain things more efficient.
But it, it, I, I think the struggle
of pharma industry and kind of the
business model that somehow dries
out, I don't think you can solve that
with further optimizing processes.
Um, and, uh, I'm not sure AI can.
Current where it is today can really
get out of that corner of just
optimizing one or the other process.
But yeah, maybe I'm too old
already and, uh, don't have enough
imagination and enough experience.
It's, I don't exclude that.
Um, but this is how I
see things currently.
Me.
Scott Berry: No, I, I, I, I agree with
you and I, I, I worry that I become
set in my ways, um, sort of thing.
And I, I, I, I wanna learn, and
I wanna understand this, but,
but I agree with you as well.
If you go into AI and say,
give me a trial design.
What it presents doesn't, uh, do a great
job of understanding the full synthesis
of what we know.
What are we trying to answer?
Uh, uh, that, that aspect of it is
the statistical intelligence that we
started this with, uh, that, that
part of it, uh, to some extent.
So, uh, so interesting.
So let, let's maybe with, with,
uh, a few more minutes here.
What, what, what.
What are your things you're
interested in now going forward?
The future of this.
Now you have this huge
amount of leadership role in
things, in tackling problems.
What kind of things excite you,
uh, maybe in the next five to 10
years that, that, that aren't ai?
Kaspar Rufibach: I, I still, I mean,
I have, I think everybody over.
Working in this business has this folder
on their desk, desktop papers to read.
Uh, and you subscribe to this table of
contents and you find all, and I mean
it's, it keeps amazing me, myself, how
interested I remain in kind of just
sitting down and, oh, here is this
thing this guy has been talking about.
I didn't know what it exactly is.
So let me read this paper.
So this kind of.
Curiosity, is it, it's not,
it doesn't stop for me.
Um, so did, did I will just, so currently,
I mean, I really try to read up on, uh,
on the causal inference and it's now,
it's kind of entering drug development.
I mean, yeah.
If you look at the, the covariate
guidance from the FDA, it
has like standardization and.
Um, kind of different way of estimating
effects and, uh, I am, I think in my
role, I am the one in, in the company
together with people in my team who need
to trailblaze that and kind of bring the
other statisticians up to speed and make
them understand what does change, when do
we need to implement different methods.
So I think this kind of marrying.
Causal inference with the clinical
trial world, which is a, a very
disparate separation to start with that.
But that's just how things happened.
And, uh, the ICG nine addendum tried to
kind of halfway fix this, but I think
we need to, to, to bring this together.
A randomized trial is a randomized trial
the first day, and then it starts to move.
In the direction of an observational
study because you have this, what we
now call intercurrent events and post
randomization things that happen, which
you cannot just resort to randomization
all the time to, to, to deal with them.
And I think we overlooked a lot
in that direction in the past.
So kind of
bringing the causal inference
methodology to, to randomized trials
and yeah, maybe in certain instances
it will not change much, but sometimes
I think you it, it can really help.
Um, that's one thing.
And then, I mean, I remain interested
in making trials more efficient.
Uh, we, we did some research
and we continue to do that
with multi-state models.
I think this is, I mean, again,
this might seem a very old fashioned
piece of survival analysis.
Well understood.
Maybe it is, but it has not been
used to the extent it can bring
benefit to clinical trials and.
Maybe here I'm hiding behind my, my
own technicalities and my own hobbies
and interests, but I, I don't mind.
I mean, I've, we, we just kind of
worked up, uh, a paper together,
or I mean, academic collaborators.
They worked up a new method in
this context, and it comes out, you
can maybe save about 5% for, for a
trial when you do this correctly.
So, I mean, this is.
This funds my salary for
over my entire career.
If there's one clinical trial that saves
5% of events and, uh, and, and I think
there are still many opportunities, Covar
adjustment or kind of this broad topic of
how to best use prognostic information.
This is a huge topic,
stratification adjustment.
Uh, super covariates, kind of this,
there you land again, potentially
with ai, but maybe you can build
a covariate, uh, based on another
database and then use it in your trial.
How can we make that happen?
What do we need that at some point,
maybe regulators are comfortable
with these kind of methods.
Uh, uh, yeah.
It's not, it's not a very fancy, exciting,
uh, list, but that kind of, these.
I still, I, I have fun in, in
understanding the technical details
of this and then make it digestible
for my stats colleagues, maybe,
and even a broader audience.
Um, this
Scott Berry: Yeah, so,
so to be clear, that is
a very exciting list, and I think the
people who listen to this podcast find all
of those things exciting, uh, within that.
So, uh, your, your
enthusiasm is contagious.
Uh, and welcome, uh, I appreciate
you coming on in the interim.
By the way.
Have you ever been, uh, have
you ever been a part of.
In interim analysis, a data
safety monitoring board.
Kaspar Rufibach: Um, so the first
trial I worked on at Roche, uh, I
met, it was on purpose by leadership.
They put me on that trial, I think
because there was a, some probability
it would stop at an interim analysis.
For efficacy.
So I shepherded it through two FU
utilities and then prepared the efficacy.
And in fact, it stopped.
So then based on that efficacy
interim, we filed the trial.
Uh, yes.
So my answer is yes, uh, and
uh, it, it is very interesting.
And, uh, I'm also big, I mean, at
Roche, and now I'm Merck, a big
proponent of utility interims.
So trying to educate teams on that.
Uh, I have never been
on a data monitoring.
Committee as a member of industry, I
think that's not obvious to to be part of.
Yeah.
Scott Berry: Right, right.
Wonderful.
So, so you were part of, uh, in
the interims that, that ended
with success and I think this
was, uh, incredibly successful.
I enjoyed it.
Enjoyed having you on.
Thank, thank you Casper for
joining us here in the interim.
Kaspar Rufibach: Yeah.
Thank you, Scott.
I enjoyed it as well.
Thank you very much.
Scott Berry: wonderful.
So thank you everybody,
until the next interim.
Thanks.