Up One cuts through the noise of health and wellness trends. Hosted by Sawyer Stone and longevity scientist Dr. Bill Andrews, each episode unpacks peer-reviewed research on supplements, therapies, and diagnostics - no fluff, just facts. If you're serious about the science of aging and human performance, it's time to take it Up One.
Sawyer Stone: [00:00:00] Welcome to Up one, the podcast where we take a deep dive into the science behind supplements, therapies, and diagnostics. I'm Sawyer Stone, your Guide through the Maze of Health Claims. Here to ask the big questions.
Dr. Bill Andrews: And I'm Dr. Bill Andrews. With decades of experience in medical research, I've dedicated my career to uncovering the real science behind disease, aging, and human health.
On this podcast, we don't just skim the surface of scientific studies. We conduct a critical meta-analysis separating credible research from misleading conclusions.
Sawyer Stone: There's a lot of noise out there, conflicting studies, bold claims, and endless marketing up. One is here to cut through it all and bring you science backed insights that you can actually trust.
Dr. Bill Andrews: We're talking prevention, diagnostics, treatments, and the big questions shaping the future of health.
Sawyer Stone: If you're [00:01:00] serious about understanding the science of health and longevity without the fluff, let's take it up one.
Dr. Bill, how are you today?
Dr. Bill Andrews: Pretty good, and good morning to you. Actually, it's morning for me, but I think it's afternoon for you. Uh, yeah, technically and, uh, so I'm doing good. And you
Sawyer Stone: I'm doing pretty well. We've got some really lovely fall weather over here in North Carolina. That's like teasing us a little crisp, like upper fifties, lower sixties in the morning.
It's pretty great.
Dr. Bill Andrews: Yeah. Uh, last year in the year before, it was in the hundreds here right now, this time of year, but now it's. Gonna be a high of 70 today, which is pretty low for us.
Sawyer Stone: That's so much better. For sure. For sure. Well, Dr. Bill, I was been listening to our podcast as they come out just to get a good point of view about what's going on and what's it's all what together.
But I feel like we might could play a drinking game, [00:02:00] like take a shot every time I hear somebody say critical meta-analysis. And I think that we might be a little drunk if we did that. 'cause we are just saying critical meta-analysis. Critical meta-analysis. Critical meta-analysis, like a lot. And it's not the easiest sentence or phrase to say, it's like pretty bulky.
But we should leave that to our listeners C a's not something that people say C-M-A-C-M-A. Yeah. CMA. Well, we should leave the drinking game to our listeners so we can be, you know, cognizant of what's going on on the pod, but have a couple questions about critical meta analysis for you if you are up for answering them.
Yeah, go ahead. Alright. Why is critical meta-analysis or CMA stronger than a standard meta-analysis and what should we be looking for when one is done well versus poorly?
Dr. Bill Andrews: Uh, well, typically you don't, you don't read results of critical [00:03:00] meta-analysis. Um, okay. And, uh, so it's not really like what, what to look for.
It's, it's a kind of thing that you would wanna do yourself. Okay. Um, if, but it, it requires a lot of skills in a lot of areas to be able to do that. And so sometimes it's better to find somebody that's skilled in doing that and, and get them to do it for you. And I, I can name about 10 people that I know of that are really super good at critical meta-analysis.
But the real problem is that even in the scientific peer review literature. There's a lot of, excuse me, like, uh, expression crap that Yeah. Falls through the cracks and you can't believe what you read. Okay. And what you can't, people believe it. That's the problem, is that people are like, reds. Reds, the definition of redder is somebody skilled in the art of persuasion.
[00:04:00] So, so people can get, get publications, uh, published because they fool the reviewers or sometimes the reviewers are not qualified to do in the review. Or it's like lots, lots. A lot of times papers are rejected because the reviewers also don't know what they're doing and feel like they'll lose their job as a reviewer unless they find problems with something that doesn't have problems.
There's all kinds of different things, but you can find it's a problem. You can find anything that you wish to be true. In the scientific peer reviewed literature, because I see a lot of companies will be marketing products and they'll say, well, we need a scientific peer reviewed study that we can say supports our product.
And it turns out you can find a way of getting anything to slip the cracks. And I, I've, I've seen this, especially in my own field [00:05:00] of telomere biology and telomerase. I've seen this happen a lot of times. There's one case in particular that I actually, so people can go to YouTube and look for videos of me speaking at medical conferences and sometimes general, uh, public audiences, things like that.
I like to show a really good example of two different scientific peer reviewed studies in the same journal I think of the year was 2000. This the example. I used the same journal. And the titles are exactly the same. Well, well, the, the, what they did was exactly the same, but the results are exactly opposite.
Okay. Oh, and so, yeah. And it's, it's, the studies were only two weeks apart. Okay. From, from each other in the same journal. So, but this is an example I used because it's near and dear to my heart, and because of my expertise in the area of [00:06:00] telomeres and polymers, I was able to go and do my own analysis.
'cause I said to myself, they both can't be right. Okay. Right. They, they used the exact same cell lines, they did the exact same assays and, and stuff like that. But one, when, when I did the critical meta-analysis, I went and actually looked at the data and things like that. I found that one person, one, one group of scientists, totally misunderstood.
How the, the protocols work. Okay. So they were like measuring telomerase activity, measuring telomere length, uh, like measuring telomere length. And you know, I, I'm the inventor of co or co-inventor of almost every assay in that field. And I could SI could tell that they, they didn't know, one group didn't know anything about telomeres.
They just bought kits from a vendor and ran the kits, got the data, thought they understood what it meant, and they get the wrong results. Okay. And the one one said, uh, [00:07:00] telomere does not lengthen Tel, and the other one said, telomere does length and telomeres. That was the, pretty much the gist of it. Uh, and it, you'll see that I put those in the YouTube videos when you talk about those.
But the, uh, um, it definitely Telomers does length and telomeres and that was the one. Yeah. I'm not surprised that the other one even got published, but it shows that the reviewers. Didn't know the subject matter either, and that's how they got approved. But, uh, it's, it's a real big problem. There's, as I said, I think already you can find anything in scientific peer reviewed literature that you wish to be true.
So if you're trying to market a product such as, uh, black grapes will make your hair grow, yeah, you can find it. Okay. And, uh, that's because, well, you probably can't find that one 'cause I don't think anybody's tried to market black grapes as a hair growth product. But, but
Sawyer Stone: like you were saying about the Diet Coke thing [00:08:00] and we, you were saying that you could not find that aspartame was a carcinogenic.
Dr. Bill Andrews: Yeah. And so we were talking about
Sawyer Stone: that.
Dr. Bill Andrews: Yeah. Uh, there's also the phosphorus that's in, uh, diet Cokes that is considered to be harmful to your health. But I, I, so far, I have not able to find any peer reviewed studies actually. Uh, let's say I've can find peer review studies, but I can't find any studies that pass my critical meta-analysis.
I mean, I see they, they will, they'll have a big publication and sometimes they'll just reference some other study and then say, this study suggests that, uh, diet Coke or Ibuprofen or, or things like that are bad for you. When I really can't find, in both cases, IBU, I, so those are my favorite examples are Ibuprofen and Diet Coke.
'cause there's, it's like there every, there's time. Somebody, somebody has a product that's the best selling product. You're [00:09:00] gonna find all kinds of critics trying to, uh, discredit them, uh, yeah. Or dismiss their properties just so that people buy their products instead. And, and that's a real problem. Yeah.
And it's like, I don't know how to fix it. Um, yeah. Yeah. Another, another thing, even in the scientific, even in the research labs, there's this. Publish or perish kind of phenomena, which yeah, scientists publish even when they don't even believe their own data, even when they can't reproduce their own data later.
And this is again, a problem that I, you have to be skilled at critical meta-analysis to be able to really figure out who's, what's so what some of the talks, titles of my talk are separating fantasy from fact or I don't know what's real and not real. And, and that's, that's what I'm really good at is, is doing that.
Sawyer Stone: So, so if these critical meta analysis can be taken or if they're written by [00:10:00] the wrong people, they can be sort of false. How do we know, in your humble opinion, which of these crita critical meta-analysis of peer reviewed studies, which of these studies are trustworthy?
Dr. Bill Andrews: That's really, really hard. Okay. And it's really that, that's, that's why these people that are good in the art of persuasion, the reds I was mentioning before, right?
They can make it so that the study looks really, really good. But it takes somebody who really understands the field to be able to determine that. So, so the general audience can be fooled easily. Um, I, I, I can this, so this is a mission of mine. Mine is to, you know, really educate people how to do this. But a lot of times it comes down to really telling 'em to find somebody who's good at this and get them to advise you.
Um, because Gotcha. They're like, you have to, you have to know things like, you know how to [00:11:00] design the experiment. Was the experimental design of the study perfectly, uh, done? Was the, did they understand statistics and statistical theory? Did they really test enough samples? Did they analyze the data correctly?
Um, yeah. And, and this is, this is a big problem is 'cause a lot of people branch outta their own field when they're publishing papers. 'cause they want to do a certain test and they just buy a kit from some company and do it, and then they totally misinterpret the test or they, they, they, they over, let's see, they, they, they, they think the results mean a lot more than they really do.
Uh, and, uh, but it's, so there's also, do they come up with reasonable conclusions? Is there common sense and logic involved in those conclusions? Uh, and this is like, there's, there's videos where I actually give demonstrations of me doing critical [00:12:00] meta-analysis and how I'll show a meta-analysis. I'll show the number of papers that show one thing and a number of papers, show one thing, and then I go through and do my critical meta-analysis and I can show how they stop the numbers in one column, start decreasing, whereas the other one, the other one doesn't decrease.
And then you're left with all the pa, all the papers studies that pass the critical meta-analysis say the same thing. The the sure people you do a YouTube search for a talk I gave in Tokyo in 2017 to an audience of about 500 doctors. Uh, that's probably the best example of where people can find that.
But again, it's like you have to be obsessed like I am. Uh, yeah. On trying to,
Sawyer Stone: it's a special skill.
Dr. Bill Andrews: Yeah. Trying to be really good. But people can learn that people can go, in fact, all through my schools, I never, I never pursued a PhD. You just handed me one because all I was trying to do is learn everything I can to be able [00:13:00] to learn about, right.
What, what is really healthy and what isn't healthy and what, what extends lifespan and what doesn't extend lifespan. Um, yeah. So yeah, long answer, typical of what I usually do, but, uh, that's okay. I think I question, I like a long answer.
Sawyer Stone: Yeah, definitely. Well, okay, so then if you were gonna design one of these studies and you were gonna make it the perfect study in your.
Opinion on some new anti-aging therapy. Your favorite, what choices would you have to make about the design, like randomized trials versus observed trials, or what would make or break the credibility of the results that you sought?
Dr. Bill Andrews: Well, first it would depend on a lot of different things. Like is it a nutraceutical or a pharmaceutical or a diagnostics that's being tested?
Uh, is it, you
Sawyer Stone: pick,
Dr. Bill Andrews: yeah. Okay. So, so [00:14:00] the easiest one probably is a pharmaceutical nutraceutical. Okay. So
Sawyer Stone: wait, while you're talking about it, can you tell us what the difference is between a nutraceutical and a pharmaceutical?
Dr. Bill Andrews: A nutraceutical is anything derived directly from plants without making any changes to it.
Okay. Um, as soon as you, so if you, if you isolate a. An ingredient, a chemical inside of a plant, and you modify it to make it better. It's no longer a nutraceutical, it becomes a pharmaceutical. So to ke to maintain the status as a nutraceutical, it has to be something that's directed purified or an extract magistrate from plants.
And so the FDA, when the FDA was first created, um, the, there was some rulings that forbid the FDA from, uh, putting any regulations on nutraceuticals, plant extracts be as they got a [00:15:00] classification as generally regarded as safe. Hmm. Okay. And, uh, that's, um, uh, that's because people have been taking some of these plant extracts for 10,000 years and uh, people haven't really noticed any dangerous side effects.
Um, lemme just say that the problem with that is that there are a lot of plant extracts that do have a lot of side effects, and typically over evolution. Uh oh. Not of evolution, but over time, over the last several thousand years, we've just learned not to eat those foods, those plants, that, that, cause if somebody died suddenly from eating a plant, everybody else say, well, I'm not gonna eat that plant.
Yeah. Uh, but it, it doesn't cover the c category of plant extracts that kill you slowly or cause damage slowly. So if it, right, if it's a, a buildup that's gonna kill [00:16:00] you like 20 years later, um, yeah, nobody's gonna know that. And plus humans aren't going to evolve a way to tolerate something like that because once you've raised your young, there is no evolutionary advantage anymore for you living longer.
So right. Nutraceuticals could be a problem. Um, but, uh, the reason why I think it'd be easier to talk about pharmaceutical, the nutraceuticals is because you don't really need to do clinical studies of nutraceuticals. And plus, every time I hear of somebody doing a study on a nutraceutical, it takes so long and it costs so much money that when they get done, they find out that the scientists in their company have already come up with the next generation and oh, now nobody's taken the product that they just tested.
So they they take the new Right. Okay. Because it's so fast to get nutraceuticals on the market. So is, so is it a [00:17:00] transitional product nutraceutical or is it a final nutraceutical? If it's a final nutraceutical, which is actually really rare, um, yeah, it then becomes something that's worthwhile doing a big clinical study.
But I've, I've always, every time I've heard of anybody doing a clinical study. On a plant extract, it's, it's like worthless by the time the study gets done because somebody's found something. So another plant that's even better. Um, right. But, uh, so pharmaceuticals, um, okay, so in the longevity field, uh, then there's a question of safety versus efficacy.
Okay. Is the product safe versus is the product, does the product work okay? And then there's also a human pharmaceuticals, there's, is it an animal study or is it a human study? Okay. Uh, because it's sometimes very impossible to do human studies, [00:18:00] right? When sometimes for, you know, a sta person skilled in statistical theory will we'll be able to calculate, well, I'm gonna need 10,000 people for this study, right?
And they all have to be identical twins. Randomized, they, the twins all have, that have grown up the same way their entire lives. So it's really hard to do sometimes clinical studies on pharmaceuticals, but the animal studies, um, yeah. You know. Okay. So then there's another question. Is it my study or is it somebody else's study?
Because personally I don it's your study '
Sawyer Stone: cause you're designing it.
Dr. Bill Andrews: Yeah. Okay, good. I don't, I don't believe in mouse studies. I, I, I think that mice, in fact, I've invented a lot of cancer drugs. Um, you know, you hear about me being aging, but I, I actually have a strong background in cancer too. I was actually national inventor of the year for my cancer research, but I, I've developed a lot of cancer drugs that just work beautifully in mice.
Right? They fail tremendously in humans. [00:19:00] And it's something like, there's again, YouTube videos where I talk about this, uh, especially on a subject called Best Choice Medicine, where I talk about how. Uh, animal studies are not necessarily translatable to human studies. And I, I forget the numbers, but I say something like 5% of anything that works in animals actually works in humans.
And Wow, that's really low. It's, it's a very low percentage, especially when it comes to things like cancer drugs. Right. The, uh, and there, there's a lot of reasons for that, and I definitely don't have time to go into those right now, but, um, I, uh, I would, I would do my animal studies on pygmy marmosets.
Sawyer Stone: Okay.
Dr. Bill Andrews: Oh, and that's because pigmy marmosets are pretty much the size of mice, so you can do a lot of them. Um, and I'm, I'm also very opposed of doing any pre, so animal studies are called preclinical studies. I would be [00:20:00] opposed to any, doing any preclinical studies that brought harm to the animals. Okay.
Yeah. The, uh, I'm, I'm a very. Um, animal rights activist. Yeah. But, uh, which is probably
Sawyer Stone: rare as also being a scientist.
Dr. Bill Andrews: Actually, not true. I mean, every company I've worked for that did have animal studies, 90% of the scientists were appalled. It was the clinical affairs group that was doing all these things, and they would send 'em out to other companies like Charles Rivers or something like that, to, to do the studies Yeah.
To dissect the animals. And just, I, I've worked in hospitals where I would walk up and down hallways on the floors where they're doing animal studies, and I was just appalled and made me nauseous. Uh, but so, so I, I believe that if we're gonna do a study on longevity in animals Yeah. We, the animals have to be, uh, have to benefit from those things and Right.
You know, a lot of the, a lot of the studies [00:21:00] that you do, uh, with, uh, animals require. Dissecting and things like that. Su supposedly, but you don't really have to. It's like, uh, the question is, if you're doing longevity, did the animals live longer? Did they live healthier? Period. That's all you gotta really do.
Measure, you know, like I, right. When I talk about my preclinical studies with pygmy marmosets, I haven't done yet, but I've already identified places in South Columbia that would do it and things like that. I would have the pygmy marmosets all still in the wild, in a netted area Okay. And just have tag on them Sure.
And just have food for 'em and everything like that. Protect 'em from predators. Uh, and, uh, just keep track of when one quits moving. Okay. Pretty much. Yeah.
Sawyer Stone: With,
Dr. Bill Andrews: with the study.
Sawyer Stone: Yeah.
Dr. Bill Andrews: And, uh, and that would be the way to do the study. And it wouldn't be because of the treatment. Well, hopefully it wouldn't be because of the treatment.
But if suddenly treated mice [00:22:00] or treated p marmoset started, uh, uh, not moving fast, I would. To instantly quit using that, uh, treatment. Quantitative data is completely worthless without qualitative data. Okay. And so, okay. And, and, and that's, that's why you know, a lot of the studies, I, so I, I'm a big believer in qualitative.
When you hear me talk at conferences, you'll hear me talk about the Betty White test. Okay. I mean, okay. Even, even in companies that I work with, that and Betty White Test, you know, you can look at a picture when she's 25 and 85 and you know which picture was taken first. Why? Yeah. Don't know. C is it quantitative?
No. Is it qualitative? Very much so. We actually, we use the term here at neurosciences called QQ scores, which means qualitatively quantitative. Okay.
Sawyer Stone: Mm.
Dr. Bill Andrews: So what what we do is we, if we're, if we're trying to find out if a skincare product makes your skin look [00:23:00] better, right? We do, we do measure wrinkles and we do measure.
Blister formation and things like that. But more importantly, then I think the most important test is to have people look at a computer screen and see two pictures, right? And then say, which picture looks younger? Okay. And then click that one, then two other pictures, which picture looks younger? Click that one, which pictures look younger?
Click that one. You know, that, that kind of thing. And that's, that's all qualitative. But I think that's so much more important than anything else because when, when you buy a skin product, you don't wanna know that your telomeres are getting longer or your, uh, DNA methylation patterns are changing, or your, uh, your, uh, IgGs are getting glycosylated in different ways, different markers of, of lev uh, aging and longevity that people use.
You wanna know that your skin look better and, and you wanna know, yeah, you wanna know that when you walk down the street. [00:24:00] Everybody looks at, people are thinking, gosh, she looks great. Yeah, that's exactly, and that's, that's, that's the most important thing, ed of all. And so, so yeah, I, I've written clinical protocols.
A lot of 'em I like, I, I have some that are like 200 pages long for, for looking at various techniques. Mo mostly pharmaceuticals. 'cause I wouldn't spend that much money and time on a nutraceutical, especially, we're gonna have a better one coming in sooner before the study would be done. But, but a lot of it is qualitative.
Okay. Sure. And in, in the clinical protocols, I identify exactly how to take a before and after photo. Okay. Yeah. And, you know, no makeup, similar lighting, do it in a dark room where the only lighting source is a light that can be consistent from time after time. Uh, no smiling, you know, if you insist on smiling, make certain that's the same smile each time.
The, the lots of different things. 'cause the qualitative is far more important than, but, but, [00:25:00] but still, I, so in designing the perfect experiment I and clinical study, let's say we've already been through animal studies, now we're doing humans. Yeah. I, I would be doing a hundred different biomarkers. Okay.
Because I don't believe any biomarker alone is telling you that you have reversed aging or affected aging, or even slow, well, first of all, it's impossible to measure the slowing down of aging except in animal studies. So you could do it if you had 10,000 pygmy marmosets in one group and 10,000 pygmy marmosets in another group.
You can measure the slowing down of aging, but you just can't do it in humans. I mean, it's like. Uh,
Sawyer Stone: well then how do we know that TLO Vital is working
Dr. Bill Andrews: it's faith? Okay, it's faith. And that that's what's really, you, you, you have to go on faith, uh, regarding the [00:26:00] biology of the product. Okay. We do know that Sure.
Lengthening telomeres does reverse aging in animal studies. Okay. Yeah. We do know, we know that it reverses aging in, uh, human cells grown in a Petri dish. Uh, we know, we know that it reverses aging in human skin, grown on the back of a mouse, and in order for us to, and then we can even measure slowing of aging sometimes in those cases if we do enough samples, but goodness.
But to do, but to do the studies on humans, that would take an incredible amount of work, stuff like that. And so
Sawyer Stone: in time, I,
Dr. Bill Andrews: I always say. We resort to what are called phase four clinical studies, okay? Mm-hmm. So you hear all the time about phase one, phase two, phase three of clinical studies, but you really hear about phase four.
Phase four is, and this is perhaps it can be done with nutraceuticals or [00:27:00] pharmaceuticals, but phase four is after the product has already been marketed, okay? And in the case of pharmaceuticals, it's by prescription only. Okay? Right? So the doctors prescribe it and then to their, their patients. And then when their patients come to them with adverse side effects or serious adverse side effects, uh, or adverse events is what it's called, they'll record it and let the manufacturer know.
And so that's why when you buy a pharmaceutical, you see these list of all these different things that can happen that's comes from those doctors getting feedback from the patients, right? That's what a clinical. Uh, phase four clinical study is done. You can do that on, on, uh, with nutraceuticals too, but now it becomes more anecdotal because nobody's actually doing the actual measurements.
Um, and uh Right. That, to make it even more complicated, there isn't a [00:28:00] chemical or ingredient of, from a plant extract on the planet that somebody isn't sensitive to our immune systems. Right. Our immune systems are so different from each other. Yeah. I often talk about the VD and j regions of our chromosomes, which are the regions that are shuffled to create our antibodies.
Okay. And everybody's, everybody's are different. So everybody, there's always gonna be somebody that's gonna be sensitive to something. And so it becomes a numbers game if like, like 0.1% of the people taking the product have this serious, uh, adverse effect. You just tell those people quit taking it. Okay.
Yeah. Yeah. Nothing else you can do. Uh, but, uh, yeah, so, so in terms of testing in pharmaceutical, I would be looking at blood biomarkers, you know, uh, like especially homocysteine and, uh, insulin and, uh, uh, [00:29:00] a host of others, I, I, I think there's like 50 of them in my clinical protocols that I would be looking at because just first in one means nothing.
Just affecting one means nothing. It just means you're affecting that one particular market. You gotta see all the changes that are associated with aging. Um, but yeah, so a lot of blood markers and it includes telomere length. And, and in my clinical protocols, I use more than one method of measuring telomere length, but because telomeres are so difficult to measure, but, uh, DNA methylation changes, that's easier to measure.
So I include that. And also IgG clay oscillation is something, those are the. Three biggest one, and well, there's also total RNA sequencing, which is super expensive, but it, it's the best way to test for epigenetics. People call DNA methylation epigenetics a lot, but it's actually not, it's actually a epigenetic mechanism.
It's a way of, of controlling epigenetics. But total, RNA sequencing is the best way to measure epigenetics. And [00:30:00] epigenetics is the study of genes being turned on and off, because that's the biggest thing about aging is you have right, 25,000 different genes in your body, every cell of your body, and except your red blood cells, and you have all these genes and the ap, your parents, your health, your ability to do things is all affected by the relative proportions of how much each of these genes are turned on or off.
Every gene. As like a dimmer switch next to it that turns it on and turns it off. So that's epigenetics. It's the study of how much is a gene turned on, how much is a gene turned off. And so
Sawyer Stone: is that like how some people get blue eyes and how some people get brown eyes? Well, it's either turned on or turned off.
Sometimes
Dr. Bill Andrews: that, yeah, in those cases are permanent. So there's not a dimmer switch. It's sometimes sure on off switch. Um, yeah. And sometimes it's permanently on, but, uh, the genes that affect aging are all dimmer switches. [00:31:00] And so what, okay. What you wanna do, and there's a lot of genes that are correlated with aging and what, even back in the, yeah, 1990s when we were studying aging, like, like skin grown on the back of a mouse, we would find like from like essentially doing equivalent of total RNA sequencing, which wasn't available, that we would find 50 different genes, at least, that were changed with aging.
When every, and there's a few publications that I'm an author on that, that came out in the 1990s and things like that about. Things, but there's a lot of genes that have changed. But those genes, those turning off are, that's what gives us our wrinkles and our hair color and things like that, that are associated with aging and so Gotcha.
We, we wanna measure all those things. And the best way to measure those is from total RNA sequencing. Um, now, so, so we also want be testing physical things like balance. Balance is an extremely, uh, important thing. There's a Dr. Joe Raphael in, in New York that [00:32:00] just did a big, uh, publication about that, which I was, I was very impressed with about balance may be one of the best markers of aging There is, yeah.
But there, there's also, um, uh, let's see, vision, um, cognitive skills. Uh, hand grip. Hand grip is one of the most important ones. Yeah. Uh, and it's hard to do hand grip studies with animals, but. There's ways of doing it by making them climb up through a hole to get their food and things like that. How good are they at
Sawyer Stone: doing that?
Dr. Bill Andrews: But there's, so I would be doing heart exams, cardiovascular exams, uh, brain exams, carotid artery exams, uh, all the different organs looking for what are called, um, trans liver transaminases liver's one of the first things that fails, uh, when you get older and there, when liver start failing, you [00:33:00] get release of transaminases liver, uh, from the liver, and you can measure those.
And those are the things that are included in all the studies. But it's like these clinical studies are super expensive and, and we only do 'em when we actually have a final product. We don't, we don't, we're not gonna do 'em. Right. And then if we're still developing a new product, and, uh, a lot of times it's impossible to raise the money.
Sometimes it, it costs a million dollars to test one person. Right after treatment. And so it's, it's really, but, but a lot of the, the big problem, especially with pharmaceuticals is a lot of pharmaceutical companies cut corners and
Sawyer Stone: Sure.
Dr. Bill Andrews: I've been
Sawyer Stone: trying to do the cheapest and fastest. Yeah.
Dr. Bill Andrews: I, I've, I've, I've, there's at least six or seven different drugs that I've invented, well, a lot more than that, but six or seven that actually got approved through clinical studies.
But, but there's others that I invented that didn't get approved, uh, especially cancer [00:34:00] ones. But the, uh, I would be, I, I was never actually part of the clinical study. I just invented the product, but I would always be attending the clinical affairs meetings, getting updates on the testing. And I was always appalled at how many things got dropped, either 'cause it was too expensive or because somebody forgot to prepare for it.
Okay. All of a sudden Right, you're doing your clinical st I call it the oops factor. Okay. Yeah. So, so clinical study be, be done. Patients, you got 50 patients sitting on, on, uh, uh, hospital beds, uh, IVs in 'EM, and stuff like that. Okay? It's time to measure homocysteine Okay. Levels, and all of a sudden the nurse comes up and says, oh, we forgot to order the kit.
Okay. That, that, no, that's, that happens a lot. And so they say, okay, let's move forward anyway. Okay. So, so in my clinical protocols, I'm the orchestra [00:35:00] leader. Okay? I'm sure I, I'm not an md, I'm a PhD, but you know, I teach continual medical education all the time. So, and, and I, and it is no, no offense to, to MDs.
They're so busy treating patients right? Sometimes I have to refer to 'em as cattle herds. They're. One patient out, next patient in next patient, they don't have time to learn all this stuff. And then when they do learn it, they learn it from sales reps that are obviously biased. So, so there's such things as conferences, which are continuing medical education where the doctors go, they actually listen to the, uh, lecturers and I teach at these things all the time.
They listen to 'em, then they're given a test that I'll have written my questions of the test and they have to pass the test to get a certificate showing that they have participated in it and they get to hang this in the wall. And now all of a sudden they're, they're, they have a new field, they can do
Sawyer Stone: what randomized control trials are often called the gold standard of [00:36:00] trials.
But when it comes to longevity research specifically, are there blind spots or limitations to a randomized controls trial?
Dr. Bill Andrews: Uh. Well, it's, it's, they're really hard to do. But, but I do believe that the best kind of measurements come from double blind, placebo controlled, randomized trial. Ah, the critical thing is it's hard to, it's, it's hard to randomize humans because every human is, is, uh, their, their lifestyles are different and things like that, but it's
Sawyer Stone: already randomized
Dr. Bill Andrews: and, and yeah.
And so in order to be able to say that you have two groups that are pretty much identical in terms of certain traits, um, you have to have a lot of groups. You have to have a lot of people in each group, but in, in animals, yeah, it's, it's a lot easier to do. But, um, but so I, I, I would never s say it, [00:37:00] randomized, doubled wine, placebo controlled, randomized study is a waste of time.
Yeah, but it, it does come right down to being qualitative instead of quantitative, but quantitative. It, it's, yeah, it's like, I mean, I try to include as much qualitative stuff as possible, but I still, I, I try to do, like if I'm, if somebody, like, there's been times when I've actually done it publicly, like randomized com, uh, placebo controlled studies where somebody is at a actual conference and like I remember one in particular at an A four M conference, there was a booth where somebody was trying to, they had a product that, um, they were trying to promote based on energy levels.
And, and I, I do believe there are products that work with energy levels, but I also believe there's a lot of quacks in charlatans that take advantage of certain tricks to try to fool you into thinking that their product does different things. And so. I, [00:38:00] uh, I remember doing a, uh, there was this group that had this bracelet that you put on your wrist, and then they would show that, uh, it's easier to tip you over, or if you don't have a bracelet on your wrist, you can fall over it easier than if you had it on your wrist.
So I did this, I, I challenged them. I did a double-blind placebo controlled experiment. I, I said, first said to 'em, do you have to add the wrist on your wrist? And they said, no. I said, okay, well, let's put, let's put it in a bag and a paper, brown paper bag have, just have a person to hold onto it. Okay. And so we, we had, we got some things that, that felt exactly the same as the wrist.
And we, I had a circle of people standing around wanting to watch a study, and I would have somebody else randomize it. Randomize the band. Yes. And then, uh, then we would provide the person a. Several different people with bags, and then the people, people [00:39:00] selling the product would then try to tip 'em over.
And it turned out, and then, then afterwards we opened the bags to find out what, who had what, and it was completely useless. Okay. And yeah, as soon as, as soon as we showed this, um, the person who was running that booth at the conference just started making a scene like calling me a, a fake and fraud and things like that.
And I just had to walk away from it. But I think I made my point, uh, and nobody was questioning. But that was the advantage of Doubleblind placebo control because. Sure nobody could accuse me of being biased. Okay. In this study.
Sawyer Stone: Right. 'cause you were also blind to which one had the wristband.
Dr. Bill Andrews: But it's rare when and and so in terms of quality, you know, being able to push over, that's a qualitative thing.
More than a quantitative anyway. So it's still, it's hard to separate Qualitative. Qualitative is the most important thing.
Sawyer Stone: Yeah. [00:40:00] Okay. So when the researchers are pulling all of the evidence together for their studies, a critical meta-analysis and isn't just like pulling the results, how does it actually weigh the strengths and weaknesses of different studies to get a more trustworthy conclusion?
We sort of talked about this earlier, but like what is the critical meta-analysis, like studying in the study to get the best result and to spit out like what might be the best conclusion or a trustworthy conclusion?
Dr. Bill Andrews: Well, you don't actually do a critical meta-analysis during the study. Critical meta-analysis is.
After the study is published or even not published. Okay. When, when somebody else reviews it. Okay. But right in order, if you are somebody that's wants to do a really good study, you have to be skilled at doing critical meta-analysis so that you know that I see it will pass the test. And that's, that's, so that's why like my clinical protocols are so expensive and time consuming.
It's because I design all these things into 'em. But yeah, so, [00:41:00] so, right. So you, you gotta, you gotta know your animal. It's typically, so, so I remember in graduate school there was always this big emphasis that when you're gonna do do any study on animals, you have to know all about that animal. You can't just, right.
It's, you have to know everything and, and because in order to know what to look for, so, so, but that is, it goes beyond that. It has to also be, you have to know the particular everything about the particular test you're doing. Let, let's say er okay, you're measuring two legs. Um, so many people do it wrong.
So many people misunderstand the limitations of various protocols. Um, and that's why I recommend DNA methylation and IgG glycosylation instead. It's because it's so hard to interpret and it's also so easy to make the results be exactly what you want 'em to be. Okay. Because, [00:42:00]
Sawyer Stone: right,
Dr. Bill Andrews: it is, I mean, because you're
Sawyer Stone: running the study.
Dr. Bill Andrews: I, like, I wanna say in the 1990s, I actually gave a presentation where I showed how I could, uh, change, like there's two different methods I was picking on. One was, uh, what's called PCR measurements of temor length and the other one was Q fish measurements of temor length. And, and I actually showed how I could make the results be anything I wish them to be, to be, be.
Just by slight changes in the salt concentrations or slight changes Yeah. In temperatures. And so all the more important to be double-blind placebo controlled because Right. Everybody is affected by biases. Okay.
Sawyer Stone: Right. And,
Dr. Bill Andrews: um, the, uh, subconsciously or consciously, it is impossible to rule. Right. Uh, even I've been through so many clinical studies, I still find myself prone to possible things, but it's like I see.
Yeah. It's like a lot of times scientists [00:43:00] will say, well, shoot, this result doesn't make sense. I better repeat it. Well,
Sawyer Stone: right.
Dr. Bill Andrews: That's exactly the big mistakes that, that get made. You, you keep repeating it until it makes sense. Okay. And then, then all of a sudden the result is biased. And, uh, not always because the pro, the person doing the study is trying to market a product.
Sometimes they're trying to market themselves. And I, I can tell you there are scientists I know. That have fake data just to increase their own, um, popularity in the field. Okay. And, and this is something that's called dys Personality disorder. Okay. So, you know, I, I also have a degree in psychology mostly because that was the best place to learn about statistical theory and, and statistics in general.
But Histrionic personality disorder is a disorder of people that have to have attention. They have to say, look what I've done, look me. [00:44:00] And
Sawyer Stone: I thought that was narcissism.
Dr. Bill Andrews: Yeah, same thing. It's, it's narcissism is usually about look how great I look or something like that. But it's, uh, uh, but uh, it's like there's so many studies out there that from scientists that just are like, they feel like they are useless unless they can get something else out there that published that gets a lot of, gets in the press and gets a lot of attention.
Sawyer Stone: Well, that must make them biased about their results and what study they choose to do. If they're looking only for recognition,
Dr. Bill Andrews: what, which actually brings to another problem is that even when doing meta-analysis, you don't get a paper published by trying to publish something that everybody else has already done.
Okay. And shown. Yeah. You only get it published if you show a study where it's different from everybody else's. Yeah. And that tends to now get, most of the new publications are the ones that say that it's different. But what about the ones that were all identical? You [00:45:00] suddenly start ignoring these things and next thing you know, press releases.
Somebody has just proved that something we believed all along is not true. Okay.
Sawyer Stone: Right. They're wrong.
Dr. Bill Andrews: They're wrong. That's where my critical meta-analysis comes along. And CMA and I, I can convince myself at least, and, and I, I, I will try to do this at medical conferences and things like that, but. I've gotten a lot of cease and desist orders.
Okay. Uh, uh, so it makes it really hard for me to do that. So I tend to kind of like leave it, you know, good friends and family and things like that. Uh, but, uh, or sometimes not, not well. Sometimes it's impossible, but I try not to name names in terms of products and things like that.
Sawyer Stone: Yeah. Well, okay, so you just said this, like when we get a big headline and.
Just this week, you know, I don't know when this is gonna come out, but just this week the New York Times reported that all of us that [00:46:00] are have been paying attention to our astrological sign for many a millennia are probably thinking that our astrological sun sign is wrong because there was the 13th astrological sign once upon a time.
And also there was like the sun rotation changes and so the day that which the constellation was behind your blah blah, blah and your birthday, and so all this stuff. So, and it disrupts this whole universe that's been created around this idea of astrology. And it becomes like all of these books that have been written, are they now wrong?
Is the New York Times just doing an op-ed or is astrology so immeasurable that comes opinion. So it's like we get these breakthrough studies and these headlines and we can't replicate it. Like you can't really. Replicate where the sun was on your birthday 20 years ago in aging research. How common is that and [00:47:00] what can a meta-analysis reveal about why replication sometimes falls apart?
Dr. Bill Andrews: Well, okay, so you, you said headlines about new discoveries and things like that, and that's where critical meta-analysis comes in really handy. Um, but now I, I have not seen this headline that you just talked about, about changes in astrology, but I, I can say that I, I don't know if astrology works or not, but I'm a big, I'm a big believer in the possibility that it does.
Okay. Yeah. And I respect people's at beliefs in astrology, but I, but it depends on. It would depend on, on what actually, what factors did change. Did, did this, did the people who just published whatever something you were just talking about, did they actually show that the sun really was in a position different And has there been [00:48:00] peer review to prove that they were right and things like that.
Before, before changing anything. But there's all kinds of things that happen like that. And so it's gotta be, is it reproducible? But the person that published that, right, might just be somebody who suffers from Histrionic personality disorder, or who's yp Narcisstic and they just needed to get some publication.
And the other, it's not necessarily histrionic personality disorder, which I sometimes call hip, but, um, yeah, or narcissism. It, it's also because of the publisher parish. Right. I top my fingers. Let's see, uh, publish. Sorry, I should, it's, it's the publisher parish phenomena that exists in the scientific community, especially the academic community where in order for people to get grants and other funding sources, they have to demonstrate a certain number of publications.
And so sometimes people will, scientists will resort to [00:49:00] anything to get a publication. And again, it's hard to get a publication when you're just showing that everybody else is right. You gotta show that everybody else is wrong to get the publication. Right. This is happens a lot. So, so it'd be interesting if I was to do the critical meta-analysis of this estro astrological stuff, I'd be very curious as to what I end up finding out.
Um, yeah.
Sawyer Stone: Yeah.
Dr. Bill Andrews: I mean, I've, I've done things like that on other related things like, like, uh. Like, did Mathusla really live to be at 900 plus years old and things like that? Um,
Sawyer Stone: yeah.
Dr. Bill Andrews: And it's really impossible to tell, right? Because there's a lot, we're not there. Lot of evidence. There's a lot of evidence that they were counting years by full moons, you know?
Right. Which changes everything and can we prove it? No. Uh, and, uh, but, uh, but yeah, but okay, so, so things that have made headlines that really turned out to be false, okay? Is all this stuff, [00:50:00] like, all this stuff about coconut oil being a cure for Alzheimer's, remember that? Oh my goodness. I have not heard that.
Oh, shoot. Check back on this. There was a woman, and I, I can't don't even think she was a doctor, but she had cured her husband of Alzheimer's with coconut oil, okay? And she got all over the news. She wrote a bestselling book on using coconut oil for treating Alzheimer's. Trader Joe's, the grocery store.
Trader Joe's started having mountains of bought jars of, uh, coconut oil. When you walked into the store, you almost had, yeah, they were in your way. You had to walk around it. People were buying them like mad. Uh, yeah, because people believed this redder, this person who was skilled at the art of persuasion.
She never mentioned that her husband died from Alzheimer's. Uh, but, [00:51:00] and, and before the even book, the book became a bestseller. But if people started figuring it out that this was all they've all been had, and now Trader Joe's does not market coconut oil. Is there, there no big press release saying it's all fraud?
They just of course made it disappear. Okay. And this woman made a ton of money. Everybody's experie. Yeah. And that's, so that, that's a, that's one of my favorite examples. Another example,
Sawyer Stone: it'll be a Netflix documentary in like five years.
Dr. Bill Andrews: Yeah. Uh, Elizabeth Holmes type.
Sawyer Stone: Yeah, exactly.
Dr. Bill Andrews: Netflix, uh, thing, which I thought was a really great, uh, uh, series.
Um, another good example of stuff that really got a lot of press was all these pharmaceutical companies coming out with these monoclonal antibodies that would attack beta amyloid. I know I'm using terms here that, so, so there was, so if you, if you were to look at [00:52:00] like monoclonal antibodies for treating Alzheimer's, again, it's another Alzheimer's thing.
Everybody thought that because of the fact that when somebody has Alzheimer's, they have this appearance of things in their brain called beta amyloid plaques. And so, okay. The question was, was it a result of Alzheimer's or was it a cause of Alzheimer's And people like me, I, I was doing research on Alzheimer's back in the 1990s.
Uh, and I, I pretty much was 99% sure that beta amyloid plaques and, and all the scientists I worked with, true, we were, we were sure that these were results of Alzheimer's, not a cause of Alzheimer's, but still a lot of people kept thinking this is what's causing Alzheimer's. So all these pharmaceutical companies, big, big race to come up with monoclonal antibodies, which is a kind of like immune, uh, [00:53:00] treatment that will attack the beta amyloid and cause it to disappear from the brain.
And a lot of us are thinking, what a waste of time. This, these companies don't, they know better, but there was big, millions and billions of dollars spent on trying to do this. They all, all these, like 10 different, five different companies came out with these monoclonal antibodies all at the same time.
None of 'em worked. Okay. Ugh. And it was like a lot of the scientists were just fuming mad because all this waste of time and expense and marketing and things like that when it was pretty obvious that it was never gonna work. Um, and so, but those kind of things happen all the time. There was some new, oh yeah.
So a new possible treatment for Alzheimer's is lithium. I just heard about,
Sawyer Stone: oh yeah, I've heard about this as well,
Dr. Bill Andrews: lithium, but combined with a particular type of salt. That allows it to get through the blood brain barrier. Like, [00:54:00] uh, I forget the name of the salt. Um,
Sawyer Stone: I don't know about lithium in your blood brain barrier.
That feels like we're going back in time to like, treatments at mental facilities back in the day.
Dr. Bill Andrews: Yeah, well, lithium has gotten in the news a lot lately because it's also being, uh, hyped as a, uh, telomerase inducer to lengthen tel. But, uh, it could be all false. And this is, this is why I do critical meta-analysis.
'cause I, I'd like to follow up on these things. Uh, but, uh, I'm, I'm very anxious. Probably as soon as I get done with this podcast, I'm probably gonna go and do some critical meta-analysis of this new lithium version, which somebody has published, because the other ones don't get through the blood brain barrier, so it doesn't really get to the brain.
This new one is like a thousand. You can use a thousand fold less dose. And get it into the blood brain barrier and it's affecting Alzheimer's. But I'd like to see the data, 'cause I've only seen the, so far, the press releases. I think it was late last night that I saw.
Sawyer Stone: Yeah. Well this, this sort of brings me to my next [00:55:00] question.
You, you've like kind of answered it, but I'm gonna ask the question anyway and then you can go back and fill in the blanks. Can you give us a real world example of where a supplement or a therapy looked exciting at first, like this lithium thing, but once the data was aggregated, the story completely changed.
I mean, I think you were saying that about the coconut oil. Yeah. And you were saying that maybe about this lithium, depending upon what it is, I mean, and just seems crazy to me that we like get all excited and we get it all out there and like everybody's hopes get up and then like all the scientific research isn't done properly and like the expectation falls.
Dr. Bill Andrews: Well, the problem really is that this is very common in the nutraceutical field because. Nutraceuticals Scientists can take advantage of the generally regarded as safe rule and they can get publications. But I'm trying to go, I'm going blank on, I mean [00:56:00] there's, there are like lots of different things that have been in the press a lot lately and, and the best way to find it's was it in the press?
And it's no longer in the press. Right? That's, that's the best way. The cross up
Sawyer Stone: like, like they were talking about how you had to like, you could get rid of all of your like mouth cancer if you switched with, I think it was coconut oil for a couple times a day, you like mouth pulled, do you like never brush your teeth again and you just did that.
I mean, it's crazy the stuff that comes out that's like, this is the new best thing that's like gonna change your life. Like cottage cheese is on like the biggest high of its whole life right now because people are so obsessed with protein. But in a few months, cottage cheese would just go back to being what it is.
It's just these like fads that go in and out.
Dr. Bill Andrews: There are so many things that come into that category. I, I, I can't remember 'em off the top of my head, but if you can think of any, I, I can tell you eggs. [00:57:00] I mean, look at eggs. Uh, all the press about eggs are bad for you 'cause it gives you cholesterol, but none of that was true, right?
Cholesterol and eggs get totally destroyed in your intestines and stomach and things like that. And you, it, it doesn't contribute to the cholesterol load that you have in your body. The sugar is actually, what does that, I mean, I think sugar is something that's a permanent thing. We're, we're never ever gonna say.
Lots of sugar is good for you. Okay. No,
Sawyer Stone: but
Dr. Bill Andrews: this whole thing with, look at the bad wrap that eggs got for so long and now
Sawyer Stone: they really do. They're
Dr. Bill Andrews: the greatest things there are. And part of it because of the protein, but
Sawyer Stone: yeah,
Dr. Bill Andrews: protein is a, is it's like one of the things that about protein is that, well, there, there's, okay, two different categories.
Okay. Protein, if you're just trying to get protein to get the amino acids to help build all the other proteins in your body. The, uh, I, uh, [00:58:00] you can tell when I get excited about it particularly, uh, I start tapping my fingers and I'm sure that's messing up the recording. Uh, the, um, but the, it really, all proteins are really the same.
Mm-hmm. All you gotta do is look at the amino acid composition because what amino, because, you know, you wanna make, you wanna have a well-balanced amino acid distribution and it turns out. From an evolutionary perspective, the best amino acid composition to allow us to live long enough to raise our young is pretty much the amino acid composition of meats.
Okay. But you can get the same thing from plants. Okay. Yeah. Just by, and that's why a lot of different plant proteins are a mixture of different proteins because they're trying to make the amino acid composition to be equivalent to, uh, whey or meat or things like that.
Sawyer Stone: Alright. I just have a couple more questions for you.
Okay. Because I know you got a, you're booked a blessed, [00:59:00] um, can you walk us through a specific example where researchers mistook correlation for causation in aging science and what consequences were, what the consequences were for the field or even for the public?
Dr. Bill Andrews: Uh, a lot of 'em, I mean all the, like the DNA methylation, IgG, glycosylation, even telomere length, they're all correlations.
Right. None of 'em have been shown to be causative. I mean, I mean telomeres I think come out to be the best to be show that telomere length does cause diseases in aging when they get short. Right. And, uh, but the jury is not really out until we can start finding ways to lengthen telomeres in humans and show that the results 'cause scientific method involves, involves like the, the test that you do.
You have to [01:00:00] first be able to reverse some effect that you see and, and show that that, so, so if you think something is a cause, you gotta reverse it to show that the cause goes away and then you have to do something called extinction, where you then have to repeat, allow it, go back to the way it was to show that it did cause the disease again.
Then you gotta reverse it again. That's part of the scientific method protocol. Um, yeah, but, uh, none of that's been done with any of the real no biomarkers of aging because nobody's ever passed what I call the Betty White Test, which is really gonna be a cure for aging. Now, longevity can also be defined.
So wait,
Sawyer Stone: when you say that, sorry to interrupt you, when you say that, are you saying that when you say the Betty White test and you're saying, you said this earlier in the podcast, that you're, you can look at the picture and know one picture was taken before the other. Are you saying that in order to pass the Betty White test, you want the same two pictures and the person to say [01:01:00] that the picture that was taken after the first picture was taken before the first picture, so you want the results to be reversed and that this person does in fact look younger in this photo that was taken later in their life.
And so that makes them pass the Betty White test, is that what you're saying?
Dr. Bill Andrews: Yes, but it wouldn't be just looks, it would be, look, feel, and behave. Okay. So there would be several different studies that would be qualitative. But the Yeah, but that's, that's what I mean by that because, and unless you do like 500 different biomarkers and show they all reversed according to things that correlate with aging, you really can't say that.
Did you reverse aging or did you just reverse the biomarker? Um, got it. So longevity can also to be defined as just living longer. Okay. So like, yeah, an anti-cancer drug is a drug that extends, increases longevity. 'cause people live longer by taking, but that's a whole different thing. But focusing on just [01:02:00] longevity, result of aging, nothing.
There's nothing that's not a correlation or just a correlation. Everything. There's no 'cause until somebody actually demonstrate reversal of aging by what I call the Betty White test. We don't have any measurement except correlation.
Sawyer Stone: Yeah. Okay. Well, when you are planning a study, how do you determine the sweet spot for a sample size, like enough people to see meaningful effects, but not so many that you're finding results that don't matter.
And then, you know, when you're thinking about that, what is the biggest risk of it being too small of a sample size or too large of a sample size? When you're interpreting longevity science,
Dr. Bill Andrews: that's a major problem. Okay. I bet. Uh, and, and no, it is, it's because too many people, it's like tossing a coin. Um, yeah.
Well, I'll say a dice. How many times do you have to toss it to get a six? Well, a lot of scientists that don't have any [01:03:00] background in statistics, statistical theory will say six times. Okay. Yeah. But to be 99% certain that you'll get at least one six. There's a formula that you could calculate that I'm really good at, but it, it typically, when when numbers get big, it always becomes four, 4.6 times that number.
Okay. So, so, and, and so if you figure out what's the probability that something, an event will happen, let's say, yeah. Particular drug will cure a certain disease. If you have some, uh, minimum number on what, what, what is, like, how many people like, so how many people do you expect to see cured at the, uh, worst case scenario?
Okay, so you get that. Let's say you expect to see one in 20. Okay. Well, you have to, you have to test 4.6 times that many people, okay? And it turns out to always be that number, [01:04:00] unless it's, unless it's really low. Like if it's. It's like you expect 50% of the people to be treated, then you only have to multiply that by 3.3.
These are numbers that I just, yeah. Have you done so many times? I remember these. But typically when you get into things that have a low chance of success, let's say lower than like one in 10, it's always 4.6. So multiply. So, so in order to do a study pretty well, if you think the worst case scenario is X, or let's say, um, parameter A, uh, where you, one in a number of people are gonna be, uh, show a positive sign.
If they don't, it's a waste of time to do the study. Let's say then you, you need to do 4.6 times a then have that number of people in your study. Yeah. I just can't believe how many ti scientists I have, have worked for me and things like that, that will come and give a report at one of our scientific staff meetings and say, such a thing doesn't work.
Look at I tested [01:05:00] it 10 times and got no hits. And everybody would just be all over the person because of the fact that they did about five times less than they should have done. You know, that kind of thing. Right. But yeah. But that's a really, really big problem. So many studies have been misled because of that.
But again, it's hard to do the right number of people in a clinical study. So you have to do the animal studies, and a lot of times it's very okay to do a clinical study with just one person. Okay. Interesting. If you have good markers, especially since Yeah. The cost of doing the studies are so expensive, a lot of times so expensive.
Yeah. And then, yeah, then after you do the one person, then do another person and then do another person and just build it as you go. Okay. But again, that, that's, you're not gonna be able to do double blind placebo controlled at that time, but you just have to have good markers to make certain, but yeah, it's, it's a tough question.
It's just a lot of studies are, uh, meaningless because they didn't [01:06:00] test enough samples.
Sawyer Stone: Yeah. Well, and I'm sure like other things play into a factor. Like you get a sample size full of like 50 people and you, they're all white women, age 32 that grew up in North Carolina. And they all have different genetic markers.
Like I don't have a gray hair, but I have many peers that are my age that grew up in North Carolina and grew up in similar standing as me that have gray hair. And so our results are gonna be different. And so I'm sure that plays into it too, where you're like, oh, well this person is not gonna have effective data because of this part of their genetic makeup.
Dr. Bill Andrews: People in the biological industry field do not get the proper training in statistical analysis and statistical theory. Yeah. And that's why I'm so glad that I got a degree in experimental psychology. 'cause everything you were just describing. Is exactly what [01:07:00] experimental psychologists would take into account and try to factor out as much as possible, you know?
Uh, yeah. And, and sociological too. So, so psychol. So even in psychological studies, you have to be sociological too. Mm-hmm. In terms of, uh, demographics and things like that. And that's, that's something that's extremely important that 95% of all biological scientists don't understand. Okay. So that's why, that's one of the reasons why I decided when I, in my quest to learn everything I could, to understand why we age, how we age, what aging is and how not to age.
Uh, I chose psychology. Experimental psychology is one of the studies that I needed to really do well in.
Sawyer Stone: Yeah. Well, Dr. Bill, I know that you have got to pop off and do another podcast 'cause you're famous. So I just wanna say thank you for having this little chitchat with me today and I look forward to doing it again soon.
Dr. Bill Andrews: All right, well [01:08:00] thank you and, uh, see you next time.
Sawyer Stone: See you next time.
Thanks for joining us on Up One. If you found today's conversation valuable, be sure to subscribe and share this episode with someone who's curious about the real science behind help.
Dr. Bill Andrews: Have a topic you want us to break down. Send us your questions. We're here to help you separate fact from fiction.
Sawyer Stone: Until next time, stay curious, stay informed, and let's keep taking it up one.