Talking Biotech with Dr. Kevin Folta

In the age of an internet full of false information, how do we tell the real from the fictitious? Jon Guy introduces his new book, Thinking Straight- an owner's manual to the mind.

Show Notes

We are bombarded by claims, and have access to the most information in human history, instantly.  How do we sort it out? What is real and what's not?  Who do we trust? These are major questions today, and affect everything from public health to the foundations of democracy in the USA.  From the news to the internet to the dinner table we are immersed in suspect information. How do we recognize and address conspiratorial thinking?  What mistakes do we make when analyzing a problem?  How can we recognize disinformation? Jon Guy has written a new book, Think Straight, that that addresses these questions perfectly, by providing a tool kit for dissection of information and claims. He covers a broad section of topics in what might be the most complete work on the subject of critical thinking and skepticism.

Pre-Order on Amazon Here 

What is Talking Biotech with Dr. Kevin Folta?

Talking Biotech is a weekly podcast that uncovers the stories, ideas and research of people at the frontier of biology and engineering.

Each episode explores how science and technology will transform agriculture, protect the environment, and feed 10 billion people by 2050.

Interviews are led by Dr. Kevin Folta, a professor of molecular biology and genomics.

346 jon guy
===

Kevin Folta: [00:00:00] Hi everybody. And welcome to this week's podcast today. We're going to delve back into the ideas of critical thinking and skepticism, because it's a critical component of how we look at and digest all of the science we explore on a weekly basis. And today we have a really cool guest we're speaking with John Guy and he is the author of the upcoming book.

Think straight, critical thinking for future. And when I go through this book and I look at the table of contents and all the things that are within it is an extremely academic look of a deep dive into science and skepticism and something that we really needed. I almost feel like it's an owner's manual for the brain, that if you are someone who's thinking about current issues, And the mistakes that we make, this is a book to buy, and I'm really excited about this.

And so we're talking to John [00:01:00] Guy today, John, how you doing? I'm doing

Jon Guy: great, Kevin. Thanks for having me.

Kevin Folta: Yeah. This is a lot of fun. We've been trying to do this for a very long time. Indeed.

Jon Guy: Yeah. We've had, we've been running into several scheduling conflicts over the last couple of months.

Kevin Folta: Yeah. But, but all good that we were able to finally do this because I, you know, I've had folks on.

Who deal in science and skepticism in this idea of critical thinking, because I want podcast listeners, not just to be good with the facts and understand the guests and the new science, I want them to be able to apply it and then think about it inappropriately, criticize it. And so, you know, you're a perfect guest for this particular job.

So tell me a little bit about, you know, you're an author of this book. Tell me a little bit about how this came to be something you needed to do.

Jon Guy: Well, first off I have to correct you that the subtitle of the book is actually an owner's manual for the mind. I think what you had [00:02:00] was. A draft copy. And I had changed the subtitle of it since then.

And the publisher went with an owner's manual for the mind. So it's, it's, it's kind of cool. That that's what you thought it was.

Kevin Folta: Well, I have both, I, I looked at that old one earlier today and I have the other one too. So both of them, you know the, the one is a little more completely. But I love the idea of the owner's manual for the mind, because I have owner's manuals for, you know, a 1972 Ford tractor.

I've got owner's manuals for repeats of equipment I own. And I don't have one for my brain, but this one fits. Yeah,

Jon Guy: well, that's perfect. I appreciate that. So I think what kind of inspired me to write it was originally. I, I had watched Stephen Nobel, his great course called your deceptive mind.

And he covered a lot of ground in that in that great course. And I thought. Well, I know this topic pretty well. I should write something about this, maybe like a small curriculum or something, you know, for [00:03:00] educators to teach other people. And so I started writing it and realized pretty quickly that I hardly knew anything about the topic.

I guess that's kind of par for the course. Right. So I just started reading. Everything I could get my hands on, skeptical Inquirer, skeptic magazine everything by, you know, Dr. Schirmer and Carl Sagan and, you know, all the all the, all the names, big names and skepticism, and And yeah, that's I, I just kinda thought that this is something that needs to be taught.

Like you were saying, understanding the facts is one thing, but understanding how to get to the facts is another. And this book is kind of an instruction manual for how to get to the facts, how to sort out, what is real from what isn't.

Kevin Folta: And that's a really important part of this. I'm teaching a class on this in the fall, and I hope that your book is out by then so that I can weave this into the curriculum because there's so many good nuggets in this that I want students to be able to understand, [00:04:00] you know, exactly how to find good information and how to think about it critically.

How did I get on your radar though on this? Because you had asked me to review the manuscript originally and you know, how did, how did that. So

Jon Guy: when I S when I started doing actual research for this project I, I think to, I think to write this book, I probably read somewhere in the, in the range of like 250 to 300 books.

Who knows how many papers and articles. But one of those books I picked up was pseudoscience a conspiracy, the conspiracy against science. And you had a chapter in that book. I think it was called food. Oh, science or science? Yeah, food science. And I've read that. I read that chapter and was blown away by, I mean, the whole thing, you know, the writing style, the you know, your background knowledge, just the way you presented the information.

And I thought this is a guy who needs. Looking at my work. And tell me if I know what I'm talking about.

Kevin Folta: That's pretty funny because when I turned in the first [00:05:00] draft of that chapter to the the editor I apologized and I said, this thing is the most dry, horrible thing I've ever written. Would you mind if I punched it up a little bit and made it funny and, and I read it now, cause I can't remember exactly what I wrote and I break out laughing at some of the stuff that's in there because there's some real good nuggets in there, but.

But it's, it's, I'm glad that that was inspirational for you because this book is I think 320 some pages. I mean, this thing is a monster and it kind of covers everything. But when we start talking about the idea of science and skepticism, let's kind of go back a little bit to, to the roots of this.

It's changed through time. I don't know how old are you now? A 37. Oh, you're 37. You're young with this thing. I'm 55. I was born in the 1960s. And I went through the 1970s and pseudoscience was all about Bigfoot. It was about the Bermuda triangle. [00:06:00] It was about area 51. It was about Lochness monsters. And we had to sit and critically evaluate these claims.

And I was always the wet blanket in my neighborhood who told the other kids they were nuts, but now we have really serious political and social issues, whether. A pandemic is a hoax. Whether vaccines will kill you, whether or not a, an election was stolen. These are things that in modern day skepticism and in modern day, critical evaluation of evidence really have tremendous impacts.

So why is it so what caused that shift in and why is critical thinking and skepticism more important today than.

Jon Guy: Well, I don't know exactly what caused the shift. I think Th there there's a lot of ideas out there about what has caused that, right? You, you mentioned, you know, the, the UFO's and the Bigfoot and all that.

And I mean, if you go back even [00:07:00] further, we, we get to, you know, sucky by and werewolves and, and I think there's a tendency. Of humans to have the sort of magical thinking tendency, right. It's called magical ideation. And it's basically our tendency to think that things have, you know, some sort of magical presence or.

You know, there's some energy there in the universe that controls everything. And that also leads into, you know, another form of Deakin called conspiracy ideation. And these things are deeply rooted within our evolutionary psychology. We have, you know, reasons to think that there are patterns there because most of the time when we are able to identify a path.

W we benefit from that. Right. And then these days people have tons of access to information, but they also have tons of access to misinformation and the way that. You know, big corporations like [00:08:00] Google and Facebook and Twitter, the way that they feed us, this information, insulates us into information bubbles.

And we, we can look at some, you know, terribly misinformed information and look at it as it as if it's evidence to support some crazy idea. Like, you know, vaccines are poisonous or the election is stolen as you alluded to.

Kevin Folta: And that's a really important point is the fact that so many people are willing to abandon critical evaluation of a claim and accept it if it comes from a trusted source.

And so this is the big deal is whether you're talking about elections, you're talking about vaccines. How do we begin to change the minds of people who, and this goes back to your book. I mean, I, how do I make them read it? Right. I guess, I guess what is the, what is the way for those of us who think critically of these [00:09:00] issues to try to begin to influence others, to share our enthusiasm for evidence.

Jon Guy: Yeah, that I, that's a, that's a tough question. I, in my personal opinion and there's tons of research that supports this critical thing. And, and philosophy for children needs to be taught from the, you know, from grade school on, there are ways to teach young children critical thinking skills in a way that's fun and exciting for them.

There's a way to teach middle school and high school kids, critical thinking in a way that's enjoyable and has long lasting effects. So I, I think starting. What the education system would be. I mean, the best thing we could do is try to implement it into the early education. So. People start thinking critically, you know, from a young age.

And you mentioned that people kind of abandoned their critical thought. If, if they find some information that supports a source that they trust, but I don't think that's exactly it. I [00:10:00] think that people aren't taught critical thinking from the beginning, so they don't have the they don't have the background knowledge or the skillset in order to evaluate information properly.

So, so getting that information. More deeply ingrained into the school system, I think is a much more effective approach than trying to throw facts at people.

Kevin Folta: Oh, yeah. I agree with you a thousand percent. Those are my biggest science communication failures personally, but also the fact that even now teaching and I've been teaching in the university system for 20 years or even longer, if you count times as a grad student and for me, it was always like, let me bury you in the facts of the discipline, but now it's much more.

I give you some information and then I test on a, you go out to the field and here's what you see. What do you think you should do? And my students freaking hate it because they'll say like, can't you just give me a multiple choice exam where I can get. [00:11:00] I am asking them to put together the data and then make a decision and then write an essay about it.

And for me, the worst part is, is they'll say, well, what's the right answer. And I'll say, I don't know, I there's no right answer. I just wanted to see what you would think. And, you know, and, and from the class compilation of answers, I found a few that were really good. So this was why. The way we think about critical thinking and analysis and synthesis of data.

And I agree with you. I mean, all the information we have in the world is at our fingertips in terms of the facts and the evidence. So. What it boils down to maybe is how do we get people to understand what is reliable evidence versus BS?

Jon Guy: Yeah, I, I agree completely. And I think the evolution of critical thinking and skepticism is, is really leaning towards that.

If you, if you go back 20, 30 years into, you know, when, [00:12:00] when Carl Sagan was publishing the demon haunted world w w when you get back into that era, critical thinking was mostly, or a scientific skepticism rather was mostly about scientific literacy. So that's kind of evolved because scientific literacy, isn't all there is to critical thinking.

You have to understand. A myriad of topics in order to kind of grasp the subject matter. Right? So like these days people are weaving in you know, the, the latest studies from psychology, the latest studies from neuroscience, media literacy information literacy. You know, the strictures of logic, the foibles of memory, all these things are, are part and parcel to critical thinking.

I mean, a critical thinker, a good critical thinker should have some background knowledge in, you know, the workings of memories should have some sort. Basic understanding of logic and [00:13:00] argument should that should be familiar with, you know, a couple dozen logical fallacies and cognitive biases. And what's going on in this current state of psychology.

Why, why do we do the things we do and how do our brains trick us into believing that there's some reality that's not really there.

Kevin Folta: That's perfect. And, and a very good. Framing of how we look at issues from, you know, politics to science is we have silos of people that are gathering around current media.

Not necessarily trained in media literacy or being in a certain media ecosystem to question the media, but rather to reinforce what they already know. And so how much does that kind of bias play into our ability to think about

Jon Guy: issues critically? I mean, a hundred percent maybe. Yeah, I mean, we have, we have a long list of cognitive biases that are going to that are going to [00:14:00] make us do that.

Right. There's, you know, the confirmation bias is, you know, I called it in the book, the, the mother of all biases, because it's, I mean, It's the single most influential driving force of our cognition, because that's exactly what we do. We look for information that confirms what we already believe and what we already think.

And we suppress and ignore information that is discordant with what we believe. So if we have some new source telling us that, you know, the vaccines are dangerous or that Donald Trump is the one true president. I mean, if we trust these sources, we can look to that and say, see, look, you know, this trusted source set it.

And the underlying problem there is what is that source doing to convince you, right? What is that source doing to make you think that they're trustworthy? Is it because. You know, they have similar religious beliefs as you similar political ideologies as you do. They have the [00:15:00] same social circles as you, and you know, all these things can influence how we think about any given piece of information.

So if we break that down, if we break that information down and look at that information critically, we can say, oh, Is the argument sound. Is it valid? Are there any kind of logical fallacies involved in this? What are the interests of this person? Could those be influencing how they're making the argument to me?

What are my influences? How could I be interpreting that based on what I already think and believe so critical thinking is like slowing that whole process of kind of vertically accepting information and. Giving it a second. Look with a set of skills that allows you to evaluate that information.

Kevin Folta: You just used a word that I had to look up that was in your book.

What does Verta vertically mean?

Jon Guy: Oh, man, I wish you didn't ask me that. I had a one of the, one of the guys who helped me write the book was a [00:16:00] professor of English and linguistics for like 30 years. And we went back and forth on this term. And vertically means something differently in psychology than it means in philosophy.

And my definition doesn't really fall in either category. But it's close enough. Vertical. Typically means just something's truth value, right? It's it has, it has to do with w with truth values, or how you, how you think about information. So, in my book, I define vertically as kind of the blahzay acceptance of information.

Kevin Folta: Veracity veracity

Jon Guy: is similar. Yeah. We're asking similar veracity is, you know, what is its true value? And I actually used a metaphor in the book to further explain what I mean by vertical, just to kind of paint a picture. And so basically what I did in the book is. I use vending machines as an example.

So a vending machine has[00:17:00] programming instructions to be able to tell whether or not legal tender is in fact legal or counterfeit. So if you put the, put a bill into the machine, it accepts it, no matter what, it'll pull it into the machine. Right. And it's, it's vertically accepting that bill. Now it isn't until something is obviously wrong, right?

That that bill is rejected. And so when we use this as a metaphor for vertically accepting information, We can look at it. Like we vertically accept the information because of all the biases and, and and, and cognitive foibles that we have. We, we accept information vertically and it isn't until something is obviously wrong with that information that we start to question it.

So with the vending machine metaphor we all know that if you take, you know, a a good dollar bill, legal tender, that's rejected, we can smooth it out a bit and feed it back into [00:18:00] the machine. And, you know, sometimes it'll take it right. And, and to use that analogy. Misinformation and disinformation can be smoothed out by you know, these influenced technicians and fed back to us in a way that makes us think that it's real because they have really sophisticated ways of doing that.

So the kind of, one of the biggest points in my book is to try to understand when these influence technicians are smoothing it out when they're smoothing out that.

Kevin Folta: That's really cool. That's a great analogy because it really does. It really is a case of once we have a nugget of information, what are the processes we use to analyze that bill?

And what are the ways that we look for the flaws that indicate or flag that it's. And this is where we really start to lose our lens because of media and politics. And it's an unfortunate residue of the times that [00:19:00] infer, and this is where training and skepticism is so valuable as, as it is you know, graduate education.

One of the things we learned too, we learn as PhD students and we're tested for usually in our thesis defense and our you know, candidacy, defenses. How much are you willing to be wrong? And I'll tell you what the best feeling in the world is saying. I have no idea. I don't know, because there's so many people that dig in their heels on information that they will stand by in defense.

Or, or as I've seen today, people who are actively fooled and deceived by others and accept it, even though they know they're being deceived.

Jon Guy: Yeah, I agree. And one of the, one of the things I said in the book is that certainty is the enemy of discovery. We only learned things when we are wrong or when we don't know.

So it's out of, you know it's out of ignorance or wrong-headed this, that we actually learned something. And if you, you know, if [00:20:00] you prefer to stay in your your, your little bubble of information and believe what you want to believe. Th then that's then that's your business. But when that business starts affecting other people, you know, that's when I, you know, myself and other science communicators get involved in, cause we feel like it's kind of a moral responsibility to correct that information because it does impact.

If you are, you know, if you think water, fluoridation is terrible, if you think climate change is a hoax, if you think vaccines are dangerous, these things have, you know, public health impacts beyond personal beliefs. And when you take those personal beliefs and you put them out there into society, especially social media, where you might have an influence that that has some devastating effects.

Kevin Folta: No, that's perfect. We'll take a break here. We're speaking with John Guy. He's the author of think straight and it's an owner's mind or an owner's [00:21:00] manual for the brain or what, what, what's the new tagline? I, I didn't get it right.

Jon Guy: You'll get it. One of these times an owner's manual for the mind and

Kevin Folta: owner's manual for the mind, which I absolutely adore because it is this is the talking biotech podcast by collabora.

And we'll be back in just a moment.

And now we're back on collaborators talking biotech podcasts. And we're speaking with John Guy, he's the author of think. And owner's manual for the mind. And it is, I was really impressed by the depth and scope of this work. It's, well-written, well-researched has a lot of historical I guess, references historical references.

You can read everything from you know, Socrates, everybody else in here. And very, very well. Well done and something that I think is a excellent collection to any skeptical [00:22:00] book collection. You know, I, I, there you go. I put you up with, with James Randi and Carl Sagan there. I hope you don't mind.

No, I love it. Keep it, keep your head, you know, increase your headband and let's keep going. So the other big thing is, you know, you mentioned this idea of memory and that's one thing that you have in your book that isn't necessarily a. Elements of most other guides to skepticism. So if we talk about that for a second, what, what role does memory play in our perceptions of.

Jon Guy: That's, that's a good question. So people typically think that memory works like a recording device. You know, we form a memory and then when we want to recall that memory, we basically go get it from the files and hit play. And that's not how it works at all memory. Is a very, very complicated part of our kit cognition and it's, it's wrong.

A lot of the times you know, one of the things that I stress in the book is [00:23:00] that memory is a. It's a cognitive process. And if we've learned anything over the years is that most cognitive processes are, are faulty to some degree. Right. And, and is no exception. So when, when you w when you form a memory, there, there's a lot of different theories about how to, how memories are formed and how they're stored, but basically when, when you form a memory, it it's it's.

How do I articulate it properly?

Kevin Folta: Well, maybe that it's a manifestation of our biases, right. That when we perceive something, we're now putting it through a filter, which has our cognitive biases shaping what's remaining. Yeah,

Jon Guy: that's exactly right. I mean, there's research on memory that shows that eye witnesses are better at identifying people who are their same race, their same age their same gender.

And you know, those are huge biases right there. If. If I witnessed somebody who's Asian [00:24:00] robbing a store, I'm going to have a much harder time identifying that person as I would a white male, because I'm a white male and it's easier for me to identify them. And you know, memory is, is it's, it's clogged with all these different I, you could call them biases.

I think we can just call them flaws. But. Like you said when we form a memory, we're not just forming the memory. Of something that happened. We're forming the memory of what happened in addition to our internal narrative of reality. So everything that we think and believe about something is getting stored with our memories because memories are associative.

They are associated with everything else in our brain. So when we form a memory, we're not. Stamping the memory of the event, we're stamping the memory of the event and everything that happened before that, and that all plays a part into how we remembered it in the first place. How we recall it [00:25:00] later on how it changes over time.

Kevin Folta: Yeah. And then, but it's all part of this idea of self-deception how we make mistakes by, by believing. Our own interpretations at times, and not considering them critically. And you know, there's so many examples these days of this, and it drives me crazy because I think that I've got a pretty clear lens as someone who's trained as a scientist who comes into a conversation or discussing discussions.

I probably have wrong, you know, let me see. I do. I mean, I really love it because I really understand that. I knew a lot about a little, little tiny sliver of the universe and I'm free to admit that everything else is, is, you know, everyone else is the expert. And I love experts. I defer to experts. I freely admit having the humility to say, I don't know very much about anything.

So when I am [00:26:00] encountered with a situation yeah. I really don't have a good monitor for self, self deception yet. I see so many other people who are readily fooled just because of something shiny that happens to match what they believe. Yeah. And so how much of an issue is that issue of self deception and how do we guard against it?

Yeah, I

Jon Guy: think I think that's that low hanging fruit, you know, when we, when we see something that you know, is, is a. In line with what we already believe. We want that to be true because we tend to have a kind of ownership of our beliefs. Like they're, they're ours, you know? I have my opinion, I have my beliefs.

We're, we're really possessive about them. And if something's challenging that we kind of feel. They're challenging us. And I think the whole idea of going into something as if you don't know anything is, is, is an extremely respectable approach. And I really [00:27:00] admire that you take that approach. I I've taken a similar approach in my arguments that I've had on social media, as most of us have been in.

Right. Where I, I tend to only get into discussions about things that I do know what I'm talking about. And I remember a friend of mine making a comment about well, you know, this is just what you, you think this is, you know, other people think different things. And I responded that. I think what you're doing is you're confusing.

What I think with what I know. And I know a lot about a little by virtue of having researched this book and having, you know Read hundreds of books on the topic and, and tens of thousands of articles. That's how you gain knowledge. And if you, a lot of people don't have expertise in anything. But they're there, they're free.

They feel like they're free to weigh in on everything. And that's just not the case. Like you said, we need to defer to experts [00:28:00] and, and stay in our lane because outside of that lane, we really don't know.

Kevin Folta: But that's the hard part is how do you identify who an expert really is? And I'm going through this right now with some things that I'm studying, where, you know, is this real advice?

I have someone I'm interviewing next week for the science facts and fallacies podcast. That is someone I've interviewed before someone I really appreciate, I love his work, but he's making some claims that I don't know that I'm really. Able to judge and or that I'm the person to really take apart critically.

Like I would be in a discussion on molecular biology. This is diet and other things like that. So, you know, makes me a little bit uncomfortable to be the. Interrogator, you know, because I, because I really say I'm not qualified to judge this at that level, but, but what I am saying is here's what doesn't make sense to me.

And, you know, with full humility of saying, I'm not an expert in this [00:29:00] area. And I think if all of us went into these conversations saying, let's talk about the evidence and come to. A common plateau of truth, or at least heading that direction. That's how we'd be much better off than these kind of, you know, knocked down, drag out arguments about, you know, why you're wrong and why I'm right.

Jon Guy: Yeah. I agree completely. One of the things that was kind of really moving for me on my way to kind of my journey to. Skepticism and critical thinking was it was, it was a story in one of Richard Dawkins books where I don't remember the details verbatim, but somebody had come in to one of his college classrooms and gave this big, long lecture about some aspect of biology.

And at the end of the lecture, This lecture conflicted with what their professor had been teaching them. And at the end of the lecture, his teacher walked up to the gentlemen and shook his hand and he said, thank you. I've [00:30:00] been wrong all these years. And that level of intellectual humility was really moving for me because I like nobody can do that.

It's so hard for us to do that because of our possessiveness with our beliefs. But back to your question you know, what, how do we know who's an expert? I actually, I covered that quite a bit in the book. W you know, what is an expert? How did they get their status? And I think it, it, it's, it's not one thing, right?

For one, you have to have current expertise. If you're an expert in molecular biology from 1960, Your expertise is not current, so that the, what, you know, then is, is going to be vastly different than what we know about molecular biology. Now you, you should also have, you know, a specialized field of study because we're not all polymath, there's very few polymath.

And when you have a S a specific area or a specialized area of expertise that you spent years. Understanding reading [00:31:00] about writing, about reaching, researching about then that gives you kind of the imprimatur of all that information that came before you, right? I think you should also have experienced.

Being right, you should, right. You should have kind of a track record of not being wrong about things in your field. So if, you know, if you look at that, the other, the other way, if somebody is an expert in, you know, geology and continually publishes research, that's just wrong. Why, why should we believe that person we shouldn't because their evidence and their research, it's not valid.

It doesn't, it doesn't comport with current Current theories in geology?

Kevin Folta: No, that's a really good point because people who are always on the fringe tend to go away or not be taking seriously, unless they have legitimate evidence. And as scientists, we love game-changers. And over the years, people have [00:32:00] argued with me about the safety of genetically engineered crops and all this stuff, they'll say, well, you just go along with the company line because you just support genetic engineering, no matter.

I say, you know, what, if there's one lab on this planet that breaks the story that says it's dangerous. I hope it's mine because I'll tell you what, it's easy to show another paper that gets published in a low impact journal and a paper that never gets cited that says, yeah, this is safe stuff, but you give me the opportunity to show that there's some facet of danger in basically 70% of the products in the grocery store.

That is unique because we've added a gene to the crop. You know what? I would have grants for the rest of my life and probably get a Nobel prize. Yeah. I, I,

Jon Guy: yeah, that's one of the things I mentioned is the book in the book is that scientists get their status by by making novel discoveries. If, if you, if you made this novel discovery, w I mean, you would be putting [00:33:00] decades of research into genetically engineered crops into serious question or.

One particular field of that. And if you can show that, you know, 400 scientists in front of you were all wrong and demonstrate that empirically, of course, you're going to have tenure. Of course, you're going to have grant money. Of course, everybody's going to want you to be doing research for them because you have the smarts and the wherewithal to figure it out where other people.

Kevin Folta: And that's exactly it, but there's a sort of inertia though, that when you're trying to change what four people had demonstrated before versus 40 versus 400 versus 4,000, we get to a point where we're looking at when there is consensus of 40,000 or 400,000 scientists. It really is hard to shift that because you would have to have such extraordinary evidence to overwhelm everything else that it really does.

A hypothesis into theory and he shows, it shows that [00:34:00] we've made that transition. And this is where I'm blown away by the topic. You know, it's a topic you cover in the book. I'm blown away by the confidence, or at least the. Expression of confidence of the people who wish to fight science. And they'll, I mean, I went through it today over and over again with GMO-free Florida, who basically all they say is your fresh spreading in for misinformation.

You know, these folks are out there just to cloud and obviously. The scientific landscape, the nothing to a conversation, but where does it come from? Is what's their motivation for lying to the public about, you know, vaccines or genetic engineering or, you know, fake life.

Jon Guy: Yeah. I mean, there's, there's, there's going to be tons of motivations behind that, especially depending on who were, who's doing the talking, if it's, whether it's Fox news or CNN or former president Trump, or, you know, current president [00:35:00] Biden or, or your friend on the street.

I mean, it it's the, the motivation behind it is going to, you know, vary drastically especially in the anti-vaccine movement. There's big money in that, especially if you're a big name in in the anti-vaccine movement. RFK Jr. I think he went from like a hundred thousand followers, pre pandemic to like 1.3 million, you know, and in, in, before he got his before he got his account shut down and these people are, are, they make a profit off of selling this information.

They don't have their names in respected high-impact peer reviewed journals. That's not where they're publishing their information. That's not how. Getting you know, respect from their peers and they're, they're not doing any of that. They're, they're making their money from social media, advertisement, revenue, selling books, making appearances.

There's big money in that. Look at you know, how many people knew who Johnny Anitas was three years [00:36:00] ago compared to now, you know, how many people knew who Mike Gaetan was? Or, or doctor Robert Malone, people didn't know these names. They didn't know who they were and now everybody knows their names.

Who's involved in the conversation because these are the big names that come out on Joe Rogan or or Fox news or, or, you know, wherever. And people tend to think that these are the people who have. You know, the, the wherewithal to come out and tell everybody what the real truth is, but they're not doing that by scientific methodology.

They're doing science by press, basically.

Kevin Folta: And how much of this is dependent upon this issue of memory? Not from. You know, the biases that come into shaping a memory, but just us forgetting. It seems like if something isn't in the headlines and interface that we seem to forgive and forget agregious false information.

I think the best example in my mind is the lumpy rats of Sarah Laney. In [00:37:00] 2012, this paper, the day it came out was dead on arrival that people criticized it appropriately showed its flaws. And it changed everything. There were governments that canceled their programs in genetic engineering based upon this paper, despite its flaws.

And here we are 10 years later where nobody really remembers what that paper was about. People still put the rats on, on posters and parade around with it's showing that as evidence of, of danger yet. It's like nobody ever holds the liars accountable. And how do you know what's the deal with that? And is there ever a chance that.

W is it that we will never hold them accountable or that we need to educate people to the point where they never have any traction in the first place?

Jon Guy: Yeah. I mean, education is one of the keys to doing that. And I really liked the discussion you had on science facts and fallacies. It might've been a few months ago.

Where they were [00:38:00] talking about whether or not to cancel Joe Rogan for you know, bringing guests on that spread information. And I think you had the, the most sensible response to that out of, out of anybody who I heard talking about it, and we, I think you were absolutely right that the answer isn't to cancel them because that, that just amplifies them.

Those people got on Joe Rowe. 'cause they were canceled and Joe Rogan took notice of that and said, okay, well, let's get you over here. You, you know, I got 10 million listeners and, and you had, you know, a hundred thousand followers over here, let's amplify your voice. So it never has the intended effect.

And I think back to your question is like, how does, how does memory play into this? I mean, there's, there's all kinds of ways. That memory plays into this. There's something called the continued influence effect, right? Where we, we hear some information and we later learned that either bits of that information or all of it was false, [00:39:00] but yet we continue being influenced by it as if it were true.

So I mean, there's, there's tons of reasons. To take things like memory, seriously, to understand the processes of how memories are formed, how they influence our decision-making and, and you know, how they, how they impact our beliefs. How our beliefs impact our decisions and how our decisions impact our lives.

Kevin Folta: Well, let's go back to your book for a minute in terms of its content and who the target audience was, because I kind of see this kind of having a rather broad appeal in that it works for me as almost a textbook for college, because it has a lot of depth and a lot of. Wait to it. And it has a lot of evidence, all cited.

Well, who is your target audience here? I mean, is this something that you want to put in the hands of climate change deniers and anti-vaxxers, you know what I mean? Like who is the person who you really thought would, would buy this and. [00:40:00] Consider it carefully and really use it as a way to reevaluate their own

Jon Guy: behaviors.

Well, I have to admit that I I completely failed at what I intended to do in the first place. So like I said earlier, I started writing it because I wanted some small curriculum that you could give to educators to teach people and, and basically people who have no idea what scientific literacy is, people who have no idea what critical thinking is.

And as the manuscript developed, I started thinking that there's just too much to the subject. There's just too much to. To boil down into, you know, easy, short lesson plans. And as I was researching it, I was learning more as I was going. And I'm thinking, Ooh, I want to incorporate that, oh, people should know about this, you know?

And so the, I guess the target audience now is I tried to take as many different. [00:41:00] Concepts in the field of critical thinking, whether that's you know, how the brain works neurologically, how memory works, logical fallacies, conspiracy theories. I try to take as many different subjects that people cover in critical thinking and put it into one text that's written.

For a lay audience. And I know it's, it's a, it's a difficult task to do that. And I don't know if I've accomplished that. I haven't got a lot of feedback from it yet. But I think it's enough for the motivated adult to be able to sit down, read through it and work through some of the concepts that are in the book and come out at the end of it.

If, if they finish the book. Hmm, maybe he's onto something. Then I planted that seed and they have some skills that they learned in the. To to let that seed grow and become something other than conspiracy believes or you know, being constrained by [00:42:00] political ideologies or whatever it may be.

That's keeping somebody trapped in their little information, bubbles.

Kevin Folta: So one of the concepts I really really enjoyed was the idea of the manufacturer Versie oh yeah. But this is a totally true thing. It's manufactured risk, and this is something that I see all the time, whether it's the dirty dozen. Or vaccines, you're having this, this group of people who are creating risks or there absolutely is none.

And why is it that risk or a sense of risk is so persuasive where scientific information.

Jon Guy: Well, we were buying by nature. If you have read condiments work well, you know, that we're risk averse. We don't like taking risks. We will try to avoid risks. The whole, you know, the whole idea of of the manufacturer Versie is it's basically a controversy that's completely.

Made up it, there, there is no controversy, right? There, the earth is [00:43:00] not flat. Period, hands down and empirically logically it, the earth is not flat. But you have a minority of people who think the planet is flat. And then there's a sense of false balance that gets given to the, I guess, false balance.

Isn't the right term in the case of flat earthers, because they don't get a lot of media attention. But. But it's presented as if there's a controversy of whether or not the earth is flat and, and nobody, nobody with any kind of empirical research has demonstrated that earth is flat. And yet there's, there's a significant portion of people in this.

Who actually believe that. Now, if we take a less absurdum example and delve into like anthropogenic climate change or the safety and efficacy of vaccines or water fluoridation, or genetically engineered crops that the science on. Is continually been reinforced through [00:44:00] empirical research over and over and over.

Genetic genetically engineered crops are the most studied foods in human history, hands down. And there's never been any sort of link between a foods genome and some adverse effect in a human being. So there, you know, foodstuffs that are genetically modified. Are just as safe as foods that aren't, and I don't lie.

I mean, the conspiracy that they are as crazy, because you think about, there are companies who sell terrible food that make millions off of it every year, food that is positively bad for us and, and they make millions off of it. The idea that there's some conspiracy to suppress information that this food is, is bad for us is bonkers because there's food that we know is bad for us.

It still makes millions of dollars. Well,

Kevin Folta: yeah, and then my comeback has always been, you know, these are companies that are worth bajillions of [00:45:00] dollars. And if company a had any inkling that company B's product was poison, you would see it on every single commercial. You would see them using it as leverage.

As, you know, in an, a marketplace to gain more market share, you know, people are always kind of letting the balloon air out of the balloon of the opponent, but this is, this is, you know, you're, you're exactly right. When we start to look at some of these major issues and the manufactured false information around them.

It makes it really difficult for us to be able to sort the reality from the non-reality and or us collectively, of course, but you know, this is why this book is so important is because it helps people understand the process of skeptical evaluation of claims and the mistakes we make and the biases we have.

And I really think that's one of the more, you know, Resource as a owner's manual for the mind. I think it's fantastic [00:46:00] if people wanted to actually go, so what is this actually going to be available? And if people wanted to pre-order it, when and where do they do that?

Jon Guy: You can pre-order it on amazon.com, Barnes and Nobel, probably anywhere where books are sold.

Yeah, pre-ordering is one of the best things you can do for authors because first week sales are super important. They give information to the Butler publisher about how popular the book's going, gonna be and how they want to market it. And it gets the word out and opens opportunities, especially new authors.

That couldn't be more true in the case of new authors like myself.

Kevin Folta: And, you know, and, and I'm happy to say I I'm really impressed by the work, and I'm really excited that it's something that, that is going to be available for everybody. I'll use it in my class too, for, for at least some of the early parts of my course I'm teaching a class on critical evaluation of medical and scientific claims or some uh, medical and agricultural scientific claims.

And we're going to go back through. [00:47:00] Claims about these things, but my whole first, third of the class is about critical thinking in deception and how we, how we deceive ourselves and how do we avoid that? And so I've really excited. I'm going to get a chance to do that. So if people want to follow you on places like Twitter or Facebook, how do they do.

Jon Guy: You can follow me on Facebook. I'm John Guy on Facebook. I'm also skeptic. John got at skeptic John Guy on Twitter. I also recently started writing for thinking his power.com, which is a website that's dedicated to promoting critical thinking and understanding you know, most of the stuff that's in my book, you can find.

I'm thinking his power.com. And you can, pre-order the book on Amazon Barnes, noble or wherever you want to get your book.

Kevin Folta: And just for those listening it's John Guy is J O N G U Y. So it's not, don't put an H in there cause that John Guy is like a real estate guy and

Jon Guy: that's the first time I've ever had to not explain that.

Kevin Folta: No, I just wanted to make sure our [00:48:00] listeners get it right. The the, the, the one that's worse is Jonathan, whereas J O H N a T H O N L. So John, John, I can, I can do that. You know, J O N a G Y. So, so John Guy. Yeah. Thank you so much for writing this book.

I think it's incredibly useful. I think it will be something that really is, you know, and maybe this is a little bit. You know, again, a little bit flattering, but I got a bookshelf full of James Randi and Carl Sagan. And I think this fits right up there with it and encapsulates a lot of what they were really leading us to at a very different time.

So thank you so much for doing this and thank you for being against.

Jon Guy: Well, thank you, Kevin. I really appreciate being here. I had a wonderful time and I'll tell you to hear you say that you're going to teach my book in your class and use it during your curriculum. And that it's you know, comparing me to Carl Sagan and James Randy is a monumental compliment and I couldn't appreciate it.[00:49:00]

Kevin Folta: Well, you did a lot of really hard work here. And I, as somebody who's tried writing books and given up on, you know, Paige, ADA, I really appreciate what you've put together here. So, so good stuff. But you know, I really do encourage the listeners of the podcast to take a good look. If you're interested in critical thinking.

The book again, is think straight and is written by John Guy. And so this is collaborative talking biotech podcast. If you haven't tested collaborative products, please give them a shot. The idea of having organization within a laboratory around a common format is something that really makes a difference in our overall performance and something that I came too much to.

In the game. So something I would encourage you to take a look at. So this is a collaborative talking biotech podcast, and we'll talk to you again next.