Chasing Leviathan

PJ and Dr. Alaina Talboy discuss the ethics of industrial research, social media, and consumer products. They also explore the problematic implications of a Metaverse without regulation.

Show Notes

In this episode of the Chasing Leviathan podcast, PJ and Dr. Alaina Talboy discuss the ways in which numbers have ethics built into them, particularly as seen in the field of industrial research ethics. Dr. Talboy also provides practical ways to avoid misinformation and malinformation on social media. 

For a deep dive into Dr. Alaina Talboy's work, check out her book: What I Wish I Knew: A Field Guide for Thriving in Graduate Studies 👉 https://amzn.to/3weSjfW 

Check out our blog on www.candidgoatproductions.com 

Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. When it rises up, the mighty are terrified. Nothing on earth is its equal. It is without fear. It looks down on all who are haughty; it is king over all who are proud. 

These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. 

Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.

What is Chasing Leviathan?

Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. It is without fear. It looks down on all who are haughty; it is king over all who are proud. These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.

[Unknown2]: excited about this today so i tell

[Unknown1]: yeah

[Unknown2]: me a little bit like how so we're talking about what is the importance of ethics

[Unknown2]: and in industrial research tell

[Unknown1]: yes

[Unknown2]: me a little bit how you got into this right i mean

[Unknown1]: oh

[Unknown2]: you took kind of like you went into ai for a little bit it kind of came out of

[Unknown2]: that but a lot of it has to deal with statistics a lot of it has to do with the

[Unknown2]: right way of asking questions that sort of thing

[Unknown1]: yeah

[Unknown2]: what guided you on this journey

[Unknown1]: oh man alright so

[Unknown2]: well

[Unknown1]: i

[Unknown2]: i

[Unknown1]: alright so when i was doing my phd my

[Unknown1]: my research expertise is in judgment and decision making

[Unknown2]: hm

[Unknown1]: and more specifically probabilistic reasoning so how do

[Unknown2]: i

[Unknown1]: we use numbers to make decisions and i am that person that absolutely loves

[Unknown1]: numbers i will talk about them all day i taught statistics for six years like it

[Unknown1]: is one of my favorite things in the world and i know like people hate statistics

[Unknown1]: but i love it

[Unknown1]: but you know it it comes out in so much of our lives like we deal with statistics

[Unknown1]: almost every single day and we deal

[Unknown2]: m

[Unknown1]: with probabilistic reasoning almost every single day and i'll give you a quick

[Unknown1]: example of when you drive to work in the morning you know you have any number of

[Unknown1]: routes that you can take and you may not even realize you're doing a mental

[Unknown1]: calculation of what's the probability that i'm going to make it to work on time if

[Unknown1]: i take this back road versus sitting on in orlando i four

[Unknown2]: yes

[Unknown1]: you know

[Unknown1]: there is that calculation happening in our head that is

[Unknown2]: that

[Unknown1]: a type of probabilistic reasoning when you think about the

[Unknown1]: weather now florida is its own case because it rains every day

[Unknown2]: of california

[Unknown2]: i don't

[Unknown1]: but in other places when the weather says there's a thirty percent chance of

[Unknown1]: showers what does that actually mean like

[Unknown2]: right

[Unknown1]: is it thirty percent of the area is it thirty percent of the day is it thirty

[Unknown1]: percent of every hour you know there's

[Unknown2]: right

[Unknown1]: any number of interpretations we know there's a right

[Unknown2]: white

[Unknown1]: one but

[Unknown2]: white

[Unknown1]: most people don't think about that

[Unknown2]: right

[Unknown1]: and so these are the the areas that got me really interested in understanding how

[Unknown1]: probability fits into our daily lives and how it influences the decisions we make

[Unknown2]: h

[Unknown1]: and you know we talked a little bit about like data representation with covid

[Unknown2]: hey

[Unknown1]: that is a huge part of what i harp on a lot is when you represent data there are

[Unknown1]: right ways to do it and there are

[Unknown1]: other ways

[Unknown1]: yes

[Unknown2]: research

[Unknown1]: yes

[Unknown2]: so talk to me about that transition from just like the love of numbers

[Unknown2]: so talk to me about that transition from just like the love of numbers

[Unknown1]: yep

[Unknown2]: to you know what numbers have ethics built into them and

[Unknown1]: they do

[Unknown2]: that's really fascinating to me

[Unknown1]: it is because so as a a formally trained researcher i

[Unknown2]: hey

[Unknown1]: went through an entire research ethics series

[Unknown2]: hm

[Unknown1]: and one of the cornerstones of this foundational research is the belmont report

[Unknown1]: and it's talking about

[Unknown2]: about

[Unknown1]: some of the atrocities that happened in the name of human research and

[Unknown2]: oh yeah

[Unknown1]: it's it's ugly like scientists have done some pretty awful things in the name of

[Unknown1]: science

[Unknown2]: oh yeah yeah

[Unknown1]: yeah and so

[Unknown1]: you know we get a lot of training in this in formal phd programs where we talk

[Unknown1]: about how do we respect persons how do we minimize harm and maximize benefit what

[Unknown1]: is it that we're doing that is ultimately going to help humanity

[Unknown2]: here

[Unknown1]: in ways that we hope it does you know we really have to care about

[Unknown2]: yeah

[Unknown1]: the impact this is going to have not only on our research participants the people

[Unknown1]: actually engaging in our studies but on the larger community in which this

[Unknown1]: research is going to go out into and so

[Unknown2]: well

[Unknown1]: as you can tell i'm i'm pretty uh hyped up about this topic i

[Unknown2]: yeah

[Unknown1]: get very involved and invested because it's something that's so important to me

[Unknown2]: yeah

[Unknown1]: and then i go to industry research

[Unknown1]: and this is where i leave academia and join industry and start conducting research

[Unknown1]: in this user research space so how do

[Unknown2]: what do you

[Unknown1]: we improve our products for the customer how do we improve these experiences

[Unknown1]: and i start to look at the work that's been done in some of these spaces and the

[Unknown1]: the biggest example you'll see is like facebook currently rebranded as metta

[Unknown1]: oh

[Unknown2]: oh my god

[Unknown1]: my gosh they're in the news every other day for doing something completely

[Unknown1]: unethical and it's a big problem you know and and facebook is an easy target

[Unknown2]: right right

[Unknown1]: to say like you know but they're not the only ones there are um you know ma we

[Unknown1]: have digital consultants misusing the data of millions of users to push

[Unknown2]: what would not

[Unknown1]: or nudge people to make decisions they may or may not want to make and they're not

[Unknown1]: informed and they're not told about these things there was in twenty seventeen

[Unknown1]: someone created an ai generated gaydar app to figure out who your

[Unknown2]: what you

[Unknown1]: friend circle is gay that you might be able to date not realizing the ultimate

[Unknown1]: harm that it is doing to an already

[Unknown2]: oh

[Unknown1]: marginalized community

[Unknown1]: these are problems these are big problems and we need to think about

[Unknown2]: yeah that couldn't be misused at all

[Unknown1]: right not at all

[Unknown1]: never gonna be a problem like but this is the this is the issue is that when

[Unknown1]: you're talking about industry research you're talking about doing

[Unknown1]: and creating products that are going to be immediately available

[Unknown1]: which in my mind means you need to be even more on top of the ball of ethics and

[Unknown1]: ethical considerations than even in academia because academia moves at the pace of

[Unknown1]: a dinosaur like it's slow industry is like boom boom boom let's go let's go

[Unknown2]: right

[Unknown1]: you need to stop and think and so this is this is kind of where i got into this

[Unknown1]: space and started talking about the ethics of industry research and why it is so

[Unknown1]: vital to think about these problems

[Unknown2]: yeah execution over planning often

[Unknown2]: one of the unseen cost is often ethics right and

[Unknown1]: yeah

[Unknown2]: that's a very an industry forward type of thing

[Unknown1]: yeah

[Unknown2]: i mentioned that my day job is digital marketing and

[Unknown2]: once you understand

[Unknown2]: really one of the things that i is a big deal to me and i want to have a guest on

[Unknown2]: to talk more about it is how we are literally tinkering with people's

[Unknown2]: neurochemistry with these apps and

[Unknown1]: yes yeah

[Unknown2]: like and like they don't even realize what they everyone kind of knows but like

[Unknown1]: yeah

[Unknown2]: the the the introduction of infinite scroll and the fact

[Unknown1]: yep

[Unknown2]: that everyone's going to it in order to compete with each other and

[Unknown1]: yeah

[Unknown2]: what that does to the consumer without the consumer being asked is

[Unknown1]: yep

[Unknown2]: it's unbelievable

[Unknown1]: it is and in twenty twenty the

[Unknown2]: one

[Unknown1]: creator of the infinite

[Unknown2]: of

[Unknown1]: girl actually went on a speaker tour to apologize

[Unknown2]: yes

[Unknown1]: for the impact his work had and to bring awareness to the psychological and

[Unknown1]: societal ills of this technology like this is the person who created this stuff

[Unknown1]: and they are out here apologizing for it you know and

[Unknown2]: yeah

[Unknown1]: and again facebook or meta is an easy punching bag they were literally just in the

[Unknown1]: news april twenty twenty two for again

[Unknown1]: six months of purposeful manipulative

[Unknown2]: i

[Unknown1]: psychological harm to their user base

[Unknown2]: e

[Unknown1]: and nothing's going to happen from it

[Unknown2]: yeah

[Unknown1]: like nothing

[Unknown2]: well

[Unknown1]: nothing's gonna happen to facebook or meta it's gonna happen to the customers or

[Unknown1]: the people using the platform

[Unknown2]: yeah

[Unknown1]: that's

[Unknown2]: i mean they're gonna make and this is one of those things that our society kind of

[Unknown2]: struggles with is that they make so much money off the harm that even

[Unknown1]: yeah

[Unknown2]: if they are legally penalized they can pay the money for it and still come out

[Unknown2]: with a profit so it's still worth it to them

[Unknown1]: and

[Unknown2]: right

[Unknown1]: that is what sees technology it's not how much money can we invest up front to

[Unknown1]: avoid these things

[Unknown2]: mm

[Unknown1]: it's how much do we have to pay when people find out what we've been doing

[Unknown2]: yes

[Unknown1]: and that dichotomy

[Unknown1]: is so personally infuriating and it's just like i i want to shake the person and

[Unknown1]: be like what are you thinking like we're harming people we need to stop doing this

[Unknown1]: like it's not a slap on the wrist for the people who are harmed it's

[Unknown2]: mm

[Unknown1]: literally changing their lives

[Unknown2]: yes yes i used to be a youth pastor and i used to be a high school teacher and in

[Unknown2]: both those cases as tick tock came out i wanted to engage with it in digital

[Unknown2]: marketing right and so i used

[Unknown1]: yeah

[Unknown2]: it and i personally was like why am i on this for two to three hours at a time and

[Unknown2]: if you look at the way it's used that's what people like well they're twenty to

[Unknown2]: thirty second videos the

[Unknown1]: yeah

[Unknown2]: average user uses it at least an hour at a time that's like the like

[Unknown1]: help

[Unknown2]: if not more and it's

[Unknown1]: yeah

[Unknown2]: like and which is actually which is funnily enough youtube has longer videos but

[Unknown2]: youtube isn't used like that because

[Unknown1]: but

[Unknown2]: it doesn't give you that just insane

[Unknown2]: starts with it starts with the d the

[Unknown1]: as a dopamine surge yeah yeah

[Unknown2]: endorphins y dopamine yes i it's it's exactly what's going on

[Unknown1]: yeah you're hitting that reward centering your limbic system over and over and

[Unknown1]: over again and you're getting a hit every twenty or thirty seconds

[Unknown2]: e

[Unknown1]: it is literally the foundation of an addiction now i'm not

[Unknown2]: yeah

[Unknown1]: a specialist or an expert in this area but that is where that goes

[Unknown2]: oh everyone like i don't think anyone is going to debate with you that social

[Unknown2]: media is addicting i think everyone knows that i haven't talked to a single person

[Unknown2]: who's like no it doesn't do that to me like

[Unknown2]: like everyone's like i just need to spend in fact i would say almost every single

[Unknown2]: person i've talked to unless they're not on it says i need to spend less time like

[Unknown2]: if you did a

[Unknown1]: what

[Unknown2]: poll to listeners like you'd know what your immediate reactions like should you be

[Unknown2]: spending less time on social media always yes

[Unknown1]: absolutely

[Unknown2]: everyone everyone says yes so it's

[Unknown1]: yes

[Unknown2]: not that this isn't like what's interesting about this is that we're just on the

[Unknown2]: cusp the beginning of this new technology and

[Unknown1]: yeah

[Unknown2]: we are and so everyone knows what's happening but we haven't figured out what

[Unknown2]: we're going to do about it yet and so

[Unknown1]: so

[Unknown2]: what go ahead

[Unknown1]: so i'll add one more wrinkle to this for you

[Unknown2]: sure

[Unknown1]: so we know the emotional cost we

[Unknown2]: eh

[Unknown1]: know the time sink we know that we feel bad after using it and we know

[Unknown2]: yes

[Unknown1]: that the social media sites are purposely manipulating content

[Unknown2]: hm

[Unknown1]: to increase the appearance of negative stuff to increase engagement because

[Unknown1]: negative

[Unknown2]: hm

[Unknown1]: gets more hits than positive

[Unknown2]: right

[Unknown1]: let me add one more layer here

[Unknown1]: where is your data

[Unknown2]: oh yeah

[Unknown1]: and who's accessing your data and how much data do they really haves something

[Unknown1]: that people don't actually realize is this thing called off facebook data or

[Unknown2]: see that

[Unknown1]: even on twitter off twitter data

[Unknown1]: these social media sites are tracking your movements across the internet even when

[Unknown1]: you are not on their website

[Unknown2]: yeah

[Unknown1]: all right and now you start

[Unknown2]: right

[Unknown1]: to think about things like the meta verse and what is considered data in the

[Unknown1]: metaverse is it your eye movements did you know you can be identified by a

[Unknown1]: combination of different

[Unknown1]: virtual reality movements like this is all your personally identifiable

[Unknown1]: information and do you even know that it's being collected

[Unknown1]: um

[Unknown2]: yeah

[Unknown1]: it's a lot

[Unknown2]: oh no it like literally i think it was like six or seven years ago the and i we

[Unknown2]: again it's kind of the backbone to things we already know the value of personal

[Unknown2]: data surpassed the value of oil

[Unknown1]: yep

[Unknown2]: like it's it is the biggest resource in the world now is that the ability to

[Unknown2]: control people's attention and to know exactly what people want

[Unknown1]: yep

[Unknown2]: and

[Unknown2]: but i so these are the harms right this is this

[Unknown2]: is where we're talking about industrial research why is it important we are

[Unknown1]: these are the harms

[Unknown2]: dealing with this every day

[Unknown2]: i've been floating kind of this

[Unknown2]: reformulation of machiavelli to go back to what you said about negative content it

[Unknown2]: is easier to scare someone it's easier to scare than to inspire

[Unknown1]: yeah

[Unknown2]: yeah and i think because to inspire someone it's actually more powerful but it's

[Unknown2]: harder to do it's like it's it's better to be loved it's better

[Unknown1]: yeah

[Unknown2]: to be feared than loved right

[Unknown1]: yes

[Unknown2]: it's very similar and that's exactly what we see

[Unknown2]: even what's interesting is social media's kind of lateral effect if i could put it

[Unknown2]: that way on journalism and how journalism has become more click bait in order to

[Unknown2]: compete right so it's there's

[Unknown1]: yep

[Unknown2]: all these things where we're releasing this and we're just doing what's effective

[Unknown2]: uh and it's guided by guided by the dollar so how do you

[Unknown2]: so how do you make sure

[Unknown1]: yep

[Unknown2]: that you're asking the right questions when you do industry research what

[Unknown2]: questions should people

[Unknown2]: in these technical spaces be asking

[Unknown2]: instead of maybe like how long can we keep users on

[Unknown1]: yo

[Unknown2]: like what questions should what questions are they asking and what questions

[Unknown2]: should they be asking

[Unknown1]: yup so as as the researcher in the room i am there to represent the customers i am

[Unknown1]: the voice of the customer in a sense you know because they they don't have a

[Unknown1]: direct suit at the table often you know that's what research is there to do so as

[Unknown1]: that stand in for the customers i want to know is this gonna help me is this gonna

[Unknown2]: hm

[Unknown1]: harm me is this going to

[Unknown2]: um

[Unknown1]: do anything for my family and what is it going to do and then as the researcher i

[Unknown1]: need to ask does this maximize benefits does it minimize harm what is it stake are

[Unknown1]: we considering all the people who are potentially going to use this as well as all

[Unknown1]: the people who might not

[Unknown1]: and do we

[Unknown2]: know

[Unknown1]: care about who's in the room versus who is not in the room you know and

[Unknown2]: and

[Unknown1]: as a researcher and my expertise is in human biases and heuristics and how we make

[Unknown1]: decisions

[Unknown2]: eight

[Unknown1]: i own right up front i'm human i am biased i

[Unknown2]: right

[Unknown1]: have mental shortcuts that i use that is just the way brains work i need to

[Unknown1]: acknowledge that because i need other people to acknowledge that too you know

[Unknown2]: yeah

[Unknown1]: we all have these shortcuts that we use it's only through working together and

[Unknown1]: really hammering out these details and really identifying our assumptions that we

[Unknown1]: can undo some of the bias that gets built into these products

[Unknown2]: hm

[Unknown1]: and so you have to have a person and it's typically the researcher who comes in

[Unknown1]: and says like what are your assumptions here like who do you expect to use this

[Unknown1]: who do you not expect to use this and that's a really important consideration that

[Unknown1]: research brings to the table

[Unknown2]: yeah absolutely so

[Unknown2]: really struck by like most of this seems i wouldn't say common sense but just

[Unknown2]: really solid right that you're saying like it's not you know while i understand it

[Unknown2]: is controversial cause it's not done it's not really controversial right like

[Unknown2]: everyone's

[Unknown1]: yeah it's

[Unknown2]: like yeah that would be nice that would be those are nice questions to ask but

[Unknown2]: there is one that's interesting

[Unknown1]: everyone i talk to says that like yeah of course we do these things

[Unknown2]: oh yeah

[Unknown1]: of course and it's like but

[Unknown2]: yeah

[Unknown1]: do you

[Unknown2]: yes no yeah i man we just had well yeah i probably shouldn't talk about clients

[Unknown2]: yeah so we've definitely had like everyone pays lift service to these and then

[Unknown2]: they're like hey can you do this for us and we're like no that's actually illegal

[Unknown2]: i know it doesn't seem like it would actually hurt anybody but you that's not your

[Unknown2]: information to just take like

[Unknown1]: yep yep yep

[Unknown2]: and as soon as that was explained to them they were fine with it but they didn't

[Unknown2]: think about what they were doing right like it's just like

[Unknown1]: yep

[Unknown2]: it's it's easy to do that it's like yes it is and we should not the

[Unknown2]: but you the the phrase you used there that was really fascinating to me is

[Unknown2]: thinking about the people who may not use it can

[Unknown1]: thanks

[Unknown2]: you give me a couple examples of what that is and what that question represents

[Unknown1]: so in the product space usually in industry research research is a reactive

[Unknown1]: process is the people come to the research team and say hey we need data to

[Unknown1]: support this

[Unknown1]: and it's it's a very different way of working from academia where you get to ask

[Unknown1]: questions that no one has thought about before it's very

[Unknown2]: hey

[Unknown1]: proactive it's trying to push the envelope so industry research is a bit more

[Unknown1]: reactive and it's not a you know this i'm having a blanket statement it's not

[Unknown2]: right right

[Unknown1]: this way all the time but in general so you'll get a research question that says

[Unknown1]: hey

[Unknown1]: i need to

[Unknown2]: i don't know why why

[Unknown1]: know why why my customers really love this product that's not on market yet

[Unknown1]: because they need to create marketing materials

[Unknown2]: yes

[Unknown1]: and it's like well do you know they love them

[Unknown1]: and so the question that comes up that i often ask my partners is hey what are you

[Unknown1]: going to do if the research comes back and is not supporting your belief that

[Unknown1]: people love this like are you going to change

[Unknown2]: hm

[Unknown1]: direction are you going to go with what the customer says and

[Unknown1]: sometimes they do

[Unknown1]: you know sometimes they don't and i'm just

[Unknown2]: yeah

[Unknown1]: being honest here

[Unknown2]: yes yeah

[Unknown1]: like it's not always the

[Unknown2]: yeah

[Unknown1]: easiest job in the world um but you have to be willing to like stand up for your

[Unknown1]: customers and say like no this is not what they want

[Unknown2]: yeah

[Unknown1]: and it takes a bit of a thick skin to get through some of these conversations so

[Unknown2]: yeah oh absolutely uh

[Unknown1]: that

[Unknown2]: is there something

[Unknown2]: hmm

[Unknown2]: yeah i think uh what what are some common things that customers can look out for

[Unknown2]: or should be aware of when they are they there making decisions ah in what they're

[Unknown2]: buying and what they're choosing where they're choosing to spend their time

[Unknown2]: obviously like the data side of it is important

[Unknown2]: but are there are there other ways that they can improve their own decision making

[Unknown2]: so that they can make better decisions um

[Unknown2]: and maybe in some way provide like a an ethereal seat at the table because they

[Unknown2]: they're able to

[Unknown1]: yeah

[Unknown2]: know a little bit more how things are

[Unknown1]: yes

[Unknown2]: are handled

[Unknown1]: yes so when when people are using data to make decisions

[Unknown2]: hm

[Unknown1]: are they're trying to decide if they want to buy a product and they're looking at

[Unknown1]: the data that goes with the product there are

[Unknown2]: so

[Unknown1]: a lot of

[Unknown1]: really clear cut questions but i will warn you it takes time

[Unknown1]: and it takes mental effort and it takes a bit of reflection and these are things

[Unknown1]: that get very exhausting very quickly

[Unknown1]: we are limited in our cognitive resources but if it's an important decision we

[Unknown1]: really do need to slow down and take that time to consider it and so some of the

[Unknown1]: questions i ask myself is what is represented in the data

[Unknown1]: what am

[Unknown2]: what

[Unknown1]: i actually seeing

[Unknown2]: hey

[Unknown1]: um where does this data come from

[Unknown1]: who generated this data can i even find that information

[Unknown1]: is all of the relevant information available do i

[Unknown2]: what

[Unknown1]: have a full picture do i have a small slice of the reality that they're presenting

[Unknown1]: to me

[Unknown1]: another big one is this confirming something i already believe and

[Unknown2]: oh yeah

[Unknown1]: this is huge for fake news it's like it is so much easier to see something and go

[Unknown1]: yeah i already knew that of course that's true it's not like this

[Unknown2]: yes

[Unknown1]: confirmation bias is massive so is this something i really believe and should i

[Unknown1]: believe it do i need to

[Unknown2]: like

[Unknown1]: go find something that goes against my beliefs and see both sides of the picture

[Unknown2]: the

[Unknown1]: and the last one is

[Unknown2]: the really interest sorry good

[Unknown1]: oh go ahead

[Unknown2]: oh it's good to say what's really interesting about that is like even your belief

[Unknown2]: can be true and your your evidence that your is confirming your true belief could

[Unknown2]: be false which

[Unknown1]: correct

[Unknown2]: is which is o like kind of mind blowing for some people like i know this is true

[Unknown2]: and it's like you're right but that's not that's not what you should be using

[Unknown1]: but it's not it's not so you know what you believe and your opinion is not the

[Unknown1]: same thing as fact and they should not be treated equally and people tend to

[Unknown1]: conflate the two things

[Unknown2]: yes

[Unknown1]: so i can have an opinion about a set of facts and my opinion can go against those

[Unknown1]: facts but that doesn't make me right

[Unknown1]: that just makes me have an opinion

[Unknown2]: and you were about to give a last question though i i sorry i did not mean to i

[Unknown1]: oh no that's right so so yeah so this goes back to the the social media as

[Unknown2]: thought you were done so yeah

[Unknown2]: yeah

[Unknown1]: does this evoke an emotional response

[Unknown2]: hm

[Unknown1]: do i feel good at reading this or do i feel good learning this or do i feel angry

[Unknown1]: or do i feel sad and again going back to those negative emotions those negative

[Unknown1]: emotions increase engagement which in turn increases ad revenue for whatever

[Unknown1]: company it is you're looking

[Unknown2]: yes

[Unknown1]: at and so the more angry you are the more upset you get the more money you are

[Unknown1]: at and so the more angry you are the more upset you get the more money you are

[Unknown1]: generating for someone else

[Unknown1]: generating for someone else

[Unknown2]: yes

[Unknown1]: so if you feel your emotions coming into it step back take a look take a breath

[Unknown1]: and think about

[Unknown1]: what your

[Unknown2]: what

[Unknown1]: actions are actually doing

[Unknown2]: yeah if it's free you are the product right yeah

[Unknown1]: if it's free you are the product

[Unknown2]: yes

[Unknown1]: so every single one of these social media platforms it's reddit it's twitter it's

[Unknown1]: wev it's every single social media app out there you are the product coz you are

[Unknown1]: not paying for it

[Unknown2]: right right

[Unknown1]: yep

[Unknown2]: the um yeah spoiler alert i am not on facebook anymore

[Unknown2]: but it's it that this is in response to your last question which you think is such

[Unknown2]: a good one because

[Unknown2]: during the

[Unknown2]: clinton and trump presidential race

[Unknown1]: yeah

[Unknown2]: i would spent for like for like two or three weeks i

[Unknown1]: yeah

[Unknown2]: would get on facebook for about half an hour every night i would type something

[Unknown2]: out and i had very firm beliefs that facebook was not a good place to have

[Unknown2]: arguments and so for two or three weeks i got on every night i typed something out

[Unknown2]: and then i deleted it and i just like i was just really angry the rest of the

[Unknown1]: yeah

[Unknown2]: night and that went on for two three weeks and i was like

[Unknown2]: why

[Unknown1]: yes yeah

[Unknown2]: it's not gonna change anything like all i'm doing is just being angry so i just de

[Unknown2]: leave it my account and i you know i i cheat like my wife if i need to get a hold

[Unknown2]: of someone my wife still has her facebook because she wants to keep track of baby

[Unknown2]: pictures and so you

[Unknown1]: yep

[Unknown2]: know but do you know how many times i've used my wife's facebook account to get in

[Unknown2]: touch with someone some i thought i would use it

[Unknown1]: oh

[Unknown2]: at least a couple times never i've never used it it's been five six years and i

[Unknown2]: just look at that and i was like well i you know you're like i how i get in touch

[Unknown2]: with people and it's like

[Unknown2]: the truth is you will find a way like our

[Unknown1]: yeah

[Unknown2]: you just don't like you don't need to know and find all these people only through

[Unknown2]: facebook if you the only way you know someone is through facebook then you

[Unknown2]: probably aren't going to reach out to them and so it's just i

[Unknown2]: that uh was a key question for me

[Unknown2]: in deciding to just like just to leave it and my life has been better ever since

[Unknown2]: doesn't mean i haven't experimented with other social media

[Unknown2]: and

[Unknown1]: really

[Unknown2]: you know had the same exact problems but at least facebook

[Unknown1]: doubt help

[Unknown2]: the original sin is god

[Unknown1]: y

[Unknown2]: no

[Unknown1]: so

[Unknown2]: go ahead

[Unknown1]: i still have social media accounts

[Unknown2]: yeah

[Unknown1]: i have a facebook page i have linkedin twitter reddit

[Unknown2]: well you have a book so

[Unknown1]: and a lot of it i have a book yeah so

[Unknown2]: yes

[Unknown1]: i i published a book and so i have to have a way to reach people

[Unknown2]: right

[Unknown1]: but it's funny that you bring up that you just use your your wife's social media

[Unknown1]: because there are times where i'll be like arguing online and i i am terrible at

[Unknown1]: this i am so terrible there's someone wrong on the internet and my partner will

[Unknown1]: look at me and go why are you arguing on the internet just

[Unknown1]: like

[Unknown2]: yes yeah

[Unknown1]: it's like i know that they're wrong and like it there are times where i like let

[Unknown1]: myself get carried away with that and then i have to like rain it back in and go

[Unknown1]: no this is dumb i'm not fighting on the internet i'm gonna stop

[Unknown2]: yeah i i oh it's so easy

[Unknown1]: but it's easy it's easy to get sucked into it

[Unknown2]: oh yeah that's actually i built a twitter account met some nice people through

[Unknown2]: like small groups like there were certain chats and stuff um

[Unknown2]: but uh literally figured out like on twitter and this happens on most social

[Unknown2]: places but the way that twitter operates because every

[Unknown2]: media

[Unknown2]: platform has its own quirks the way

[Unknown1]: yep

[Unknown2]: twitter especially operates the way everyone has access to everything there are

[Unknown2]: literally

[Unknown1]: yeah

[Unknown2]: i figured out there's a person who's i and i know there's just lots of people like

[Unknown2]: this who literally spending six to nine hours a day just going around and looking

[Unknown2]: for people

[Unknown2]: to

[Unknown2]: kind of police incorrect and so i and i've just one i'm like what a miserable

[Unknown1]: it doesn't seem like a fun way to spend a day

[Unknown2]: existence

[Unknown2]: oh i like i was like this person is so miserable you know um and i i'm not here to

[Unknown2]: make fun of that person i it was uh actually i think they were on disability so i

[Unknown2]: think they're just at home they can't they don't go out right

[Unknown1]: yeah

[Unknown2]: and they just and so

[Unknown2]: that that's what they do and

[Unknown2]: that was kind of the like i ended up deleting my account on there too i still have

[Unknown2]: other ones i do i

[Unknown1]: hi

[Unknown2]: do like reddit and stuff like that but i try not to spend a lot of time

[Unknown2]: a lot of the reason i started the podcast is in response to

[Unknown2]: the lack of long form discussion and the value of just like if you're going to

[Unknown2]: have an argument you know

[Unknown2]: and really get somewhere that's gonna take a long time

[Unknown2]: and the odds of you doing it and understanding someone's intonation through text

[Unknown2]: is

[Unknown1]: hm

[Unknown2]: it's like no no it's like this

[Unknown1]: so

[Unknown2]: isn't gonna happen and

[Unknown1]: this

[Unknown2]: especially with other people chiming in and like

[Unknown1]: yell

[Unknown2]: and people get on social media to release stem they don't get on to actually like

[Unknown2]: debate and so

[Unknown1]: yup

[Unknown2]: uh and for me

[Unknown2]: this isn't even a debating forum like just that idea of like

[Unknown2]: i can what i want to do is create a platform where people can listen to someone

[Unknown2]: all the way through

[Unknown2]: because

[Unknown1]: k

[Unknown2]: it's amazing how often the conversations break down like on social media because

[Unknown2]: before someone can even finish their thought someone else jumps in and just

[Unknown2]: creates this magnificent straw man that is just like

[Unknown2]: so eight sorry i i obviously you know this is near dear to my heart i i'm really

[Unknown2]: enjoyed the conversation before we go any further i do want to make sure i mention

[Unknown2]: this cause you know you you

[Unknown2]: i know this is part of it right why do you have a facebook page because you have a

[Unknown2]: book and i think this is an important book like i as someone who who got their

[Unknown2]: masters what i wish i knew a field guide for thriving in graduate would have been

[Unknown2]: very timely for me

[Unknown1]: yes yes

[Unknown2]: yeah yeah talk to me a

[Unknown1]: so

[Unknown2]: little bit about the book

[Unknown1]: yeah so the book actually came out of uh very similar uh background actually so i

[Unknown1]: was having these conversations on twitter i'm very active in the academic twitter

[Unknown1]: and science twitter communities

[Unknown1]: and i was spending a lot of time talking to people and you know that two hundred

[Unknown1]: and forty character limit really doesn't allow much

[Unknown2]: no

[Unknown1]: room for nuance

[Unknown2]: no it doesn't

[Unknown1]: so i started having what i called

[Unknown2]: i

[Unknown1]: friday afternoon coffee chats and this was fifteen minutes or a half hour where

[Unknown1]: someone could book my time completely free

[Unknown2]: three

[Unknown1]: i i'm a huge fan of giving back to the community that gave me so much and so i

[Unknown1]: want to make sure i'm helping other people in the same way that i got help and so

[Unknown2]: that's awesome

[Unknown1]: yeah so i i still do these conversations today you can go to my website and book

[Unknown1]: them right now um sadly i booked out for two months and maybe not right now but

[Unknown1]: like in the future

[Unknown2]: what a surprise yeah like someone's giving free advice about graduate school

[Unknown2]: please

[Unknown1]: yes

[Unknown2]: yeah that makes sense

[Unknown1]: so i started having these conversations with these graduate students who were you

[Unknown1]: know struggling with their workload or they're

[Unknown2]: h

[Unknown1]: dealing with a really crappy advisor who never answers email or even worse they

[Unknown1]: are the abusive advisor that you hear these horror stories about

[Unknown2]: oh man

[Unknown1]: and then there's the group of academics who are at the

[Unknown2]: i

[Unknown1]: end of their degree

[Unknown1]: and they swallowed this line hook line and sinker that they're going to get a ten

[Unknown1]: year track position when in reality only three percent of academics will you know

[Unknown1]: it is and it's hard and it's it's one of those conversations of like well i did

[Unknown1]: all this stuff now what and

[Unknown2]: yeah

[Unknown1]: you know there's no

[Unknown1]: formal training

[Unknown2]: wa

[Unknown1]: for how to go from industry you know from academia to industry and

[Unknown2]: she

[Unknown1]: so i saw this

[Unknown2]: i thought

[Unknown1]: huge gap and just started helping people

[Unknown2]: public school

[Unknown1]: figure out how to do it

[Unknown2]: yeah

[Unknown1]: and and i started writing it down um and so now it's in a book

[Unknown2]: because you are tired of saying it over and over again yeah you're like i

[Unknown1]: well i said it so many times

[Unknown2]: oh yeah

[Unknown1]: and i went through it myself and oh my gosh my friends i love my friends they are

[Unknown1]: the most phenomenal people because they heard me and listened to me agonized about

[Unknown1]: this decision to

[Unknown2]: yeah

[Unknown1]: stay in academia or leave and i talked about it for weeks and like you know so

[Unknown1]: shout out to my friends and my partner for being amazing people for that and

[Unknown1]: helping me and just letting me kind of go through the process and come to my own

[Unknown1]: decision um

[Unknown2]: oh yeah

[Unknown1]: yeah so

[Unknown2]: so that you that's why that's why we need community though that's

[Unknown1]: yeah

[Unknown2]: my my wife is so patient with me my kids are patient with me they like

[Unknown1]: yeah

[Unknown2]: i mean uh i i could tell like my son is seven yeah i have a seven year old a four

[Unknown2]: i mean uh i i could tell like my son is seven yeah i have a seven year old a four

[Unknown2]: year old and one on the way and uh

[Unknown2]: year old and one on the way and uh

[Unknown2]: already i i just like dad dad we know like you you're dead you don't have to

[Unknown2]: the the

[Unknown2]: it's such a it's such a blessing

[Unknown2]: it's great to hear you say that um

[Unknown2]: that's that's really cool

[Unknown1]: yep

[Unknown2]: uh so that kind of came out of those conversations starting with you in your own

[Unknown2]: personal journey and

[Unknown1]: it did

[Unknown2]: then moving on to these

[Unknown1]: yeah

[Unknown2]: uh apparently still free conversations if youre going to grad school in the next

[Unknown2]: six months as you know i'm sure you want to book out

[Unknown1]: a lot of people are have

[Unknown2]: um

[Unknown2]: that's that's really awesome

[Unknown2]: kind of returning to our our kind of main topic i did want to touch on that

[Unknown2]: because i did think that was fascinating um

[Unknown2]: when you talk about those decisions you know you talked about confirmation bias

[Unknown2]: what are some other data dealing decision making that people struggle with you

[Unknown2]: know i automatically think of i know just the basic like correlation versus

[Unknown2]: causation right like

[Unknown1]: correlation versus causation yep

[Unknown2]: it' so um but

[Unknown2]: are there some really common ones you think are really important and do you have

[Unknown2]: any clear examples of like just that people can think about practically

[Unknown1]: y

[Unknown2]: that would help them

[Unknown1]: yes so i

[Unknown1]: so there there's two different directions i'll go for this one

[Unknown2]: sure

[Unknown1]: so one is uh a human bias i just published a paper on in the frontiers in

[Unknown1]: psychology that i named the value selection bias and so

[Unknown2]: good

[Unknown1]: i am a big fan of practical and pragmatic so you know no sexy names here it's just

[Unknown1]: a very simple

[Unknown2]: how about you

[Unknown1]: value selection bias

[Unknown2]: yeah you not the monte hall problem you know you

[Unknown1]: i know

[Unknown2]: don't have like that

[Unknown1]: so this is a situation where you are trying to

[Unknown1]: reason through a bunch of difficult information

[Unknown1]: you've likely never encountered this type of problem before

[Unknown2]: hmm

[Unknown1]: you don't really know how to solve the problem

[Unknown1]: very common in mathematical situations so where numbers are involved of course

[Unknown2]: yes

[Unknown1]: coming back to my

[Unknown2]: sure

[Unknown1]: love and joy is when we encounter these situations

[Unknown1]: we often rely on the information that is presented to us as

[Unknown2]: i

[Unknown1]: it is presented and we use that

[Unknown2]: why

[Unknown1]: as our answer to the problem

[Unknown1]: and so if you think about a situation where it says like

[Unknown1]: you have

[Unknown2]: that

[Unknown1]: a seventy percent chance of

[Unknown1]: having this

[Unknown2]: you

[Unknown1]: thing go wrong

[Unknown1]: and

[Unknown2]: yeah a whole bunch of

[Unknown1]: there's a whole bunch of other information provided in the problem but

[Unknown2]: that

[Unknown1]: that's the salient bit of information that you get out it there's seventy percent

[Unknown1]: chance this is going to go wrong doesn't matter what the other

[Unknown1]: context is it doesn't matter whatever information is there you

[Unknown2]: contact with

[Unknown2]: you are

[Unknown1]: are more likely to use that number as like yeah there's a seventy percent chance

[Unknown1]: there's something going go wrong doesn't matter anything else like nothing else

[Unknown1]: getting encoded there so you rely on the most salient information that's presented

[Unknown1]: to you and you might not think about like is there missing context here or maybe

[Unknown1]: there's another consideration to take into account

[Unknown2]: oh what

[Unknown1]: or maybe this other data that's presented with this salient point should be

[Unknown1]: evaluated

[Unknown2]: o k

[Unknown1]: and it might not be so

[Unknown2]: yeah uh can you give a can you give a concrete example of that cause i i think i

[Unknown2]: know what you're talking about but i'm i'm just struggling to grasp it a little

[Unknown2]: bit

[Unknown1]: right so a very simple example of this is take a home pregnancy test all right you

[Unknown1]: know that when you take this test there is

[Unknown2]: you

[Unknown1]: some likelihood that is going to come out with a true positive or a false positive

[Unknown1]: or a true negative or a false negative so you know actually being pregnant when

[Unknown1]: you are the test tells you you're pregnant but you're not the test tells you

[Unknown1]: you're not pregnant but you are or the test tells you you're not pregnant and

[Unknown1]: you're not so there's a whole set of outcomes here right

[Unknown2]: yes yes

[Unknown1]: so

[Unknown1]: so when we consider this type of a problem from a mathematical standpoint and we

[Unknown1]: ask people what's the likelihood that

[Unknown2]: you watch

[Unknown1]: you're actually gonna be pregnant if the test tells you you're gonna be pregnant

[Unknown1]: if you've never encountered the situation and the salient information that's given

[Unknown1]: to you says you know seventy percent of women who are pregnant get a pregnant get

[Unknown1]: a positive result you're going

[Unknown1]: to think yeah that's an absolutely accurate test i'm gonna get a pregnant result

[Unknown2]: wa yeah that

[Unknown1]: and i'm gonna be pregnant like or it's not it's gonna say i'm not pregnant and i'm

[Unknown1]: not pregnant and we take that result at face value and we ignore the other context

[Unknown1]: of those um those conditional statements of like maybe you are but you're really

[Unknown1]: not you know the test says one thing but in reality it's something else

[Unknown2]: hm

[Unknown1]: so these are this is where the value selection bias comes in is that you're given

[Unknown1]: a piece of salient information

[Unknown1]: and that decision is based solely on that information ignoring the rest of the

[Unknown1]: context

[Unknown2]: kind of oversimplifying situations so that it's to the point where it's good

[Unknown2]: enough you know

[Unknown1]: yes

[Unknown2]: seventy percent that's good enough

[Unknown1]: it's good enough and it it brings to mind a question of you know how much

[Unknown1]: information is enough to make

[Unknown2]: hm

[Unknown1]: an informed decision

[Unknown2]: yep

[Unknown1]: do you need every single piece of data to make an informed decision or

[Unknown2]: why do you think

[Unknown1]: is there like an mvp of data knowledge you know like pulling an industry term in

[Unknown1]: here you know

[Unknown2]: yeah yeah

[Unknown1]: but it it's an interesting question of you know if we do

[Unknown1]: rely on surface level information in situations where we're not familiar with

[Unknown1]: everything that goes into it

[Unknown1]: how

[Unknown2]: i

[Unknown1]: bad is that and it can be really bad and it's really bad in the case of things

[Unknown2]: no

[Unknown1]: like

[Unknown1]: um disinformation campaigns and misinformation and malformation and when we get

[Unknown1]: these kind of superficial debates on social media bringing it back around to this

[Unknown2]: yeah sir sir

[Unknown1]: topic um

[Unknown1]: we we've never encountered this before you know social

[Unknown2]: right

[Unknown1]: media is still so new in the history of humanity we had like five

[Unknown2]: five

[Unknown1]: to ten to twenty years of this and that's it and so a lot of these problems that

[Unknown1]: we're encountering are still very novel to us

[Unknown2]: yes

[Unknown1]: and we don't know how to think about what's not there we can't know what we don't

[Unknown1]: know

[Unknown1]: and so this kind of superficial decision making where we rely on what is presented

[Unknown1]: to us is a serious problem

[Unknown2]: yes

[Unknown1]: something we should be aware of

[Unknown2]: yeah i it's something i kind of about once a month i just stop and think about it

[Unknown2]: and i think it's a good thing to think about that it's all encompassing now and

[Unknown2]: it's everywhere now but ten years ago social media didn't really exist

[Unknown1]: yeah

[Unknown2]: like it was like a weird thing on the internet you know

[Unknown1]: it was

[Unknown2]: it's like it's like i i

[Unknown1]: yeah we had we had yahoo chat rooms and icq

[Unknown2]: and it's and so and what's interesting is people are talking about ai and is going

[Unknown2]: to be important

[Unknown1]: can

[Unknown2]: what i don't think people think of social media because they think of it as a

[Unknown2]: a hobby an entertainment a way to blow off steam they don't take it seriously and

[Unknown2]: i don't think they realize that that is the revolution right like

[Unknown1]: but there's also

[Unknown2]: this is like we are literally manipulating people's brains because that's like

[Unknown1]: yeah

[Unknown2]: it's rewiring people's brains like when you look

[Unknown1]: it is

[Unknown2]: at um are you familiar with nicholas car's uh the shallows

[Unknown1]: hmm

[Unknown2]: okay so he like they did a study where they took people i don't know where they

[Unknown2]: found these people but this is impressive they found people who didn't use google

[Unknown1]: oh

[Unknown2]: and they did and they had them use google

[Unknown1]: uhuh

[Unknown2]: and they had people who regularly use google and they did

[Unknown2]: it's not mr am uh brain imaging

[Unknown1]: okay

[Unknown2]: while they were doing it and the people whose brains were accustomed using google

[Unknown2]: the place where they used it was like ah the that part of the brain was like twice

[Unknown2]: the size or something like that

[Unknown1]: oh wow

[Unknown2]: uh and it was like the it like the area that was lighting up was like twice the

[Unknown2]: size something like that

[Unknown2]: i don't remember exactly how it goes he's talking about how like neur plasticity

[Unknown2]: is something we think is only for kids it's actually for adults too

[Unknown1]: it is

[Unknown2]: they repeated the experiment after having the people who hadn't used google and

[Unknown2]: this is not social media this is just google search

[Unknown1]: jazz google

[Unknown2]: two weeks

[Unknown1]: yep

[Unknown2]: and they had the same

[Unknown2]: they had the same brave brain function that's all it took to rewire their brain

[Unknown2]: i'm like and google search is in no way as powerful social media i'm like i can't

[Unknown1]: well oh oh i will actually i will

[Unknown2]: imagine well maybe not

[Unknown2]: okay

[Unknown1]: argue you on that point because

[Unknown2]: okay yeah

[Unknown1]: what you think about google it

[Unknown2]: yes

[Unknown1]: is an ai driven it's an

[Unknown1]: algorithm and it learns what you prefer based

[Unknown2]: yeah yes

[Unknown2]: right

[Unknown1]: on your personal search history

[Unknown2]: yes

[Unknown1]: so not only when you tell someone to go google it

[Unknown2]: yes

[Unknown1]: you're telling them to go confirm their own biases because the way the algorithm

[Unknown1]: is generating these results is based on your previous search history so

[Unknown2]: ah

[Unknown1]: you're not going to find things that go against your beliefs because the algorithm

[Unknown1]: is designed to confirm your beliefs so telling someone to go do your research on

[Unknown1]: google is not it's not doing your research it's not like stop using that phrase

[Unknown2]: oh man

[Unknown2]: oh yeah

[Unknown2]: oh yeah there's yeah it's so interesting

[Unknown2]: that brings up a whole nother layer of things i laughed i had marisa zala bac

[Unknown2]: she's on the

[Unknown2]: board for ethics ai ethics for i triple e i

[Unknown1]: okay

[Unknown2]: had her on and um

[Unknown2]: uh she said you know we were talking about i was like what are some more resources

[Unknown2]: you know for learning how to deal with ethics and ai she's like well i mean just

[Unknown2]: start by googling it and i'm like seems like there's

[Unknown1]: oh

[Unknown2]: see i was like i i literally like like in the middle of it was just like doesn't

[Unknown2]: that seem like there's a like compromised interest what's the with conflict of

[Unknown2]: interest here i mean

[Unknown1]: cons like the address yeah give

[Unknown2]: i like what i i'm pretty sure google's gonna have its own say on uh a i ethics

[Unknown1]: that's true

[Unknown2]: but yeah really that's really fascinating i i think what i meant is the dopamine

[Unknown2]: effect of google versus social but i could be wrong in that too that's just my

[Unknown2]: that's just my intuition which as we know

[Unknown2]: could definitely be right or wrong right so um when i said that about that yeah i

[Unknown2]: know like google's algorithm i mean

[Unknown2]: just the sheer amount of information i think one of the things and you know you

[Unknown2]: mentioned academia coming along at a dinosaurs pace like media critical skills uh

[Unknown2]: media

[Unknown1]: yeah

[Unknown2]: critical thinking skills are going to be is needs to very

[Unknown1]: yeah

[Unknown2]: quickly become one of the most important things

[Unknown1]: yeah

[Unknown2]: like whereas for a long time we were it was about teaching people things now it's

[Unknown2]: about how to learn and how

[Unknown1]: yes

[Unknown2]: to critically think is going to be more and more important because at this point i

[Unknown2]: mean i home school my son and like we do like a science curriculum but to be

[Unknown2]: honest it's nothing compared to like he just asks me questions and whereas in the

[Unknown2]: past we'd have to like i remember getting like opening an encyclopedia now

[Unknown1]: yeah

[Unknown2]: he just like he's just like hey dad can you youtube this and

[Unknown1]: yo

[Unknown2]: it's like there's like eight different documentaries that he can immediately watch

[Unknown2]: on whatever it is and he comes back and he's like squids have blue blood and they

[Unknown2]: have teeth in their arms i'm like well that's gonna give me nightmares but thanks

[Unknown2]: um not that you i'm sure

[Unknown1]: yeah

[Unknown2]: you don't experience any of that yeah

[Unknown2]: but

[Unknown2]: sorry i you know again i'm having a great time this is so fascinating to me

[Unknown1]: yeah

[Unknown2]: is there something that you wanted to share i feel like you were you were about to

[Unknown2]: say something

[Unknown1]: yeah so you brought up the idea of media literacy and this

[Unknown2]: he

[Unknown1]: is actually the the next book i'm working on is the different types of literacy we

[Unknown1]: need to actually

[Unknown2]: see

[Unknown1]: be informed decision makers in the modern day and you know media literacy is one

[Unknown1]: of them you

[Unknown2]: yeah

[Unknown1]: know we have in the u s we have a curriculum focused predominantly around stem

[Unknown1]: english and writing skills and

[Unknown2]: and

[Unknown1]: in a

[Unknown2]: water

[Unknown1]: modern day we need not only those regular you know kind of core literacy we need

[Unknown1]: additional literacy is like statistical literacy where data is being manipulated

[Unknown1]: not only by news agencies but political organizations health

[Unknown2]: oh

[Unknown1]: care organizations we need computer literacy

[Unknown2]: yeah

[Unknown1]: how much time of the day do we spend on computers you know and the pandemic threw

[Unknown1]: this into such a strong relief that

[Unknown2]: he

[Unknown1]: computer literacy skills are lagging and there are some cases where our kids know

[Unknown1]: more about computers than we do as fully grown adults you know and

[Unknown2]: yeah

[Unknown1]: i grew up in the eighties and the growing up with computers and

[Unknown2]: i know

[Unknown1]: there are things my son can do on his computer then i just look at him and go

[Unknown1]: what do you do you know and i worked in microsoft like

[Unknown2]: yeah

[Unknown1]: i should know you know but like there's computer literacy there's together

[Unknown2]: yeah

[Unknown1]: there's media literacy how do we understand

[Unknown1]: what

[Unknown2]: what did you

[Unknown1]: is controlling our media who has a say in it why

[Unknown2]: one thousand two

[Unknown1]: has it become this click bait type of news

[Unknown2]: mm

[Unknown1]: outlet

[Unknown1]: one of the questions i didn't bring up earlier when you asked about what should

[Unknown1]: people be asking is

[Unknown2]: yeah

[Unknown1]: who's paying for this

[Unknown2]: oh yeah

[Unknown1]: and this is so important in media literacy when you think about the different news

[Unknown1]: organizations that we have and you start to go up the hierarchy of who owns these

[Unknown1]: companies

[Unknown2]: the different yes

[Unknown1]: different news out and so these are all

[Unknown1]: types of literacy that we need to understand and we need to understand that not

[Unknown1]: only is are all of these literacy required we need to understand that as

[Unknown2]: i

[Unknown1]: a language

[Unknown2]: hm

[Unknown1]: science and scientific terminology means something very different

[Unknown1]: than it does to someone who's not trained in science and so there's all these

[Unknown1]: issues going forward that we really need to take a strong look at our curriculum

[Unknown1]: and how we teach people these different things going forward because it's not just

[Unknown1]: stem and english anymore it's so much more

[Unknown2]: and and even stem it doesn't really provide literacy right i mean that's like i

[Unknown1]: no it does it it gives you like the it gives you the language base it teaches you

[Unknown2]: just

[Unknown1]: the words but like the way i use the word hypothesis and doing my research and

[Unknown2]: yeah

[Unknown1]: having my assumptions and biases is going to be very different from someone on

[Unknown1]: facebook saying we'll go do your research you know these are two very different

[Unknown1]: things

[Unknown2]: yeah yeah i just had

[Unknown2]: uh doctor chris uh alfa from and he was talking about how science grows and one of

[Unknown2]: the things that just got highlighted

[Unknown1]: sorry

[Unknown2]: and this is we talk no wor the one of the things you know

[Unknown2]: covid and the way that people just don't understand how science works even like

[Unknown2]: and i know its a surprise but even politicians not understanding how science works

[Unknown2]: you know even going

[Unknown1]: yeah

[Unknown2]: back to the like the amount of

[Unknown2]: our our world is rapidly becoming more complex and we have not been given the

[Unknown2]: skills to deal with it like that was so evident even like we were talking about

[Unknown2]: facebook and the facebook hearings listening to the congressman ask those

[Unknown2]: questions to mark zuckerberg

[Unknown2]: you're like so like how do you make money it's like surely surely you could have

[Unknown2]: been briefed on that like

[Unknown1]: yeah i

[Unknown2]: i say

[Unknown2]: yeah but again and this is really fascinating you mentioned this your son and

[Unknown2]: you're very computer literate

[Unknown1]: mm hm

[Unknown2]: i'm yeah

[Unknown1]: yeah

[Unknown2]: your son does things that you don't understand on the computer and this

[Unknown1]: yep

[Unknown2]: is something where so now we have these politicians who someone mentioned this

[Unknown2]: we've only had one president who uh went to a non segregated school or went to a

[Unknown2]: yeah went to a non segregate school

[Unknown2]: which is a really

[Unknown2]: crazy thing to think about for a second yeah and when you think about the changes

[Unknown2]: that have happened

[Unknown2]: in this short amount of time it's like

[Unknown2]: you have these politicians who have risen to the top over decades of experience

[Unknown2]: and then they get to the top and then they don't have

[Unknown2]: the i they they've worked so hard and they have so much to get there to the top

[Unknown2]: and all the skills that got them to the top have not prepared them for the

[Unknown2]: decisions that they have to deal with

[Unknown1]: yeah

[Unknown2]: and that's that's a charitable reading of it there's also some other less

[Unknown2]: charitable ways of thinking about it but surely that's some of it right like

[Unknown1]: right

[Unknown2]: we all have grandparents like or are not all of us some people have grandparents

[Unknown2]: who are in their sixties seventies eighties

[Unknown1]: yeah

[Unknown2]: and when they face technology

[Unknown2]: that's a different road than someone who grew up in the eighties with computers

[Unknown2]: like that's just i

[Unknown1]: it help

[Unknown2]: it it's a different it's a different thing

[Unknown2]: i'm obviously speaking of talking too much all right i i want to make sure that

[Unknown2]: i'm i am asking you the right question i want to hear like one of the questions i

[Unknown2]: really want to ask too what are the most common ethical pitfalls you encounter in

[Unknown2]: industry research

[Unknown1]: alright

[Unknown1]: so i

[Unknown2]: if that doesn't get you in troubles that doesn't get you in trouble i realize i'm

[Unknown2]: not

[Unknown1]: yeah no so so in i did a lot of healthcare research and health outcomes

[Unknown2]: he

[Unknown1]: research and you know how do

[Unknown2]: how

[Unknown1]: people make decisions using data when it comes to their personal health and this

[Unknown2]: hm

[Unknown1]: is something i am

[Unknown1]: super outspoken about is

[Unknown2]: okay yeah

[Unknown1]: that i'll give you an example to to frame it up for you so everyone knows the most

[Unknown1]: common heart attack symptoms they are tightness in your

[Unknown2]: you

[Unknown1]: chest pain shooting down your

[Unknown2]: car

[Unknown1]: arms fatigue you know those kind of symptoms but usually when you get that sharp

[Unknown1]: painting your chest that's like your big indicator that you're gonna have a heart

[Unknown1]: attack right

[Unknown1]: no

[Unknown1]: so this is true

[Unknown2]: spanish

[Unknown1]: for men all right and this is the problem with health care research is that we

[Unknown1]: have had so much of the research done using what i refer to as the default mail

[Unknown2]: yeah

[Unknown1]: expecting that everyone is going to conform to the default mail but when you start

[Unknown1]: to look at women and see how much

[Unknown2]: what

[Unknown1]: heart attacks are under reported in women and how they actually pass away more

[Unknown1]: often because the symptoms are not the same symptoms for women are things like

[Unknown1]: indigestion and nausea and tiredness and full body aches

[Unknown2]: hey

[Unknown1]: sounds like a stomach flu to me right you

[Unknown2]: right right

[Unknown1]: know but that's that's a symptom of a heart attack for a woman

[Unknown2]: no

[Unknown1]: and so these are gender differences in health care that are just completely swept

[Unknown1]: away and now it's starting to get steam and we're starting to get some exposure to

[Unknown1]: these things

[Unknown1]: and that's

[Unknown2]: that's

[Unknown1]: just one example of it so i actually pulled up something because i wanted to talk

[Unknown1]: to you about this

[Unknown2]: sure yeah

[Unknown1]: is uh

[Unknown1]: give me one second

[Unknown2]: let me actually if i could just add and i could be wrong in this i should you know

[Unknown2]: i won't say just google it right but i'm am pretty sure like a good example of

[Unknown2]: this was when they first brought out air bags they

[Unknown1]: yes

[Unknown2]: tested them they tested them on the average male weight and then they

[Unknown1]: ye

[Unknown2]: put kids up there and

[Unknown1]: yep

[Unknown2]: kids died right because they didn't think about

[Unknown1]: yeah yes and you put women up there and women die

[Unknown2]: yeah yeah

[Unknown1]: because the height difference is there

[Unknown2]: yes yes

[Unknown1]: it's the same exact thing

[Unknown2]: yes

[Unknown1]: and so you know women have an entirely different set of medical and this is just

[Unknown1]: one example

[Unknown2]: mm right right

[Unknown1]: and medicine has used the default mail for so long that most of our medical

[Unknown1]: findings are based on men and now so this is an issue when you start to think

[Unknown1]: about like women's health issues you know women have entire reproductive system

[Unknown1]: that men don't have and they causes problems and women are just told like it's

[Unknown1]: normal to be in pain a week a month for the majority of their life

[Unknown1]: if i ever went to you as a as a man and said like you're gonna be in pain a week a

[Unknown1]: month for the rest of your life you would look at me like i'm crazy like

[Unknown1]: it's not why are women expected to deal with this pain and so okay so there's

[Unknown1]: gender differences

[Unknown2]: sure

[Unknown1]: now let's think about cultural differences and

[Unknown2]: sure

[Unknown1]: think about this in terms of mental health problems so in the u s we treat mental

[Unknown1]: health problems as a failing of an individual and

[Unknown2]: hm

[Unknown1]: we don't treat it as a neurochemical imbalance in their brain

[Unknown2]: wa

[Unknown1]: it is a

[Unknown1]: physiological problem that we have you know it's a neurochemical problem

[Unknown2]: money

[Unknown1]: but that's in the us when we go to other cultures that are more collectivist in

[Unknown1]: nature that take care of a familial model

[Unknown1]: and you care about your local community and you care about your society they don't

[Unknown1]: treat it as an individual problem they treat it as a community problem that this

[Unknown1]: person needs more support and more help and maybe we need to change the way we're

[Unknown1]: doing things and so

[Unknown2]: no

[Unknown1]: we miss this entire cultural separation that is caused by being individualistic

[Unknown1]: versus cultural or uh community minded in some ways

[Unknown2]: hm

[Unknown1]: and so

[Unknown1]: yeah

[Unknown2]: yeah no it's it's really important i don't think people understand

[Unknown2]: generally speaking how often this stuff just happens right it's

[Unknown1]: yeah

[Unknown2]: because it gets masked behind what well the statistics say and people all never

[Unknown2]: feel comfortable arguing with statistics and it's actually

[Unknown1]: yeah

[Unknown2]: very easy to

[Unknown1]: it is

[Unknown2]: argue with statistics

[Unknown1]: it is you know mark wayne

[Unknown2]: so when you're faced with good yes

[Unknown1]: oh

[Unknown2]: i love this quote yes

[Unknown1]: so but here's the thing

[Unknown1]: the marked ten quote gets cut off it says that

[Unknown2]: oh don't say it please i love it get say it's so good

[Unknown2]: okay

[Unknown1]: there are lies dam bsd in statistics and that's the part that it gets quoted

[Unknown2]: yes

[Unknown1]: but let me pull out the actual full on quote it says figures often beguiled me

[Unknown1]: particularly when i have the arranging of them myself in which case the remark

[Unknown2]: what

[Unknown1]: attributed to israeli would often apply with justice and force there are three

[Unknown1]: kinds of lies lives damned lies and statistics and the thing that sticks out to me

[Unknown1]: about this

[Unknown2]: hm

[Unknown1]: is he is actually talking about how information is presented

[Unknown2]: no

[Unknown1]: and how it can be so easily manipulated

[Unknown2]: yes

[Unknown1]: when you have no clue what goes into these statistics

[Unknown2]: yes

[Unknown1]: and that is my argument about data ethics is that you need to know these things

[Unknown1]: you need to know what goes into it and you need to know how manipulation happens

[Unknown1]: in the presentation of this

[Unknown2]: yes so what are some questions kind of was you wrap up here i want to be know

[Unknown2]: respectful of your time the um so thank you for your patience as we kind of walk

[Unknown2]: through this

[Unknown1]: this has been a great conversation

[Unknown2]: what are some great questions yeah no i've had a great time thank you

[Unknown2]: what are what are good questions to ask when you your face with that blanket well

[Unknown2]: seventy percent of men do this or

[Unknown1]: yep

[Unknown2]: eighty percentage of everybody does this and it's like

[Unknown2]: oh what are the

[Unknown1]: see

[Unknown2]: questions we need to ask

[Unknown1]: yeah now

[Unknown2]: what are the questions that help you pierce that veil and say oh this is actually

[Unknown2]: good statistic or you

[Unknown1]: yeah

[Unknown2]: know there's a little bit more nuance here

[Unknown1]: yeah like the the seventy percent of statistics are made up on the spot

[Unknown2]: yes

[Unknown1]: you know

[Unknown2]: the classic yeah

[Unknown1]: it's a classic it's true it's true know there there are statistics made up on the

[Unknown1]: spot all the time so the question

[Unknown2]: question i ask

[Unknown1]: i ask is where did this data come from

[Unknown1]: who generated the data are they trustworthy

[Unknown2]: hm

[Unknown1]: who's paying for the data oh is

[Unknown2]: eight

[Unknown1]: there a purpose to this data what is that purpose and again going back to that am

[Unknown1]: i having an emotional reaction to this data

[Unknown2]: mm

[Unknown1]: and you'll see

[Unknown2]: interesting

[Unknown1]: this a lot in graphical presentations of data so you everyone's familiar with a

[Unknown1]: pie chart that doesn't add up to a hundred percent

[Unknown1]: or you know maybe there's a graph

[Unknown2]: wow

[Unknown1]: where your y

[Unknown2]: why

[Unknown1]: axis isn't shown so you have no idea if the increase in decrease shown in your

[Unknown1]: graph is big or small

[Unknown2]: right

[Unknown1]: or maybe you have data on two different scales presented on the same line

[Unknown2]: why

[Unknown1]: with no

[Unknown1]: differentiation of how those scales line up or why they are lined up the way they

[Unknown2]: gran i

[Unknown1]: are you know there is always the question of what is the purpose of this graphic

[Unknown1]: and what message is it conveying to me

[Unknown1]: and really again going back and taking the time and being

[Unknown1]: reflective and going is this confirming my bias

[Unknown2]: what

[Unknown2]: hm

[Unknown2]: yeah now that's really good i uh are you familiar with storytelling with data the

[Unknown2]: book

[Unknown1]: i believe i have it on my shelf actually

[Unknown2]: do you think is that a good uh resource obviously until your book comes out right

[Unknown2]: that'll be the definitive the definitive work

[Unknown1]: the

[Unknown2]: but yes

[Unknown2]: uh

[Unknown2]: but until like i because that's what comes to mind um

[Unknown2]: uh i think it's and then there's a book called lies with

[Unknown2]: lying with statistics there's like a very small book that's uh like it's like red

[Unknown2]: and white stripes on the like whatever i bought on amazon just trying to think of

[Unknown2]: where people could find more information on this besides just telling them you

[Unknown2]: know go google it

[Unknown1]: yeah

[Unknown1]: go google it

[Unknown2]: what are some what are some good resources those are the two that come to mind

[Unknown2]: from me but what are

[Unknown2]: good resources for someone who wants to dig deeper into this until your book comes

[Unknown2]: out

[Unknown1]: so there are there's some really great books by daniel conman that he's written

[Unknown1]: there's a book called noise

[Unknown1]: um and then

[Unknown2]: yes

[Unknown1]: there's it just came out recently it's a it's a great book

[Unknown2]: you talking about

[Unknown1]: about how we make decisions based on information

[Unknown2]: yeah

[Unknown1]: he also wrote another book that really helps you learn the difference between

[Unknown1]: what's called type one and type two reasoning or

[Unknown2]: hm

[Unknown1]: fast and slow which is the title of the book

[Unknown1]: and it's how we make decisions using very

[Unknown2]: about

[Unknown1]: fast quick thinking that's kind of automated in a lot of senses versus that slower

[Unknown1]: type two thinking where we deliberate and we try to make you know

[Unknown1]: very informed decisions and so those

[Unknown2]: hmm

[Unknown1]: two books are phenomenal for that

[Unknown2]: gotcha

[Unknown1]: then there's another book by gird geiger and it's how people use data to make

[Unknown1]: decisions specifically in health care contexts in some of them and i cannot

[Unknown2]: did you

[Unknown1]: think of the title of that one off the top of my head

[Unknown2]: gird

[Unknown2]: grind

[Unknown1]: a

[Unknown1]: geiger g i g

[Unknown1]: geiger g i g

[Unknown2]: what's the last guide greener

[Unknown1]: e r e n z e r

[Unknown2]: i'm sure if i put that into google google will figure out what i want so

[Unknown1]: like

[Unknown2]: that's

[Unknown1]: yeah

[Unknown2]: awesome well i just want to say thank you so much i think this has been really

[Unknown2]: helpful is there any last thoughts that you want to leave our listeners with

[Unknown2]: before we end today

[Unknown1]: ah yes so there is

[Unknown1]: so there are

[Unknown1]: three tenants to scientific thinking and these

[Unknown2]: hey

[Unknown1]: are these are things that i try to live by is that you

[Unknown1]: question everything

[Unknown2]: what's your

[Unknown1]: including experts like you know

[Unknown2]: hm

[Unknown1]: we're human we have our own problems and we have our own lens that we bring to the

[Unknown1]: table question

[Unknown2]: i

[Unknown1]: us you know but also be willing to accept answers when we bring you evidence so

[Unknown1]: but question everything um

[Unknown2]: sure

[Unknown1]: importantly question yourself and your assumptions and your biases

[Unknown2]: hey

[Unknown1]: and also know that you can't know what you don't know and so there's always room

[Unknown1]: to learn even when

[Unknown2]: yeah

[Unknown1]: you are the expert in the room

[Unknown2]: right

[Unknown1]: um maintain a healthy dose of open skepticism you know tempered by a

[Unknown2]: why

[Unknown1]: willingness to believe based on evidence and so update your beliefs when you are

[Unknown1]: presented evidence that might go against what you want to believe and know that

[Unknown1]: your opinion is not the same thing as a fact these are different things

[Unknown2]: yes

[Unknown1]: and finally the third core part of that is practice intellectual honesty

[Unknown2]: hey

[Unknown1]: and that

[Unknown1]: that is simply saying it is okay to be wrong and

[Unknown2]: hey

[Unknown1]: it is okay to admit when you are wrong it is not a failing to be wrong it's

[Unknown1]: actually how we learn and we can't learn if we're never wrong you

[Unknown2]: yeah

[Unknown1]: know so be okay

[Unknown1]: being the wrong person sometimes and be willing to adjust and change and learn

[Unknown2]: yeah i think you're saying that very graciously you know it's okay i mean i think

[Unknown2]: it's important to be wrong sometimes like if

[Unknown1]: yeah

[Unknown2]: you're never wrong they're you're probably doing something wrong

[Unknown2]: like you have to be like that means you're never learning and so i i i love that i

[Unknown2]: think that really coincides with the mission of what i'm trying to do with this

[Unknown2]: podcast so again thank you for coming on it's been a real pleasure

[Unknown1]: thank you so much for having me

[Unknown2]: so