Chasing Leviathan

PJ is joined by ethicist Marisa Zalabak. Together, they discuss the emerging world of A.I. and its consequences.

Show Notes

In this episode of the Chasing Leviathan podcast, PJ and Marisa Zalabak discuss the complex ethics of artificial intelligence and why, despite the potential hazards of AI, Marisa remains a dedicated optimist regarding the emerging future of humans and our increasingly self-aware technologies. 

For a deep dive into Marisa Zalabak's work and IEEE, check out these sources: 
Open Channel Culture 👉 https://openchannelculture.com/ 
Institute of Electrical and Electronics Engineers 👉 https://www.ieee.org/

Check out our blog on www.candidgoatproductions.com 

Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. When it rises up, the mighty are terrified. Nothing on earth is its equal. It is without fear. It looks down on all who are haughty; it is king over all who are proud. 

These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. 

Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.

What is Chasing Leviathan?

Who thinks that they can subdue Leviathan? Strength resides in its neck; dismay goes before it. It is without fear. It looks down on all who are haughty; it is king over all who are proud. These words inspired PJ Wehry to create Chasing Leviathan. Chasing Leviathan was born out of two ideals: that truth is worth pursuing but will never be subjugated, and the discipline of listening is one of the most important habits anyone can develop. Every episode is a dialogue, a journey into the depths of a meaningful question explored through the lens of personal experience or professional expertise.

[Unknown2]: hello

[Unknown2]: and welcome to chasing thetan i'm your host pj weary and i find myself here today

[Unknown1]: hello and welcome to chasing by i'm your host pj weary and i find myself here

[Unknown2]: with marsa zala a founder of open

[Unknown1]: today with

[Unknown1]: calendar food channel focus is an educational psychologist adaptive leadership ted

[Unknown2]: channel culture she is an educational psychologist adaptive leadership coach ted x

[Unknown2]: and keynote speaker researcher author facilitator and a i ethicist as a coach and

[Unknown1]: keynote speaker researchers office facilitator and ai

[Unknown2]: consultant for

[Unknown2]: adaptive leadership and organizational culture she focuses on human potential

[Unknown1]: coach consulting for adaptive leadership organizational culture he on

[Unknown2]: sustainability and

[Unknown2]: essential skills for the emerging future marisa currently serves as a co chair of

[Unknown1]: sustainability and essential skills for the america future tesa currently serves

[Unknown1]: as a co chair of the committee expanding little ai a

[Unknown2]: a committee expanding global ai ethics education with ie e

[Unknown1]: with my

[Unknown2]: dot org the world's largest professional organization advancing technology for

[Unknown1]: e dot org largest professional organization advancing technology or humanity he is

[Unknown2]: humanity she is a contributing author of i e e's

[Unknown1]: a contributing author of high

[Unknown2]: recommended standards for the ethical design of artificial intelligence as well as

[Unknown2]: recommended standards for the ethical design of artificial intelligence as well as

[Unknown1]: recommended standards for the ethical design of artificial intelligence as well as

[Unknown2]: a co author of proposed approaches for transdisciplinary collaboration in the

[Unknown2]: a co author of proposed approaches for transdisciplinary collaboration in the

[Unknown1]: a co author of proposed approaches for transdisciplinary collaboration for

[Unknown2]: development and mobilization of ethical ai she's a member of mit u lab with a pre

[Unknown2]: development and mobilization of ethical ai she's a member of mit u lab with a pre

[Unknown1]: development and mobilization of pe of ai she's a member of mit lab the president

[Unknown2]: presenting presc institute co creation innovation labs building capacity in

[Unknown2]: presenting presc institute co creation innovation labs building capacity in

[Unknown1]: presents peasant institute co creation innovation labs building capacity and

[Unknown2]: programs and action research worldwide to support societal innovation and she also

[Unknown2]: programs and action research worldwide to support societal innovation and she also

[Unknown1]: programs and act to apply to support the silence innovation but she also serves on

[Unknown2]: serves on multiple global leadership teams for the advancement of the un

[Unknown2]: serves on multiple global leadership teams for the advancement of the un

[Unknown1]: multiple global leadership teams with advancements on the un sustainable

[Unknown2]: sustainable developmental goals and peacemaking you can learn more about her work

[Unknown2]: sustainable developmental goals and peacemaking you can learn more about her work

[Unknown1]: developmental goals and cheesemakers you can learn more about it work at open

[Unknown2]: at open channel culture com and she can be reached personally at m zaltys much for

[Unknown2]: at open channel culture com and she can be reached personally at m zaltys much for

[Unknown1]: channel culture dot com and keep first that man so that at open shan c dot com

[Unknown1]: marsa thank you so much

[Unknown2]: coming on today

[Unknown2]: coming on today

[Unknown1]: welcome that was a lot to chew on p j

[Unknown2]: that's fine i i love it it really gives a i mean it it's amazing the breadth of

[Unknown1]: amazing the rest of what you do

[Unknown2]: what you do

[Unknown2]: and so today like obviously there's a lot we could talk about and one of the

[Unknown1]: and so today like obviously there's a lot we should talk about and one of the

[Unknown2]: things that when we had a kind of a pre podcast phone call was what topic to to

[Unknown1]: things that when we had a kind of a pre podcast phone call was what topic to to

[Unknown2]: discuss

[Unknown1]: discuss

[Unknown2]: but we

[Unknown1]: but we kind of both came to the conclusion that what is most clear

[Unknown2]: kind of both came to the conclusion that what is most clear for some reason the

[Unknown1]: for some reason the title clear and present danger kept coming up in my mind but

[Unknown2]: title clear and present danger kept coming up in my mind but it's not necessarily

[Unknown1]: it's not necessarily a danger right but what are the clearest ethical dilemmas and

[Unknown2]: a danger right but um what are the clearest ethical dilemmas in and problems that

[Unknown1]: problems that we are facing in the

[Unknown2]: we are facing in the

[Unknown2]: ai world today

[Unknown1]: ai world today

[Unknown1]: yeah well

[Unknown1]: be a lot

[Unknown2]: yeah

[Unknown1]: um

[Unknown1]: and one of the things i really want to

[Unknown1]: keep

[Unknown1]: i

[Unknown2]: i don't like

[Unknown1]: remind myself as often as i can to keep circling back to

[Unknown2]: hm

[Unknown1]: is that

[Unknown1]: this is coming with something that is very exciting and wonderful

[Unknown1]: right

[Unknown2]: yeah

[Unknown1]: and it's when i first got involved it's i triple e by the way which is just you

[Unknown1]: know i had no clue either when i started

[Unknown2]: that makes my life so much easier that

[Unknown1]: so much easier i triple

[Unknown2]: i think i said four e's the first time i

[Unknown1]: right but it's all good it's all good the thing is i didn't know about it either i

[Unknown2]: yeah

[Unknown1]: because i'm not an engineer it's an engineering organization but what they've done

[Unknown2]: i have seen it before

[Unknown1]: umhum i

[Unknown2]: ah i just never talked to anyone about it but yes go ahead

[Unknown1]: well the thing is what they they are it's a huge organization right

[Unknown2]: hm

[Unknown1]: it's the largest professional organization of engineers in the world

[Unknown2]: right

[Unknown1]: the thing is

[Unknown1]: that they do multiple things

[Unknown2]: hm

[Unknown1]: and one of the things that i got recruited for and became part of

[Unknown2]: was

[Unknown1]: was a committee

[Unknown2]: developing

[Unknown1]: developing recommended standards for the design of ai related to human well being

[Unknown2]: hm

[Unknown1]: so for example if i'm going to have a flying car which has its benefits right i

[Unknown1]: mean we can talk about the

[Unknown1]: downsides right but when i if if it's benefits

[Unknown2]: yeah yeah

[Unknown1]: what

[Unknown2]: what

[Unknown1]: are the ethical considerations that

[Unknown2]: h

[Unknown1]: we have really

[Unknown1]: taken and and consider the you know

[Unknown2]: what

[Unknown1]: what are the considerations what

[Unknown2]: what

[Unknown1]: are the what are the legal ramifications what are the

[Unknown2]: what about favorite

[Unknown1]: one of my favorite phrases when you i began working on this stuff was intended in

[Unknown1]: unintended users

[Unknown2]: right oh users okay that was not where i thought that was going but yeah

[Unknown1]: right intended and unintended users so you know recently we've had these

[Unknown2]: situation

[Unknown1]: situations with i mean i don't want to point out to tesla but it's just an example

[Unknown2]: sure

[Unknown1]: recently we've got an example where they self driving

[Unknown1]: capabilities

[Unknown2]: right

[Unknown1]: have found that they're not stopping for stop lines stop signs right as much as

[Unknown1]: they need to be

[Unknown2]: sure

[Unknown1]: well one would say they would need to always so the thing is

[Unknown1]: that the intended user who's the person driving that car

[Unknown2]: yeah

[Unknown1]: or even the passengers

[Unknown2]: yeah

[Unknown1]: you could say isn't arguably an

[Unknown2]: sure

[Unknown1]: in intended user but if it's children driving in the car they are not intended

[Unknown1]: users they were

[Unknown1]: brought

[Unknown2]: one

[Unknown1]: into the car by someone else

[Unknown2]: right

[Unknown1]: if the person who might be harmed

[Unknown2]: hm

[Unknown1]: by the car going through a stop sign

[Unknown2]: yeah

[Unknown1]: becomes in a sense an unintended victim

[Unknown2]: yes

[Unknown1]: right and then there are people who are involved in it in all kinds of other ways

[Unknown1]: in peripheral ways so the thing is that it gets it can be exhaustive and it is

[Unknown1]: exhaustive to go

[Unknown2]: yes

[Unknown1]: through the it there's an iterative process

[Unknown1]: a repeating process

[Unknown2]: mm hm

[Unknown1]: that when are considering when a person is considering the design of something of

[Unknown1]: an autonomous machine

[Unknown2]: yeah

[Unknown1]: because that's really a more appropriate term it's autonomous intelligence systems

[Unknown2]: okay

[Unknown1]: ais is in its most people don't use that term but it really is kind of a more

[Unknown1]: accurate term

[Unknown2]: well i was

[Unknown1]: i was gonna ask you

[Unknown2]: gonna ask you this so uh and i'd

[Unknown2]: love for you just if you could take a quick detour 'cause i was going to ask when

[Unknown1]: and i'd love you just take a quick deeper i was going to ask when we're talking

[Unknown2]: we're talking about ai today it's important to differentiate i've done a little

[Unknown1]: about ai today it's important to differentiate i've

[Unknown2]: bit of machine learning i do code

[Unknown1]: code

[Unknown2]: i was looking at doing machine learning and i realized i didn't have time between

[Unknown1]: i was looking at doing machine learning and i realized i didn't have time between

[Unknown2]: home

[Unknown2]: schooling my kids and running a digital marketing agency but i do know a little

[Unknown1]: home schooling my kids and running a digital marketing again but i do know a

[Unknown2]: bit of the different kinds so we're not talking about like general intelligence

[Unknown1]: little bit of different kinds so we're not talking about like general acceptance

[Unknown2]: that's not something that you're focused on right

[Unknown1]: no and in fact well it's something we will eventually be because the

[Unknown2]: right

[Unknown1]: thing is even with general intelligence we're going to talk about there is design

[Unknown1]: that's being

[Unknown2]: yes

[Unknown1]: that's coming into place right and they are

[Unknown1]: once we get into general intelligence then

[Unknown2]: yes

[Unknown1]: we really are talking about autonomous systems

[Unknown2]: right right yeah yeah so for our users general users well i'm already lapsing into

[Unknown1]: right yeah yeah so i users general users i'm already lapsing in go speak no for

[Unknown2]: code speak no for our listeners

[Unknown1]: listeners

[Unknown2]: general

[Unknown2]: intelligence is that idea of what most people think of when they think of ai which

[Unknown1]: general intelligence is that idea of what most people think of when they think of

[Unknown1]: ai which is the talking robots that pass the touring path and do all humans how

[Unknown2]: is the talking robot that passes the turing test and can do all the human things

[Unknown2]: right what we're talking about yes exactly

[Unknown1]: exactly

[Unknown2]: uh one of the more disturbing iterations right but the

[Unknown1]: uh one of the more disturbing iterations right but the

[Unknown2]: yes i robot all those kind of considerations whereas honestly ai is present in our

[Unknown1]: yes you know i robot all those kind of considerations whereas honestly ai is

[Unknown1]: present in our lives right now at that machine learning level at this autonomous

[Unknown2]: lives right now at that machine learning level at this autonomous you know

[Unknown2]: level

[Unknown1]: level

[Unknown2]: so i

[Unknown2]: just wanted to clarify that so people knew we were talking about yes phones

[Unknown1]: so i just wanted to clarify that so even the we

[Unknown2]: netflix like i mean every time you get a recommendation there's not some person

[Unknown1]: netflix like i mean every time you get a recommendation there's not some person

[Unknown2]: sitting behind a screen somewhere that's ai so and think about how often we get

[Unknown1]: sitting behind a screen somewhere' that's ai and think about how often we get

[Unknown2]: recommendations now so yeah just to give people a little bit of that scope

[Unknown1]: recommendations so uh yeah just to give people

[Unknown1]: right and i and i think that i think the the film social dilemma did an

[Unknown1]: extraordinarily good job

[Unknown2]: hm

[Unknown1]: of

[Unknown2]: really

[Unknown1]: really revealing

[Unknown1]: really revealing

[Unknown2]: um

[Unknown1]: that

[Unknown2]: that

[Unknown1]: what has happened right

[Unknown2]: yeah

[Unknown1]: by people's engagement by our engagement with artificial and artist with

[Unknown1]: autonomous intelligence systems right

[Unknown2]: yes

[Unknown1]: facebook and

[Unknown2]: yeah

[Unknown1]: instagram

[Unknown2]: yeah

[Unknown1]: by our engagement we are engaging with it so people go i don't have any anything

[Unknown1]: to do with ai and you're like well yeah you do and right so are you using your gps

[Unknown1]: yeah you do right so the thing is it's the degrees to which and

[Unknown2]: hm

[Unknown1]: when one of the reasons i find myself

[Unknown1]: in a particularly

[Unknown1]: good position i don't know good is the word but an important position for myself

[Unknown1]: in this is that

[Unknown1]: i was somebody who did not was not i was not somebody who did machine

[Unknown2]: i was somebody who said no is

[Unknown2]: hm

[Unknown1]: learning or coding or any of those things

[Unknown2]: i

[Unknown1]: i was somebody who

[Unknown1]: a i was like really

[Unknown2]: yeah yeah yeah

[Unknown1]: you know

[Unknown1]: and so and it's relatively new for me i

[Unknown2]: right

[Unknown1]: s i ended up being in the right place at the right time and asking the right

[Unknown1]: questions

[Unknown2]: yeah tell us a little bit about that you told the story to me but i think it's

[Unknown1]: tell us a little bit about that you close store to me but i think it's good for

[Unknown2]: good for our listeners to understand it and

[Unknown1]: our student to understand it and uh

[Unknown2]: we'll move from there but i think that story is very helpful

[Unknown2]: we'll move from there but i think that story is very helpful

[Unknown1]: well we'll move from there but i

[Unknown1]: well i ended up going to a conference on ai for social good and there are tons

[Unknown1]: there are some amazing things there are ai programs ais programs that reduce human

[Unknown1]: trafficking that

[Unknown2]: hey

[Unknown1]: clean the oceans there's something called the crisis hotline which which is in

[Unknown1]: posted in every school most schools across the country and in fact in the u s and

[Unknown1]: in fact has saved at this point i mean when i first got introduced they had saved

[Unknown1]: something like thirty six thousand lives that was

[Unknown2]: wow

[Unknown1]: several years ago already

[Unknown1]: so they are using

[Unknown1]: ais

[Unknown2]: yeah yeah

[Unknown1]: as a triage system

[Unknown1]: for

[Unknown2]: right

[Unknown1]: young people

[Unknown1]: who are in distress

[Unknown2]: hey

[Unknown1]: and it is a brilliant it's a

[Unknown1]: combination of it's a

[Unknown2]: a combination

[Unknown2]: it

[Unknown1]: beautiful marriage of the ais system

[Unknown2]: out

[Unknown1]: assisting

[Unknown2]: um

[Unknown1]: human beings

[Unknown2]: hm

[Unknown1]: who will get y these people

[Unknown2]: that

[Unknown1]: write somebody who's texting in crisis but if

[Unknown2]: right

[Unknown1]: we're receiving hundreds and thousands of those a day which

[Unknown2]: yes

[Unknown1]: we are

[Unknown2]: yes

[Unknown1]: we can't staff that

[Unknown2]: right

[Unknown1]: not with qualified and enough staff so what we do right so what they

[Unknown1]: have done not we but what they have done is they have a system

[Unknown2]: to what they have done what they have

[Unknown1]: that triage say like these are the flags and they know the flags in tone of a text

[Unknown2]: wow okay

[Unknown1]: in in word letters used

[Unknown1]: and their

[Unknown2]: yes

[Unknown1]: accuracy is pretty high

[Unknown1]: and

[Unknown2]: yeah

[Unknown1]: what happens is that system that initial system flags

[Unknown2]: hmm

[Unknown1]: who needs to be sent to the next level

[Unknown1]: who needs to be sent to the next level

[Unknown2]: right

[Unknown1]: and that's gonna be another system

[Unknown2]: system and then

[Unknown1]: and then eventually very quickly

[Unknown2]: qu

[Unknown1]: it goes to human a panel of human beings who are

[Unknown2]: yes

[Unknown1]: reading names and

[Unknown2]: yes

[Unknown1]: eventually responding to them and eventually asking the person do you want help

[Unknown2]: hey

[Unknown1]: and they and are able to

[Unknown2]: are able to

[Unknown1]: then make that happen in real time

[Unknown1]: so it to me

[Unknown2]: me

[Unknown1]: that's kind of one

[Unknown2]: yes

[Unknown1]: of the best demonstrations of ais that i know right

[Unknown2]: yes

[Unknown1]: and

[Unknown2]: i

[Unknown1]: yeah i mean i to me that's like and when i i and i'm watching

[Unknown2]: and i'm working and stuff

[Unknown1]: this stuff and i'm seeing it and i i was watching ai systems this wanted that's

[Unknown1]: amazing about

[Unknown2]: sure

[Unknown1]: controlling sounds in a hospital so that

[Unknown2]: so that

[Unknown1]: you can heal because you know all those

[Unknown2]: all those

[Unknown1]: horrible beeping sounds

[Unknown2]: oh yeah

[Unknown1]: right to to kind of ameliorate that anyway so those were great the thing is person

[Unknown1]: sitting next to me i said what

[Unknown2]: i said

[Unknown1]: do you do and they they were at i triple e

[Unknown2]: no

[Unknown1]: and i said and saying well we're working on ai ethics and i said well who's ethics

[Unknown2]: right

[Unknown1]: that was the first question and

[Unknown2]: right

[Unknown1]: then asked a series of questions and because i

[Unknown2]: i i

[Unknown1]: asked those kinds of questions that's how i got invited that plus the fact i'm a

[Unknown1]: psychologist so i was coming from the perspective

[Unknown2]: perspective

[Unknown1]: i'm also an educational psychologist so not

[Unknown2]: right

[Unknown1]: just from and it's not to be that you

[Unknown2]: a little bit

[Unknown1]: know it's not to make that any smaller it's coming from the human consideration

[Unknown1]: perspective as well

[Unknown2]: right

[Unknown1]: as a pedagogical how are we going to teach this or

[Unknown2]: yes

[Unknown1]: how are we going to communicate this

[Unknown2]: yes

[Unknown1]: so that's how i got involved but because i was not somebody who had a previous

[Unknown1]: love affair with ai

[Unknown2]: yes

[Unknown1]: i had a lot of i was able to provide questions

[Unknown2]: yeah

[Unknown1]: that were very valuable in the process

[Unknown2]: right

[Unknown1]: um yeah

[Unknown1]: i

[Unknown2]: yeah and

[Unknown1]: thought of a i as being like doctor no

[Unknown2]: right

[Unknown1]: right for anybody who's old enough to know what that is it's like you know it was

[Unknown1]: kind of like that was you know

[Unknown2]: that

[Unknown1]: it was like the evil villain who's running the big machine so

[Unknown2]: yeah yeah

[Unknown1]: yeah

[Unknown2]: kind of i mean i think a lot of people think of skynet right like that's the

[Unknown1]: i mean i think a lot of people think of skynets right that's the initial yes even

[Unknown2]: initial yes even as you're talking about

[Unknown1]: if you're talking about

[Unknown2]: an ais handling

[Unknown1]: an ais handling

[Unknown2]: a crisis hotline i know some people would have that immediate reaction you know i

[Unknown2]: a crisis hotline i know some people would have that immediate reaction you know i

[Unknown1]: uh crisis hotline i know some people would have that immediate reaction you know i

[Unknown2]: had like shouldn't they you know if someone's in trouble shouldn't they

[Unknown2]: had like shouldn't they you know if someone's in trouble shouldn't they

[Unknown1]: had like shouldn't they you know if someone's in trouble shouldn't they

[Unknown2]: immediately deal with human but if they're texting one they're not going to

[Unknown2]: immediately deal with human but if they're texting one they're not going to

[Unknown1]: immediately deal with a human like if they're texting one they're are not going to

[Unknown2]: necessarily be able to tell the difference and what people don't realize is like

[Unknown2]: necessarily be able to tell the difference and what people don't realize is like

[Unknown1]: necessarily be able to tell the difference and what people don't realize is it's

[Unknown2]: well don't they miss some people if humans were handling the text messaging which

[Unknown2]: well don't they miss some people if humans were handling the text messaging which

[Unknown1]: like well don't they miss some people if humans were handling the text messaging

[Unknown2]: to be honest what would be more likely to happen is they would miss them because

[Unknown2]: to be honest what would be more likely to happen is they would miss them because

[Unknown1]: which to be honest with would happen to be more likely to happen if they would

[Unknown1]: miss them because they don't have time but then the other side of it is that

[Unknown2]: they don't have time but then the other side of it is that humans would miss

[Unknown2]: they don't have time but then the other side of it is that humans would miss

[Unknown1]: humans would miss flagging de freshness too right whereas you can completely get

[Unknown2]: flagging the correct ones too right whereas you can continually get better with

[Unknown2]: flagging the correct ones too right whereas you can continually get better with

[Unknown2]: the ai and then pass on the problematic ones because this is the truth is that

[Unknown2]: the ai and then pass on the problematic ones because this is the truth is that

[Unknown1]: better with the ai and then pass on the problematic because this is the truth is

[Unknown2]: when dealing with any kind of open phone number like that you're gonna have you're

[Unknown2]: when dealing with any kind of open phone number like that you're gonna have you're

[Unknown1]: that when dealing with any kind of open phone number like that you're going to

[Unknown1]: have

[Unknown2]: going to have crank calls right you're going to have kids who are just like hey

[Unknown2]: going to have crank calls right you're going to have kids who are just like hey

[Unknown1]: you're going to have crank calls right you're going to have kids who are just like

[Unknown2]: what's up you know and it's like you know has a small conversation with them and

[Unknown2]: what's up you know and it's like you know has a small conversation with them and

[Unknown1]: hey what's up you know like you know has a small conversation with them and then

[Unknown2]: then maybe figure something out and so i think just providing that kind of context

[Unknown2]: then maybe figure something out and so i think just providing that kind of context

[Unknown1]: maybe figure something out and so i think providing that kind of context

[Unknown2]: that a lot

[Unknown2]: of times when we talk about the error with you know so for instance with tesla

[Unknown1]: that a lot of times when you talk about the here with you know

[Unknown2]: that stop sign things a big deal and they're going to of course fix that but the

[Unknown2]: the truth is i can't

[Unknown1]: i can't tell you how much worse drivers have gotten in the last ten years i think

[Unknown2]: tell you how much worse drivers have gotten in the last ten years i think everyone

[Unknown2]: knows right i can't tell i i mean if i drive forty minutes i guarantee you i will

[Unknown1]: everyone knows right i can't tell i mean if i drive forty minutes i guarantee i

[Unknown2]: see at least two people with their head down looking at their phone just driving

[Unknown1]: will see these two people with their head down looking at their phones just

[Unknown1]: driving on the

[Unknown2]: on the highway and yeah like the

[Unknown1]: and in fact yeah well it's an interesting thing i already interpreted it's

[Unknown2]: no no no

[Unknown1]: an interesting thing because i just i just read this morning

[Unknown1]: i want to say i forgot it was the post or at times um they have now there's

[Unknown1]: studies coming out you know because there's many psychological studies of the

[Unknown1]: effects of covid right

[Unknown2]: mm

[Unknown1]: in the pandemic but that um

[Unknown1]: the cycle that there has been a societal effect in that accidents vehicle

[Unknown1]: accidents have risen exponentially

[Unknown2]: hey

[Unknown1]: and one of the reasons that they're attributing it to is there is a social

[Unknown2]: so

[Unknown1]: distancing there's it like people aren't people

[Unknown2]: yeah

[Unknown1]: aren't considering where somebody else is people

[Unknown2]: right

[Unknown1]: are not as socially not that people were horribly socially i mean fantastically

[Unknown1]: socially aware before just say but

[Unknown1]: they're less

[Unknown2]: yeah yeah

[Unknown1]: and that is there's a detachment a so

[Unknown2]: yeah

[Unknown1]: societal detachment and that that that

[Unknown2]: that they were

[Unknown1]: they are attributing because they're they're they've been looking at the reasons

[Unknown1]: right like what was happening in that accident and they're finding that that is

[Unknown1]: one of the that's a a consequence

[Unknown2]: hey

[Unknown1]: which in fact might be

[Unknown1]: a place that a s could help

[Unknown2]: yeah yeah um

[Unknown1]: yeah um

[Unknown2]: yeah i i before we get too far i did want to talk about this because you mentioned

[Unknown1]: yeah and i i i before we get too far i didn't want to talk about this because

[Unknown1]: you've mentioned things like

[Unknown2]: things like

[Unknown2]: the social dilemma and you talk about unintended users and i had just seen

[Unknown1]: the social dilemma and you talked about unintended use i've just seen

[Unknown2]: a

[Unknown1]: a

[Unknown2]: study presented in a video a youtube video where i think it is the washington post

[Unknown1]: study send it in a video a youtube video where watching it so i could get i could

[Unknown2]: but i could i could be getting it wrong but a couple of reporters did a double

[Unknown1]: be any wrong but a couple of reporters did a double blind test on how fast to get

[Unknown2]: blind test on how fast to get to graphic and

[Unknown1]: to graphic and

[Unknown2]: hate

[Unknown2]: crime content on tick tock

[Unknown1]: hate crime content on

[Unknown2]: and

[Unknown1]: yes

[Unknown2]: it's actually not very hard and what people don't understand is you don't have

[Unknown1]: it's actually not very hard and what people don't understand is you don't have

[Unknown2]: time to go through every single video that's posted up you can't compete right

[Unknown1]: time to go through every single close stuff you can't compete right it's too big

[Unknown2]: it's too big so they rely on users reporting and on some basic ai filters that

[Unknown1]: so they rely on users reporting and on some basic ai filters that it's very

[Unknown2]: it's very difficult like like even people mess up humor like to expect a ai to

[Unknown1]: difficult like to get like even people mess up humor like they expected ai to

[Unknown2]: understand humor it's like not going to work um and it was within a hundred videos

[Unknown1]: understand think humor is not going to work and it was within a hundred videos

[Unknown2]: which it sounds like a lot but on tick tock that's one to two hours you could get

[Unknown1]: which sounds like a lot but on tick tock that's one to two hours you could get to

[Unknown2]: to

[Unknown2]: brutal anti semitism

[Unknown1]: brutal anti semitism

[Unknown2]: graphically violent videos and nazi symbolism

[Unknown2]: graphically violent videos and nazi symbolism

[Unknown1]: graphically violent videos and not

[Unknown1]: yeah

[Unknown2]: if you're

[Unknown1]: well there's also right i mean the thing is there there's that because it becomes

[Unknown1]: it can become a vehicle for hate

[Unknown2]: yeah

[Unknown1]: it can become a vehicle for anything that's toxic because

[Unknown2]: yes

[Unknown1]: you know people who are determined to do that

[Unknown2]: yes

[Unknown1]: are going to use any means that they can right and

[Unknown2]: yes

[Unknown1]: that to me is it's a faster vehicle right

[Unknown2]: right well and sorry

[Unknown2]: uh that's actually not this is what i wanted to ask about because i have

[Unknown1]: sorry that's actually this is what i want to ask about because i have experienced

[Unknown2]: experienced this i have experienced a few things we popping up and i'm like oh no

[Unknown1]: this i have experienced a few things popping up and i'm like oh no and what people

[Unknown2]: and but people don't realize is how much the algorithm on tick tock drives it

[Unknown1]: don't realize is how much the algorithm on tick drives it that's the ai part of it

[Unknown2]: that's the ai part of it and how if you watch something because it's interesting

[Unknown1]: and how if you watch something because that's interesting the algorithm thinks you

[Unknown2]: the algorithm thinks you want to watch more of that in a more extreme way and so

[Unknown1]: want to watch more of that in a more extreme way and so when you talk about

[Unknown2]: when you talk about unintended users so i apologize for the length of this

[Unknown1]: unintended users i apologize no it's okay

[Unknown2]: question i guess but it's just like

[Unknown2]: when you talk about unintended users i think about like

[Unknown1]: when you talk about unintended users i think about like

[Unknown2]: my oldest would look at something and be like oh but my youngest anything

[Unknown1]: my oldest would look at something and be like oh but my youngest anything

[Unknown2]: aggressive on tv he immediately gravitates to he's four like i mean if if he's

[Unknown1]: aggressive on tv he immediately gravitates to he's poor i mean if he's watching

[Unknown2]: watching um

[Unknown2]: i i don't even know where he comes up with some of this co this stuff you know

[Unknown1]: i don't even know where he comes up with someone who got this this stuff you know

[Unknown2]: like he's he's watching like like doc mc stuff's and like someone pushes somebody

[Unknown1]: like he's he's watching like uh like doc stuff ins' pushes somebody and his

[Unknown2]: and his instinct is not to watch them the whole thing with like we don't push you

[Unknown1]: instinct is not to watch them the whole thing with like we don't push you know

[Unknown2]: know he's like oh i push people you know like he he's geared towards

[Unknown1]: he's like oh i push people you know he geared towards aggressiveness and i know

[Unknown2]: aggressiveness and i know that when we're talking about kids which tech talk is in

[Unknown1]: that we were talking about kids which tick talk is in a lot of ways marketed as

[Unknown2]: a lot of ways marketed at children

[Unknown1]: children

[Unknown2]: that curiosity

[Unknown1]: that curiosity

[Unknown2]: got like i mean as an adult we know not to watch graphically violent stuff because

[Unknown1]: like i mean as an adult we know not to watch graphically violent stuff because

[Unknown2]: what it does to us or we should know but like we we assume responsibility for that

[Unknown1]: what it does to us or we should know but like we we assume sob for that but as an

[Unknown2]: but as an unintended users

[Unknown1]: unintended users

[Unknown2]: how do you handle

[Unknown1]: how do you handle

[Unknown2]: kind of that curiosity from unintended users

[Unknown1]: kind of that curiosity from unintended well i think that this is where

[Unknown1]: the whole purpose of the education committee that that we are

[Unknown2]: mm hm

[Unknown1]: working on now i mean first we're working we're focusing on

[Unknown1]: engineers because what most

[Unknown2]: yeah

[Unknown1]: people don't realize i didn't realize i thought of course engineers got ai ethics

[Unknown1]: in school they do not get ai ethics in school

[Unknown2]: no

[Unknown1]: so they will more and more because that's now

[Unknown2]: yes

[Unknown1]: it's coming into the xs so they

[Unknown2]: right

[Unknown1]: will but it it hasn't so the thing is that's first

[Unknown2]: yes

[Unknown1]: the second thing for the committee is really looking at the non

[Unknown2]: not i

[Unknown1]: engineers who are not training in schools

[Unknown2]: yeah right making a part of boot camps

[Unknown1]: you've got a fourteen you got right you gotta well you not forget boot camps you

[Unknown1]: got a kid in the middle of the sahara

[Unknown2]: yeah

[Unknown1]: who figured out how to get on who has now just developed three apps

[Unknown2]: yeah

[Unknown1]: so we

[Unknown1]: now how are we there we go so how are right so how are we learning right how are

[Unknown2]: that's how i learned to code i didn't take a book yeah

[Unknown1]: we reaching those

[Unknown2]: e

[Unknown1]: burgeoning engineers self educated n i mean that's steve jobs

[Unknown2]: right

[Unknown1]: i mean steve jobs didn't right

[Unknown2]: right

[Unknown1]: and then although he had access before that but there are

[Unknown2]: but they

[Unknown1]: many un formally educated right

[Unknown2]: yes

[Unknown1]: engineers then we reach business people who are

[Unknown1]: producing it

[Unknown2]: hm

[Unknown1]: and mobilizing it

[Unknown2]: right

[Unknown1]: who are saying oh my it people deal with that who are not

[Unknown2]: what

[Unknown1]: and

[Unknown2]: you're right

[Unknown1]: or my or my legal team who are not

[Unknown2]: right

[Unknown1]: and then we begin to go to daily users who are adults and then k

[Unknown2]: hey

[Unknown1]: through twelve i mean it's a scaffold and this is you know

[Unknown2]: yeah

[Unknown1]: we have to start with the the people who we know are wanting to design it in

[Unknown1]: schools

[Unknown2]: hm

[Unknown1]: the thing about where you were i'm going to just circle back to

[Unknown2]: sure

[Unknown1]: the users or unintended users know there was another thing that just recently came

[Unknown1]: there's been several cases because now that's we're gonna see more of them in the

[Unknown2]: right

[Unknown1]: metaverse

[Unknown2]: yeah oh

[Unknown1]: right so the metaverse what happened recently uh there were two cases one of which

[Unknown1]: was there was a woman who was who was a beta

[Unknown2]: the fa

[Unknown1]: tester

[Unknown1]: for a game

[Unknown2]: yeah

[Unknown1]: and in the game somebody

[Unknown2]: somebody

[Unknown1]: sexually

[Unknown2]: se

[Unknown1]: was sexually abusive to her

[Unknown2]: well

[Unknown1]: in the game and

[Unknown2]: hm

[Unknown1]: while she was watching the game she was in or in the game in her goggles she was

[Unknown1]: in the game with her husband and her brother in law who were actually even

[Unknown1]: watching this witnessing this happen until they

[Unknown1]: realized what was going on and everybody just got off

[Unknown2]: they realize

[Unknown2]: yeah

[Unknown1]: and the response this was i think from facebook i'm almost positive

[Unknown2]: hm

[Unknown1]: i would have to double check that again i don't want to be liable you know i don't

[Unknown1]: want to be litigious but the was the response from the producer of the game

[Unknown2]: yeah

[Unknown1]: was well you know she didn't check off all those

[Unknown2]: all those things

[Unknown1]: safeties she was supposed to check off

[Unknown2]: to check off

[Unknown1]: she was the beta tester she didn't check

[Unknown2]: she set up all those

[Unknown1]: off all those safety so it's blaming the victim in this

[Unknown2]: right

[Unknown1]: right blaming the target and i

[Unknown2]: yes

[Unknown1]: try to stay away from victim talk but they go toward the target right

[Unknown2]: sure

[Unknown1]: the thing is we also then just this i think it was this week

[Unknown2]: hey

[Unknown1]: uh there was a case where they taught there was it's a it's an adult game a very

[Unknown1]: much adult content game

[Unknown2]: got it

[Unknown1]: whatever that whatever it is

[Unknown1]: and somebody went

[Unknown2]: and seventy one

[Unknown1]: how old

[Unknown2]: how

[Unknown1]: are you three people showed up and they went how old are

[Unknown2]: oh

[Unknown1]: you and it was three kids who were one was six one was five and one was somewhere

[Unknown1]: in that same age

[Unknown2]: right

[Unknown1]: and they went and it was a thank goodness it was a responsible person who said

[Unknown1]: what

[Unknown2]: yeah

[Unknown1]: are you doing in here because they

[Unknown2]: right

[Unknown1]: reported it but they were what are you doing in here and they said where did you

[Unknown1]: get they went i my i got my i i got my mom's goggles

[Unknown2]: i'm sorry

[Unknown1]: because you know the smart five year old the smartt

[Unknown2]: yes

[Unknown1]: six year old is gonna

[Unknown2]: oh yeah

[Unknown1]: do that

[Unknown2]: oh i my son

[Unknown1]: my son

[Unknown2]: learned

[Unknown2]: the combination to the ipad without asking me i think he was like sneakily

[Unknown1]: learned the combination to the ipad without asking me thank you was like sneakily

[Unknown2]: watching me and he bought he rented a couple videos on amazon prime so now we have

[Unknown1]: watching me and he bought he rented a couple of videos on amazon prime so now we

[Unknown1]: have to

[Unknown2]: to watch lego batman at some point like

[Unknown1]: right well and there there are lots of

[Unknown1]: consequences and and they they these are both very serious consequences and then

[Unknown2]: right yeah

[Unknown1]: there are

[Unknown2]: hm

[Unknown1]: lighter consequences but some of them feel like lighter consequences and are not

[Unknown1]: so much for example

[Unknown1]: to me this is a it is a big deal

[Unknown1]: um

[Unknown1]: we have

[Unknown2]: you have

[Unknown1]: very small children ordering an adult woman's voice around

[Unknown1]: alexa get me

[Unknown2]: right

[Unknown1]: that alexa pick up my clothes alexa open the door alexa

[Unknown1]: siri

[Unknown2]: see

[Unknown1]: it's all women's voices

[Unknown2]: seven four

[Unknown1]: it's all adult women's voices that's

[Unknown2]: that's the default right yeah

[Unknown1]: so right and we don't think about that the thing is when we go well what can we do

[Unknown1]: about that well we can we are the consumer

[Unknown2]: yeah

[Unknown1]: we are the consumer and as a business person any business people listening as a

[Unknown1]: business person i guarantee that there are several things in your business

[Unknown2]: hm

[Unknown1]: that you may use

[Unknown1]: that you haven't thought about those things

[Unknown2]: right

[Unknown1]: and and it's not to say it's not saying blame because

[Unknown2]: right

[Unknown1]: who knew right like i didn't know how did i know like i haven't thought about that

[Unknown1]: right most people

[Unknown2]: right

[Unknown1]: go i hadn't even thought about that

[Unknown2]: yep

[Unknown1]: so it's not about blame it's the idea of it's like bias right we are all

[Unknown2]: it is

[Unknown1]: we all have bias but one once we learn about our bias we go oh well now i have a

[Unknown2]: if i had my home but now i'm in the course

[Unknown1]: choice

[Unknown1]: i can

[Unknown2]: i can

[Unknown1]: either continue on that bias without operating from that bias

[Unknown1]: or i

[Unknown2]: more ice

[Unknown1]: can decide i'm going

[Unknown2]: i

[Unknown1]: to do something else

[Unknown1]: and that's where

[Unknown1]: we people really

[Unknown1]: people have gotten complacent there's a a term

[Unknown1]: there's a term in called

[Unknown1]: what is it it's the

[Unknown1]: i think of it in a second it's the par it's a paradox it's a paradoxical term

[Unknown1]: it's like pleasure paradox anyway i'll think of

[Unknown2]: one

[Unknown1]: it in a second my my brain just went empty

[Unknown2]: go ahead and explain the concept maybe i can help you

[Unknown1]: the there no so right so the idea is the idea that we push a button

[Unknown2]: hm

[Unknown1]: even though we know it's not good people after watching social dilemma after

[Unknown1]: watching other things go i know i

[Unknown2]: right

[Unknown1]: shouldn't just press accept or i

[Unknown2]: right

[Unknown1]: know i shouldn't do that but we do it

[Unknown1]: because it's easy

[Unknown2]: right

[Unknown1]: we do it because it makes it's you know it's just easier it just makes my life

[Unknown1]: easier

[Unknown2]: right

[Unknown1]: right and so there is a lot of

[Unknown2]: a water

[Unknown1]: that and we have gotten to be a very complacent world in a

[Unknown2]: hey

[Unknown1]: lot of ways

[Unknown2]: hm

[Unknown1]: and then what happens is people get angry after saying well so and

[Unknown2]: right

[Unknown1]: so is taken advantages it's kind of like well we all do have a responsibility here

[Unknown2]: yes

[Unknown1]: and whether we have exercised it or not whether or not we don't and many of us

[Unknown2]: many of them

[Unknown1]: don't perceive

[Unknown1]: that we have

[Unknown1]: a

[Unknown2]: a

[Unknown1]: that we have any power

[Unknown2]: right

[Unknown1]: but right

[Unknown2]: when we have so much more power than we realize yeah

[Unknown1]: realize we do and this is where you know what when we p when we talk about even

[Unknown1]: introducing this into k through twelve if

[Unknown2]: hey

[Unknown1]: we can give some if we can give a kid a batch of legos and say build a building

[Unknown2]: hey

[Unknown1]: we can say to that kid think about who's gonna live in that building

[Unknown1]: you know

[Unknown2]: no

[Unknown1]: think about like the sims games

[Unknown2]: right

[Unknown1]: or any of those you know the part of the instructions is not think about the

[Unknown1]: consequences to the people that are right like

[Unknown1]: and that could be

[Unknown2]: yeah

[Unknown1]: because if we if we really began it earlier on

[Unknown1]: we might

[Unknown2]: we

[Unknown1]: it's not a guarantee cause there are people who are like yeah yeah yeah human

[Unknown1]: nature is gonna do what human nature does thing and that's true that's true to a

[Unknown1]: certain degree also though we do

[Unknown1]: if anybody has ever met a child who has learned about recycling they know right

[Unknown2]: hm

[Unknown1]: that child right children who learn about recycling become the like recycling

[Unknown1]: police

[Unknown2]: right right

[Unknown1]: right they come in they

[Unknown2]: yeah yeah oh they get so excited about it yeah

[Unknown1]: and so if we can begin to

[Unknown1]: learn how are

[Unknown2]: however

[Unknown1]: we how are we operating what responsibility are we taking for the things that we

[Unknown1]: use every day that we do that we buy and especially for people one of the reasons

[Unknown1]: i love cause because i'm an aa leadership coach

[Unknown2]: hm

[Unknown1]: you know businesses it's like you are you

[Unknown2]: what

[Unknown1]: you are contributing

[Unknown2]: right

[Unknown1]: so however it is you're contributing so maybe you're already doing it and it's

[Unknown1]: great

[Unknown1]: and

[Unknown1]: what more can you do and

[Unknown2]: yeah

[Unknown1]: knowing that around the just around the corner

[Unknown1]: there is going to be an exponential jump in

[Unknown2]: hm

[Unknown1]: what we use

[Unknown2]: right

[Unknown2]: right i mean the we're gonna see a huge difference between fake facebook accounts

[Unknown1]: right i mean we're going to see a huge difference between fake facebook and

[Unknown2]: and fake metaverse accounts right that's those are two very different things

[Unknown1]: accounts and stan adversary council are two very different things

[Unknown2]: something you

[Unknown1]: something you said you were talking about uh the woman's voice and immediate

[Unknown2]: said you were talking about uh the woman's voice and immediately what i thought of

[Unknown1]: things that i thought of was because i've been part of these discussions at a

[Unknown2]: was because i've been part of these discussions at a lower level in marketing is

[Unknown1]: lower level in marketing that decision was not made based on should we have a

[Unknown2]: that decision was not made based on should we have a woman's voice that that

[Unknown1]: woman's voice that that decision was made they did about tens of twenty different

[Unknown2]: decision was made they did about ten to twenty different voices and they had

[Unknown1]: voices and they had customers listen to it and then they said which one made the

[Unknown2]: customers listen to it and then they said which one made the customer engage more

[Unknown1]: customer engagement which one made it to

[Unknown2]: which one made the customer happier without any regard to what are the ethics of

[Unknown1]: without any regard to what are the ess of using this particular voice and so that

[Unknown2]: using this particular voice and so just getting people to understand that to

[Unknown1]: just getting people to understand that to introduce that eps it or because it's

[Unknown2]: introduce that ethical element is like is this just because everyone likes it or

[Unknown2]: because it this this draws more people in is it the right thing to do i mean it's

[Unknown1]: this that just draws more people in is it the right thing to do i mean it's like

[Unknown2]: like the infinite scrolling thing everyone's switching over to it because it's

[Unknown1]: the infinite scrolling thing everyone's switching over to it because it's affected

[Unknown2]: effective right i mean that's slightly different from ai but it is it's all these

[Unknown1]: i mean that's slightly different from ai but it is it's all these different

[Unknown2]: different

[Unknown2]: decisions that are made by effectiveness and what they run into and you said this

[Unknown1]: decisions that are made by effectiveness and what they run into and you said this

[Unknown2]: about the exponential thing and this is where

[Unknown1]: about the exponential thing and this is where

[Unknown2]: i'm curious what the solution is here because the people who do based on marketing

[Unknown1]: i'm curious what the solution is here because the people who do based on marketing

[Unknown2]: might actually beat out people who do things ethically because it's more about bra

[Unknown1]: might actually beat out people who do things ethically because it's more about

[Unknown2]: bringing people in

[Unknown1]: bringing people in

[Unknown2]: does that make sense

[Unknown1]: yeah i think it does i think that there you know i i refer to myself as a

[Unknown1]: dedicated optimist

[Unknown2]: sure

[Unknown1]: i really am philosophically

[Unknown2]: yeah

[Unknown1]: i'm a dedicated optimist and i'm not i'm not

[Unknown1]: fond of the i'm i'm also not fond of like of the um

[Unknown2]: got like i come up in a

[Unknown1]: kind of the positive toxicity world of positivity where we're all we're all

[Unknown2]: yeah yeah yeah yeah

[Unknown1]: rosy like no i mean i to me that is actually it requires denial right

[Unknown2]: sure

[Unknown1]: that requires a certain level of denial i understand that if i dumped in

[Unknown2]: yes

[Unknown1]: the middle of the ocean during a tsunami it's probably not so good but but

[Unknown2]: yeah

[Unknown1]: what if

[Unknown2]: fair point

[Unknown1]: right but if

[Unknown2]: hm

[Unknown1]: i and i'm going to look for if i

[Unknown1]: can find because i'm gonna believe that i'm gonna find a piece of wood that i can

[Unknown2]: try to find i i

[Unknown1]: hold onto

[Unknown2]: yes

[Unknown1]: and that i'm gonna get to that

[Unknown1]: shore somehow or i'm gonna get to some place where i can be safe because

[Unknown2]: that some

[Unknown1]: that to me that's dedicated optimism

[Unknown1]: it's like you know you know i know i'm in the quick sand i know i'm in the quick

[Unknown2]: right you gotta try your best yeah yeah yeah yeah

[Unknown1]: sand and

[Unknown2]: yeah

[Unknown1]: right what am i gonna do so

[Unknown1]: i

[Unknown2]: i very much

[Unknown1]: believe very much there is a place where it will meet

[Unknown2]: okay yeah

[Unknown1]: it can't i believe it it's a not well can i

[Unknown2]: yes yeah i am

[Unknown1]: believe that to me i feel like you know i always i've

[Unknown2]: i not

[Unknown1]: learned over the years from teaching

[Unknown1]: and from

[Unknown2]: i

[Unknown1]: public speaking

[Unknown1]: it's always about the invitation

[Unknown1]: it always comes

[Unknown2]: yeah

[Unknown1]: down to the invitation and it's rather than

[Unknown2]: what

[Unknown1]: a demand

[Unknown1]: i mean

[Unknown1]: i can say and i can pretty clearly say and you know i believe

[Unknown2]: i mean i could say

[Unknown2]: hm

[Unknown1]: that we really need to do this

[Unknown2]: hm

[Unknown1]: simultaneous i can tell anybody that they need to do something they're probably

[Unknown1]: not going so right so

[Unknown2]: right right

[Unknown1]: that's just like that's human psychology

[Unknown2]: yeah

[Unknown1]: so that's human nature so

[Unknown1]: not just psychology right it's on a a lot

[Unknown2]: so i why

[Unknown2]: i want

[Unknown1]: of levels but the

[Unknown1]: idea that

[Unknown2]: i do that

[Unknown1]: we can people really can

[Unknown2]: hey

[Unknown1]: be invited to their their best selves

[Unknown2]: yes yeah

[Unknown1]: and i also believe that there are probably there's more best self than worst self

[Unknown1]: in the world i know it doesn't feel

[Unknown2]: hey

[Unknown1]: like that a lot of times

[Unknown2]: yeah

[Unknown1]: but if i believed otherwise i don't know if i'd get up in the morning

[Unknown1]: but if i believed otherwise i don't know if i'd get up in the morning

[Unknown1]: so right

[Unknown2]: right right right i

[Unknown1]: and i have a ch

[Unknown2]: i love this answer by the way yeah i i think it's it's key

[Unknown1]: right and i have wait yeah and i have a daughter right my my

[Unknown2]: like

[Unknown1]: daughter's twenty two years old she's got her whole life in front of her and i and

[Unknown1]: and her friends and all the children

[Unknown2]: hm

[Unknown1]: i and all i don't even want to call them children but they are all the young

[Unknown1]: people i've i've taught over the years you know i mean

[Unknown2]: right

[Unknown1]: i've been i you know i've been i mean i've had a career for

[Unknown2]: a year four dead so i take a lot of time

[Unknown1]: four decades so it's like a lot of them are are fully adults but the

[Unknown2]: hm

[Unknown1]: i want them

[Unknown2]: hm

[Unknown1]: to

[Unknown1]: flourish i want them to live a life and do those things so i to me my time is

[Unknown1]: better spent

[Unknown2]: hm

[Unknown1]: not in denial but in

[Unknown2]: right

[Unknown1]: saying okay where are the cracks

[Unknown2]: yeah

[Unknown1]: how can we heal

[Unknown2]: you think i can look

[Unknown1]: how can we look for what the cracks are telling us

[Unknown2]: right what's under stress

[Unknown1]: use what's right what's on and what do we need to get rid of because there's some

[Unknown1]: things we need to just get rid of right that's like sometimes

[Unknown2]: sure

[Unknown1]: the foundation is rotten

[Unknown2]: mm

[Unknown1]: so you get rid of the foundation and sometimes

[Unknown1]: the

[Unknown1]: cracks are just things that tell us what needs to get filled in right so

[Unknown2]: the craps were just like like

[Unknown2]: right

[Unknown1]: i you know and i mean i can be metaph metaphoric with this but it's

[Unknown1]: i really believe that human beings can do this and in fact

[Unknown2]: hey

[Unknown1]: it's

[Unknown1]: it's the way of the future

[Unknown2]: hm

[Unknown1]: you know this is you know in at in the pressing institute the mit prec institute

[Unknown1]: out of

[Unknown2]: how

[Unknown1]: charmer they talk a lot about the emergence right the

[Unknown2]: hm

[Unknown1]: emerging future we're in the emerging future it's not like the future's over there

[Unknown2]: hmm

[Unknown1]: we're in it right now

[Unknown2]: yeah

[Unknown1]: right and so what

[Unknown2]: so what

[Unknown1]: what will

[Unknown2]: what would you do

[Unknown1]: we do with this emerging future i mean i i often think about

[Unknown1]: my mother and my father

[Unknown2]: my mother and my

[Unknown1]: my mother was born in nineteen eighteen my father was born in nineteen o nine

[Unknown2]: hmm

[Unknown1]: and at ninety five years old watching my mother sit there with an ipad in her hand

[Unknown1]: going

[Unknown1]: looking at me like you gotta be kidding me

[Unknown2]: it's crazy no it really

[Unknown1]: do know she you know

[Unknown2]: is i yeah

[Unknown1]: in her lifetime

[Unknown2]: yeah

[Unknown1]: they went from taking

[Unknown1]: a battery out of a car and putting it into the radio in the house so that you

[Unknown2]: taking the battery out of a car

[Unknown1]: could listen to a radio and taking it back out of the radio to put it into the car

[Unknown2]: oh

[Unknown1]: that happened in one person's lifetime

[Unknown2]: right right

[Unknown1]: star trek is happening him she was she was hu the ipad having a facetime

[Unknown1]: conversation with my

[Unknown2]: right

[Unknown1]: daughter and i'm

[Unknown2]: right

[Unknown1]: looking at her and she's looking at me like

[Unknown1]: what are you

[Unknown1]: and she

[Unknown2]: right

[Unknown1]: thought it was great

[Unknown2]: yeah

[Unknown1]: she thought it was great and she

[Unknown2]: yeah

[Unknown1]: was also sitting there going and that's going to happen

[Unknown2]: and that happen

[Unknown1]: again i'm going to use i use the word exponential a lot because i think

[Unknown2]: yeah

[Unknown1]: we're just going to keep having these

[Unknown2]: right

[Unknown1]: huge leaps

[Unknown1]: you know

[Unknown2]: yes

[Unknown1]: the time it took to go from radio to having or having a phone

[Unknown2]: of ap in the neighborhood

[Unknown1]: in the neighborhood to having a phone in your house

[Unknown2]: to having a phone in your hand

[Unknown1]: right in hand to having a phone in your hand

[Unknown2]: yeah

[Unknown1]: so right all of those things each one of these things is going to keep jumping

[Unknown2]: yeah

[Unknown1]: and

[Unknown1]: which means that we we will

[Unknown1]: will need to we will i mean i again i i really try to stay away from the need but

[Unknown1]: it's like i invite people it's like consider

[Unknown2]: right

[Unknown2]: hm

[Unknown1]: we need a little we

[Unknown1]: we can be that

[Unknown2]: wanna be

[Unknown1]: flexible

[Unknown1]: we can be we may not feel like it

[Unknown2]: s

[Unknown1]: but we can be that flexible and i think the

[Unknown1]: education

[Unknown1]: is part of what

[Unknown1]: to me for at least for me and i think this is i i invite people to consider what

[Unknown2]: part of what's me with everything i think it i

[Unknown1]: i've learned from my own life i think that's all we can do right is

[Unknown1]: by

[Unknown1]: learning i did not not i didn't know nothing about ai but by learning about it and

[Unknown2]: by morning i didn't have enough i

[Unknown2]: hm

[Unknown1]: really learning because i went into the deep dive i went into the deep end of the

[Unknown1]: pool but by really

[Unknown2]: yeah

[Unknown1]: learning about it i had a completely different

[Unknown1]: experience

[Unknown2]: yeah

[Unknown2]: yeah yeah

[Unknown1]: you know yeah we

[Unknown1]: i'm sorry go ahead yeah yeah

[Unknown2]: no no no

[Unknown2]: first off

[Unknown1]: first off you know and i i didn't mean touch off the middle of that hand cap thank

[Unknown2]: you know and i i didn't mean to cut you off in the middle of that fantastic thank

[Unknown2]: you so much such an optimistic answer and i appreciate that and my original

[Unknown1]: you so much such as optimist answer and and i appreciate that my original question

[Unknown2]: question wasn't meant to be negative it was just like it's an obvious crack right

[Unknown1]: wasn't meant to be negative it was just like it's an obvious crap right it's like

[Unknown2]: it's like one of those things that we're like looking at like how do we deal with

[Unknown1]: one of those

[Unknown1]: like how how do we deal with

[Unknown2]: this

[Unknown1]: yeah i didn't view

[Unknown2]: the

[Unknown1]: it as negative in any way actually

[Unknown2]: yeah yeah oh okay good it was just like i i love that optimism

[Unknown1]: i viewed it as being i viewed it right i viewed it as being that

[Unknown2]: that

[Unknown1]: is our

[Unknown1]: primal human human response that's my primal human response right that when

[Unknown2]: yes

[Unknown1]: i first knew it was like when i first got invited and he was like mm

[Unknown1]: right

[Unknown2]: yeah well i mean you just have to ask like okay this is definitely a problem we

[Unknown1]: yeah well i mean you just have to ask like okay this is definitely a problem we

[Unknown2]: have to fix because you do get a competitive advant advantage you know

[Unknown1]: have to fix because you do get a competitive advant advantage you know

[Unknown2]: i think

[Unknown1]: i think of

[Unknown2]: of the psalmist saying

[Unknown1]: saying why does the wicked prosper you know it's like why do the people who are

[Unknown2]: why do the wicked prosper you know it's like why do the people who are not caring

[Unknown1]: not caring about what the consequences are advancing and that's not always the

[Unknown2]: about what the consequences are advancing and that's not always the case you know

[Unknown1]: case you know we have examples of people who are making clear

[Unknown2]: we have examples of people who are making clear um

[Unknown1]: and

[Unknown2]: but we have to make that concerted effort and i think a lot of this

[Unknown1]: but we have to make that concerted effort and i think a lot of this

[Unknown2]: that that discussion of acceleration that discussion of how it's going to be

[Unknown1]: that that discussed an acceleration that discussion of how it's going to be

[Unknown2]: exponentially more i mean i think about i remember getting a facebook account when

[Unknown1]: exponential i mean i think about i remember getting a facebook account when you

[Unknown2]: you had to have a dot edu

[Unknown1]: had to have a dot edu

[Unknown2]: uh email address right and that was fifteen years ago and now facebook is a

[Unknown1]: email address right and that was fifteen years ago and now facebook is a dominant

[Unknown2]: dominant force in many people's lives i'd say the majority of people's lives you

[Unknown1]: force in many people's lives i'd say the majority of people's lives you know and

[Unknown2]: know and it's such a and that is that is an a i thing right it's the w the

[Unknown1]: it's such a and that is that is an ai thing right it's the the algorithm makes a

[Unknown2]: algorithm is makes a big difference in people's lives little tweaks google very

[Unknown1]: big difference in people's lives little tweets google very similar what people

[Unknown2]: similar what people search and what comes up and they make

[Unknown1]: search and what comes up and they make

[Unknown2]: they make real decisions and ethical decisions and i think that's where if i could

[Unknown1]: they make real decisions and ethical decisions and i think that's where if i could

[Unknown2]: ask you to go in this direction you had that initial question that in many ways

[Unknown1]: ask you to go in this direction you had that initial question that in many ways

[Unknown2]: led to you getting this position who's ethics how do we find that ethic and how do

[Unknown1]: led to you getting the position who's ethics how do we find that ethic and how do

[Unknown2]: we find an ethic that works

[Unknown1]: we find an ethic that works

[Unknown2]: for the common good but also establishes common ground so it's not just like

[Unknown1]: for the common good but also establishes common ground so it's not just like

[Unknown2]: someone's ethic that hey it does work for everybody but it's one that everyone

[Unknown1]: someone's ethic that hey it does work for everybody but it's one that everyone

[Unknown2]: agrees works for everybody which are isn't always the same thing

[Unknown1]: agrees works for everybody which it's always the same thing no and i thank you for

[Unknown1]: that and thanks for redirecting us there because that's really

[Unknown1]: a

[Unknown2]: are huge part of their work committee

[Unknown1]: huge part of the work of the committee is

[Unknown1]: it's it's an international committee

[Unknown2]: yeah

[Unknown1]: right so we've got people in israel people in dubai people in africa people in

[Unknown1]: japan people we've got people all over the world who are in this committee

[Unknown1]: considering just that because we're not only when we begin to consider ethics we

[Unknown2]: the ch said that

[Unknown1]: have to consider governance we have to

[Unknown2]: yeah

[Unknown1]: consider regulations whose regulations cause if we're talking about human rights

[Unknown1]: even what does

[Unknown2]: right

[Unknown1]: human rights mean in russia what does human rights mean here what does human

[Unknown1]: human rights mean in russia what does human rights mean here what does human

[Unknown1]: rights mean in china what does human rights mean in south america depending on

[Unknown1]: rights mean in china what does human rights mean in south america depending on

[Unknown1]: which pay of the place in south america

[Unknown1]: which pay of the place in south america

[Unknown2]: yeah i know

[Unknown1]: right what does human rights mean in africa depending on which place in africa so

[Unknown1]: right

[Unknown2]: i mean even in even in

[Unknown1]: even

[Unknown2]: america you have

[Unknown1]: yes yes

[Unknown2]: uh it's a it's a volatile subject yeah

[Unknown1]: yeah the ethics yes so

[Unknown1]: when were this you know as you know

[Unknown2]: nine one

[Unknown1]: as a philosopher discussing ethics is always right

[Unknown1]: it's a movable feast

[Unknown2]: it's the easy it's the easiest type of discussion always yeah no one ever gets

[Unknown1]: easy it gets the easiest type of discussion always yeah it's a no yeah right it's

[Unknown2]: heated

[Unknown1]: a movable feast so the thing is what a big part of what for one of those one of

[Unknown2]: the

[Unknown1]: the tenants that we're using because you have to use three lines right

[Unknown2]: hm

[Unknown1]: and one of the through lines or i i

[Unknown2]: i

[Unknown1]: call it a through line

[Unknown2]: um

[Unknown1]: is

[Unknown2]: yeah

[Unknown1]: sustainable development which were really which was not initially an original part

[Unknown1]: of uh the committee but now is more and more because it becomes evident that if

[Unknown1]: you're beginning to discuss human rights right if you're going to talk about like

[Unknown1]: the sdg the sustainable development goals for the u n

[Unknown1]: right

[Unknown1]: human rights are part

[Unknown2]: the part

[Unknown1]: of that

[Unknown1]: and so

[Unknown2]: you would hope so yes

[Unknown1]: right and equity and equitable behavior and those things should be part of ethics

[Unknown1]: one would

[Unknown2]: one

[Unknown1]: ho

[Unknown2]: yes right right

[Unknown1]: not assume but hope

[Unknown2]: yes

[Unknown1]: and then we begin you know to me it almost becomes a

[Unknown1]: you know it

[Unknown1]: one of the

[Unknown2]: one of the sal

[Unknown1]: challenges is not getting lost in the swamp so much of debating

[Unknown2]: hm

[Unknown1]: which one

[Unknown2]: right

[Unknown1]: to choose

[Unknown1]: but which

[Unknown2]: let's see

[Unknown1]: ones we can start with

[Unknown2]: right

[Unknown1]: which ones we can agree on right those are like collaborative agreements right so

[Unknown2]: yeah

[Unknown1]: in my work as a leadership coach right that's one of the things i do a lot spend a

[Unknown1]: lot of time time doing and people start with what

[Unknown2]: so what we

[Unknown1]: we don't want it's like forget what we don't want start with what we want

[Unknown2]: yeah

[Unknown1]: right and let's build from there and so

[Unknown1]: right and let's build from there and so

[Unknown1]: you know part of our work has been gathering

[Unknown1]: what first

[Unknown2]: what this

[Unknown1]: of all what does ai mean to different in different places to different people

[Unknown1]: because it means different things in different places to different people just fyi

[Unknown2]: yeah

[Unknown1]: and then

[Unknown1]: and that's just that's globally right so

[Unknown2]: right

[Unknown1]: we've got all the different right

[Unknown1]: that all these definitions

[Unknown2]: all these steps

[Unknown1]: we're going

[Unknown2]: wed

[Unknown1]: to use right

[Unknown2]: like now

[Unknown1]: now we're going to come up with our

[Unknown2]: power

[Unknown1]: operational definition because you have to find a common language

[Unknown2]: right

[Unknown1]: in semantics

[Unknown1]: and then we begin to look at ethics in the same way what how can we find

[Unknown1]: commonalities to begin with

[Unknown2]: hm

[Unknown1]: you know we know that this is a big part of the work of the world economic forum

[Unknown1]: and we know it's right those are the kinds

[Unknown2]: the

[Unknown1]: the un unesco those are all the big organizations

[Unknown1]: there are big

[Unknown2]: together

[Unknown1]: huge meetings about this at places like the hague where they're you know

[Unknown2]: uh

[Unknown1]: they're they're really debating how

[Unknown2]: i will

[Unknown1]: can we find commonality

[Unknown1]: so that

[Unknown2]: yeah

[Unknown1]: we can find some way

[Unknown1]: to

[Unknown1]: regulate this runaway train which what feels like a runaway train sometimes

[Unknown2]: right

[Unknown1]: so the qu it c the it constantly goes back to who's ethics okay

[Unknown2]: okay

[Unknown1]: we said this at this meaning all right

[Unknown2]: all right anybody else

[Unknown1]: anybody else had right and then somebody else is going to come in and we go

[Unknown1]: alright well

[Unknown2]: alright

[Unknown1]: we're going to consider that and everybody's going to consider that we're going to

[Unknown1]: see what we can come up with

[Unknown2]: that's good

[Unknown1]: that is

[Unknown1]: as

[Unknown2]: l honor

[Unknown1]: common a goal

[Unknown1]: right

[Unknown2]: right

[Unknown1]: and present that as us

[Unknown2]: not

[Unknown1]: understanding that we're also in development on this because

[Unknown2]: right

[Unknown1]: there is no no there is no final expert i mean there are people like stephen araki

[Unknown1]: who is arguably one of the biggest experts in the field

[Unknown1]: the

[Unknown1]: but

[Unknown2]: but

[Unknown1]: even he would say you know there are we there are dozens of organizations working

[Unknown1]: on this and

[Unknown2]: yeah

[Unknown1]: it's you know i tripoli is one and it's huge but there are many organizations

[Unknown1]: working on this and really wonderful people you know that's one

[Unknown2]: yeah

[Unknown1]: of the other things i try to tell people when they get very depressed about it

[Unknown1]: it's kind of like there's actually a lot of people working on this

[Unknown2]: yeah yeah

[Unknown2]: yeah

[Unknown1]: you know

[Unknown1]: you know

[Unknown2]: working very hard yes

[Unknown1]: so yes but so but i love the question of who's ethics and that is one we have to

[Unknown1]: continually come back to

[Unknown2]: hm and

[Unknown1]: because also something that i might've thought ethical was was ethical yesterday

[Unknown2]: yesterday

[Unknown2]: right

[Unknown1]: how are we seeing it now right

[Unknown2]: yeah

[Unknown2]: yeah as new information comes in as blind spots are revealed i'm sorry continue

[Unknown2]: yeah as new information comes in as blind spots are revealed i'm sorry continue

[Unknown1]: and we know the comes in as blind spots are revealed

[Unknown1]: well the thing is and we know this like in biomedical and biomedical ai and right

[Unknown1]: so things that are saving people's lives

[Unknown1]: can

[Unknown2]: can also be

[Unknown1]: also be used nefarious

[Unknown1]: so what are things weak

[Unknown2]: can you give me an example of that i'm sorry i i can't think

[Unknown1]: i can't think what off my de

[Unknown2]: of enough top my head but maybe i don't want to know but still

[Unknown1]: well we can do gene editing gene

[Unknown2]: ah yeah

[Unknown1]: editing could be gene editing could save a child's life

[Unknown2]: yeah

[Unknown1]: and gene editing could also do some really messed up stuff

[Unknown2]: right right

[Unknown1]: right

[Unknown1]: there are many different there are many many

[Unknown2]: yes yes i have i don't i think it's francis fukuyama i read his book on you know

[Unknown1]: have i don't know it's francis cpa i read his book on this idea of could we could

[Unknown2]: this idea of could we could end up in a world if we're not careful where rich

[Unknown1]: end up in a world if we're not careful where rich people are literally editing

[Unknown2]: people are literally editing their kids

[Unknown1]: their kids

[Unknown2]: to be

[Unknown1]: to be more advanced and different things

[Unknown2]: more advanced at different things than the and

[Unknown2]: like i mean i imagine the poor enrich divide but at a genetic level right 'cause

[Unknown1]: or i mean imagine the poor and rich divide but

[Unknown1]: wait well there's a lot of right and the trans hu's a whole argument about

[Unknown2]: they have access to these technologies

[Unknown1]: transhumanism which i will not

[Unknown2]: right

[Unknown1]: go into today because transhumanism i just the thing is transhumanism is a whole

[Unknown1]: thing that's like

[Unknown2]: yeah

[Unknown1]: martine roth blatt work which

[Unknown2]: yeah

[Unknown1]: is amazing

[Unknown2]: yeah

[Unknown1]: amazing she you know martine was the you know she was the inventor of series sx or

[Unknown1]: the founder series sx

[Unknown2]: right

[Unknown1]: but you know enhance in robotics that does amazing stuff

[Unknown2]: yeah

[Unknown1]: yeah and then but there's a lot of different people who are in

[Unknown1]: moving in a direction of transhumanism that is a whole

[Unknown2]: what you don't have

[Unknown1]: another layer

[Unknown2]: another two hours

[Unknown1]: it's and it's all it's also you know it really then we really there's there's so

[Unknown1]: many belief systems that come into play

[Unknown2]: right right

[Unknown1]: right and now you're belie there's you crossover into religion you cross so right

[Unknown1]: you cross it there's a lot there's a lot of stuff there

[Unknown2]: so many metaphysical questions yes no

[Unknown1]: right so right so we can like pull that down

[Unknown1]: but the right and and i think it's and i think it's really fascinating

[Unknown2]: yeah

[Unknown1]: i used to my reaction used to be

[Unknown1]: oh my god no

[Unknown1]: and now i go hmm

[Unknown1]: what do

[Unknown2]: right

[Unknown1]: i think about that

[Unknown2]: right

[Unknown1]: because

[Unknown1]: it's

[Unknown2]: it

[Unknown1]: changing what i believe is ch what i think is changing

[Unknown2]: yeah

[Unknown1]: um

[Unknown1]: what i value

[Unknown1]: uh has not

[Unknown1]: in

[Unknown2]: right

[Unknown1]: many ways right the values my values are kind of embedded in my beliefs in

[Unknown1]: different places although i have

[Unknown2]: yeah

[Unknown1]: i have some values have r risen higher to the top

[Unknown2]: right

[Unknown1]: but i think that this question of whether is ethical

[Unknown1]: is what we want i would i really wish we could get more people talking about all

[Unknown1]: the time i mean i know

[Unknown2]: yeah

[Unknown1]: it's that's like every philosopher's dream is to like have a a world where people

[Unknown1]: go what do i think about that right

[Unknown2]: right

[Unknown1]: what do we think about that because wouldn't you i mean

[Unknown2]: i mean that's

[Unknown1]: i's why i have i'm having you here on the show right i mean that's why i'm doing

[Unknown2]: why i have i'm having you here on the show right i mean that's why i'm doing this

[Unknown2]: show i think it's i

[Unknown1]: the show i think

[Unknown2]: wanted to talk about this just so that people start asking these questions so

[Unknown1]: i wanted to talk about this just that people start asking these questions so thank

[Unknown2]: thank you again for coming on i mean i just think this is so essential

[Unknown1]: you again for coming on

[Unknown1]: oh and i know well this is also how it worked you know you have brain lateral

[Unknown1]: slides it's the privacy paradox

[Unknown2]: hm oh yes

[Unknown1]: when i was talking before about the paradox i was i talk i have i have like brain

[Unknown1]: i have like m lateral slides

[Unknown2]: yeah yeah yeah yeah so

[Unknown1]: when right with my brain w it's the private

[Unknown2]: people want to press the but uh press the button go ahead

[Unknown1]: people wanting to press the button and give up their privacy

[Unknown2]: yeah

[Unknown1]: and if you said do you know you gave up your privacy they would say yes

[Unknown1]: if they were being honest they would say yes

[Unknown2]: yeah oh yeah

[Unknown1]: but i did it

[Unknown1]: cause it was kind of easier

[Unknown1]: right now and the thing is

[Unknown2]: i wanted the article so i accepted all the cookies i accepted all the cookies

[Unknown1]: the thing is the thing is i've done it we've all done it and i think when we can

[Unknown1]: take

[Unknown1]: you know we can kind of take the finger pointing

[Unknown2]: hm

[Unknown1]: out of it and we can take

[Unknown2]: hm

[Unknown1]: write blame out of it and start saying okay we all do it now what do we want to do

[Unknown1]: right if we know that we're all dumping

[Unknown1]: garbage in the street and that the streets are becoming unlivable

[Unknown2]: i think the industry

[Unknown2]: hey

[Unknown1]: i know we've all been doing it but do we really want to be doing that or do we

[Unknown1]: want to make a change right and

[Unknown2]: yeah

[Unknown1]: i think that people

[Unknown1]: when invited to their best selves again i know it's very optimistic of me but i

[Unknown1]: really believe that

[Unknown2]: yeah yeah

[Unknown1]: will can

[Unknown2]: city

[Unknown1]: not always

[Unknown2]: my favorite

[Unknown1]: will but can

[Unknown1]: can change i think people can you know

[Unknown1]: and it and maybe it's not

[Unknown2]: have you

[Unknown1]: even a change maybe it was there all along they just weren't it

[Unknown2]: h

[Unknown1]: wasn't rising to the surface

[Unknown2]: you know this

[Unknown2]: reminds me so much i just had a conversation with dr lewis gordon uh like a week

[Unknown1]: you know reminds me so much i just had a conversation with doctor for

[Unknown1]: like a weekend and

[Unknown2]: ago and

[Unknown2]: we are

[Unknown2]: talking about his book fear of black consciousness and it's on

[Unknown1]: you're talking about his book here my

[Unknown1]: and it's

[Unknown2]: racism but the solution

[Unknown1]: ra solution

[Unknown2]: was

[Unknown1]: was

[Unknown2]: very similar

[Unknown1]: very similar your discussion of it is a difficult societal problems that we don't

[Unknown2]: your your discussion of it is uh of these difficult societal problems that we

[Unknown2]: don't know how to wrestle with is very similar and it was one just committing to a

[Unknown1]: know how to respon very similar and it was one just committing to a better future

[Unknown2]: better future is really important and so being like and that's the way he talked

[Unknown1]: is really important and so being like and that's the way he talked about it was

[Unknown2]: about it was committing to the future for you you say dedicated optimists very

[Unknown1]: committing to the future for you you say dedicate optimist

[Unknown2]: similar and the idea of opening up possibilities and not just shutting things down

[Unknown1]: the idea of opening up possibility and not just shutting things down and so when

[Unknown2]: and so when you're faced with a difficult question opening up possibilities and

[Unknown1]: you're facing the difficult questions opening up possibilities and inviting people

[Unknown2]: inviting people to be better i mean that's he talked about he

[Unknown1]: to be better i mean that's he talked about the

[Unknown2]: was a professor in indiana and

[Unknown1]: with the sensor in india and

[Unknown2]: he would

[Unknown2]: walk in and he was often the first black person uh this is that many of the

[Unknown1]: he would walk in and he was often the first black person

[Unknown1]: this is that many of the students have seen and some students would immediately

[Unknown2]: students had seen and some students would immediately walk out but some students

[Unknown1]: walk out but some students would stay and they would talk to him and they say wow

[Unknown2]: would stay and they would talk to him and they'd say wow i never i've never talked

[Unknown1]: i never ever talk to a black person before i thought it was supposed to go like

[Unknown2]: to a black person before i thought it was supposed to go like this but it's going

[Unknown1]: this but it's going like this and so it's not it's not will but it can and when if

[Unknown2]: like this and so it's not it's not will but it can and when and if you get enough

[Unknown1]: you get enough can you can solve these problems in meaningful ways not not

[Unknown2]: can you can solve these problems in meaningful ways not completely not perfectly

[Unknown1]: perfectly meaning well also and i'm sorry to you know it's funny that you

[Unknown2]: but in meaningful ways and so yep

[Unknown1]: mentioned that because so much of my training was

[Unknown2]: yeah

[Unknown1]: in doing social equity and racial equity trainings i used to

[Unknown2]: hey

[Unknown1]: facilitate those

[Unknown2]: right

[Unknown1]: and

[Unknown1]: that was

[Unknown2]: that your

[Unknown1]: where i learned

[Unknown1]: better skills

[Unknown2]: hey

[Unknown1]: at having ethical conversations

[Unknown1]: because that's what it

[Unknown1]: comes those kinds of conversations when we're talking about our

[Unknown2]: that no

[Unknown2]: about

[Unknown1]: biases when we're talking about our stereotypes spheres are the way our brains and

[Unknown1]: our hearts operate

[Unknown1]: people come there's so much defensive because people are terrified they'll be

[Unknown1]: they'll be wrong

[Unknown2]: right

[Unknown1]: right

[Unknown1]: well what what happens if we were wrong well you know we

[Unknown1]: it

[Unknown2]: that's well i mean that's actually it's so interesting i i have

[Unknown1]: well i mean that's actually it's so interesting i i have

[Unknown2]: grown a lot in my own life grew up independent fundamental baptist christian right

[Unknown1]: grown a lot in my own home life independents fundamental baptist christian it's

[Unknown2]: it was very exciting should women be able to wear pants right this was like a real

[Unknown1]: very exciting do women be able to wear pants right this was like a real discuss

[Unknown2]: discussion that like adults had and

[Unknown1]: them like adults had and

[Unknown2]: much is

[Unknown1]: i just like

[Unknown2]: like now looking back

[Unknown2]: in wisconsin in negative twenty degree weather like are you serious like of course

[Unknown1]: in wisconsin in negative twenty degree weather oh i'm i'm i'm from wis i am also

[Unknown2]: they should be able to wear pants

[Unknown1]: from wisconsin

[Unknown2]: it's cold like yes

[Unknown1]: it is cold

[Unknown2]: um but the at the end of the day like learn to ask like to present them the future

[Unknown1]: uh at the end of the day like uh to ask like to present them with the future it's

[Unknown2]: it's like would you rather

[Unknown1]: like would you rather

[Unknown2]: continue to be wrong

[Unknown1]: continue to be wrong

[Unknown2]: just so you don't have to admit it or would you rather

[Unknown1]: just so you don't have to admit it or would you rather

[Unknown2]: uh

[Unknown1]: uh just be part of the change and start doing what's doing well

[Unknown2]: just be part of the change and and start doing what's you know doing what you've

[Unknown2]: really have started to realize is right

[Unknown1]: well and there are so many layers to this because part of it is philosophical

[Unknown1]: right and cerebral

[Unknown2]: hm

[Unknown1]: and so much of it is emotional

[Unknown2]: yeah

[Unknown1]: so much of it is

[Unknown1]: so much of it is

[Unknown1]: at both

[Unknown2]: s

[Unknown1]: emotional and a

[Unknown1]: societal

[Unknown1]: tendency in the west

[Unknown2]: hey

[Unknown1]: toward

[Unknown1]: a kind of

[Unknown1]: a punitive

[Unknown2]: that's

[Unknown1]: you know it is a very it's that very kind of like i'm

[Unknown2]: like there

[Unknown1]: it's very maya cpa right i'm going to

[Unknown2]: right

[Unknown1]: beat myself for having been a part of something that was awful it's like we've all

[Unknown1]: participated in something that's awful you know when people go i've never

[Unknown2]: i

[Unknown1]: bullied anyone you go you know what i'll bet your your your sister probably would

[Unknown1]: argue with that right so the thing is none of us are angels although my mother

[Unknown1]: would have

[Unknown2]: right

[Unknown1]: argued that she was a saint the fact is that none of us right none of us are

[Unknown1]: angels

[Unknown1]: and

[Unknown2]: right

[Unknown1]: none of us also i mean

[Unknown2]: also i

[Unknown1]: i also have been

[Unknown1]: a trainer for social emo what's called social emotional learning in schools

[Unknown2]: hm

[Unknown1]: and in and in businesses um and i always

[Unknown2]: i

[Unknown1]: say you know no one gets a certificate of completion in social

[Unknown2]: right

[Unknown1]: emotional learning it's like oh i'm finished right

[Unknown1]: no right we're not

[Unknown2]: right

[Unknown1]: finished we're not finished being a better person you know because what is social

[Unknown1]: emotional intelligence it's self awareness the ability to regulate ourselves the

[Unknown1]: ability to relate with others and handle conflicts the ability to make good

[Unknown1]: decisions i mean or or not to make

[Unknown1]: uh careful decisions or responsible decisions

[Unknown2]: hm

[Unknown1]: if anybody tells me they're finished with all of those things i'd like to know who

[Unknown1]: they are

[Unknown2]: right yeah

[Unknown1]: because i mean really who who i mean because

[Unknown1]: it's it's an ongoing thing you know

[Unknown2]: hm

[Unknown1]: i very often say so many people have had i do a lot of

[Unknown1]: train i've done a lot of trainings during the pandemic

[Unknown2]: i do like that

[Unknown2]: hm

[Unknown1]: and one of the things when i talk to people about what was what if when we reflect

[Unknown1]: on what were you like before

[Unknown2]: to

[Unknown1]: and what do you

[Unknown2]: what

[Unknown1]: like now and

[Unknown1]: the one of the most common not the only

[Unknown2]: nothing

[Unknown1]: one but one of the most common reflections is i've really become a more patient

[Unknown1]: person

[Unknown2]: hm

[Unknown1]: and

[Unknown1]: that was something that happened probably not because people thought oh i really

[Unknown1]: wanna become a patient person

[Unknown1]: it happens because guess

[Unknown2]: and

[Unknown1]: what you can't go anywhere you can't do anything

[Unknown2]: hmm

[Unknown1]: um

[Unknown1]: i'm sorry

[Unknown1]: right

[Unknown2]: right

[Unknown1]: i know it really sucks but

[Unknown1]: and

[Unknown1]: as a result not everyone but a lot of people a lot more people

[Unknown1]: found that in themselves

[Unknown2]: hm

[Unknown1]: and that's the development of a skill now when things

[Unknown1]: speed up

[Unknown1]: they may say

[Unknown2]: they

[Unknown1]: well that was use that was them this is now right and so it's not so it's not a

[Unknown1]: one time like i learned it once

[Unknown1]: and now i'm gonna be great at it forever

[Unknown2]: hm

[Unknown1]: and i think that's and coming back to

[Unknown2]: that

[Unknown1]: ai

[Unknown1]: and ai

[Unknown1]: ethics

[Unknown2]: i

[Unknown1]: coming from that

[Unknown1]: i know it's a cliche but the

[Unknown2]: light

[Unknown1]: lifelong learning perspective coming

[Unknown2]: hm

[Unknown1]: from this perspective i mean i'm like the poster child i keep going back to school

[Unknown1]: but

[Unknown1]: this

[Unknown2]: like

[Unknown1]: lifelong learning

[Unknown1]: of

[Unknown2]: um yeah what i like i love my conversation

[Unknown1]: yeah what am i learning about myself what am i learning about the world what am i

[Unknown1]: learning with how i interact with the world and things in the world because a what

[Unknown1]: our what autonomous systems are are more than simply

[Unknown1]: things

[Unknown2]: hey

[Unknown1]: they carry much more they carry many more consequences

[Unknown2]: right

[Unknown1]: right and so the student who used to bully in the hall or you know the bully

[Unknown1]: behavior

[Unknown2]: right right

[Unknown1]: right in the hallway

[Unknown1]: now

[Unknown1]: has a tool

[Unknown2]: we

[Unknown1]: that they can use

[Unknown1]: and be

[Unknown1]: infinitely more damaging

[Unknown1]: and so right when we begin to think about how this escalates that going back to

[Unknown1]: that

[Unknown2]: right

[Unknown1]: where you know you were talking about escalation before this is this

[Unknown2]: hm

[Unknown1]: you know that's when we can say okay things are faster

[Unknown2]: yeah

[Unknown1]: so what do i have to do differently

[Unknown1]: right what do we have to do differently and i know it's not you know but the

[Unknown1]: people are like no i want it done yesterday and or i want somebody else to do it

[Unknown1]: for me

[Unknown1]: and i would just like to invite them again the invitation i'd like to invite us

[Unknown1]: all and not them us all myself included right

[Unknown1]: did i think about

[Unknown2]: what i think about

[Unknown1]: pushing that button

[Unknown2]: mm

[Unknown2]: yeah and it's so it we push so many buttons throughout the day and we just don't

[Unknown1]: yeah and it's so we push so many buttons throughout the day and we just don't even

[Unknown2]: even think about it right

[Unknown1]: think about it right

[Unknown2]: and i love that kind of as like a summary of what we've been talking about as we

[Unknown1]: and i love that kind of like summary we've been talking about and i kind of wrap

[Unknown2]: kind of wrap up here one

[Unknown1]: up here one

[Unknown2]: thank you

[Unknown1]: thank you

[Unknown2]: uh too if you know as a listener you've been enjoying this please

[Unknown1]: to if you have you've been enjoying this please

[Unknown2]: like share and subscribe so that more people can hear this discussion because i

[Unknown1]: like hear and subscribes where people can con discussion that think is so

[Unknown2]: think this is so important

[Unknown1]: important uh

[Unknown2]: as we close out here

[Unknown1]: as we close out here

[Unknown2]: besides your own work which obviously i think would be a good place to start with

[Unknown1]: besides your own work which obviously i think would be in a good place to start

[Unknown2]: some of this

[Unknown1]: with some of us

[Unknown2]: what would you

[Unknown1]: what would you recommend sara a book a lecture a video a movie that you'd say

[Unknown2]: recommend is there a book a lecture a video a movie that you'd say um is a

[Unknown2]: great place for people to learn more about ai ethics

[Unknown1]: that would be a great place for people to learn more

[Unknown1]: yeah absolutely there are many courses online right now coursera

[Unknown2]: hm

[Unknown1]: has a couple of them a number of other many there are many many and burgeoning

[Unknown1]: there are a lot of them i would say look on

[Unknown2]: okay

[Unknown1]: linkedin i mean if it's business people look even even if it's not business people

[Unknown1]: but look i'm linkedin look you know under ai ethics google ai ethics and classes

[Unknown1]: or right right because and then wait so the

[Unknown2]: seems like a conflict of interest but

[Unknown1]: well it is it is but the thing is

[Unknown2]: no no it's okay i

[Unknown1]: we will it it could be it could be but

[Unknown1]: it's also re it's really really no it's really really important that we begin to

[Unknown2]: yeah no no it's okay

[Unknown1]: do this and

[Unknown2]: hm

[Unknown1]: there are many there is a whole

[Unknown2]: right

[Unknown1]: list of books which actually what i will do rather than sight them all right now

[Unknown1]: because there's

[Unknown2]: yeah yeah yeah

[Unknown1]: i i

[Unknown1]: i tend to do that

[Unknown1]: which is not only interesting to me so what i will do is i will give you that that

[Unknown1]: you if you want you could put into the notes for the show

[Unknown2]: absolutely i will put that list

[Unknown1]: right because there's a long list of books that are really

[Unknown2]: yes

[Unknown1]: beginning and new reports that are coming out more

[Unknown2]: hm

[Unknown1]: importantly reports that are coming out that are really beginning to

[Unknown1]: bring this idea to the front because even in small businesses even if you

[Unknown2]: yeah

[Unknown1]: have two employees three employees right

[Unknown1]: it's really an important thing and it's certainly important us to think about in

[Unknown1]: our families

[Unknown2]: oh yes absolutely as you're talking about

[Unknown1]: oh yes absolutely as you're talking about

[Unknown2]: whether we

[Unknown2]: have the decision to make the default voice we do have the decision to change it

[Unknown1]: whether we have the decision to make the default voice we do have the decision to

[Unknown1]: change it right i think that most of the

[Unknown2]: right i think that most of the you know y just to go with

[Unknown1]: just to go

[Unknown2]: a small example like that to to think about it

[Unknown1]: right or even right or even to say

[Unknown1]: did you ask sri nicely

[Unknown2]: yeah yeah yeah

[Unknown1]: right so there's there's there are so many options

[Unknown1]: because

[Unknown2]: yeah

[Unknown1]: right am i speaking to sarah sarah am i speaking to sri sarcastically am i

[Unknown1]: speaking to sarah in a bossy voice am i

[Unknown2]: yeah

[Unknown1]: speaking to sara siri i mean i one of my favorite things i think ever was years

[Unknown1]: ago when syria first started and i i i yell i yelled about something

[Unknown1]: and she went i don't appreciate that

[Unknown1]: and it may

[Unknown1]: and it made me laugh so hard i loved it i

[Unknown2]: yeah

[Unknown1]: loved it but i think and i i i don't know if they have removed that even

[Unknown2]: m

[Unknown1]: now from those

[Unknown1]: but

[Unknown2]: yeah

[Unknown1]: i think really important or if you're using those did you check

[Unknown2]: hm

[Unknown1]: your privacy settings right what are your privacy settings on right because you

[Unknown1]: know really

[Unknown2]: yeah

[Unknown1]: you know so anyway those

[Unknown2]: what

[Unknown1]: are the those are things that we can do

[Unknown1]: i don't want

[Unknown2]: i

[Unknown1]: people to get i mean i really i i worry about people getting paranoid

[Unknown2]: hey

[Unknown1]: like to step away from paranoia with it we'll all do the best that we can do right

[Unknown1]: we're all

[Unknown2]: right

[Unknown1]: doing the best we can do but you know a little increase in

[Unknown2]: i didn't think

[Unknown1]: consciousness goes a long way

[Unknown2]: it's very true thank you so much appreciate it

[Unknown1]: it's very true thank you so much appreciate it i really such a pleasure pj thank

[Unknown1]: you so much