Disruption Now

Step into the digital arena with John Cavanaugh, a renowned digital privacy and safety expert, on this episode of "Encrypting Freedom." Uncover the hidden pitfalls in your digital practices and learn how to fortify your data against the encroaching eyes of AI technologies. John doesn't just outline the problems; he arms you with robust, actionable strategies to take control of your digital identity.

This episode isn't just about listening—it's about engaging. Discover resources, communities, and tools that will empower you to join the vanguard of digital defenders. "Encrypting Freedom: AI & The Data Battle" isn't just a conversation—it's a call to action. Tune in to transform your understanding and approach to digital privacy in the age of AI.

Website https://bit.ly/2VUO9sf
Apply to get on the Podcast https://form.typeform.com/to/Ir6Agmzr?typeform-source=disruptionnow.com
Facebook https://bit.ly/303IU8j
Instagram https://bit.ly/2YOLl26
Twitter https://bit.ly/2KfLaTf
Website https://bit.ly/2VUO9sf

What is Disruption Now?

A podcast to disrupt common narratives and constructs to empower diverse communities. We provide inspirational content from entrepreneurs and leaders who are disrupting the status quo.

the digital world, it scales differently.

So you don't need to call one latchkey
service at a time

with AI, with being online,
you can replicate things very,

very quickly
and it can be a one man shop or operation.

So there's a huge difference in scale
when we're looking at this online.

If you believe
we can change the narrative.

If you believe,
we can change our communities.

If you believe we can change the outcomes,
then we can change the world.

I'm Rob Richardson.

Welcome to disruption. Now.

Welcome to Disruption Now.

I'm your host and moderator,
Rob Richardson.

Have you ever thought
about how your data is used?

Or better yet, have you thought about
how data of your kids is being used?

Do you have any idea about
how much is happening behind the scenes,

how that can be manipulated,
how it's being manipulated?

You know,
I don't think we thought much about it.

We entered. I'm going to tell my age here.

We enter the social media era
and just everything was free.

We were able to connect.

It was all free. It was all great.

But it's not. Free.

Nothing's free in this world.

That data is not free.

You are the product.

You and your kids are the product.

The question is,
are you comfortable with that?

Does that matter to you?

Well, we're here to actually to talk about
why data privacy matters is just not,

you know,

it's just not people

that are just liberals out here
saying we need to protect our data.

It's actually affecting all of us because

whether you like it or not,
you are a part of the digital economy.

And that's just going to amplify
now with artificial intelligence.

So with me to talk about
how we can have a future that's actually,

more about transparency and freedom

and understanding how you can protect
yourself is John Kavanagh.

He is the founder of the Plug Foundation.

And, he's gonna

tell you, a story in his journey
because it's a very interesting one.

But I want you to understand
that you have the ability

to protect yourself, to protect your kids,
but you have to know where to start.

And we hope that by the end of this
episode, you'll actually have that.

But before we start, make sure you like
make make sure you subscribe.

That's how we're going
to keep the disruption growing.

We appreciate you listening.

And now I have the pleasure of introducing
John Kavanagh.

John, how you doing, brother?

Good. Rob.

Thank you so much for the introduction.
Appreciate it.

Hey, thank you so much.

So you had an interesting journey
to get to data to get to data privacy.

I like to start like this is such a

we met not not not so long ago.

And to me, it was clear
that you had a clear mission and focus.

And I have to say,
I consider myself pretty informed.

But after I left our conversation,
I became even more concerned

about data privacy.

I was like,
oh, this is worse than I thought.

And, I'm just curious, how did you get
into the world of data privacy?

Like, how did this become
your kind of central? Why?

Yeah.

So everybody in privacy,
if you ask, they all have a wild story.

Very few people just started out
passionate about this.

And like you mentioned, the introduction.

We all thought the products we were using
online were free and great.

And yeah,
there were some advertising behind it.

No problem.

Where you see that
with television and stuff,

but there's a dark side to it
and people learn that at their own pace.

So a few people on my board, one was,

a lawyer,
and he was just doing contract law,

and he saw a drone go over his house,
and he was with his kids and family.

And it started that question like,
you know, who's drone is that?

Right?

Why are they do they have now video of me
and my family?

Right.

So we all have those interesting ways.

for me, it started back in undergrad
where I made a website.

It was called Slate Up,

and it was just a place
as a pre-med student

where I wanted to meet other students
that were taking

the mCAT already in med school,
because it's a huge decision.

It's like a half

$1 million in debt to figure out
if you like being a doctor or not.

Yeah.

So it was a lot of cost
to figure that out.

So I built a website
to connect people similar

to like the old school version
of Facebook. Yeah.

And that was are missing
connect the world.

Yeah.

So I wanted to meet and connect
with people, especially professionally

and within school.

And it grew very fast.

within the first three months,
we had one fourth of you see, on it.

Yeah.

And it grew in the Midwest,
Ohio State, Middle Tennessee,

Clayton State University, lots of schools
in Ohio, Kentucky, Tennessee, Indiana.

And as we were growing,
we got, research grants

to keep it running, but eventually
we needed a series A funding.

Sure.

And we weren't
we weren't keeping any data of our users.

We were like, oh, let's just,

make sure

that we're taking care of them
and that everything's protected.

But we just want to make sure that
the people that are using our products,

feel good about
you know, the environment that they're in.

Right? And our privacy policy
is, hey, we're not selling anything.

We're not pushing any ads.

And we developed, a way to make money
by selling to universities.

So we figured that would be our pipeline
and our revenue to make money,

which everybody agreed upon.

But when we were looking at that series
A funding

every single investor 50
plus that we talked to,

every single one was saying,
hey, what are you doing to sell the data?

Because that's another revenue
stream, right?

And you can't ignore it to be competitive
in this day and age with tech.

And I started learning
about what that really meant.

So when looking through the details
of what data mining was, how it's used,

who it's sold to, you enter this huge dark

side of the internet
that people are vaguely familiar with.

But when you really dive
deep down about it,

you understand how they're using various

parameters about

your life to nudge you to a decision.

It can be a political decision, right?

It can be in buying products
at a certain time.

It can be understanding
your mental health state.

Are you depressed?

Are you sad?

Well,
you tend to buy more in this direction.

and we were primarily
focused on college students

who have, by default, pretty bad literacy.

financial literacy skills. Yes.

So that was one thing was like, hey,
a lot of these students are getting a lump

sum that is debt, essentially,
and spending it on beer and pizza

right on the weekends.

But if we have an opportunity
to slip our products in there as well,

where they're paying 12% interest
on whatever

T-shirt company they are,
then that's good for business.

But in the long run, it's
not sustainable for our society.

And I found a lot of objections.

So, me and the core group,
we decided to close down the organization

and build a nonprofit.

So you you mentioned me as the founder,
but there are.

And I want to say that back, back, back,
back up for a minute, okay?

People just have to I want to make sure
they absorb what you just said.

A series funding is a big funding round.

Yes, right.

There is seed pre-seed series
A, you're talking

minimum 5 million, probably dollars.

I don't know what it was then,
but I'm guessing it's around that like so

very clear. Like

millions of dollars was offered. Yes.

And because you weren't comfortable
with the direction

of where the investors wanted
you to go with the data.

You turned it down
and you started a nonprofit.

Just make sure people understand that.

So like it's one thing to say
that you have these principles.

It's another thing
to actually have done it.

What's that massive respect for. Right.
Thank you.

Could have been easy by the way, but,
like, it's,

So I want to talk about
I want you to go down the nonprofit.

but I like to just deep dive into,
like, what?

Really kind
of, sparked you to say no to that?

Because it sounds like you weren't against
the idea of ads in general.

It sounds like there was something there.

Was there other things that that deeply
disturbed you to make you go from?

It's one extreme to say, no, no,
I do this to starting a nonprofit.

Like like what were like the top things
that really just, like, stuck out.

There has to be some things
that like that.

The investors,
the people wanted to do that.

You just didn't sit right with you.

Like, what? What were those things?

Yeah.

So our primary focus was understanding
how students could feel at home

at a university. Yes.

And we were helping with,
reducing melt rate where people said,

hey, I want to go to University
of Cincinnati, for example.

But then they wouldn't at the end of the
like beginning of the school year,

they wouldn't come because they went
to Ohio State or something as an example.

a lot of that is because they just didn't
feel a sense of community.

So we were helping facilitate
that sense where the resident advisors,

the people that live on the dorms,
and the orientation

leaders
would connect in a better facilitated way.

Right. so people felt more at home.

and then, you know,

to be honest, I was against ads
because the business model from.

Yeah, the business model that we had.

Right. Didn't need it. Right.

So everything else was like,
hey, these are slightly it was a steep,

like a curve of more
and more invasive maneuvers.

Right. Which was against the philosophy.

It's like, hey,
we have a good revenue stream.

universities are willing to pay for this.

We had letters of intent and to buy
and everything, so we were good to go.

And, they wanted you to be.

I feel where you go. Is that, yes.

You didn't like the ads, but I also think
it was the level of what you would have

had to compromise
feels like for becoming for for selling.

That would have meant that you would
have gone away from your mission.

It feels like. Exactly.

And here's another thing
is, I'm convinced in the tech world,

if you are a for profit tech company,
unless you're a billionaire

and you can find everything itself, at one
point you're going to be cash strapped.

So if it's not the series A
that we compromise on,

it's going to be a series B, 100,
100 million or something like that.

And what's going to happen is companies
they want or investment firms

they want somebody on the board
that makes decisions.

And unless you are a super galactic

unicorn like Zuckerberg,
who has all of the voting rights.

But even but,
but but he had to still do that.

And I don't think he has any problem
with it, actually.

You know, like, I don't even say he
he doesn't share your your moral dilemma.

It's not
I just think we got to be honest. Right.

But but he's like he did share that.

But you're right if it but that but also
he's following the model that they want.

Yeah.

Any you know he does it to like builders
have a tough job but I don't.

But I do think that they don't do enough
for policy to protect people.

And what you're saying is you

if you would have done
that, you'd have had no ability

to protect the people
that you're fighting for. Yes.

And it can only get worse.

Yeah. And, I. Totally respect that.

Like to been able to do that. Amazing. For
this is why this is amazing.

This is why I was moved to have you on.

So sorry to interrupt you,
but go ahead. Good to go.

Thank you, I appreciate it, but to,
if I died tomorrow, let's say I'm running

it, and I have decision, like, even
best case scenario, I have full decision.

I die tomorrow.

Whoever's taking my place,
there's nothing, there's

no law protecting the original mission
that we have.

Somebody can come in and say,
this is our business strategy.

where a nonprofit has

an articles of incorporation
that the government enforces.

Right.

So if our founding team, by the way,
our founding team is amazing.

There's four of us.

If our founding team all dies tomorrow,

whoever upholds it is legally bound
to uphold our Constitution.

And to change that is extremely difficult.

and the people that have to change
it have to look at our, our mission

and see if that follows our mission.

So, that's why I made the shift to
if we're doing something

in the tech space,
it needs to be a nonprofit.

Okay. That's amazing.

So, you started the Clock Foundation.

Now, tell us that you moved from,

you know, working to help the community
of college students.

What is the foundation doing now?

What is its mission?

Yeah.

So, the mission is to wake up
as many people in the same way

that I was woken up when,
when I did this deep dive, it was like

pulling the matrix out and like,
oh, this is really what's going on?

and it's to do it
in a way to create awareness.

So the four main principles
that we're doing is creating awareness.

We want to build education off of that.

we want to create technology
that solves for it.

So it's great to say, hey, guess what?

You're screwed. Rob,
I don't have any solutions for you. Right?

Okay. That doesn't help anybody. Yeah.

If we have tools and technology.

So technology and tools.

So the second piece second half.

So awareness
education technology and tools okay.

To give people the steps
that they need to protect their privacy

okay.

The second part of that is that we are
also not only focused on privacy,

but we're focused on digital safety, okay.

And that's more encompassing.

So similar to when we were kids,
our parents would say,

don't talk to strangers or look both ways
before you cross the street.

What does that mean in an ever
changing online landscape?

What does it mean
when it comes to virtual reality?

When it comes to AI, when it comes
to just using our phones or our laptops?

And how do we inform our children,
how to inform families and, keep that

as a thread that we are on the pulse
for whatever advancing technologies come.

Yeah.

Give some examples.
Like so people think digital safety.

and you gave me
some really great examples.

But like people don't people
don't appreciate what that means.

Tell me, why should people care
about digital safety?

Yeah,
I think people have some idea of privacy

and that feels easier because it's like,
all right, you can, you can.

You get to choose
what level you want to do.

Do you want to have your information
there? Do you not.

And that needs to be in an informed
consent way

where we actually understand
what you're telling us. Yeah.

Not a 10,000 page terms condition.

This where you can sell your information,
it can be sold to anybody.

Are you comfortable with that
or not? Yeah.

And then people need to really have
that versus whatever we have now.

It's like not informed consent
I believe that's pretty dumb.

I think overall people understand what
privacy means and why that's important.

What do you mean by digital safety.

Yeah.

So digital safety has a lot of,
general threads.

So it can be a mix of cybersecurity,
such as when you are creating a password,

are you creating a long password
a short password.

do you have authentication, multi-factor
authentication

so that if somebody finds your password,
you get sent something on your phone

saying, did you approve of this password
being sent?

There are little things,
like very basic things.

Are you updating your devices regularly?

there are things

such as what are the next scams
that are happening on TikTok or Facebook?

we now know with AI there have been
phone calls that are replicating

somebody's voice because all you need
is a six second sample.

So those types of things
that I need to understand,

that if I'm a mom
and my kid calls me from a phone

and it sounds like them,
but it's not registered, I don't know,

it's their actual phone number.

And they're saying, hey, mom,
I need money, right?

For you to have some street smarts.

Yeah, some digital street smarts.

Yeah, that's a good way to think about it.
Yeah, exactly.

That's a good that's a really good.

That's a really good example.

Like, I remember growing up,
my parents and I had a, had

I had a password between the family
that only 3 or 4 people knew.

If that was so, you would know, actually,
if somebody said

that they were trying to pick you up
because this is going to age me again.

But we were latchkey kids, right?

So, like, sometimes we were,
which is never happens now.

Like, you get picked up by people
and, and uncles,

and we would only know
that if people would know a password.

Hardly
anybody ever picked me up, by the way.

But it's usually I just went home myself.

But they they would tell us that
because kids were getting kidnaped, right?

When people were, saying,
oh, you're your father.

I'm a friend with your father. Right?
And he told me to take you home.

Yeah.

And like, so kids
sometimes wouldn't know the difference.

You know the father's name.

They say. You know this,
I know this about your family.

So you assume that kids are more trusting

that haven't been through the level of,
exposure and experience that adults have.

So we had we had, like,
passwords that we all have to know.

And it's funny, like,
I feel like that needs to come back now.

Like people. People didn't.

People don't think about that as much now,
but it's it's

because people don't get strangers.

Don't pick up the kids. Most,
we have helicopter people now.

Our moms and dads, like,
everything is like, encapsulated,

but it's not really
what did with the digital world, is it?

It's not.

And the digital world,
it scales differently.

So you don't need to call one latchkey
service at a time

with AI, with being online,
you can replicate things very,

very quickly
and it can be a one man shop or operation.

So there's a huge difference in scale
when we're looking at this online.

Yeah.

I also remember an example
that we, that we talked when we first met.

Yeah, about how digital safety affects
the most vulnerable populations.

Like I love for you to
to to talk through that example.

We talked about like women
that are for example, like women

that are recovering from,
violent acts and domestic abuse.

Yeah.

Like that go to organizations like Women
Helping Women.

You know, you really talked about
how digital safety can affect,

people like that, that I never thought
about, talk about, like,

why even even the
most vulnerable populations

could actually be, more at risk as well.

Yeah.

So what we're seeing a lot
now in the landscape through our research

is that there's various
areas, there's human trafficking,

there's intimate partner
violence and domestic abuse,

that we're seeing
a lot of these things happening.

And then there's school
bullying and sextortion.

So, let's go to to the first example
is that, in women's shelters,

there's research that's been done
in Europe that found that 79% of women

who went to a shelter for domestic abuse
were tracked in some way or form online.

Wow. There.

But the national Network to End
Domestic Violence, which is here in,

in the US, you know,
they found that it was 100% of people,

and now they surveyed,
probably like a couple thousand less.

but either way, the survey was over
3000 people.

So it is a significant amount of people
who are saying that online,

I'm being tracked.

And this could be
if my Snapchat location is on

for those that you Snapchat, it could be
my car.

is being tracked. It could be my phone.

It could be more,

like the Apple AirTags,
which are very common to slip into a car.

I just talked to a young woman
who had that happened to her.

She has no idea who is from.

So there are various ways
of tracking and stalking.

That's that's done.
So how does Snapchat tracking chat?

You can turn your location on.

They have an they have a piece
that's called Snapchat maps.

Right. And you can turn your location on.

You can see where people are at at all
times. Yeah.

And it's pretty common for her.

Used to have that on
some are very selective about

which friends have it,
but some partners may demand

that you have these types of things on,
and if you don't, you will get in trouble.

As an example.

And what type of tools do they have
in place, like you said, like this is

like Snapchat should probably do

a lot more to warn to warn kids, or
maybe turn that on and off to say, do you?

I know, like, Apple
at least started doing that.

Like, do you want to stay
or do you want to stay on?

But I think with kids
they probably needs to be

another level of, protection and safety.
Absolutely.

And kids are using it
all the time to figure out,

like who's at what party
or where they're going with things to.

That's that's tough. Like,

so how do you balance this
when you think about like,

how do we balance the obviously
the innovation and having the right

balance between what, what I guess
policy or regulation needs to look like?

obviously you and I are both believers
in policy and regulation somewhere here

that and say like you create regulation,
it's going to kill innovation.

Yeah, right.

That's the line.
What's your response to that.

Yeah, that's a that's a good question.

I the way that I think about it
is that most organizations

that are, that are doing
this, have good intentions,

they're trying to make good products,
and I don't blame them at all.

The areas that I'm really looking toward,

is when it comes to bad actors.

So when it comes to corporate
policy and privacy,

there is this culture of surveillance
capitalism. Yes.

The The Punk Foundation doesn't
focus on that,

although where where that happens,
we're more focused on how we can help

with bad actors, but, help
fight against bad actors.

Sorry.

and but when we look
at the, the corporate area,

the way that I see it

and this is a, the more personal take,
but the way that I see it

is that they have been using a technology
before regulation.

So everything before regulation
is sort of a gravy train for them.

But there are.

But then the consequences that they have

will not allow for what they're doing
to be sustainable in the long run.

Right?

If they're if they're taking everything
and bastardize it and sucking it in

and turning it out is

I you're going to have a lot of problems
of quality data that exists.

You're going to have a lot of problems of,

taking and, and,
just having massive amounts of siloed data

that when cyber attacks happen,
now you have data

you shouldn't have collected to begin with
or you should have purged.

And now it's affecting
as a national security risk.

So, the Biden administration
actually released,

about around October,
a bill for cyber security

where they're funding,
for people to get jobs in cybersecurity.

And this does tie into privacy
and digital safety,

but they want to because all of the top
businesses are not equipped,

for example, what China has

when it comes to their, ability to hack.

As an example,
when it comes to missile launching,

when it comes
to, let's say, if P&G shuts down,

you know, because of cyber, cyber
attacks or something along those lines,

or a coordinated effort
that shuts down multitudes of hospitals.

So we're looking at this from a national
security and defense, type of thing.

And businesses in the United States
have a lot of freedom compared

to a lot of other countries to navigate,
but we need some type of unity of

how are we protecting our citizens,
which in turns protects the country.

So it is a national security risk,
by the Biden administration.

that's very, very interesting.

It's a national security risk.

And so you kind of answer
one of my questions like, how do you see

navigating this?

If you were,
I guess, president and ruler for a day

where you had the control of Congress
and your president and Senate,

what laws would you pass to, to,

to to help us with both privacy
and digital safety?

Yeah.

So I think mimicking the GDPR is probably
the best place to start in America.

Tell me what the GDPR is. Yeah.

So that's the
the European Union's, privacy law.

And it's, basically it's global.

So it's like imagine
be a federal law here.

And, it controls,

what amount of data
a company can collect the limits.

And,
it's very complicated and very thorough.

but the main thing is
that your privacy is a fundamental right.

And to have access as a company
or as an organization to the privacy,

the person needs very clear
consent saying, yes.

Exactly.
And they can revoke that at any time.

And they say, hey, never mind.
I don't want you to have my data.

The only exceptions,
there's a few exceptions

and I'm not an expert on GDPR,
but there are a few exceptions

that when it comes to like a court case
as an example.

Yeah. So fairly reasonable exceptions.

Right.

and but the most important philosophical
thing in the difference is that in

the United States, privacy is an asset
that you trade for services.

That's good.

Yeah.

And you're that's bad
but it's a good line.

so yeah it's a bar.

Yeah. It's a bar.

It's a bar. But like is that is bad. Yes.

Say that again one more time
because I think that's important is

to drop the bar again okay.

So in the United States privacy
is an asset that you trade for services.

Convenience, whatever that is.

in Europe it's a fundamental human right.

And the United Nations that actually
they have a, bill of fundamental

human rights.

Article 12 is about privacy
is of being a fundamental human right.

And to be frank, in the developing world,
the US is very behind on this. Yes.

and we have patchwork laws.

So like California has CcpA,
which is the California Privacy Act

and it's emulated from European Union's
GDPR.

Right. So they're they're using that.

But that's only in California.

Right.

And Colorado has a few things
Virginia has a few things.

Kentucky is passing a bill.

And it's it's like a slap on the wrist
for organizations that violate it.

But it's a start. Right?

So there are various
we call it a patchwork framework.

Right.

but there's nothing federal
that's, that's happening.

And that's something that would be
ideally the best place to start.

Oh, no, that's that's,
very well said. So.

All right. They we don't have policy now.

It's obviously in the US is the wild,
wild West.

Privacy is an asset.

It's not a right in in the US
unfortunately.

what are some steps people can take.

Like what are some practical steps
you talked about for you

you talked about having multiple,
forms of authentication.

what are some things that you think were
some basic things people can do

or places they can go to learn about steps
they can take to protect themselves?

Yeah. So I think that when it comes

to individuals,
just being aware is half the battle.

Yeah.

Because when you're aware of
what's actually happening, there are few

resources.

The Plug Foundation's
website, it's Plunk Foundation.

Org plunk.

Yes. By the way, it's

an acronym for peaceful, loving,
uplifting, nurturing and kind.

So that's why plunk. There you go.

so plunk
foundation.org has resources on this.

There are other things
such as the center for Humane Technology,

the Mozilla Foundation,
DuckDuckGo has a blog.

It's like a Google search engine
that's that's privacy centric.

there are
there's the International Association

for Privacy Professionals, AIA.

Those are all really good resources.

The EFF has a really, really good
write ups on such things, such as like

police surveillance and how that's
being used and, things you can do about.

And they also have a ton of lawyers
on their team

that actually go to battle in court and
try to win cases when it comes to privacy,

those are all really good things
in terms of practical, actionable steps.

we are building a curriculum
at plunk to do this.

and to put it

online and make it available,
but extremely basic.

I'll give it a few,
like right off the bat.

Yeah. extremely basic.

one is
make sure your devices are updated.

if you even if you have good privacy
in place, if the device isn't updated,

it's easy to get into.

So make sure you update your device
hands down.

That's that's the best thing to
is that you want to make sure

that you have at least a junk
email address where you can if you

if you go to a store
and they ask for your email address

for 20% off, 30% off, whatever that is,
typically I would say don't do it.

But if you want to make it easy,
just have a junk

email address, with information
that's not relevant to you.

it doesn't have your address, stuff
like that.

That's a good way to go about it.

And here's a really interesting one that,
if you go to the doctor

and they or if you go pretty much anywhere
and they want to scan your ID,

they don't

need to scan your ID, they can verify that
it's you and you can say, hey,

I would like for you, you can verify
this is me, but you don't need to scan it.

Right?

Because a lot of companies take that,
they scan it,

and now they have a ton of information
about you that they don't need,

and they store it, or they resell it or
share it with other third party members.

And you don't need to do that.

So if you have a kid

or anything along those lines too,
don't need to do any of that scanning.

you don't need to give your social
for a lot of places.

So it's always good when somebody says,
May I have your social?

You can ask, is that required for us
to continue this transaction?

Right.
even at a doctor's office? It's not.

It's an often not so
they don't need any of that information.

So doctors, we sell your stuff too.

So yeah.

I know
this isn't going to be a clickbait moment,

but there's, there's a law called
or there's an act called HIPAA,

which is the health information
privacy. Yes.

So they can do that.

So what happens is
there's a barrier that you have to meet.

You have to be, some type of person,

that or entity that crosses this,
this barrier of health information.

Once you do,

which means you meet
a lot of the federal regulations,

you have to meet
all these federal regulations.

Once you do,
you have the ability to transact

with people
who who have reached that barrier.

Right? Yeah.

So so you can share
or they can share information.

And some of that is like the part of me
that is conflicted on that.

What does not conflict is I understand
even when things are used for good

purposes, people find ways to use them
for nefarious purposes, I get it.

The other part of that, though,
is of course, the sharing of information

helps to prevent, diseases, helps
you learn about causes.

So, I mean,

figuring out that balance of what that is,

because I do think that's important
and I don't have the answers is important.

I mean, I

so I do think it has to be like,
because the sharing of that data,

the problem is people don't trust
that you'll do something

like what shouldn't be shared is
if somebody has a,

a condition that shouldn't be shared
with the insurance company

to figure out ways, right.

so they can maybe not,

or they're not supposed to be able
to not give you an assurance.

But we all know they can find ways
to make it difficult for things

not to be covered. Still.

yes, that's the worry I have.

But I'm like,
I also know share the sharing

of the information for things
like understanding health patterns.

You introduce things like doppelgangers
that have people

have similar profiles to you.

Like what do you think is the balance
there

times of figuring out how we share data?

But it's not.

It's probably regulation to find out
the answer to my question on that. But.

Well, it's it's
actually a pretty simple solution.

Okay. And and Facebook can do this
with advertising.

Lots of organizations can do this.

It's just that don't attach
personally identifiable information.

Oh yeah. To it. Anonymize the data.

Right.

Like, hey, we you know, maybe there's 7000
cases in Cincinnati of this thing.

We don't need to be able to trace it back
to who those people are.

It's anonymized data.

And, I know for the nerds out there,
they're going to say, well,

I has the ability to anonymize data
right through that.

There are techniques that brilliant people
even in the university.

Yes. Yes.

And, well, they add noise, so you can't
identify exactly who has what condition.

And so there are brilliant people
coming up with these solutions.

And to go to your question about,
how do we have,

innovation, you know, and privacy.

There's so much innovation happening
in this realm that protects us, too,

and there's so much innovation
that's going on.

I think, to answer my own question out
loud, I think we do have policy.

And we started synthesizing investment
the same way we started doing,

sustainability. Right, right.

So now you're seeing all of this happen

and all this happened all across Ohio,
but really all across the country,

that there's just solar power farms
being built everywhere.

Of course, there are people
that are trying to be against it

because they want to keep the money train
the same way it's always been.

I'm sure there will be people
that want to keep data the way.

It's the way it's always been,
because that's how

they've always made their money.

Yeah, but to the point is, there's lots of
there's lots of jobs being created

that way, lots of opportunities
being created that way.

Yeah.

If we start having a renaissance towards
okay, we can protect people's privacy

and still innovate.

And we want to incentivize
that type of activity.

That to me
is one of the main purposes of policy.

Yeah.

Policy is actually
it is a guiding principle to say

this is the type of society we want to be,
and we never get it perfect.

We can't overdo it.

But I like the concept of
how do we incentivize people to innovate.

It protects privacy.

We have the
you talked about what you do in terms of,

the solution to scrub out the noise
using AI.

I mean, there's also blockchains
involved in the zero proof knowledge.

You're able to, hold data like collect
data.

That's, that's that's totally private.

But then you can still know about the data
without knowing the data.

Like, I know it sounds weird, but it's the
same type of thing you're talking about.

Like it's a way of proving things
without knowing the details of a person,

which I think though
we have the technology to do.

This is just what problem
are we trying to solve a society?

And right now we've just said nothing
more important than just how fast we can

just, make profits without any type of,
thought about,

should we do all of these things
and how should we do them?

Yeah, totally.

And one thing,
you know, that this is an issue

that is sort of bipartisan,
you know, because you will talk to,

a ton of different people
from both sides of the aisle.

And, you know, it's
something that needs regulation

when people on the right are saying, hey,
I don't even believe in regulation.

But this I know what I'm into deep.

I know that there's.

Something that they know
their kids are vulnerable.

Yes. And another thing too,

is that a lot of things that we're seeing
is that you could be an affluent family,

but this could still happen to your kid
all in all, it takes is just one moment

where, hey, mom. Hey, dad, I hate you.

You don't understand me.
And then they put this online.

I used to talk crap about my parents
under a tree house, right?

Right.

But the digital. Tree house exists.

Yeah, where you're.

Spewing that, and it just takes somebody
who can identify

through sentiment analysis,
through using AI, and pinpointing

exact locations of who may be having fits
with their parents at this moment.

to understand.

Okay.

These are vulnerable children. Yeah.

And it just takes one connection.

You can buy their data for five bucks
to understand what their interests are.

You can say, okay,
it just takes this one connection

with this child who's having a bad moment
with their parents online.

And that's all it takes to start
something like that.

So people are understanding
like both sides of the aisle,

all sides of the income economic ladder
are people that are affected from this.

What happens if we don't get this right?

Well, I think it's going to get sort of
like climate change.

It's going to get a lot worse
before it gets better.

and but I do see hope.

so it's hard to predict,
how I, will make this worse.

It's hard to predict, how bad actors,
because they're very smart.

I'm asking you to predict.

What do you think that looks like?
If we don't get it right.

I think

it's going
to be a lot more like Brave New World.

Have you seen Brave New World? Yes.

Or. I mean, read the book.

It's essentially
where people kind of understand

that this is happening,
but they don't care that much.

And there will be large
corporations or large governments

that are using that to understand
everything about you and nudging

how you think and what you think.

Over time, there will be bad actors

who are exploiting this to get their way
with whatever they want.

and this could be not even
just from a kids and,

you know, vulnerable population
perspective, but from policy.

Absolutely.

so it looks very bleak,
to be honest with you.

So, yeah,

that it's a world where the algorithms
know us better than we know ourselves,

and we don't even know
that that's happening.

Like you said, and I.

And to take it further,

like we can get to the point where it
it sounds really freaky, right,

that algorithms

own most of people and corporations
and nations.

Now, that sounds weird until you realize

that most holders of land aren't.

People are organizations.

So imagine if we
if we're okay with algorithms,

because this is very possible
that algorithms can eventually determine,

who owns land, how policy is written,
all those things that we don't even know.

Like,
I know it sounds like science fiction,

but it's not science fiction
when you understand that if there's not,

we don't have a guiding map
for what we want in society,

how we're using artificial intelligence
to augment us, right?

Augmented intelligence,
not an artificial intelligence.

And AI, everybody is not new.

This is this is this is algorithms
and machine learning.

All that has been going on
for a long time now.

We have another level of generative AI.

And so like I'm with you, I think there's
a lot of hope and potential.

But there's there's huge reasons
to be concerned.

Yeah.

It's it's hard to understate
how important it is,

but it always sounds like tinfoil
hat in a way.

So it's and it's impossible to predict

because this is the thing I struggle with.

We are generating so many billions of data
points,

just like per second with AI,
and how that transforms

without much regulation
could go any direction.

And it's hard to predict
just how bad that could be.

Yeah.

and, you know,
if we don't look out for this

at all,
we don't know how fast it can affect us.

It'll go faster than we can imagine.

Yeah, it's going to be exponential.

It's already is. It already is. Right.
Like we're growing.

That's horror.

People understand? Yeah.

As you know,

exponential growth is not something
the human brain can actually, understand.

So people say, well, you look like
what does that mean?

It literally means it's growing so fast
you don't understand.

Yeah. Right.

From the human like we understand as nerds
what exponential growth looks like.

We can see it on a chart.

Yeah, but we really can't visualize
exponential growth.

So it's moving so fast that like,
we need to this is why, you know,

you're going to be a midwest con.

And this is why like we're really focused
on what does policy innovation look like.

Like because we think
I think you agree to that.

We don't policy. Does
it mean that we don't have innovation.

It means that we are building trust
so we can, build better innovation,

have a flexible enough policy,
but a policy that sets clear rules

so we understand
how we're operating with one another.

Like I just think it's very important.
Agreed.

Like we need even
we need to get to deep fakes

and all those things that are really like,
like that are really concerning people.

So like before we leave,

I got a couple of lightning
round questions before I do this. Like

if you had
to say something that will surprise people

the most about the lack of privacy
or digital safety,

can you think of a story or an example
that exemplifies

that?

What surprised you the most?

Yes. on average
you have 3192 data points about you.

And this is much better than people
that are close to your friends

may know about you.

So this is being sold
and, traded every single day

and being aggregated for every time
you accept the term and condition,

right, that invade your privacy.

And I remember
the one thing that you told me about,

when you set out a picture, metadata,
please tell people that the.

White part.

When you send out a picture
and the metadata behind it, what that

what you can just by saying, oh, yeah,
one picture could. Do. Yeah. True.

If you're uploading to Instagram
or Facebook or, various sites,

you can right click on a computer
and inspect the element

and it shows
what time of day this was taken.

What camera was taken from,
GPS coordinates.

All of that stuff is not obfuscated,
for a lot of websites.

So you're able to track people
based on where they took a picture.

Yeah. So very scary stuff.
So just think about this.

All right?

So I want to get to a couple of lightning

round,
lightning round questions about you.

All right.

So you have a committee of three,
living or dead,

to advise you on business life, digital
privacy, safety, whatever you want.

Tell me who these three people are
and why.

Marly Marlinspike
would be the first one who created

the end to end encryption
that's used on WhatsApp and signal.

So that would, be one my grandpa,
who was a judge here in Cincinnati.

Fantastic.

and really levelheaded person.

What's his name?

Marley. Marlinspike.

No, no. You're my grandpa.
Norman Murdock.

Okay. Yeah, he is county commissioner.

Judge for a while. Wow. Yeah.

And then, you know, the third one,

I would probably.

And so many good people,

I would probably look toward,

somebody named Steve Shahan,

who is a very interesting musician.

Yeah, but he went to pretty much every
single country and learned about cultures,

and it would be just so good
to get that perspective.

because he's still alive now
and has seen how cultures have evolved.

And he's extremely intellectual
in that way.

And I think you're going
to need that perspective.

But I think those are the three
that I would go for.

All right.

what's an important truth you have that
very few people agree with you?

I like a hot take.

Yeah.

I don't like the order.

that is in every meeting.

And I'm gonna write something about this.

But you're guilty. You.

That's what I. But.

Yeah, but, I get emails about it
all the time.

Every media guy show up to.

And then there are people
that don't show up to the meeting

that have it,
and then I'm not in charge of the meeting.

And now I see, like,
all these things over.

I've always order I not going to be a plug
for validation meetings.

No, no, I, I was, talking to John
Salisbury about, yeah.

Even just like when we have a meeting
or a presentation.

making people turn off their phones, like,
if it's in person.

Yeah, like.

Demanding people turn off their phones
and just, like, creating attention.

I also think order, on one hand, excuses
people from really paying attention

because they think they can go back to
something where it's like, I want to like,

if we're having a meeting,
I want full focus,

I want good participation, and I don't.

And, you know, this is a fleeting moment,
so you got to pay attention.

So I like that a lot.

No, I think that's really important.

And I'm guilty of what you said.

Right.

And but,
but I respect it and I know it's true.

Right.

Because, in the last interview
I had right before you,

we were talking about the need
to make sure people don't think that

artificial intelligence
is not going to replace intelligence.

You need to still be intelligent.

You need to understand how to do this
and have focus.

Right?

Because those who who are able to,

communicate their authentic intelligence,
they will be the winners.

And in the age of artificial Intel.
Totally. Right. Yeah.

That that takes though
focus, that takes presence.

Yeah.

That takes still grinding it out
and learning those hard parts.

and no amount of artificial intelligence
will replace the need for authenticity.

Totally.

Like so I completely agree
with you on that.

All right.

So, a time

you failed in your life
and how that made you better.

I fail literally every single day.

Every single day.
Welcome to the club. Yeah.

it's always made me better.

I think, there was one point where, back
when I was first starting the slate

up, you know, related stuff,
where I wanted to quit.

And I talked to my team about it,
and they gave me a a week, and, then

I saw how much work my team was doing,
and I wasn't there to lead them.

And I felt really bad about,
you know, wavering.

Yeah.

At that moment,
and I realized that leaders need

extreme control in their emotions,

and they need to step up
and not be that person that waver.

So it was either

that I should not be a leader
or I need to step up in that context.

And that was a good path
for me to like, really understand that.

That's a great point.

I'll say is that from one leader
to another, because I've felt it too.

You also need a safe,

a safe place to be vulnerable

and tell it
because you're going to have doubts.

You're going to have, difficult times.

Now, that mentor may not be your team.

Maybe, maybe, at some point,
maybe it can be.

But having people
that have been through it,

because it's always hard
like it always is always hard.

Like even even when you get to where you
where you think you want to be,

the new problems come about.

Yeah. Right.

And there's it's very hard
for people to relate.

It is what you're doing
like your family can't.

They can't.

That's why
you need to talk to other leaders as well.

So I tell you totally,
you need to get support

and advice from people important
and have them pour into you.

You'll help them too,

because you're going to be
you need to be vulnerable

because if you feel like
you always have to take it on,

yeah, you might break too,
which also hurt your team.

True.

But I agree with you
in terms of regulating your emotion.

You know, I I've had to learn that too.

And I'm still learning

that it's a constant process
of when regulating their emotions.

But you have to be in a place
where you can also be vulnerable.

and sometimes you have to show your team
vulnerability, too.

I think all of this stuff is balance,
and I think your team showed you

that they had your back,
and that sounds like that inspired you.

Yeah, definitely. Yeah.

All right.

Final, final final lightning
round question.

What's your slogan?

Your your it'll be on your
that'll be on your grave.

What is it?

probably some like we're all going to die.

Enjoy it.

That's true.

You know, don't take it too seriously.

All right, brother, good to see you. Good
to see you, too. Pleasure having you on.