A podcast that explores how AI is transforming careers, businesses, and industries. Hosts Greg Boone and Erica Rooney deliver real-world use cases and actionable AI strategies to help professionals stay ahead of the curve.
Molly Mae Potter: And with DeepFakes,
I would highly encourage families
to have a special passphrase
that no one knows out there.
Because Erica, I mean, if I, if I got a
video from you and you're like, Molly,
like I am in super trouble right now.
I really need you to
send me a thousand bucks.
I'm gonna say, alright Erica, you
know, what did you have for breakfast?
You know, at this conference?
If you're just like, oh,
I had a banana muffin.
I'd be like, Nope, not, not Erica.
Right?
Yeah.
But little things like
that, that, that you can do.
Greg Boone: AI isn't the future.
It's now, and whether you're in hr,
sales, operations, or leadership, the
choices you make today will determine
whether you thrive or get left behind.
Erica Rooney: Welcome to the AI
Voice or Victim podcast y'all.
We are live at the AI Powered
Women Conference here at MIT.
I am so excited.
We have none other than Molly
Mae Potter here with us.
She is a security and product
development executive.
Y'all with several really weird career
shifts and those are her words, not mine.
Really weird career shifts in her life.
So y'all, this is going to be fun.
Molly, welcome to the show.
Molly Mae Potter: Ah,
it's so great to be here.
This has been an amazing event
and thanks for having me on.
Erica Rooney: Can I also tell you
why I was like, this is gonna be so
fun for me, and you're gonna laugh
when I say it because homegirl sends
me an email and I'm like, oh my
God, my matron of honor is Molly.
And she goes, thank God you
said that because everybody
else says their dog is Molly.
Everyone's
Molly Mae Potter: got a golden doodle
Erica Rooney: named Molly.
Molly Mae Potter: I'm convinced.
Yeah.
Well, you're,
Erica Rooney: you're my bestie girl, so,
Molly Mae Potter: awesome.
Erica Rooney: Tell me a bit
about you, your background,
why you're so excited about ai.
Molly Mae Potter: Yeah, so.
Right now AI really is like
this new revolution and where
are we gonna play in it?
What are we going to do?
I actually started my career off
in aviation, so I was a flight test
engineer in the Air Force flying in
the back of F sixteens, testing a.
All the cool new
technology, the total badass
Erica Rooney: is what
you're trying to tell me.
Molly Mae Potter: Um,
Erica Rooney: well, yes.
And
Molly Mae Potter: um, used to being in
a very male dominated field and rolling
with it and hitting all the punches.
And, um, in 2013 I left the military
and had no idea what I was going to do.
Like I worked at a running shoe
store, so I went from flying in F
16 to like selling running shoes.
And I was like, what am
I gonna do with my life?
Fell into high tech.
So I worked for a very large
tech company, um, and then kind
of evolved my career in fixing.
Like messes, like organizational chaos
and mess, and anything that just needed
structure to it, I would just go do it.
So the jobs that no one else wanted to do
is where I fell into in like large tech.
And wherever there's security,
there's usually a mess, and
so therefore about four.
Four, maybe five years ago I started
becoming a security leader, and I use it
very loosely because I fell into security
because it needed organizational cleanup.
And then I ended up going to another
very large like cloud hyperscaler
and leading their security
and compliance and realizing.
Oh my gosh.
No one really knows what they're
doing in the age of AI right now.
And so it's all about just figuring
things out as you go along in life.
And so I've had all these weird career
pivots, and it's all just been because
there's been opportunities where there's
been a void and a lack of leadership,
and no one to really organize it.
Erica Rooney: Oh my gosh, yes.
And you have talked
about an oh shit moment.
Yep.
An oh shit moment where
you felt like an AI victim.
You wanna talk about that?
Yes.
All right.
Tell me what happened.
Molly Mae Potter: Um, so there I
am VP Molly Mae Potter Security
and Compliance for a very large
like Fortune 100 tech company.
And, um, I am sitting with who I
consider like my work husband at the
time, and he's running infrastructure
and our boss has mandated that we.
Do ai, ai, right?
For get out there, go do ai,
go do ai, and do some ai.
Go do ai.
But there was a, um, a cloud product
that could come in and really help
us with optimizing cloud costs,
which is a huge driver for ai.
And, um, I'm sitting there and I'm
talking to this product company and
it all seems great and I'm like.
I, I don't know what questions to ask.
Like I know the basic cybersecurity,
how do you manage access?
Do you have a SOC two
type two certification?
Right?
But this is, but oh my God.
Like I am, I don't know what to ask.
I don't know what to ask.
All of these products that we are
bringing into our state, and I
look over at my other executive
partner, right, and he's like.
And he's got 30 plus years
in like this industry.
And he's like, you're doing a good job.
Like, I dunno.
And I thought, oh my God, I am sitting
here and I have no idea what to ask.
What do I do?
And so that was the moment where I was
like, okay, I gotta go figure this out.
AI is fundamentally changing the
entire landscape of cybersecurity.
What we have been doing is
not going to take us forward
and I don't know what to ask.
And so instead of going, uh, and just
leaving it at that, I hired an executive
coach, probably the only person that I
could actually say as an expert in ai, Dr.
Funta guns, 25 years PhD in the subject.
And I said, I'm hiring you.
You need to teach me like the
technical ins and outs of AI so that
I at least know the basic questions
to ask, and then I just need to
pull my full network together.
And then I started realizing.
There was no complete set of questions.
There was no standard
framework out there, right?
This is the wild, wild
west for cybersecurity.
And oh, by the way, the frameworks that
are popping up right now that normally
cybersecurity experts would like lean
on, they start to contradict each other.
So it's getting really, really
interesting, and that's when
I decided, okay, I'm gonna
go be a voice in this space.
As we figure this out together,
Erica Rooney: oh my goodness.
It makes me think a little bit of
like Facebook and how it started as
this cute little fun way for college
kids to, oh my gosh, stay connected
and, and now is like being accused
of, you know, getting the President
Trump elected and all these other
things and has become this big thing.
And you've got Mark Zuckerberg in Congress
and doing this, that, and the other.
And I feel a little bit like that.
Yes.
With ai, where it's like, okay, chat.
It's so great.
It's so cool.
And everybody's leaning in and it's
just growing into something at a
rate that we can't yet That's it.
Control it in the way that we like to Yep.
Molly Mae Potter: And
And need to and need to.
And the whole new surface of AI is
also a whole new surface for attackers.
And I don't want to be doom and gloom
about it, but we just need to be
very aware of a shifting landscape.
Also means shifting cybersecurity concerns
not only for society, but also our
personal selves and how we view the world
and knowing what's real and what's not.
In social engineering and things
that used to take very sophisticated
hackers, you kind of think of them as
the guys in the hoodies with their like
monster energy drink in the basement,
like typing away is that can now be
done in a very, very simplistic way.
And then nation state actors, I mean
that those are in industries like
professional hacking and on the AI
level, oh my gosh, it just scales
everything from social engineering.
Down to the deepest bowels of
like the NSA type of level.
Yeah.
Erica Rooney: So let's go
a little deeper on that.
Let's talk a little bit about how
AI is transforming the way companies
think about cybersecurity, both
for the better and for the worse.
What are you seeing?
Molly Mae Potter: What I am primarily
seeing is that very large companies are
saying, we're doing security for ai.
And then they don't
have policies in place.
And I mean, I'm talking
Fortune 100 companies.
It's kinda like crossing
Erica Rooney: your fingers and just
hoping nothing goes wrong, right?
Molly Mae Potter: Like how do you use ai?
But the security and these very large
security companies that, um, have
also focused on supporting enterprises
are built for what we've been doing,
which is really, um, react, respond.
Rather than, and it's kind of more
of a discover and then you patch
something where it really needs to
shift towards predictive, right?
And, and being more proactive
in that predictive model.
And so the companies that are
starting to adopt more of the
predictive, I think are going to have.
Much further advantage and an edge
where a lot of legacy companies are
still stuck in the mud with, we have
this legacy security organization, this
is how we've been doing, detect and
respond in a very reactive way, and
they're really pushing for, we've gotta
get all of our AI features out rather
than looking at, well, how do we balance
the security and tech debt with that?
And the companies that are looking at.
Strengthening their security with the
features are driving customer trust
at a much greater adoption rate.
But those are more of
the smaller companies.
Those are the ones that you're starting
to see come up really, really fast.
Like they're doing it in a very
different way because they've been
able to reshape and reframe, oh, like
Erica Rooney: smaller
companies can move faster.
Right.
Those big ships are harder to turn.
Right.
Molly Mae Potter: So it's,
it's that double-edged sword.
Right.
AI is making it so smaller companies can
actually have a stronger security posture.
Yeah.
Because they can adopt all these tools,
but then some of the larger companies
are still like turning a barge.
And um, the last company I worked for
was a massive tech hyperscaler company.
And just seeing the, like, the scratching
of the heads and security and some of
these meetings, I thought, oh my gosh.
Like this is a really big.
Oh shit.
Light bulb moment for me.
Erica Rooney: Yeah, I can imagine.
So we know AI gives us speed.
We know it gives a scale.
Yeah, I use it.
You use it.
But it does open up blind spots.
Yes.
So what are some of those blind
spots that you are seeing?
What are those hidden risks
that we need to be aware about?
Molly Mae Potter: Yeah, and, and again,
it's like knowing the ingredients
that's in the bread you're about to eat.
Um, so Agen AI is really
about the architecture.
So the more that you know
about what's in your solutions.
And that architecture, the more you're
going to be able to protect it, and
the more you're going to be able
to understand when things go awry.
So it's really about knowing the recipe.
So what's in it?
What are you consuming?
And I also tell people that
security's a lot like playing.
Playing on the playground.
So you have a sandbox who's
invited in the sandbox, who's not.
So access is huge.
Huge for that.
So maintaining access and identity,
like knowing where your toys are
stored for the right level of
kids on the playground as well.
Where's your data stored?
Who needs access to it?
Right?
We don't want to give
machetes to five-year-olds.
Right.
So it's idea about Idea, yeah.
Think about your data and
where it's being stored.
And also I think being really kind to
people during this time, security can have
a tendency to beat people over the head.
Erica Rooney: Well, and
I think it's scary too.
It's, but because nobody
wants to do the wrong thing.
Exactly.
But they find that they're
often like they do it
Molly Mae Potter: and,
and so if you are really.
Vulnerable as a leader to say, we
are all figuring this out together.
I'm not gonna beat you up.
We gotta just figure this out.
And it's not just the security team's,
like it's everybody's responsibility
and we gotta work together on this
cross-functionally because if HR wants
to use this and it's gonna be personal
data, and then finance wants to use this,
the security team and the engineering
team needs to network that together.
So what I am seeing from the landscape
shift is when you start looking at.
The different layers of ent, there's all
sorts of crazy things that can happen.
It's loss of control, it's hallucinations.
And, and even, um, I think last week
or the week before, you know, we're
seeing even with Gemini, hidden
text prompts in, um, in images that
then get consumed and then the, the.
Basically the poison injection prompts
will ask or the hackers to send your
Gmail information and your calendar.
So what you are connecting
to, so again, that's access.
What are you as a leader
going to allow access to?
And what is that risk that
you're willing to consume?
So access, knowing what's the recipe
is, um, and understanding your
vendors and where they come from,
asking those hard questions that I
didn't know how to ask before, right?
Um, what are you doing
for security, right?
How often are you updating your model?
What data are you using?
Are you purchasing someone else's data to.
Like understanding these really basic
things will actually go a long way.
Erica Rooney: Hmm.
It's so interesting to me too when
we're thinking about this, 'cause
usually when you have a new tech
come in, it impacts one department or
maybe two, and then the IT department.
'cause they're in charge
of all those things.
And you involve security too
because Okay, we're all good.
Yeah.
But this is something that
encompasses the entire organization.
Yep.
And so I'm a big fan of bringing HR in
at the top of it, but to, because it's
also this whole change management thing
that we have to get people on board with.
But it's also the security thing.
Yes.
It's not just, Hey, this is our tool.
It's how are we using it?
What's okay to use it?
You know what's going to, who's using it?
Who's using it?
Yep.
All of the things you need to think about.
I love that.
All right.
The other thing I really
wanted to ask you is.
With individuals.
Right.
Because I hear from people like my mom.
Yeah.
They are so terrified.
Anytime a piece of
their data is given out.
Yeah.
Right.
Then you've got the other end
of the spectrum where I'm like,
my data's everywhere already.
Molly Mae Potter: Yeah,
Erica Rooney: yeah.
You know?
So how do we put that individual back
in the driver's seat so that they
know how their data's being used?
Yep.
They know what they should be
thinking about before they click.
Oh yeah, I can track me anywhere.
Yeah.
Well, what do we do?
Molly Mae Potter: So we're
living in a digital age.
Your data is out there.
So even if you think no one has my
data, I'm sorry, they have your data.
And um, so you also have to just be
aware of what you think you're sharing.
Even things like your writing tone
can be then multiplied into other
LLMs and we're off to the races.
Um, and.
So when we start thinking about DeepFakes,
when we start thinking about even just
all the chat bots that are out there
that you're putting in, so you're
talking to a chat bot at your bank,
well, you gotta start thinking about
where that data's going to be used.
Um, it's about having those critical
conversations with your family.
And I tell people to actually
take an AI inventory.
I actually know what AI my family uses.
Ooh, right.
So that it is all interconnected.
So as a family, my husband's
an architect, right?
He has his systems, I have mine.
Are we combining them together?
Are we not?
Is he using this for work?
Is he using this for personal?
What am I using?
And then also teaching the
critical thinking for, I've
got a 6-year-old daughter.
So she asked questions to chatbots, but
what questions are we actually asking and
how does that reflect back on our family?
And with DeepFakes, I would highly
encourage families to have a special
passphrase that no one knows out there.
Right.
Um, because Erica, I mean, if I, if I got
a video from you and you're like, Molly,
like I am in super trouble right now.
I really need you to
send me a thousand bucks.
Yeah, I'm gonna say, alright Erica, you
know, what did you have for breakfast?
You know, at this conference?
And if you're just like, oh,
I had a banana muffin, I'd be
like, Nope, not, not Erica.
Yeah.
Right.
Yeah.
But little things like
that that, that you can do.
So it's out there.
How are you responsible with it?
And, and it's also just
being a good human.
What are the things that you're gonna
teach your teenage kids that they
should and should not do from even
a social engineering perspective?
'cause now we have to think about social
engineering as well on all of this.
So have those conversations.
Know what AI you're using, know as
a family, what you're willing to put
out there, and what you're willing
to let the world learn on about you.
And what are those things that are
never, ever, ever going to hit a
digital platform that's only between
you and your spouse, you and your kids.
And make sure that that's in
the back pocket at all time.
Yes.
Erica Rooney: I love the idea of a safe
word, but I've gotta tell you this story.
Okay.
So the other day, a mom who
lives in my neighborhood Yep.
That I don't know super well, but she
lives in the neighborhood so well enough.
Needed me to pick up her kids.
So I was like, sure.
She goes, here's the safe word,
they'll trust you after that.
And I was like, cool.
So the school gets everybody in my
car, 'cause it's carpool pickup and
there's like six of them in there.
And these kids don't know me very well.
They know all the other kids in the car.
So I think they feel a little safe.
Yeah.
And I'm like.
The safe word is, and they look
at me like whatcha talking about?
Because ain't nobody used
that safe word in 27 years.
And I'm like, oh Lord.
So make sure your kids know what
the safe word is on a regular basis.
You can't teach 'em once in a walk away.
Yeah.
I love that.
Okay.
I wanna talk a little
bit about leadership.
Yeah.
I'm gonna take a little left
turn here because I know so many
leaders are overwhelmed with ai.
Yeah.
I know me as not a techie person,
not a cybersecurity person.
Yeah.
It definitely overwhelms me.
Yep.
'cause you know, maybe I
accidentally put in client data.
Yep.
I don't know, like, yep.
What are those immediate steps that can
help you get comfortable, reclaim your
voice, and really understand the wrist?
Molly Mae Potter: I love
a good old SWOT analysis.
I, I don't know, like there's
just something about starting
with a basic assessment.
Start with the basic assessment
of your, of your organization.
Start with the basic assessment of the
tools they're using of their, their
digital interfaces, the conversations,
the key meetings, the the threats
to AI to your current business.
Like, just lay it all out
and involve your team.
Right?
And I think starting with an
assessment, you'll start learning
very quickly what are the big gaps.
From an AI risk perspective or, um, even
just a, a business risk perspective,
because if you have a business, I hope
that you already know right now at
least what your key business risks are.
So start protecting that.
There will, it will evolve over time
and there will be new risks that will
show up in the digital landscape.
But if you at least know your current
business risk and your biggest
opportunities, just start there.
Right.
See, I think most
Erica Rooney: people are only honing in
on the opportunities For a lot of it.
Yes.
'cause they're like, it's so
exciting, it's gonna scale.
We're gonna grow, we're
gonna go fast, fast, fast.
And they either minimize or
they completely ignore them.
I mean.
How do we get those people?
I mean, I know you said do
the SWOT analysis, but like to
just I know it's going fast.
I know it's exciting, but slow down.
Molly Mae Potter: Yeah.
And I, I, as the security
practitioner in, uh, and Signature
authority and an organization
that was like, go, go, go, go, go.
And I was like being left in the dust.
Um, I started.
Really partnering with other
stakeholders in the company that
had a bigger voice up to the board
where we could start reporting
digital risk in a different light.
Working with the CIOI had a great
relationship with the last two CISOs
as well, so I was in the product
team, but I was reporting to the ciso.
I see this as a huge risk.
This isn't just a risk in a cloud product,
it's a risk across the board that we
don't have a risk framework for this.
We need to go do this
and go build that out.
Talking to the audit committee,
I had a great relationship.
Most people wanna stay
away from internal audit.
I loved working with internal
audit because it's a big
stick that you can carry.
And so when you start working with
all of these partners, you also start
seeing how the interdependencies across
your organization tie back to security.
And you at least highlight it as a risk.
Now it's up to the business
to decide if they're gonna
do something about it or not.
But when we started gaining momentum on
this is now a new risk vector, we need
to start making sure our tools and our
infrastructure, and as we're looking at
our tool chain across the board, how do
we integrate that as a new risk factor?
You at least get the conversation going.
So it does not mean that it has to be
perfect, but it gets the conversation
going when you start with the reporting
factors and you start with those with
the tool chain, and you start with those
with customer data and you start having
conversations with customers that now are
like, but how are you securing your ai?
And I tell you what customers drive it.
So if you can't answer basic
questions on customer trust.
It's gonna force you to really think
very long and hard about what you're
doing, not only internally with your own
operations and where things are being
developed and maintained, but also the
product itself and how you actually
secure the product for the customer.
Customer trust you need to
Erica Rooney: take the time.
Now you have to.
You need to do it now, right?
So
Molly Mae Potter: that in the future
you're not having to redo it all.
Internal operations, take an assessment.
Know what your customers are asking.
But also help them along the way.
Yes.
Right.
Um, let the customer be the voice and
your customer should also help guide
your roadmap and what you're going to
be doing and, and to go ahead with this.
So don't just assume I saw a
lot of features being released.
Oh yes.
That customers I don't
think are gonna use.
And then those tend to rot in
the infrastructure and they
don't get maintained and they
don't get care and feeding and
they become big security cysts.
Yes.
I love that you're bringing that up.
Erica Rooney: 'cause we had a conversation
earlier that really talked about.
Understanding what the
client wants and needs.
Yep.
And is looking for and
paying attention to that.
Yep.
And guess what?
You can use AI to help you with that.
Molly Mae Potter: It's a
great augmenter for it.
Yep.
And, and it's also a great way, as I
said, that double-edged sword, right?
Smaller organizations can now have a lot
more comprehensive security coverage.
Yes.
They call it sock in a box
security operations command.
Right.
Like in a box.
And there's, and they're
becoming more predictive.
So get on that bandwagon,
get, go, do it, right?
Um, it all connects and is it perfect?
No, but is it great progress as to where
we were as someone that's been a security
leader, he is always been reactive and
you never know when things are just
gonna hit the fan at one o'clock in the
morning on a Saturday night, which they.
Always do during vacation.
Um, now I could be more predictive.
Erica Rooney: I was like, in the HR
world, it's always 4 45 on a Friday.
There
Molly Mae Potter: you go.
Yeah.
Oh
Erica Rooney: my
Molly Mae Potter: God.
Friday, like, yeah, security is
always 8:00 PM It's always at night.
Well, that's when the shit
Erica Rooney: goes down.
Yeah.
But, all right.
What is next for Molly Mae
Molly Mae Potter: Potter?
Yeah, so I actually just launched,
uh, a company called Salem, CSO,
um, and we are a conglomeration,
it's really a partnership based
network of security, um, minded.
And, um, leaders that have companies
that, um, how do I put this?
We are all in a partner based
organization, so we're a menu.
So we'll do the end-to-end tech stack
for you for full technology solutions and
product solutions, but all with companies
that are led by really strong, amazing
women with a strong security mindset.
And together we partner to
put solutions forward that not
only customers can trust, but.
Have security and integrity in them and
are also really leading the way with where
we're going for compliance and regulatory.
So being that vocal voice upfront.
So that's Salem CISO right now.
And um, some of our
partners are, um, in the US.
Some of them are overseas.
I think it's really important to
start getting some of your overseas
forces like really engaged in that
conversation so it's not a North
America based conversation for security.
Um, and, um, really empowering, uh, women
leaders globally in the security space.
I, for ai.
Erica Rooney: I love that.
Oh my gosh.
Molly, thank you so much for
being, being here today, y'all.
We are live at the AI Powered
Women Conference here at MIT.
Thank you so much for joining
us for AI, voice or Victim.
We'll see you next time.
Thanks for joining us.
On ai, voice or victim.
If you want to stay competitive
in the AI age, start now.
Take one insight from today's episode and
put it into practice in the next 24 hours.
Make sure to follow us, share
your thoughts, and subscribe
for more actionable AI insights.
See you next time.