Threat Talks - Your Gateway to Cybersecurity Insights

One unlocked phone can unravel the defenses of a billion-dollar enterprise—because in cybersecurity, small mistakes don’t stay small for long. Attackers can read notes, steal IDs, or impersonate you on WhatsApp. A reused password can launch a remote tool that looks completely legitimate.

Rob Maas (Field CTO, ON2IT) and Luca Cipriano (Cyber Threat Intelligence Program Lead, ON2IT) reveal how poor cyber hygiene erodes trust, endangers partners, and weakens enterprise defenses.
CISOs, CIO and IT managers remember: in a Zero Trust world, your weakest link might not even be inside your organization.

  • (00:00) - Why your cyber hygiene affects others
  • (00:28) - Meet the speakers (Rob Maas, Luca Cipriano)
  • (00:47) - Cyber hygiene defined for CISOs
  • (03:00) - Unlocked phone → passwords in notes, WhatsApp fraud, ID photos
  • (05:53) - SOC case: contractor email compromise → remote tool drop (ConnectWise)
  • (09:40) - OSINT: 19 breaches + iterative password reuse
  • (17:01) - What to fix now: MFA, vaults, device lock, breach monitoring
  • (20:24) - Final takeaways & resources

What You’ll Learn (From Real-Life Example Discussions)
• How a stolen phone quickly turns into identity theft, impersonation, and scams targeting your contacts.
• A real SOC case: a contractor’s reused password allowed attackers to hide a remote access tool inside normal IT activity.
• How OSINT and dark web data reveal how password reuse spreads risk across accounts.
• Why shared tools like Google Docs can quietly multiply breaches when one user slips up.
• Simple upgrades—MFA, password vaults, breach alerts, and secure devices—that cut your organization’s exposure fast.

Related ON2IT Content & Referenced Resources
• ON2IT: https://on2it.net/
• Threat Talks: https://threat-talks.com/
• AMS-IX: https://www.ams-ix.net/ams
• WatchYourHack: https://watchyourhack.com
• Have I Been Pwned: https://haveibeenpwned.com

Guest and Host Links: 
Rob Maas, Field CTO, ON2IT: https://www.linkedin.com/in/robmaas83/ 
Luca Cipriano, Cyber Threat Intelligence Program Lead, ON2IT: https://www.linkedin.com/in/luca-c-914973124/

If this helped, subscribe to Threat Talks. Share this episode with your partners and contractors—stronger cyber hygiene across your ecosystem protects everyone. 

🔔 Follow and Support our channel! 🔔
=== 
► YOUTUBE: https://youtube.com/@ThreatTalks
► SPOTIFY: https://open.spotify.com/show/1SXUyUEndOeKYREvlAeD7E
► APPLE: https://podcasts.apple.com/us/podcast/threat-talks-your-gateway-to-cybersecurity-insights/id1725776520

👕 Receive your Threat Talks T-shirt
https://threat-talks.com/

🗺️ Explore the Hack's Route in Detail 🗺️
https://threat-talks.com

🕵️ Threat Talks is a collaboration between @ON2IT and @AMS-IX

What is Threat Talks - Your Gateway to Cybersecurity Insights?

Threat Talks is your cybersecurity knowledge hub. Unpack the latest threats and explore industry trends with top experts as they break down the complexities of cyber threats.

We make complex cybersecurity topics accessible and engaging for everyone, from IT professionals to every day internet users by providing in-depth and first-hand experiences from leading cybersecurity professionals.

Join us for monthly deep dives into the dynamic world of cybersecurity, so you can stay informed, and stay secure!

00:00:00:04 - 00:00:03:23
What if your cybersecurity hygiene
causes issues to others?

00:00:04:13 - 00:00:09:02
The subject of today is about
how one person’s bad digital habits

00:00:09:07 - 00:00:15:20
can lead to significant risk to
colleagues, friends, family, etc..

00:00:16:07 - 00:00:18:02
Welcome to Threat Talks.

00:00:18:02 - 00:00:21:03
And today we dive deep
in cybersecurity hygiene.

00:00:21:18 - 00:00:24:15
Let's get on to it.
Welcome to Threat Talks.

00:00:24:15 - 00:00:27:20
Let's delve deep into the dynamic world
of cybersecurity.

00:00:28:22 - 00:00:33:10
With me today is Luca Cipriano, who is
our CTI and Red Team Program Lead.

00:00:34:00 - 00:00:35:18
Welcome, Luca. Thank you very much.

00:00:35:18 - 00:00:38:18
And my name is Rob Maas,
Field CTO at ON2IT.

00:00:39:09 - 00:00:42:09
So, Luca, today's subject
is cybersecurity hygiene.

00:00:43:12 - 00:00:47:07
Can you elaborate a bit on
what do we mean by cybersecurity hygiene?

00:00:47:09 - 00:00:48:00
Yeah, sure.

00:00:48:00 - 00:00:53:18
So, well, in general, what we mean are all
the best practices that we should follow

00:00:54:02 - 00:00:58:16
when it comes to, for example,
password, digital footprints or,

00:00:59:04 - 00:01:05:13
everything that involves, like,
data that should be protected.

00:01:05:13 - 00:01:09:02
So things like, don't re-use passwords,

00:01:09:02 - 00:01:11:12
the same password
for every website is a start.

00:01:11:12 - 00:01:14:11
Don't use a simple password.

00:01:14:11 - 00:01:17:13
Like for example, .... yeah,
I don't know, your first name.

00:01:17:13 - 00:01:20:17
[ ] Welcome1, I think is
still the most used one. Yeah.

00:01:20:17 - 00:01:26:01
So unfortunately, that’s sad,
but I've seen like, welcome, or

00:01:27:09 - 00:01:29:22
different versions of welcome
in different languages.

00:01:29:22 - 00:01:36:02
Welcome2025. Yeah, with some extra numbers
and people will just keep using it and,

00:01:36:05 - 00:01:39:07
maybe they just, change the number,
like one number,

00:01:39:07 - 00:01:42:07
or one extra also, which is
really easy to guess,

00:01:42:17 - 00:01:46:08
like, for example, other things
that could be like good

00:01:47:07 - 00:01:53:18
hygiene is like, don't save credit card
details or password on browser.

00:01:53:18 - 00:01:57:18
They can be stolen, if you get infected
by info stealer, for example.

00:01:57:23 - 00:02:00:12
So you should use, well, password vaults.

00:02:00:12 - 00:02:02:17
Don't give away everything...

00:02:02:17 - 00:02:06:05
Oh, yeah.
We'll get some more into that towards the end.

00:02:06:05 - 00:02:08:09
Yeah. But, you say okay,
it's about protecting data.

00:02:08:09 - 00:02:10:12
It's your cybersecurity hygiene. Yeah.

00:02:10:12 - 00:02:12:20
What would you say to people
who say I have nothing to hide?

00:02:12:20 - 00:02:15:05
Because I hear that quite often.
Yeah.

00:02:15:05 - 00:02:18:21
That's the problem is like, a lot
of people, they always say, well,

00:02:19:13 - 00:02:24:11
if I got hacked, I don't care, I don't
have money to be stolen, I don’t

00:02:26:02 - 00:02:27:05
I have nothing to hide.

00:02:27:05 - 00:02:30:19
Even if they get in my email,
I don't care what they can read.

00:02:30:19 - 00:02:34:22
But the problem is, like, it's
a little bit selfish, I think, but also,

00:02:35:02 - 00:02:39:17
maybe because people don't really
think through to the extent that

00:02:40:05 - 00:02:44:06
this can create problems not only to you,
but to the people around you.

00:02:45:10 - 00:02:48:11
Because, like, I can give you an example.

00:02:48:11 - 00:02:51:11
There's a person which
is quite close to me.

00:02:51:23 - 00:02:56:05
I have a lot of people
that have this problem.

00:02:56:05 - 00:02:59:11
I'm not going to say any names,
but like, in the specific

00:03:00:02 - 00:03:04:09
case, this person is quite close to me,
and we have kind of a work relationship.

00:03:04:09 - 00:03:09:20
So, the problem is that this person doesn't

00:03:09:20 - 00:03:12:23
or didn't, because then I said
they had to do that.

00:03:12:23 - 00:03:17:11
They didn't use the lock screen on their phone
so everybody could just lock the screen.

00:03:17:14 - 00:03:21:20
There was no facial recognition,
no fingerprint or no pin code, anything.

00:03:23:07 - 00:03:26:07
And the phone got stolen from the car.

00:03:26:17 - 00:03:30:15
And of course, I was helping this person
and want to say like, yeah,

00:03:30:15 - 00:03:32:15
but you should use something to lock it,

00:03:32:15 - 00:03:34:21
because otherwise
they can access all the data.

00:03:34:21 - 00:03:36:05
The answer was the usual.

00:03:36:05 - 00:03:37:12
Yeah, I don't have anything to hide.

00:03:37:12 - 00:03:40:03
They cannot do anything with that.

00:03:40:03 - 00:03:41:21
But then I said, because I know

00:03:41:21 - 00:03:45:00
the habit of people that are not
well versed in cybersecurity,

00:03:45:00 - 00:03:48:21
I said, okay, I'm pretty sure
you write down passwords.

00:03:49:10 - 00:03:52:05
You write down passwords on
just a note on your phone.

00:03:52:05 - 00:03:54:09
Right? And they were
like, oh yeah. Yeah.

00:03:54:09 - 00:03:56:08
So then now they can
access all your password

00:03:56:08 - 00:03:59:15
because you don't use a vault
and they're just on a normal note.

00:03:59:15 - 00:04:02:21
The footprint on the phone
is much bigger. It's really big.

00:04:02:21 - 00:04:06:18
And again, the answer was like, yeah,
but even if they access my email,

00:04:06:18 - 00:04:09:03
I don't have anything to hide
or there's nothing that they can do.

00:04:09:03 - 00:04:11:18
But that's not true,
because like, for example,

00:04:11:18 - 00:04:16:10
you're putting me at risk as well because
now they have access to your WhatsApp,

00:04:16:19 - 00:04:20:11
they can send me a message and say,
hey, listen, my bank account

00:04:20:14 - 00:04:24:04
changed to this, and,
please send me the money here.

00:04:24:23 - 00:04:27:01
And, yeah, I mean, how do I know?

00:04:27:01 - 00:04:30:01
Like, I can be as good as I want
in cyber security, I can be

00:04:30:02 - 00:04:33:19
as careful as I want, but it’s coming
from your phone, from your legitimate...

00:04:34:03 - 00:04:35:11
And you trust this guy.
Yeah.

00:04:35:11 - 00:04:38:02
And you can't even inform me
because you don't have the phone anymore.

00:04:38:02 - 00:04:41:13
So, like, how are you going to tell me
that your phone got stolen?

00:04:41:13 - 00:04:42:15
You don't know my number,

00:04:42:15 - 00:04:47:06
and you don't have the phone to inform me.
If the actor was quick enough,

00:04:47:11 - 00:04:49:09
knowing that you are not informed,

00:04:49:09 - 00:04:52:19
he could basically abuse your
trust with this person.

00:04:52:19 - 00:04:53:00
Yeah.

00:04:53:00 - 00:04:55:17
And that's just one of the things,
because I'm pretty sure

00:04:55:17 - 00:04:59:03
that a lot of people take
pictures of their passport or

00:04:59:11 - 00:05:04:05
driving license, because sometimes maybe
they need to send an email or [ ], so probably

00:05:04:05 - 00:05:06:00
if they start to look
through the pictures,

00:05:06:00 - 00:05:09:23
they will find also this kind of pictures
with all your data, which is even worse

00:05:09:23 - 00:05:12:23
because you can really like
steal identity at a certain point.

00:05:13:07 - 00:05:16:18
And I think that one of the biggest issues
that people don't understand

00:05:18:00 - 00:05:22:11
how important is your digital
footprint and how many things

00:05:22:11 - 00:05:26:01
you put in your phone and you put online
nowadays. And especially your identity,

00:05:26:01 - 00:05:29:21
I think, especially today where
we have deepfakes, for example,

00:05:29:21 - 00:05:31:11
we did a episode about it.

00:05:31:11 - 00:05:34:11
How easy it is, if you only have
a few pictures of a person to

00:05:34:13 - 00:05:39:17
impersonate that person, not only
by text, but also by video or photos.

00:05:39:18 - 00:05:44:08
Yeah. With AI, it's even like
the imagination is the limit.

00:05:44:09 - 00:05:47:11
You can do so many things nowadays
that, is even worse.

00:05:47:15 - 00:05:53:14
I think this example is a great bridge
to how we came to the idea to do this episode.

00:05:53:14 - 00:05:57:12
Because we also, here at the SoC
last week, had an example.

00:05:58:16 - 00:06:01:08
So what I want to do
is go over that example

00:06:01:08 - 00:06:04:08
that triggered us to
record this episode.

00:06:04:12 - 00:06:07:09
So, can you tell us what happened?

00:06:07:09 - 00:06:07:20
Yeah.

00:06:07:20 - 00:06:13:01
So, we received, an alert at the SOC,
I was helping with the investigation and,

00:06:14:03 - 00:06:21:05
basically one of our customers received
an email which seemed quite legitimate.

00:06:22:01 - 00:06:25:17
And this email was something,
you know, one of the usual, like, please

00:06:25:21 - 00:06:29:21
check this document and return them
signed, as soon as possible,

00:06:29:22 - 00:06:35:18
or I don't remember the exact contact,
but that was more or less the type.

00:06:35:18 - 00:06:40:10
And, the document was just, a fake
document, but it actually was a URL.

00:06:40:14 - 00:06:44:04
So while you would click,
there wouldn't download any documents.

00:06:44:11 - 00:06:47:16
But it was just a picture with [ ]
A magic trick.

00:06:47:20 - 00:06:50:03
Yes, indeed.

00:06:50:03 - 00:06:55:14
The URL, actually, it passed
some commands

00:06:55:14 - 00:06:59:10
in the URL and what it was doing,
it was just downloading

00:06:59:17 - 00:07:05:19
ConnectWise, which is one of those
software that is used like,

00:07:05:19 - 00:07:10:18
for example, think TeamViewer, right?
[ ] remote access tool.

00:07:11:01 - 00:07:11:08
Yeah.

00:07:11:08 - 00:07:15:14
A remote access tool that they use for
remote access. TeamViewer, Connect-

00:07:15:14 - 00:07:18:00
Wise, you have a bunch of them.
Yeah, exactly.

00:07:18:00 - 00:07:24:23
So, and it was, automatically connecting, as a client,
towards one IP that was also legitimate.

00:07:24:23 - 00:07:26:15
So there was a legitimate IP.

00:07:27:23 - 00:07:32:11
Of course, the user realized that
something was wrong because, well,

00:07:32:16 - 00:07:36:15
it didn't open any document and something
happened on their computer, so they-

00:07:36:22 - 00:07:39:22
I'm not sure if you say quite
easily: ‘Of course’,

00:07:39:22 - 00:07:44:01
I think a lot of people would ignore
it or just don't pay attention to it.

00:07:44:01 - 00:07:45:21
We’ll go over that with the tips.

00:07:45:21 - 00:07:49:20
Let's say that they got suspicious because
there was no document. In this case,

00:07:50:00 - 00:07:53:03
this was probably the reason
the customer contacted us.

00:07:53:04 - 00:07:56:21
Yeah, yeah. This happened on my PC,
which looked suspicious. Yes.

00:07:57:02 - 00:08:01:06
Please help us. Also, because the
the thing, the difficult part of this

00:08:01:06 - 00:08:05:11
is that these kind of software are
also used legitimately in companies.

00:08:05:11 - 00:08:08:19
And it happened that that company
also used that software for some

00:08:09:15 - 00:08:16:19
IT remote session with the support,
so they can blend in with normal tools.

00:08:16:19 - 00:08:20:16
Maybe they already figured that out
with the reconnaissance phace.

00:08:20:16 - 00:08:22:18
Yeah, yeah.
This company is using this software,

00:08:22:18 - 00:08:24:20
so let's make it as legitimate as possible.
Yeah.

00:08:24:20 - 00:08:33:03
They did. So it should ring less bells,
if it gets detected because it's something

00:08:33:03 - 00:08:36:03
that is normally used,
within the company.

00:08:37:06 - 00:08:41:19
So that's why they use this kind of tools,
and sometimes it'll be difficult to detect,

00:08:41:19 - 00:08:45:11
if the user doesn't really say, hey,
something is wrong here.

00:08:45:11 - 00:08:45:22
Yeah, yeah.

00:08:45:22 - 00:08:48:01
So the customer called us. Yeah.

00:08:48:01 - 00:08:49:19
Please help us. What did we do?

00:08:49:19 - 00:08:52:01
So well we started an investigation.

00:08:52:01 - 00:08:55:17
First of all, we tried to find
what happened on the host,

00:08:55:21 - 00:08:57:23
if there was any trace.

00:08:57:23 - 00:09:01:16
The normal kind of, some
light forensics, let's say,

00:09:01:22 - 00:09:05:14
just to take a look at what
was going on, on the host.

00:09:05:17 - 00:09:09:02
But one of the things that I was helping
with, was trying to understand

00:09:09:02 - 00:09:12:17
if the... because the IP where it was connected,

00:09:12:20 - 00:09:16:08
it seemed really legitimate
and the domain also and

00:09:16:12 - 00:09:22:15
the IP was registered to that domain,
and it looked really like, it was quite weird.

00:09:22:15 - 00:09:25:00
It didn't look like it was spoofed.

00:09:25:00 - 00:09:31:05
So I started to do some investigation,
some little bit of OSINT, so

00:09:31:05 - 00:09:38:07
some open source intelligence and I
checked on the dark web if there was

00:09:38:16 - 00:09:42:15
anything related to data breach.

00:09:42:15 - 00:09:50:16
And I noticed that this specific email address
that was associated to the domain was

00:09:51:02 - 00:09:55:00
registered to a person
and it had something

00:09:55:00 - 00:09:58:22
like 19 breaches
and I could retrieve

00:09:59:13 - 00:10:06:16
six different passwords that were
in this collection of data leak.

00:10:06:19 - 00:10:09:02
And they were also reused in different-

00:10:09:02 - 00:10:12:22
So you saw 19 leaks. 19 leaks.
[ ] different passwords

00:10:12:22 - 00:10:15:12
but also a lot of reused passwords.
A lot of reused,

00:10:15:12 - 00:10:18:07
and the passwords
were really simple password.

00:10:18:07 - 00:10:19:18
So not really complex.

00:10:19:18 - 00:10:22:12
Maybe some letters was change with number.

00:10:22:12 - 00:10:23:13
But like I mean this

00:10:23:13 - 00:10:27:08
it was like really pretty basic and short password.
They could be guessed by an attacker.

00:10:27:09 - 00:10:29:07
Hey, this will be probably
the next password.

00:10:29:07 - 00:10:30:02
Yeah, indeed.

00:10:30:02 - 00:10:35:16
When you have like, a few, different
iterations of password.

00:10:35:16 - 00:10:39:00
And, you know, what is the logic of that
person using the password,

00:10:39:00 - 00:10:43:03
then you can easily create dictionaries
which more or less,

00:10:43:16 - 00:10:46:16
go through all the variation
that you could use next.

00:10:46:18 - 00:10:50:03
So because you know
the logic with,

00:10:50:08 - 00:10:53:07
that you use when
you choose a password.

00:10:53:07 - 00:10:56:09
So, it turned out when we passed the..

00:10:57:04 - 00:11:00:10
all the information, of course,
we didn't give the private details

00:11:00:10 - 00:11:03:12
or the password that we found,
but we said, like, hey,

00:11:04:05 - 00:11:07:08
it seems, we think that
the email is legitimate.

00:11:07:14 - 00:11:10:14
And actually, we found the
password and a lot of details,

00:11:10:14 - 00:11:13:06
and a lot of leaks
for this person.

00:11:13:06 - 00:11:17:21
So that is most likely
that maybe they used,

00:11:18:10 - 00:11:22:00
somebody is using his account
and that was the case.

00:11:22:06 - 00:11:24:03
So that in the end turned out,

00:11:24:03 - 00:11:26:08
you know, when you
have a big company,

00:11:27:09 - 00:11:29:14
you have a lot of contractors,
for example.

00:11:29:14 - 00:11:32:21
Of course, the person that
works at IT or the SOC,

00:11:32:21 - 00:11:35:21
doesn’t know all the contractors
that are legitimate.

00:11:36:02 - 00:11:37:03
And then when they checked,

00:11:37:03 - 00:11:42:07
indeed, it was one of the legitimate
contractors that got compromised.

00:11:42:07 - 00:11:46:21
So this contractor probably
had reused his password for -

00:11:46:21 - 00:11:49:01
Yes, most likely.
- at least for his email.

00:11:49:01 - 00:11:50:07
And therefore the attacker

00:11:50:07 - 00:11:53:07
could read all his email,
but also send on behalf

00:11:53:18 - 00:11:56:16
or not even on behalf,
impersonating, the attacker,

00:11:56:16 - 00:11:59:23
and really send email
as if he was the contractor.

00:11:59:23 - 00:12:00:15
The contractor, yes.

00:12:00:15 - 00:12:03:21
And this contractor had his own domain.

00:12:03:21 - 00:12:07:01
So the domain was a contraction
of the domain and his last name.

00:12:07:01 - 00:12:08:21
So it was his private domain.

00:12:08:21 - 00:12:11:16
So probably it also had
a server hosting something.

00:12:11:16 - 00:12:16:15
And, well, they compromised many things.

00:12:16:15 - 00:12:21:12
And now due to all of this, the email that
the customer got looked really legitimate.

00:12:21:20 - 00:12:23:19
Therefore he was [ ]

00:12:23:19 - 00:12:25:12
It was sent by somebody else.

00:12:25:12 - 00:12:29:15
But, even the link could be in
this circumstance, be valid.

00:12:29:19 - 00:12:32:11
So he clicked on it and
that was triggering him,

00:12:32:11 - 00:12:36:05
because you expect it's
kind of a document and.. okay.

00:12:36:05 - 00:12:36:18
Like that.

00:12:36:18 - 00:12:40:09
So it was not spoofed, but it was
something that probably was caused by

00:12:40:16 - 00:12:44:12
poor hygiene because, reused
password, simple password.

00:12:44:16 - 00:12:51:00
And then you're a vector for
spear phishing and then,

00:12:51:04 - 00:12:54:22
yeah, you can say I don't
have anything to hide,

00:12:55:04 - 00:12:59:11
but you now are the liability for a company.
Because maybe you're a small contractor,

00:12:59:11 - 00:13:03:10
so you say, yeah, but you do some work
for a company. Yeah, as an attacker

00:13:03:16 - 00:13:07:17
you are basically using or abusing
the trust that exist between people.

00:13:07:21 - 00:13:09:05
Yeah, indeed.

00:13:09:05 - 00:13:12:05
And that's really, really good
for them, because, I mean,

00:13:12:12 - 00:13:15:08
if they do some research, they know,
okay, they use this software,

00:13:15:08 - 00:13:18:23
this is legitimate, then it can slip through.

00:13:18:23 - 00:13:21:12
And this could be a really
good entry point to start

00:13:21:12 - 00:13:24:12
a much larger... This was quite simple,
and we noticed it early.

00:13:24:12 - 00:13:27:14
But this could be the entry point
for a much larger attack.

00:13:27:16 - 00:13:27:23
Yeah.

00:13:27:23 - 00:13:31:06
And this is also I think,
why identities

00:13:31:06 - 00:13:34:06
are worth money, especially
on the dark web.

00:13:34:14 - 00:13:36:04
Yeah, yeah.

00:13:36:04 - 00:13:37:18
Do we have some other examples?

00:13:37:18 - 00:13:42:08
I can imagine, maybe some, if your social
media account is being compromised.

00:13:42:08 - 00:13:43:00
Yeah, yeah.

00:13:43:00 - 00:13:47:11
I mean, to a certain extent, that goes
back to what we were talking before,

00:13:47:11 - 00:13:50:22
I mean, even if you're not a contractor,
but you have relationships

00:13:50:22 - 00:13:54:05
maybe with other people
or you do some small works

00:13:54:05 - 00:13:57:22
for somebody that not necessarily
is like your main work.

00:13:57:22 - 00:14:00:22
But for example, what you
mentioned, social media.

00:14:01:04 - 00:14:06:01
I had it also from people, close to me
that they got compromised.

00:14:06:01 - 00:14:09:01
Their social media account
got compromised,

00:14:09:01 - 00:14:14:00
and then they started to post
things, you know, the usual scam.

00:14:14:07 - 00:14:16:01
[ ] they're making a lot of money with this.

00:14:16:01 - 00:14:17:17
With this crypto. Invest.
Yeah.

00:14:17:17 - 00:14:21:21
Click here, you get a bonus or whatever.
Yeah, or strong political

00:14:22:00 - 00:14:26:11
standpoints, especially if they’re
more famous people, then you see...

00:14:27:03 - 00:14:30:11
... often used for political statements.
Yeah.

00:14:30:20 - 00:14:31:12
Okay. And yeah.

00:14:31:12 - 00:14:36:23
So that's also another thing that
can put at risk [the] people around you.

00:14:36:23 - 00:14:39:23
So it's not only about you,
but it's also the people around you.

00:14:40:08 - 00:14:45:01
I remember, I don't know if it happens
again nowadays still, but in the past

00:14:46:07 - 00:14:49:11
they used to send this private email,
wasn't on Facebook,

00:14:49:11 - 00:14:51:17
could say, hey, is this
you on this video?

00:14:51:17 - 00:14:55:15
And then there was a link and you
receive it from a friend of yours

00:14:55:15 - 00:14:57:23
and just like, oh, what is that?
Then you click. Yeah, you click.

00:14:57:23 - 00:15:00:16
Yeah. And then you get
phished as well.

00:15:00:16 - 00:15:04:23
Phishing is still one of the most used
attacks to gain at least the initial access.

00:15:04:23 - 00:15:05:18
Yeah, indeed.

00:15:05:18 - 00:15:07:23
Indeed. Do we have some other examples?

00:15:07:23 - 00:15:08:03
Yeah.

00:15:08:03 - 00:15:11:17
I think, other things that,
unfortunately I had something

00:15:11:17 - 00:15:14:17
that happened to me as well, once.

00:15:15:00 - 00:15:17:18
I don't know really how, but,
I got something

00:15:17:18 - 00:15:21:10
open on my name that that was not,
I didn't do that.

00:15:22:22 - 00:15:26:10
I'm Italian, it was in the past,
like a couple of years ago.

00:15:26:10 - 00:15:27:17
It was in Italy.

00:15:27:17 - 00:15:31:01
Well, it was really easy there
to do weird things.

00:15:31:09 - 00:15:38:14
Just over the phone you could open,
for example, electricity account,

00:15:38:14 - 00:15:42:23
and there were not a lot of checks,
but somebody got somehow

00:15:43:04 - 00:15:49:01
hold of my ID number.
Social Security number.

00:15:49:01 - 00:15:52:12
Yeah, it was like the ID number,
I don't know how,

00:15:52:18 - 00:15:56:00
but they opened an electricity
account, on my name.

00:15:56:07 - 00:15:59:08
And then I was like, yeah,
I don't live there,

00:15:59:08 - 00:16:01:15
I don't know, it was there
and that's that's not mine.

00:16:01:15 - 00:16:07:02
And it's a bit annoying that
then you need to go through

00:16:07:02 - 00:16:08:09
all the phone calls to understand,

00:16:08:09 - 00:16:10:10
hey, that was not me,
you need to go, like,

00:16:10:10 - 00:16:13:12
and then maybe they
got my ID number

00:16:13:12 - 00:16:18:16
from some leak or somebody else
that didn't handle my data in a good way.

00:16:18:16 - 00:16:24:02
So it was not really like something
that you can prevent, but it can create..

00:16:24:02 - 00:16:27:02
Yeah, another example
I think, that especially

00:16:27:09 - 00:16:31:10
hold stand today is shared work accounts
or at least a work environment.

00:16:31:10 - 00:16:34:10
So for example, Google Workspace,
you can log in at the same time,

00:16:35:03 - 00:16:37:23
if we, for example, both work
in the same document,

00:16:37:23 - 00:16:40:22
and your account gets compromised
because you lack, of course, a lot of

00:16:40:22 - 00:16:42:21
cybersecurity hygiene, then,

00:16:42:21 - 00:16:45:21
I will trust everything you write down
in the document that we're working on.

00:16:46:06 - 00:16:46:22
Yeah.

00:16:46:22 - 00:16:49:19
Or maybe you just use it to read
all the information

00:16:49:19 - 00:16:52:13
and use it in a more sophisticated
spearfish attack.

00:16:52:13 - 00:16:54:16
Yeah. There are all kinds of
directions you can go with that.

00:16:56:04 - 00:16:57:23
Yeah. So plenty of examples.

00:16:57:23 - 00:17:01:04
Also, unfortunately a lot of
real world use cases. Yes.

00:17:01:12 - 00:17:06:07
I think it's a good point to go
over some of those cybersecurity

00:17:06:12 - 00:17:10:17
hygiene basic things that
everyone should do or

00:17:10:17 - 00:17:12:23
at least also be able to do,

00:17:12:23 - 00:17:15:23
to prevent or at least make
this kind of attacks much harder.

00:17:16:05 - 00:17:19:04
Do you have some good tips for us?
Well, yeah.

00:17:19:04 - 00:17:24:02
The one we mentioned earlier,
I was a bit fast at the beginning,

00:17:24:02 - 00:17:28:19
but like, well, some things we said
like, don't use short passwords.

00:17:29:02 - 00:17:30:17
They are easy to guess.

00:17:30:17 - 00:17:34:14
If you don't want to remember your
password, use like sentences, maybe.

00:17:34:19 - 00:17:37:10
Yeah. That's always a good idea
and also makes it longer.

00:17:37:10 - 00:17:40:06
Yeah, makes it longer. Like, you
can say like ‘ashortgrayalien’,

00:17:40:06 - 00:17:42:05
something like weird
that you will remember

00:17:42:05 - 00:17:43:09
because it's a funny sentence.

00:17:43:09 - 00:17:46:04
You put like a dash in...
And then, of course, important,

00:17:46:04 - 00:17:50:13
don't reuse that short gray alien.
Don't reuse it.

00:17:50:13 - 00:17:53:03
So use a different password
for every service you have.

00:17:53:03 - 00:17:58:15
And especially if it comes to
personal and professional services,

00:17:58:15 - 00:18:05:17
don’t do it, ever, but if you do it, at least make
a differentiation between those two. And also,

00:18:05:17 - 00:18:09:18
one thing could be like,

00:18:09:20 - 00:18:14:05
don't save any credit card details
or passwords in the browser.

00:18:14:11 - 00:18:18:00
Like, info stealers can steal
that information, but

00:18:18:06 - 00:18:20:18
use like a password vault.

00:18:20:18 - 00:18:23:06
I think they're quite cheap
nowadays, if you want...

00:18:23:06 - 00:18:24:06
Yeah. You have free ones.

00:18:24:06 - 00:18:26:23
You have free ones.
You have also like, really cheap ones.

00:18:26:23 - 00:18:30:05
And, I use, I use a couple
of password vaults,

00:18:31:06 - 00:18:34:23
but like, one of those provides
also a service that will check for

00:18:34:23 - 00:18:41:19
you if one of your email has been seen in any
data leak or something like that. [ ] this service,

00:18:41:19 - 00:18:43:11
have I been found, as well. Yeah.

00:18:43:11 - 00:18:46:14
I think they use APIs from Have I
Been Pwned, something like that.

00:18:46:15 - 00:18:46:22
That's it.

00:18:46:22 - 00:18:48:07
That's a good one to check your email,

00:18:48:07 - 00:18:51:07
and then you'll probably see
that you also end up in some

00:18:51:10 - 00:18:54:09
the data leaks, because there are
quite often quite a few of them.

00:18:54:09 - 00:18:54:23
Yeah.

00:18:54:23 - 00:18:58:15
And some password vault will inform you
as soon as something new comes up.

00:18:58:15 - 00:19:01:16
Then you can, well, change the password.
Yeah, change the password.

00:19:03:12 - 00:19:09:00
I think the other easy to implement feature
nowadays is multi-factor authentication.

00:19:09:05 - 00:19:12:00
Yeah. Actually it’s surprising,
that still some people,

00:19:12:00 - 00:19:16:09
when you mention it casually, they say
like what is that? I do have it- Yeah,

00:19:16:09 - 00:19:17:15
or they say it's annoying.

00:19:17:15 - 00:19:21:02
Yeah. Yeah, but...
The trade off. Yeah.

00:19:21:02 - 00:19:24:19
And there are still some websites
that they don't allow you to that.

00:19:24:23 - 00:19:26:13
They don't allow-
Yeah, but there are also websites

00:19:26:13 - 00:19:30:09
that don't allow you to create a password
longer than 12 characters, for example.

00:19:30:09 - 00:19:30:19
Yeah. Yeah.

00:19:30:19 - 00:19:35:06
So it's not only on the people, it's also
on the provider of services sometimes.

00:19:36:06 - 00:19:38:16
So yeah, for sure multi-factor authenticator.

00:19:38:16 - 00:19:38:21
Yeah.

00:19:38:21 - 00:19:43:06
But you already mentioned in your first
example, make sure if you have a phone,

00:19:43:20 - 00:19:48:16
that it will be locked automatically,
use your fingerprint or your face

00:19:48:16 - 00:19:53:00
recognition or at least a long
pincode, six digits or more.

00:19:54:00 - 00:19:54:09
Yeah.

00:19:54:09 - 00:19:55:08
Indeed, indeed.

00:19:55:08 - 00:19:57:12
So there are a lot of good tips.

00:19:57:12 - 00:19:59:00
We also have a good website.

00:19:59:00 - 00:20:01:07
It's not of us, but it is,
you can find it on the internet

00:20:01:07 - 00:20:04:15
it’s watchyourhack.com.

00:20:04:22 - 00:20:10:12
So it is a great tip if you need some low
level, low hanging fruit, easy

00:20:10:12 - 00:20:14:00
to implement measures to hopefully prevent
these kinds of attacks.

00:20:14:00 - 00:20:17:00
That's definitely worth a visit.

00:20:18:00 - 00:20:21:00
So anything else that we
want to conclude with?

00:20:21:10 - 00:20:24:10
Well, I think that the message here

00:20:24:15 - 00:20:28:14
should be like, sometimes, maybe
think about other people as well.

00:20:28:14 - 00:20:30:00
Don't think only about yourself.

00:20:30:00 - 00:20:32:03
I understand you don't
care that you get hacked,

00:20:32:03 - 00:20:35:03
but you should care about
the people around you

00:20:35:03 - 00:20:38:12
because, you can become a
liability for other people.

00:20:38:12 - 00:20:42:01
So it's not only about you, but it's
also about all the other people,

00:20:42:15 - 00:20:45:01
that they might be affected by that.

00:20:45:01 - 00:20:46:21
Yeah, I think that's a very good conclusion.

00:20:46:21 - 00:20:49:21
So cyber hygiene isn't just about you.

00:20:49:22 - 00:20:52:20
It's about everyone who trusts you.
It’s about everybody.

00:20:52:20 - 00:20:55:19
And if you do a bad job,
then you might affect them.

00:20:55:19 - 00:20:58:13
So I think that's a very good conclusion.

00:20:58:13 - 00:21:02:04
So to you, listeners, think
about who will trust you.

00:21:02:13 - 00:21:05:14
What do you do about
your cybersecurity hygiene?

00:21:07:11 - 00:21:10:10
Save this episode if you
want to spread the word

00:21:10:10 - 00:21:13:15
and give your family, friends,
colleagues, also some tips.

00:21:13:23 - 00:21:18:02
So give them this episode or
the website we just mentioned.

00:21:18:02 - 00:21:21:02
So, watchyourhack.com.

00:21:21:02 - 00:21:24:17
And if they do a better job,
they will also improve your security.

00:21:25:05 - 00:21:27:03
So with that, thank you.

00:21:27:03 - 00:21:30:02
And, if you like what you saw,
please like and subscribe.

00:21:30:02 - 00:21:31:19
See you next time.

00:21:31:19 - 00:21:36:23
Thank you for listening to Threat Talks,
a podcast by ON2IT cybersecurity and AMS-IX.

00:21:36:23 - 00:21:39:07
Did you like what you heard?
Do you want to learn more?

00:21:39:07 - 00:21:43:02
Follow Threat Talks to stay up to date
on the topic of cybersecurity.