Talkin' Bout [Infosec] News

This episode covers a range of cybersecurity and AI-related news, including how Pokémon Go players may have unknowingly helped train delivery robots using massive image datasets. The hosts also discuss the Pentagon’s reported plans to train AI systems on classified data and the potential risks of exposing sensitive information. Additional topics include major data breaches (such as a third-party breach impacting Crunchyroll user data), ongoing challenges in cybersecurity practices, evolving AI security concerns, and real-world examples of exploits and vulnerabilities affecting mobile devices and organizations.

Join us LIVE on Mondays, 4:30pm EST.
A weekly Podcast with BHIS and Friends. We discuss notable Infosec, and infosec-adjacent news stories gathered by our community news team.
https://www.youtube.com/@BlackHillsInformationSecurity

Chat with us on Discord! -
https://discord.gg/bhis
🔴live-chat


Chapters
  • (00:00) - PreShow Banter™ — Easier Than Printers
  • (05:20) - Pentagon Plans to Train AI With Classified Data – BHIS - Talkin' Bout [infosec] News 2026-03-23
  • (06:38) - Story # 1: Sears Exposed AI Chatbot Phone Calls and Text Chats to Anyone on the Web
  • (07:38) - Story # 1b: ALT Link - Sears Exposed AI Chatbot Phone Calls and Text Chats to Anyone on the Web
  • (15:35) - Story # 2: Federal cyber experts called Microsoft’s cloud a “pile of shit,” approved it anyway
  • (24:31) - Story # 3: The Pentagon is planning for AI companies to train on classified data, defense official says
  • (34:04) - Story # 4: CISA Urges Endpoint Management System Hardening After Cyberattack Against US Organization
  • (37:50) - Story # 5: Warning: Your AI-Generated Password Is a Major Security Risk. Here’s What to Use Instead
  • (42:21) - Story # 6: CISA warns of active exploitation of Microsoft SharePoint vulnerability (CVE-2026-20963)
  • (49:57) - Story # 7: Massive China Data Leak: Hackers Access 10 Petabytes of Weapons Testing Data
  • (51:28) - Story # 8: Anime fans' credit cards might be stolen from Sony streamer Crunchyroll
  • (55:03) - Story # 9: The Proliferation of DarkSword: iOS Exploit Chain Adopted by Multiple Threat Actors

Links
Story # 1: Sears Exposed AI Chatbot Phone Calls and Text Chats to Anyone on the Web
Story # 1b: ALT Link - Sears Exposed AI Chatbot Phone Calls and Text Chats to Anyone on the Web
Story # 2: Federal cyber experts called Microsoft’s cloud a “pile of shit,” approved it anyway
Story # 3: The Pentagon is planning for AI companies to train on classified data, defense official says
Story # 4: CISA Urges Endpoint Management System Hardening After Cyberattack Against US Organization
Story # 5: Warning: Your AI-Generated Password Is a Major Security Risk. Here’s What to Use Instead
Story # 6: CISA warns of active exploitation of Microsoft SharePoint vulnerability (CVE-2026-20963)
Story # 7: Massive China Data Leak: Hackers Access 10 Petabytes of Weapons Testing Data
Story # 8: Anime fans’ credit cards might be stolen from Sony streamer Crunchyroll
Story # 9: The Proliferation of DarkSword: iOS Exploit Chain Adopted by Multiple Threat Actors

Click here to watch this episode on YouTube.




🔗 Register for FREE Infosec Webcasts, Anti-casts & Summits 
https://poweredbybhis.com

Brought to you by:
Black Hills Information Security 
https://www.blackhillsinfosec.com

Antisyphon Training
https://www.antisyphontraining.com/

Active Countermeasures
https://www.activecountermeasures.com

Wild West Hackin Fest
https://wildwesthackinfest.com

Creators and Guests

Host
Hayden Covington
Hayden Covington joined Black Hills Information Security (BHIS) in the Summer of 2022 as a SOC Analyst. He chose BHIS after hearing many great things over the years and seeing the quality of work, as well as finding people who have the same passion for the field as he does. His favorite part of the job so far has been the community. Previously, Hayden worked in a SOC for a Naval contractor, where he also served as their SOAR project manager and SME, as well as insider threat lead. When he’s not working, Hayden can be found doing anything athletic (like triathlons!), as well as enjoying video gaming and Formula 1.
Host
John Strand
John Strand has both consulted and taught hundreds of organizations in the areas of security, regulatory compliance, and penetration testing. He is a coveted speaker and much loved SANS teacher. John is a contributor to the industry-shaping Penetration Testing Execution Standard and 20 Critical Controls frameworks.
Host
Ralph May
Ralph is a U.S. Army veteran and former DoD contractor who supported the United States Special Operations Command (USSOCOM) with information security challenges and threat actor simulations. Over the past decade, he has provided offensive security services at Optiv Security and Black Hills Information Security (BHIS) across various industries. His expertise spans network, physical, and wireless penetration testing, social engineering, and advanced adversarial emulation through red and purple team assessments. Ralph has developed several tools, including Bitor (set to release in January 2025) and Warhorse, which enhance efficiency in penetration testing infrastructure and operations. He has spoken at numerous conferences, including DEF CON, Black Hat, Hack Miami, B-Sides Tampa, and Hack Space Con.
Host
Wade Wells
Wade Wells has been working in cybersecurity for a decade, focusing on detection engineering, threat intelligence, and defensive operations. Wade currently works as a Lead Detection Engineer at 1Password, where he helps build and mature scalable detection programs. Outside of his day-to-day work, Wade is deeply involved in the security community through teaching, mentoring, podcasting, and running local events
Guest
Alex Minster "Belouve"
Alex Minster is a cybersecurity professional with a passion for Open-Source Intelligence (OSINT) , and a desire to use his technical skills to make a meaningful impact on society. With nearly twenty years of experience in cybersecurity, and a current role in Threat Intelligence for a global financial corporation, Alex remains very active in numerous cybersecurity groups including DC608 and Black Hills Information Security. Beyond his professional accomplishments, Alex is an avid oldschool gamer who enjoys arcades, retro gaming, and tabletop games. He brings his passion for adventure and his commitment to helping others to everything he does, both in and out of his professional career.
Guest
Bruce Potter
Bruce founded Turngate to build the tool he always wanted but wasn’t available during his stints as a CISO (pronounced “ciz-oh,” a hill he’ll die on). He’s been doing cybersecurity things for nearly 30 years and fills his spare time exploring hobbies by breadth (not depth).
CW
Guest
Chadd Watson
RP
Producer
Ryan Poirier
Ryan Poirier began his time at Black Hills Information Security (BHIS) as the Video Producer and Editor in August 2020. Ryan polishes and perfects every webcast, podcast, and workshop on the BHIS, ACM, and WWHF YouTube Channels. Prior to Ryan’s time at BHIS, he worked for one of the largest public schools in the United States, conducting their video production and live broadcasting. He joined the BHIS team because he felt like it would be a great group of people to work with, and he couldn’t pass up the perfect next step in his career. Outside of his time with BHIS, Ryan does freelance photography, attends Cars & Coffee events, and expands his knowledge of audio and videos.

What is Talkin' Bout [Infosec] News?

A weekly Podcast with BHIS and Friends. We discuss notable Infosec, and infosec-adjacent news stories gathered by our community news team.
Join us live on YouTube, Monday's at 4:30PM ET

John Strand:

I've got a couple of good ones today.

Wade Wells:

Pokemon Go players unwillingly trained delivery robots with 30,000,000,000 images? Oh. Oh, yeah. Like, am I surprised?

Ralph May:

No. Is

Chadd Watson:

that was amazing. The way they did that was great and that's been a story, I think, five or six times in the last couple of years. It's really funny.

Wade Wells:

Yeah.

Alex Minster "Belouve":

I I thought I've heard this story before.

Chadd Watson:

Yeah. Yeah.

Hayden Covington:

I still just like the Delve stuff. That just cracks me up so bad.

Ryan Poirier:

We're live. Well, that's good.

Ryan Poirier:

I see a stream.

John Strand:

We'll get this whole this video and audio thing down someday.

Wade Wells:

How long has it been, John? Like

John Strand:

Since 2008. It's been a while.

Wade Wells:

I believe in you. It's easier than printers now, at least. Like, I remember days, like, trying to get a webcam to work with so rough, like, trying to go using my p s five my my p s two webcam on my computer.

Ralph May:

That was a that was a fun you had a p s two like, the port like, the serial port webcam, like, that was good enough?

Wade Wells:

I don't remember. I wanna say it was USB.

Ralph May:

Yeah. I would I believe that one.

Wade Wells:

Yeah. Yeah. And I I It was for one of, like, the one of the videos where they actually tracked you and you could play games. And then we never played it. And I was like, I wonder if this can work on a computer and slowly went down the rabbit hole.

Ralph May:

What was the Microsoft, like, video chat program, the very first one called? Dang it. What was it called? I'm gonna now, you'll look it up, I remember using that. It was it was bad. It was

Wade Wells:

I used

Ralph May:

was like Windows 95 bad. Like, that's we're we're going back there.

Wade Wells:

I'm not gonna mention the websites I was on. It was not

Ralph May:

the best.

John Strand:

At those times?

Wade Wells:

No. This isn't the golden times of the Internet. Right? This is

Ralph May:

I mean

Wade Wells:

Stuff was still dark.

Ralph May:

Stuff was still dark. I all I was on was AOL in the dial up days. Like, this was, like, before the transition to, like, the full Internet, you know? And there's yeah. It it was debauchery.

John Strand:

I still I still maintain the Internet was better back then.

Ralph May:

It was. You used one ISP. Best So time many niche stuff that, like, just can't you just can't you just can't find anything like that any anymore. I mean, I remember thorough blueprints of, like, very out there Star Trek shot, you know, ships.

Alex Minster "Belouve":

And it's like nowadays, it's like, well, if you want that, like, it's gonna be like a, you know, $9.99 a month subscription and maybe for a premium you can get access to the blueprints of this one ship that appeared from this episode. It's like, that

Ralph May:

was all free back in '95, somebody with way too much time and like a CAD program.

John Strand:

Well, the internet sucks. It's I know. Is that the episode? Is that the is that the

Wade Wells:

So I will admit, I would I was reading the news and I didn't realize I had opened the AI part and I'm just going through it. I'm like, why is every article just AI? Like, I'm like, I wanna talk about something other than that. And then I'm like, no, okay.

Ralph May:

Yeah. It's just all emojis and em dashes from here, my friend.

Wade Wells:

There's there's And bullet points

Ralph May:

of things.

John Strand:

Three every it's like, why is there always three bullet points and everything? Uh-oh. That

Wade Wells:

that's a good I'm gonna have to go change my slides. I'll be right back

Ralph May:

All my slides are ruined. We can just trade slides, Wade. I'm sure we we both

Hayden Covington:

probably almost wrote the other topic.

Wade Wells:

We could. You that would be that would be pretty good.

Hayden Covington:

That'd be really funny.

John Strand:

It's one of those, like, when you're doing cons, like, slide roulette can be one of the best things at the con, and it can also be some of the worst that happens. Bruce, I think it was at ShmooCon years ago, Kevin Johnson got women of the TSA, and it was just nothing but like the x-ray pictures. And it was so epic. He did such a great job of that. And there were so many people that were so offended.

John Strand:

But most of the time what happens at a con is someone just looks at the slide and they're like,

Ralph May:

Yeah. I mean, credit credit for at

Alex Minster "Belouve":

least making it work. Like, even if it is Yeah. You you didn't make everybody in the room happy, you at least did something more than, you know, stare at it.

Ralph May:

Yeah. Never say die. Never say die.

Bruce Potter:

Yeah. It's a real art to I mean, honestly, like, I've seen, you know, 90% of them have just been awful. And you gotta go in, like, go into them with, a game plan. Like, whatever It's just like a media training. Like, you answer the question that you wish they'd asked you.

Bruce Potter:

You you Yeah. You give a slide presentation you wish they had given you, and then you just tie it in and it works out. And that's that's better than trying to work the slides.

John Strand:

We did one of b sides charm where it was all AI generated slides. This was I wanna say it was four years ago.

Ralph May:

Oh, man. The AI generated right there.

John Strand:

The AI generated slides were flat out terrifying. Like, they were putting in prompts to come up the slides like multiple reasons why all humans should be exterminated. And AI was generating the pictures with the justification. It was like, oh, yeah. Now we got guardrails.

John Strand:

Alright. Guard rails. Everybody. Let's get started. Ryan, bring out the crooked finger.

John Strand:

Let's get rolling. Here we go. Hello and welcome to another edition of Talking About News. My name is John Strand and I'm joined by a whole bunch of other people. This is in Zoom.

John Strand:

So the coordination is a little bit different for us because we're trying a new platform here. But we have a number of different stories from federal experts officially declaring that Microsoft's cloud is a pile of shit. AI companies training in classified environments what can go wrong. BHIS finds all kinds of like AI chatbot data exposed. We have Intune issues, more CISA stuff.

John Strand:

I'm gonna rant on CISA and their Kev, k whatever. Their commonly exploited database. We've got a lot going on. Let's go ahead and get started. Ralph, do you wanna pick the first story, sir?

Ralph May:

Oh. Wow.

John Strand:

Because you were talking,

Alex Minster "Belouve":

you had one that you

John Strand:

thought was pretty pretty good. So do you want because there's a lot here. So I'm kicking it over to you, man. What do you wanna start with?

Ralph May:

Oh. Let's do you do you actually wanna start off with your own with your own, like, BHIS little

John Strand:

Let's talk about let's talk about oh, okay. You wanna talk about Jeremiah?

Ralph May:

Yeah.

John Strand:

This newest thing? So, yeah. Basically, Jeremiah spends a lot of time. So Jeremiah Fowler has been with us for a little while at BHIS. And he spends a lot of time specifically going through and trying to identify exposed like like databases and things that might be useful, in what's going on in Ukraine and Russia because he has family ties there.

John Strand:

And he stumbles across some pretty terrifying things every once in a while. And this one, he was able to identify a bunch of phone calls where the audio was recorded and was exposed to the open internet. And then he started, you know, kind of working with that. And this one kinda scares me. The whole like whenever you're calling in and it's like, this will be shared for quality assurance purposes and also marketing of a security firm at some point in the future.

John Strand:

I

Ralph May:

I was gonna say the rest of the Internet will be shared with rest

John Strand:

of Internet. Will be shared. It's like they say, whatever happens in Vegas stays in Vegas, isn't it? It's the Internet. Shared with everybody.

John Strand:

I just and it's like this. Jeremiah Fowler was surprised because it takes a lot to surprise Jeremiah these days. But is this really all that much of a surprise? I mean, the type of data it is, the chatbot data, I guess it's kind of novel in that respect. But what is it?

John Strand:

It's 1,400,000 audio files and then the plain text transcripts Yeah. Were basically out there as well. Well, is this?

Ralph May:

So it looked like, just reading between the lines here, this was a web interface that went to a or a CVS file or CSV file, excuse me. And in that file were a bunch of URLs for audio files and Mhmm. Chat history. Right?

John Strand:

Mhmm. And let's not forget the most surprising thing about this is apparently, Siri still exists.

Alex Minster "Belouve":

Yeah. Yeah. There's that. There there was also the the ambient audio right there that. Yeah.

John Strand:

Yeah. The ambient audio was

Alex Minster "Belouve":

You you thought the call had ended and then you're still

Wade Wells:

I don't

Ralph May:

Something is still Yeah. Okay. Hold on.

Hayden Covington:

Was it a phone call

Ralph May:

or was it over like a web browser? No.

John Strand:

No. It's a phone call. You could literally hear the ambient conversations. And I don't know if you all have noticed this, but there's a certain percentage of people that when you're talking to them on the phone, they expect you to hang up.

Ralph May:

Oh, yeah. That's If my

John Strand:

you just don't hang up, they'll accept their phone down and walk away. I don't know what percentage of the population that actually is. But apparently, was a handful of those like that where they just stopped and they expected the Sirius chatbot to hang up.

Hayden Covington:

What a life to live. I know. Right?

Ralph May:

I did think it was funny though, because like the one thing it did say in the article was that, like, we're reading into some of these conversations, the bot would be or the AI would be like, I totally can help you. You don't need an agent. And then fail at helping them and then say, we're gonna send you to an agent.

John Strand:

And people got increasingly more angry and aggressive with Samantha saying, I want to talk to a person. I need a technician.

Wade Wells:

I How how how much data was this? Like, doesn't say in size. You would have

John Strand:

3,700,000 chat logs and 1,400,000 audio audio files.

Wade Wells:

Which is a lot. Like, I'm just surprised that you just let storage overflow that much. Is that just me? No? Like,

John Strand:

a long time there.

Wade Wells:

Sears is hurting for money for sure. Like, no one's watching those storage fees.

Ralph May:

They're just in cold storage. Well, obviously, nobody's watching anything over there.

Hayden Covington:

That's it.

Ralph May:

Like, nobody nobody care. Like, set it up, they sold it, and they were like, okay, well, just forget about it, I guess. Right? I mean, that's Well,

John Strand:

in that but okay. So that gets into a larger issue. Like, you know, I talked about this, I think, in previous episodes, talking about the coming SaaS apocalypse. Right? Because anybody can develop any well, not anybody, but any medium to large sized organization can look at the SaaS products that they have and they can very quickly develop their own products utilizing AI and many times poorly.

John Strand:

So now, instead of somebody buying and like everybody kinda coalescing around one SaaS offering, now all of a sudden people can just start creating their own crap SaaS offerings and these things don't die. They don't go away. They don't get patched. They don't get updated. They don't get looked at.

John Strand:

And that's one of the reasons why when we're looking at computer security, I hate a lot of people in security like, oh, AI. It's the end of security. I'm like, are you out of your freaking mind? It's just moving the technology profile a little bit more. And there's gonna be a lot more of stuff like this.

John Strand:

Because you know there's a bunch of custom software underneath this that was just absolute garbage.

Ralph May:

I mean, essentially, what what you're saying is from a high level, you can develop quickly and you're just going for your customer. Not to say you should or shouldn't. Right? Just that you can you like, most of these founders have an idea and they're like, let's build this as fast as possible and then they can build it really fast and then they get it out there, they get those customers, but security is kind of like, well, what?

Hayden Covington:

Yeah. It's like a trend with all the AI providers right now, is they all have some product that's brand new and very expensive that does security audits and security reviews and like, Anthropic came out with one recently and it was like atrociously expensive per like PR review because they're running so many different agents on it. But at a certain point, like, you can build anything. Like we talked about that some last week. But also should Should you?

Hayden Covington:

You? And I guess it depends, is it more costly to get breached or more costly to buy this tool? Like, one is worse?

Ralph May:

Well, I'll

Wade Wells:

take a

John Strand:

And I wanna bring Bruce in on this. Right? Because I remember years and years ago, there was lots of conversations, especially in the early web days where it was like, no. Anytime you get an urge to say, I'm gonna redevelop an entire package from scratch and not go with an open source product, you were told, don't do that. Lay down on the ground and then take a breath and wait for the urge to pass.

John Strand:

Now all of a sudden, those guardrails are gone, man. Like it's just flat out so fast that people can do this. And I don't think we can fight it. And also, going back to what Hayden was just talking about. I don't know if the code that for like traditional companies, SaaS companies that are out there, is gonna be that much better than the crap that people are gonna produce anyway.

Bruce Potter:

Yeah. I I think that's the unfortunate reality is like, you know, software development has not evolved all that far in the last twenty five years when it comes to software assurance and code quality. I will say, like, I I think what what's interesting is, like, if you can vibe code an app, you can use that same technology to, do the assessment for you and try to help you do it more securely and that kind of thing. And and the models have been trained reasonably well against that stuff. And and I've seen at least with the the people I've talked to, pretty good results.

Bruce Potter:

But you have to be thinking like, I have to care about not just the functionality, but the security of the thing. And and this is where like the eighty twenty rule kicks in. Like people are like, I got 8% there. Yeah. It works.

Bruce Potter:

I'll ship it. We'll see what happens. And that 20% is the part that's gonna kill you.

Hayden Covington:

Right. And that's the big crossover. Like, there's probably not a huge crossover between the people that are like, I'm gonna spin up Cloud Code and build an app real quick. And the people that actually wanna do security that may not be instant and very quick either. Well Like, that's not a big big diagram.

Wade Wells:

You just you just That's me. That's me.

John Strand:

No. But I'm gonna say that I'm

John Strand:

gonna say that security is still hard. And and the reason why I'm gonna say that is, like, an app is just not its code. It's the ecosystem that it's running in. Right? Like a lot of these vulnerabilities are more like infrastructure style vulnerabilities that come back and bite someone in the ass.

John Strand:

Or like like like you can have your code is really good, but if you're shuttling that data that is produced off to somebody else's platform, we see time and time again where third parties are getting breached. Right? And the other point about it is you may be let's say you code something fantastic today. Right? It's done perfect.

John Strand:

You lock it into a time box. And then you open it up two years from now, it's gonna be hidden hideously insecure. And it comes back to the hygiene associated with security is still very much missing. Right? We're still back into the it works.

John Strand:

This developer that's a cousin of mine put this thing together. It looks great. Let's get it out there. You're still not part of the hygiene and making sure that the libraries are up to date and any new security vulnerabilities and infrastructure don't have vulnerabilities that are showing up on it as well. It's just Like I said, people can make bad decisions faster now than they ever could before.

Hayden Covington:

Yeah. And I think I mean there's more to review them.

Bruce Potter:

Yeah. And I I think there's The the the non functional functional requirements requirements of of the the system have always been the the the ephemeral thing that requires like, you know, skill and experience and whatever to figure out. And and the functional requirements are like, I made an address book, you know, people can build that and understand it. But performance and security and scalability, that still requires expertise. Even if the AI is writing the code, like it requires the guidance, from someone who's been there and done that before to guide the system to the right place.

John Strand:

Yep. Alright. I got another story. This one cracked me up. And I there's a couple of really painful stories this week.

John Strand:

I just shared it in chat, Ryan. This one is federal cyber experts called Microsoft's cloud a pile of shit and approved approved it any. The reason why this this hurts for me is I've been part of these conversations in the government. And what they're predominantly talking about is they kept asking Microsoft questions. Like, how does end to end encryption work?

John Strand:

Is it actually end to end encrypted? Can you prove it? Can we get some documentation? Can we see how it's like works on the back end? And Microsoft could not answer a whole bunch of these different questions.

John Strand:

And refused to answer these questions. And then provided documentation that was subpar, and it was just bad all the way through. And having worked on the government side and DOD, I've seen this again and again and again, where you sit down and you ask a very large vendor. Back in the day, it was Oracle for me. Oracle was hideous to work with on this stuff.

John Strand:

It was like, screw you, we're Oracle. And they just refused to answer any questions. And then once again, no one gets fired for hiring IBM. No one gets fired for hiring Oracle. No one gets fired for hiring Microsoft.

John Strand:

And it went through. And I'd like to get you guys' take on this particular story and like your experience on this as well.

Hayden Covington:

I I have a quick point I wanna make, is the the headline is very sensational. It is in like the first couple paragraphs where it mentions like, when they call it a piece of shit, they're talking about the documentation that Microsoft them and the answers to their questions. I will withhold my opinion on on Microsoft's platform, but that that is the part they're calling a piece of shit.

Ralph May:

I I feel like alright. We talked about, the two things about programming that are difficult. Right? Like the design and then scalability. So I think and I believe that whenever you get at the scale of Microsoft, doing security at scale gets really difficult as well.

Ralph May:

Right? Especially when we're dealing with tons of data. And at the end of the day, they're looking at how do we make money. So they're gonna take shortcuts to allow that to happen while, you know, hope having enough security, like enough is, you know, at scale. And that that's probably what we're looking at here too with the documentation not having all of this stuff, you know, different groups and people all working on it and it kinda got broken into pieces.

Ralph May:

So Yeah. Yeah. It's good to

Hayden Covington:

hear that you Microsoft has problems doing their diagrams and everything too.

Bruce Potter:

Well, Microsoft is a 50 year old company.

John Strand:

Yeah.

Bruce Potter:

I mean, seriously,

Ralph May:

I saw it the other day and I I They love Visio.

Bruce Potter:

It's crazy. They love Visio. And and, you know, the tail that they have and they brought with them to the cloud is is really long, you know. And I think it shows in in in when you compare them to the other large cloud infrastructure providers, like, you can see the age in the system, from the documentation all the way down to the technology.

Ralph May:

Well, the other thing too is that most people don't probably go through the whole process of the FedRAMP stuff. Right? So, like, they spend all this time building out all of this, like, essentially documentation and and and and and, like, how it should be secured. Like, Microsoft probably didn't do that when they built this. They were, like, you know, just trying to build it and then eventually got to that point.

Ralph May:

And now that they get all this documentation, they're like, oh, shit. I don't think anyone's gonna ask these questions.

Wade Wells:

I think I've got a couple companies now that have tried to go FedRAMP and the keyword there is tried. Right?

Ralph May:

Like Yeah.

Wade Wells:

From my experience from smaller companies, it is a difficult thing. So to hear that, like, it's not surprised that Microsoft got passed on it, but I'm sure, like, the bar is high. I guess, John, you would know that more than I I because I I never actually got to work at a FedRAMP organization.

John Strand:

But I I I have a friend that's trying to just develop he's a very small small shop. It's basically him and a couple of other developers. And they're trying to do like all the stuff FedRAMP FedRAMP compliant and it's conflicting. It's will break a lot of the stuff that's in the cloud. And he's pretty much given up on it, but it's become like a pet project of his.

John Strand:

He's like, nah. I'm gonna get this. But his takeaway from that is, anybody comes to you and says that they're completely FedRAMP compliant, they're liars. And that's how they got FedRAMP compliant in the first place.

Ralph May:

Oh, I I just remember doing, the stig hardening guides for, systems. Right? And, like, the reason the reason I bring this up is because if you go through and apply everything in the guide, the thing just won't work anymore.

John Strand:

Yeah. So they had something a long time ago called the disc of gold discs. Yeah. Yeah. You put in this ice or you burnt this ISO to a CD.

John Strand:

And then you ran the Disagold disk for Microsoft or whatever, Linux or Solaris. And it literally had this wonderful button that was like apply STIGS. And you could click that button, it would apply all the configuration and you could not log in to it. And I was on a project, I'm gonna talk about the it's kind of weird, but the the next story that's gonna be coming up, you know, training classified data and class classified data and AI. And we had a DA, it was a designated programming authority, and a PA programming authority rep come in, and they basically sat down at all the systems.

John Strand:

And I went through and I tuned all of this crap, made sure that the system still worked. And these two guys showed up and they just hit that button and they nuked our systems. Right? Like nothing worked. And it turned into this huge thing.

John Strand:

It was like, you know, I got blamed initially. Well, he didn't secure it hard enough. I'm like, it doesn't matter. It default out of the box, it bricks it. And I had to demonstrate that.

Ralph May:

Yes. And then you had make justification. And and I remember on those that you also had like the, like the some of the supplemental test plans, and that's where that's where it seems so, you know, relevant here that it's like it's difficult to

Alex Minster "Belouve":

be that momentum against what, like, just the standard is. Because even with the test plans, you say, hey, here's this thing that is is not secure. Like, it's it's a new thing, it's an emerging thing, you know, it's a problem. Here I can show you on the server that we don't have this secured. If it wasn't part of the current test plan, they're like, don't worry about it.

Alex Minster "Belouve":

It's

John Strand:

Yeah. But that was all

Alex Minster "Belouve":

And and and then, lo and behold, like four months later, they're like, this is now part of the current test plan, drop everything and fix this, and you're like, that's the thing that I talked about four months ago that I said, hey, as long as we're in here, why don't we patch this thing? And it's like, no, it's don't don't be an instigator, Alex. Like, just Well, in well, not coming back around this time with the, you know, hey, this isn't this stuff is, you know, like, know, they said, you know, it's a pile of shit. And it's like, well, don't be an instigator, just sign off on it. Don't be the first person to to push against the norm.

Alex Minster "Belouve":

The norm is, just accept this, go with it.

John Strand:

Well, in kind of like like the piggybacking on top of Bruce said, like Microsoft has this huge amount of baggage it carries with it. And a lot of it, it doesn't even know. One of one of my favorite stories is the people that were coming up with Samba years ago. They were trying to come up with something that would be compatible in communicating with Windows systems. And they had no idea how like, Landman was working.

John Strand:

Right? And they had no idea how NetNTLMV two worked and all of this different stuff. And they were talking about going through like a memory dump of Windows computer system because they were trying to find a key that that Landman was using, basically. They were trying to figure out how it was using des to encrypt. It was using your password as a password to encrypt a string, and they didn't know what that string was.

John Strand:

And they're going through memory and they saw KGSExclamation point at pound dollar sign. And they're like, is that it? And it turns out it was. And rumor has it that was the initials, k s g or k g s, was the initials of the guy that wrote that protocol at Microsoft. And the point was no one knew, and they think that that was his password, that he literally hard coded it.

John Strand:

So it used your password to encrypt that string using DES for LANMAN. With net NTLM v two, going back to that again, there's a whole bunch of fields in that protocol. No one knows what the hell they do. And I was talking to some Microsoft engineers. They're like, look.

John Strand:

If we have to figure out how something works of ours, we go look at other people's documentation that has reverse engineered it because we don't have that documentation in play. And going back, Alex, you know, we talk about, you know, things being ignored. I keep telling people all the time, you know, in every security standard that exists on the face of the planet. Right? I don't care if you're working CIS or you're working NIST.

John Strand:

They say absolutely no clear text authentication protocols. And Microsoft, like NetNTLMV two, is a clear text authentication protocol. If you look at how it works and how it actually runs, you have a challenge, an eight byte challenge, that is sent in the clear. The response with the password hash is sent, and it uses the password hash as the verification mechanism. All of that is reversible.

John Strand:

All of that is sniffable. But it's one of those things where you have to be like, we're just gonna pretend everything's okay. Because of the legacy technologies. Because if you start trying to peel back that and trying to fix it, you're basically gonna shut down Windows completely from a lot of these things. So this is tough with legacy technologies that has this huge amount of baggage that they continue to carry forward.

Ralph May:

So Microsoft is junk. But we have a solution. We have a solution, and that is to use AI for classified data.

John Strand:

Yep. I thought

John Strand:

you were gonna say just Oh, god.

Wade Wells:

I thought you were gonna say just lie on audio.

John Strand:

Hey, Ralph, what was he going

Ralph May:

to? I

John Strand:

know. Right? I I think my word count on this episode is already exceedingly high. Do you wanna take this one?

Ralph May:

Yeah. So the Pentagon is planning for AI companies to train on classified data. Right? So that's the article. And, you know, the the wild part about this is, you know, classified so supposedly they're supposed be using classified data centers for this, obviously, to, you know, house this classified data, and then they're gonna train on it.

Ralph May:

Right? That's the big idea. But it seems like the beginning of Terminator, I just feel like that that was, like, the whole thing. They gave the machines, like, access to the military and, like, all of this information. I I I know I know I'm kind of joking around that, but, I guess, you know, obviously, they have like a Claude Gov and like other things to isolate these things out.

Ralph May:

But yeah. What do you what do you guys think about this or even the idea of classified information with AI? Right?

Bruce Potter:

I I think there's a there's a real challenge around access control. I mean, you kind of ignore the the the moral issues of, like, what they could do with it and whatever for a second. But, you know, classified information access is really, you know, geared around, you know, need to know and there's there's it is far and away the most complex, you know, tagging and and, you know, kind of compartmentalized universe that's ever been created by mankind. And at the end of the day, like, you know, these LLMs have shown that if you ask them the right questions, they will regurgitate the training material that they were given. Right?

Bruce Potter:

And so I think that's there's this question of like, how do you protect need to know classified information if you can just ask the LL on the right thing? How are they gonna actually constrain that? I don't know that there's at least publicly been a lot of discussion around how they're gonna enforce that.

Wade Wells:

There there's a couple of products out there that right now take all your data, all your notes, and all your information and then supposedly give it back to you if you have access to it. Think about it as a like one stop shop for your entire org. I have seen it both work and of course not work, and give notes on a particular document that a user wasn't supposed to have access to. But then also, same thing with like Slack messages and there are the group channels and stuff like that. Right.

Wade Wells:

I'm kind of interested to see what they do with the security because I think if they can implement some, like, exactly, access control policy, it would it would trickle down to the public sector. It would be pretty cool.

Ralph May:

Just just to, like, level the field, right, before we get into, like, they the government has been doing AI kind of adjacent stuff with classified information. Right? Like, just databases full of this to try to analyze as fast as they can. So, like, none of this is necessarily new. I think the new part is just how good these models are and the fact that they weren't developed specifically in the government.

Ralph May:

Right?

Hayden Covington:

So And and it has to be like well, maybe I'm giving them too much credit.

Ralph May:

I was gonna say it has to

Hayden Covington:

be like a calculated risk of some of our theoretically most valuable information is going to be classified in some way and we wanna use these models warfare. So like if we can give them access to this to train on it, whatever bad could come out of that is theoretically offset by the good we can do with a model that knows how to, you know, overthrow governments and things. It's just I can understand the reasoning behind it. It doesn't mean that I don't I don't necessarily think it's a good idea.

John Strand:

I I think it's a fantastic idea. And I know that that seems weird because I'm usually uber paranoid. Chad just tilts his head. Let me let me explain why I I think it's a fantastic idea. Now, there's specific utilization of what I'm talking about.

John Strand:

I'm not talking about let's have it make decisions to kill people. Once again, I'm with Anthropic on that. We should not have AI making those decisions.

Ralph May:

It would definitely not kill itself.

John Strand:

But I want you guys to think of a a scenario. Let's say that there's an arms sale for x number of a k 40 sevens from an African nation buying it from North Korea. Right? And I want you to look at all the different types of ints to basically let us know. Like the intelligence sources that you would get.

John Strand:

Right? You would have human. Right? You would have human intelligence saying that this deal is going down. You would have SIGINT, where you'd be able to identify and track, you know, what is the shipping within North Korea going to this specific ship that's going to be moving across the ocean and tracking that ship.

John Strand:

You would have all kinds of different, like GEOINT, where you can actually see that. You would also have financial intelligence. And that financial intelligence can be part of the banking system, and it can also be cryptocurrency transfer where you know roughly this is how much this many a k 40 sevens costs. We've seen this transfer and this much bitcoin from this wallet to this wallet correlating to all of this stuff. Now, if you take all of those different things, right?

John Strand:

In hindsight, when you see that, you can pull all of those data sources together and you can see it. And a really good example of this, go back and read the report on September 11 with kind of tracks like, CIA had a lot of data, but it wasn't their job and who would they give it to? How would they give it to? What would be all of that? Now all those ints that I talked about, when you're tracking SIGINT, you're tracking signals from literally billions of devices.

John Strand:

Right? When you're tracking financial information, whether or not you have warrants, it's a whole another series of conversations we can get into. You're tracking a lot of data. Right? So AI is really good at these types of problems.

John Strand:

Right? I want you to be able to identify patterns like these following or 15 patterns we can train it on. So it can start trying to identify these type of geopolitical and these types of criminal underground and these types of military actions. And that's fantastic. And that's specifically the area that I was working in, and it was literally just thousands of people working their asses off.

John Strand:

And they were only seeing one little tiny piece of the puzzle. Like they would say, we just saw a massive amount of Bitcoin go from this wallet to this wallet. What the hell is this associated with? And if you had the financial side of it, you could say, well, that almost exact amount was transferred from this bank account in Switzerland to this bank account in Switzerland. Now we've tied those bank accounts to a Bitcoin transfer.

John Strand:

There's a lot to this that is very powerful. And I can totally see how this would work for something like that. Now where it gets really scary is like Bruce was talking about and, you know, Hayden had talked about it. Right? How the hell are you going to secure AI data whenever it drops in and you have multiple classifications of data, you're gonna have to either specifically, what Bruce was talking about would be like SAPSAR programs, special access programs that you have to be read in to get the data associated with that.

John Strand:

And even whenever you're moving between the towers, you're moving between the CIA, the NSA, the NRO, they can all have data classified at top secret and maybe SCI. That doesn't mean that anybody can read that SCI data. So that's where this gets really dicey. So there's applications where this makes absolute sense. And there's applications where I'm like, well this is some of the most terrifying shit I've ever seen.

Hayden Covington:

Agents asking each other?

Ralph May:

I think it might You know, with all the,

Alex Minster "Belouve":

you know, the assembly of all that data and it's looking at all the information, the question still is or one of the questions is, how did you control for all, like, the inherent biases and all of, like, this historical data? You know, then make sure that it doesn't hallucinate things or it's not like a sycophant being like, well, I looked at all this and I'm not able to give you the answer that you want. But based on our history as a country and the things that we like to do when we can't find the answer, you know, or like you you referenced like September 11, they would probably put a lot of bias into some outcomes, there as well. Is it going to look at the same things going on and going, you know, in the absence of anything definitive, I'm just gonna kinda take a guess that's going to make my handlers happy, and here you go. This person is is is good for it.

Alex Minster "Belouve":

There's a there's an arms deal going on. Yeah. I'm gonna kinda make some stuff up that there's so much data that who's going

Ralph May:

to go through and double check your work.

John Strand:

I I have no problem with that, Alex, too. And the reason why I don't have a problem with that is that already exists today. When when you're working in these intelligence places, you constantly have humans that bring those biases to the table. Where I agree with you a 110% is you can hold somebody accountable. When it's AI, you don't have that level

Alex Minster "Belouve":

of You can't you can't check their work. Like, person, you can go they can go, these are the things that looked at. These are the conclusion. You go, oh, right here, this is where you made your error versus AI is gonna be like, trust me because who's going through all those ten decades worth of it's gonna take you ten years for a person to double check.

Hayden Covington:

But Or you have another AI that factors Or you

Ralph May:

have another AI.

John Strand:

Oh, there we go.

Wade Wells:

God. You're very good. You're doing good.

Alex Minster "Belouve":

What then happens if they disagree? Because AIs love to disagree. They will fight with each other. I'm like,

John Strand:

oh my god.

Alex Minster "Belouve":

If somebody has had GPT and have Antropic look at it, they're gonna be like, no. Then Brock makes

John Strand:

the final call.

Hayden Covington:

Yeah. Brock makes

Ralph May:

the call. Like the tiebreaker, like, I'm definitely blowing something up. Right? Brock's like launch

Hayden Covington:

the strike, launch it. Go.

Alex Minster "Belouve":

Yeah. Just jump jump the Skynet. Just launch the strike. Yep.

Ralph May:

Making up strikes. So CISA, I guess, is urging endpoint management systems for hardening after the cybersecurity attack, obviously, against US organization. Right? I think this is, related to

John Strand:

Stryker. Yeah.

Ralph May:

Yeah. Stryker and the old Microsoft Intune. I I mean, you know, the I I definitely get the, the alert here. Right? I think it's just more, you know, more of a general thing to be like, hey, we're kind of under cyber attack for our organizations across America.

Ralph May:

Right? But, yeah, endpoint management systems are the are the fun way to gain big access.

John Strand:

I I did see

Hayden Covington:

profile attack Okay. In a lot of ways and I don't think they've really I I haven't been able to find anybody talk about how it happened because we talked about that with some of our SOC customers last week. And so all we could really do is talk about how that group normally gains access to an organization. And we could give recommendations around that. But ultimately, like, the only thing we can talk about is how to make sure that you they don't use your Intune to destroy your own environment.

Hayden Covington:

Right?

Wade Wells:

But looking at the steps that it would take in order to do this and get to that level of access as well as then what does it look like when one of these does fire or, like, when a large amount of them all of sudden start getting wiped. Right? That is I've seen several conversations around that and like, pretty much what are the triggers up to this final point, which for me, is that's a destruction of data technique, right, which is honestly like, by the time you're detecting that, you're kinda screwed.

Ralph May:

I mean, I've I've done this on multiple red teams where as soon as you get access to a privileged account in Azure, you'll see what if they have Intune access or if they're using Intune. That's a great way to spread quickly into the internal environment. So, I mean, it's it's been known for a while. Like you said though, how did they get that level of access so quickly? I think that's kind of like the shock shock

John Strand:

I I I think I think the reason why we don't have that answer is I don't think they have the data. Yeah. It it it's one of those

Ralph May:

things They didn't where subscribe to the plus plus.

John Strand:

The logs that you need from Microsoft. Yeah. I I and and this is a theme. Right? Like, can see how companies respond to breaches.

John Strand:

They're like, we were compromised. This is how it came through. Usually, they say we we've we've brought on Mandiant or some other high

Ralph May:

Yeah.

John Strand:

The eye of our firm.

Ralph May:

Brought on Google to secure Microsoft. A

John Strand:

But they have like the flow of how it happened and that, you know, what are they doing to deal with it. A lot of the companies where they're just not saying how it happened, it's it's scarier to me because that means they do not know.

Wade Wells:

Thinking about it thinking about, like, the last couple big name breaches, I haven't seen one of those reports in a while.

Ralph May:

True.

Wade Wells:

Right? Like, full breakdown of TTPs and what went through, unless you're a customer of them and you're, like, screaming at them. That's the only time I've actually besides maybe some of the Salesforce stuff that happened.

Ralph May:

Yeah. Yeah. Like, I I I was gonna say, just wanna, like, almost theorize, like, put on the

Hayden Covington:

tinfoil hat and say, like, oh, they're trying to keep things secret. Like, this is something that we also wanna use too. And I I I doubt that's the case,

Ralph May:

but that's a good point, Wade, is that's it's becoming like a more recent trend with a couple of these big ones where it's like, how did this happen? We don't know. But Here here's

Hayden Covington:

everything that happened afterwards. Like, you

Wade Wells:

take credit monitoring and go away.

Hayden Covington:

Pretty much. Yeah. Exactly. Right.

John Strand:

Can away credit monitoring. I should have started a credit monitoring company because they always win.

Ralph May:

They always win. They always get paid. Right? They always get paid. I was also gonna bring up the the other thing, not to roll back to to AIs, I think this does go to to hardening, which is passwords.

Ralph May:

Right? So there we had another article in here about AI generated passwords. I guess people are using their assistant to ask for passwords.

John Strand:

Right? Oh, no.

Ralph May:

Yes. Yeah. And so if you don't know, the magic of AI systems or LLMs is pattern recognition. Okay? They inherently create patterns.

Ralph May:

Right? And when you have a password, you want it to be random. As random as possible. Right? So how how can I say this?

Ralph May:

Don't don't do that?

John Strand:

I think it's fine. Talk about I disagree. I'm being very contrarian today. I think it's fine if you call up Sears and you ask its chatbot, what does it recommend for a

Ralph May:

good all of that on the best way. Well, it's got

Hayden Covington:

plenty of data to pull from.

Ralph May:

Got so much data. If you asked

Wade Wells:

it for 64 digit character password. Right? Like

John Strand:

Yeah.

Wade Wells:

In in the end game, unless they get access to your chat logs, that that does that matter?

Hayden Covington:

Never happen. Never happen.

John Strand:

But we also convinced everybody that, like, passphrases are the vibe. So if we're doing shitty poetry the whole time, like

Wade Wells:

You got MFA installed. You're cool, Don't

John Strand:

worry. Are you are you ragging on emo are you ragging on emo password security?

John Strand:

Listen, I don't think generative models are good at emo lyrics yet.

John Strand:

Not yet. Emo's not dead. Emo's not dead. Correct.

Ralph May:

So I I think the the bigger thing is don't don't try to use LLM to, like, generate secure stuff for you, you know. I didn't know that had to be said, but I guess it And so don't I

Wade Wells:

mean, the bar password managers

Hayden Covington:

are so easy now. Like, you go to log in and it's like, do you want us to log in and create your account and save everything for you? And then like, it'll probably Yeah. It's like, as long as you pay us $30 a year, we'll like show up at your house and give you a hug at the end of it all. Like, it's the bar's there asking chat GPT You know what?

Hayden Covington:

Like 5.1 for your password or whatever.

John Strand:

You know what? I'm just gonna say this is another good idea because when I work with my family and they get breached, family and friends, and it's almost always the same thing. They're like, my PayPal got hacked. I'm like, well, did you have two factor on? No.

John Strand:

No. I know you you told us about that at Thanksgiving. You went on this long thing. Not Q factor. Didn't do that.

John Strand:

And I'm like, what what what was your password? I named I I was was named my dog. It was I can't help but think that ChatGPT would give you better password options.

Ralph May:

That's true.

Hayden Covington:

That's True.

Ralph May:

That's true.

John Strand:

Oh, man.

John Strand:

I haven't pitched about that in a long time on this show. I haven't had any I think my family members now know not to go to me. They're like, you've reached

Alex Minster "Belouve":

You know what

John Strand:

you need to you need to talk to John. They're like, hell no.

Ralph May:

Don't know. I'm done. I'm out. Cannot tell him. They can have all my money.

Ralph May:

I don't wanna go through that. I don't wanna talk to like, John. Last time I told him, he talked about

Hayden Covington:

it on his podcast.

John Strand:

The judgey asshole, yeah, he went through and explained everything about how I didn't listen to anything.

Wade Wells:

Didn't wanna, like, the the regulatory people change their password? Who would who is the one the one people you always hit on for always having a short password?

John Strand:

Oh, goddamn PCI.

Wade Wells:

Did they change it?

John Strand:

They did, but I think they went to, like, 12 characters instead. For those of you that don't know, I you know, this has been one of the things I've been ranting against for a long time. It was, up until, like, last year that PCI finally upped their password complexity requirements from seven characters to I think they did it to 12 characters, which is still bad. Right? And I the reason why I hate this so much is I I think I think Ralph was on a pen test when he was still at BHIS, where we cracked like 90% of their passwords.

John Strand:

And they were like, well, that can't be a finding. And we're like, why can't that be a critical finding? Like, well, because we're in compliant with PCI. I'm like, you don't understand. It it it's just nerve wracking.

John Strand:

And then I started calling it out, and then I literally had people that would contact me and be like, dude, you better not start beef with PCI. Those people will destroy your life. I'm like, what? There's a PCI mafia and computer security? Like, I'm not that worried about that.

John Strand:

But they finally updated. I don't even know. Ralph, you got me on this tangent, so screw you.

Ralph May:

Oh, no. I did I did that. I've got it. Yeah. It was Ralph.

Ralph May:

It was Ralph. It was

Hayden Covington:

it was not me.

John Strand:

Who It

Ralph May:

wasn't the person who works at a

Wade Wells:

password company. It wasn't it wasn't wasn't

Ralph May:

mister one password himself or

John Strand:

Somebody else just mentioned the OSI model and completely sent me into work. Right? I I just I just I hate it. And this I'm gonna get So the next story I wanna talk about, I just put this in. I I hate this, and I wanna Am I overthinking this?

John Strand:

I hate the commonly exploited vulnerabilities from CISA. I just It makes me so mad, because you have a whole bunch of organisations out there that don't patch shit unless it shows up on such as commonly ex like exploited vulnerabilities. And I think it's a few 100 right now. Like, there's literally probably I I think maybe we have a million vulner I don't even know what Tenable and Qualys and all these tools, how many vulnerabilities they're scanning for. But it's gotta be hundreds of thousands.

John Strand:

Right? And the attackers are gonna exploit any of them that show up in your environment. They're not gonna be like, oh, well, it's not on the commonly exploited vulnerabilities, so we're not gonna exploit that, Dmitry. It's it it just makes me mad because there's so many organizations that are looking to meet the minimum, and this is creating yet another minimum for them to meet. And I wanna know, am I off base on this?

John Strand:

I I don't know. It it yep.

Wade Wells:

I think we've gone over this before where we just need to have some company who just writes about vulnerabilities and then gets them to go viral, and then you just point at that article and be like, hey, look at this vulnerabilities being like Yeah. Submit a vulnerability here so we can write about it so your company will see this article and then patch it. But I agree with you like, this list, like, I I've dealt with it too. And as like an intel side of it, right, I would then pivot and find other people writing articles or doing something if it's not on this list. Like, here's seven other things.

Wade Wells:

Just because it's not on Sis' list doesn't mean it's not.

John Strand:

Screw it. Let's let's let's vibe code this app. Let's start a startup right now. Gonna start this company where you pay us money. Right?

John Strand:

And then if you want something patched, you come to us and you say, hey, need an article saying that this is actively being exploited by the Russians and I the needed to hit these following points to get management to agree. And then we will write that article and post

Ralph May:

I will write that article.

Wade Wells:

Oh, yeah. Hayden and I wrote the bot to make whatever we're

John Strand:

gonna be able to talk about. All of it.

Wade Wells:

But it was on Restream. That was the We had Restream plugged in. Now Hayden and have to rewrite the bot for Zoom because we're doing it on Zoom today.

John Strand:

Dude, on that And and

Hayden Covington:

we're SOC two compliant. I promise.

John Strand:

And it's gonna post it

John Strand:

it's gonna post it to like Instagram, Facebook, LinkedIn. That's right.

Ralph May:

It's gonna

John Strand:

have like influencers and Reddit. So it looks like it's a big a big deal.

Bruce Potter:

So if if I could interject, I think the one thing to keep in mind with CISA, not to defend them necessarily, but, I mean, their their mission is protection of federal government assets and and and critical infrastructure and that kind of thing. And and there's there's been a historic gap in the federal government around who's there to help the private sector and who's tracking it and and whatever. And there's ISACs for, you know, kind of designated critical infrastructure verticals and that kind of thing. But in general, like, you know, we look at CISO and be like, come on, be better and you poke it with a stick. But the reality is like they're they're first and foremost trying to protect federal agencies and just telling them like, these kegs are like, you gotta do this thing by this date and that's why they issue it.

Bruce Potter:

And then private industry has been like, oh, we'll use that too. But it's a terrible barometer to your point for private industry because attackers are gonna they're gonna hack, right? Criminals are gonna crime and they're gonna figure out the best way to do it. So there is I mean, there is no federal agency that has the edict to protect the, you know, the the citizenry and the and the the businesses at large.

Wade Wells:

Alright. I got it. I got it. So we do we pull a card out of the Better Business Bureau and we become the private sector cybersecurity bureau where people think we are a government, but we're really not. You know what I mean?

Wade Wells:

Like So

Alex Minster "Belouve":

and and for the like the CISA COV, like it depends on like what the capacity is of the organization. So I I know we look at it from a lot of the organizations that have the capacity to do a lot of patching, and then they go, we can only we're only gonna patch like the CSA account stuff. But there are a lot of organizations, and I do this as, you know, like a volunteer for the Wisconsin Cyber Response Team. We help out a lot of like, you know, school districts, local libraries, stuff like that, that they do not have a huge amount of staff. They have a phishing incident, you have to explain a lot of the basics to them.

Alex Minster "Belouve":

Csec have is gonna be the same type of thing where you go, okay, where do you start with the patching? You gotta start somewhere because you don't have the capacity to do a lot of patching. But, yeah, we look at

Ralph May:

it from the the viewpoint that, you know, yeah, you're, you know, billion dollar company that's just looking at

Alex Minster "Belouve":

the C6 codes and going, just patch those, That's bad. Something

John Strand:

that But also government agencies when we're testing. Like, once again, some of the stupid conversations we've had, and once again, I don't know why anyone does business with BHIS sometimes, where we have these conversations, Chad's gonna take that. He's gonna cut it, and that's gonna be an advertising a T shirt.

Hayden Covington:

Red snake. Right?

John Strand:

It's gonna be

John Strand:

a T shirt. But, you know, with these federal agencies, once again, they're like, yeah. You exploited that, but it wasn't in the Kev. So we don't have to patch it. And I'm like, that's not what its point was.

John Strand:

Right?

Hayden Covington:

Like and

John Strand:

and these these are not conversations that happen a lot, but they do happen. And I don't know how we push past that. Right? How do we get people to stop constantly looking for the absolute minimum? Or do we just let nature take its course?

John Strand:

Is it is it like, you know what? Whatever. Hackers will show you the error of your ways in a matter of time. It's just fine to let that happen naturally.

John Strand:

I think that's true. And what I've been thinking this whole time is there isn't a ground level for security for anyone. So if they pick the cab, that's, like, pretty decent because I'm used to talking to, like, middle and smaller large sized orgs that are like, it's fine. It works. Yeah.

John Strand:

That's, like, what I've talked about with aerospace people too. Like, everybody's kinda mad that nobody thought about security for thirty years or something, and now we're like, whoopsies. Now we can destroy the universe by accident.

John Strand:

Get into that cycle. Then you get into that cycle. It's like, well, legacy technology, all over OT. We haven't we haven't done security. We haven't done security in thirty years and it'd break everything if we started now.

John Strand:

So we're just gonna continue not doing security because everything, right, is is on fire. So no, these are tough problems. It's like, I had that one lady, she would just got a CISO position at a a very large company. And I went down and visited her down in Tennessee. And we did a security assessment of all of their apps, and they had a ton of apps that were access data.

John Strand:

You remember in access, you could publish a web page that had the access app quote unquote that you created? They had a bunch of mission critical web apps that were generated access databases. And we like, we found thousands of critical vulnerabilities and she cleared the room and she's like, what what do I do? And I'm like, here's what you do. You stay here for a year,

Ralph May:

Don't ask me how I know.

John Strand:

Man, you think security geeks are horrible to be around when they're arguing with each other? Those forms are awful.

Ralph May:

Oh, yeah. I

John Strand:

bet. You know, what is the range of this particular, like, vehicle? It's like and they all fight to the death on that stuff.

Hayden Covington:

You gotta cite your sources.

John Strand:

You do.

Ralph May:

You definitely have to cite your sources. Yeah. I just thought it was interesting. The not not the actual breach of what what it what equates to just it was from the National Supercomputing Center in China, but just the amount of data supposedly, much of it classified, I'm like, what? This

John Strand:

is I think a

Hayden Covington:

pretty cool truck.

Wade Wells:

That's not seeing, like, Chinese breaches is just news bias. I'm sure they happen all the time and they get

Ralph May:

reported on in China and we just stop also probably correct. Right? Yeah. We're looking for something.

Wade Wells:

I I think the more important breach here that we didn't talk about was the Crunchyroll breach.

Ralph May:

Oh, yeah? What happened

John Strand:

in Crunchyroll? A lot of people.

Wade Wells:

The let's the shiny hunters are taking on the weebs. That's all that's going on pretty

Hayden Covington:

I've seen so many posts that they're like, I was right to pirate all my anime.

Wade Wells:

So pretty it was a third party actor who had access to Crunchyroll's data who is owned by Sony. I didn't know Sony owned Crunchyroll

Hayden Covington:

until Yeah.

Wade Wells:

Yeah. No clue. Right? It sounds like Infosealer, to tell you the truth. They got third party got breached, that third party had access to Crunchyroll, then they extracted I think it was only around a 100 yeah, a 100 gigabytes worth of data.

Wade Wells:

IP addresses, email address, credit cards and PII. Right? Nothing too crazy.

Ralph May:

Nothing too sensitive. But if you had your credit monitoring company, you'd be getting paid right now.

Wade Wells:

They definitely know I watched only all of Attack on Titan now as well as I still need to catch up on Demolition. All of your viewing history for the world.

John Strand:

I got a question. Like, Attack on Titan and One Piece, like, do you, like, do you ever, like, say, okay, I'm gonna sit down and watch this, and you realize they're into thousands of episodes and it's like

Wade Wells:

for One Piece for what I will put this out there. For One Piece, there's actually a side one called One Pace that gets rid of all of the filler stories and it cuts down on the episode count quite a lot. I will put that

John Strand:

You need to send me a link for that.

Wade Wells:

Yeah. I'll read the manga. It's so much better.

John Strand:

But Real of Time or anything from

Wade Wells:

Oh my gosh. You just literally the only other.

John Strand:

I don't even know how people approach Warhammer 40,000 or was it 40 k or whatever.

Wade Wells:

You're Yeah.

John Strand:

It's like this seems like a lot of

John Strand:

geeky people are into this. How do I get started? And God forbid, do you talk to a fan? Once again, I'm sure the computer security people are like that. You know, like, how should I get started in computer security?

John Strand:

Well, the first thing that you gotta understand is that absolutely nothing's secure. Your lamp, your TV, your fridge, it's all spying on you, man. Just like, oh god, Thanksgiving is ruined at John's house.

Alex Minster "Belouve":

There have been so many poor bartenders at security conferences where they're like, my god, please. There's this also book on encryption and like all the the mathematics and it's all ruined and like these algorithms and everything. The bartender is like, I

Ralph May:

Oh. Hate this guy. No worse.

John Strand:

This is horrible. So years ago at at Sands, we were doing a conference at the Wardman Park. And we were doing faculty faculty shot Fridays where we 'd all go down during lunch and we'd take a shot, which was a horrible idea. We did that like once and it was over. And we were talking to him and the bartender was like, so what are you guys here for?

John Strand:

What's the conference? And I'm like, oh, it's a computer security conference. And Bruce was like, because she's like, I was stuck here during a snowstorm with a bunch of security professionals. And she went off on the Snowmageddon conference. And she's like, you people are awful.

John Strand:

And I'm like, that's that's great. That's us. We drink a lot. So apparently, we ran them out of liquor.

Bruce Potter:

Yeah. They had to they had to sleep in the hotel that year because nobody could And get in or so some people were on shift basically for like a day and a half during the snowman getting shmukon.

Ralph May:

That was a

John Strand:

That was one of my favorite. Actually, was one of my favorite memories ever in computer security. Like, walking across that bridge when there's no cars and it's like a foot and a half of snow and nothing. That was so cool. Yeah.

John Strand:

So I put one more in, DarkSword iOS exploit chain. When Google does a write up, they do a write up. Holy crap. I just put this in chat and they're going through this entire toolkit that you can get specifically targeting iOS, and it has exploits in it for, like, you know, what is it? Snapchat, I think was one of them, in here.

John Strand:

There's all kinds it's a just this makes me cry with joy whenever I see this level of writing, kind of breaking down these particular kits and all the different aspects of it. But, this particular exploit, they think it's associated with Russia. I find it interesting that they can't directly attribute it to any threat actor. But it's been used in Saudi Arabia, Turkey, Malaysia, and Ukraine. I had seen some people talking about it saying, yeah, it looks like it's Russia.

John Strand:

Other people think it might be an Israeli kit that's being used. But, you know, basically going after IOS versions eighteen four to eighteen seven has six different vulnerabilities to deploy a number of different malware families. The malware families are Ghostblade, Ghostknife, and Ghostsaber. It has it mirrors the Cortana iOS exploit kit as well, which was, I believe that that one was confirmed as Russian, but I'm not a 100% certain on that. But just a really nice write up, on this.

John Strand:

And the delivery mechanisms, like I I always kind of talk about the NSO group and how delivery mechanisms that we see in a lot of these nation state level boils down to utilizing ad delivery networks and the complications of actually doing that are pretty substantial. But a really great article of, you know, how these work. And the thing that that I keep coming back to, they have all these IOCs and stuff. Anybody monitoring their mobile devices? Like, I mean, do we have any logs that we're getting off them?

John Strand:

Are we getting any endpoint protection on these devices? So we

Wade Wells:

get all these things fixed. We installed iTune on everyone's iPhones. Yeah. Intune. It's Intune.

Wade Wells:

My bad.

Ralph May:

You put iTunes on there? Me too.

Wade Wells:

Yeah. What's the like, one of the first things I think I've always done when you get to a network is figure out what people's guest WiFi is and immediately take whatever that network is and do not alert on it because the amount of garbage when someone connects to that

Bruce Potter:

bad things.

Wade Wells:

There's yeah. There's no way, like I put

Ralph May:

three EDRs on my iPhone. That's how you

John Strand:

that's how you keep up to date with it. And you got those from the store.

Ralph May:

Out of money.

John Strand:

This one looks good. It's got five stars and four reviews. I'm sure it's

Ralph May:

Well, I downloaded it off the App Store. It said it was good to go. Like, it said, like, secure your phone. I keep getting ads. It's still weird.

Ralph May:

Yeah.

Bruce Potter:

This is

Wade Wells:

a really dark sword and it's like icon. Yeah. Yeah.

John Strand:

Was it? You remember when AdBlock Plus got pulled? You know, AdBlock Plus was trying to it was proxying all the traffic through itself and it could filter out ads globally, not just in your browser on your device, but all ads in any app. And Microsoft and Apple were like, no. Not today, Satan.

John Strand:

So I think it's very difficult for any application to get the level of visibility into a phone to do that level of security just because of the way their security models are built.

Ralph May:

Yeah. Yeah. The phones are the phones overall are pretty locked down, even though there's definitely ways to abuse them, you know, and and So are they though?

John Strand:

I mean, I I that's one of the things that bothers me. It's like, yeah, the phones are pretty locked down. We hear about these exploit kits, we hear about companies that are doing it. And it's so locked I think it's locked down so security researchers have a hard time, like, doing any security assessments in it. Yeah.

John Strand:

But is it just like, no no, it's locked down. Trust us kids. It's fine. It's not like a general purpose computer where any kid with a dream who lives in Norway can start taking apart Apache. Right?

John Strand:

It it's it's a little bit the barrier to entry is a little bit higher.

Wade Wells:

Like when when Hayden, have you ever seen mobile device logs? Forensic.

John Strand:

That's not Forensic.

Wade Wells:

That's not the same. That's not the same. I mean, like, that is. Yeah. Right?

Wade Wells:

Like, I would say there's, like, no security on cell phones whatsoever and that is, a ripe attack vector, right? Because there's everything that's on your computer is gonna be stored on there and as well as access to MFA. So

Bruce Potter:

You know what, I think the the the threat model is really, you know, onesie twosie, right? Like you gotta be a target. I think that the I mean, to go all the way back to being in the show talking about compromise via Intune, the only reason that's possible is because Intune is installed at every endpoint and you pop the MDM and you just go own everything, you know. At least with cell phones, like it's trench warfare and, you know, they gotta go like they have to be interested in you as a person and what you have access to to go get it and then they're gonna spend that and then they might get caught. Right?

Bruce Potter:

It's a really expensive and it's more intel than cybercrime.

Wade Wells:

Think I don't think so because we've seen enough Infosealer malware where everyone gets it, right, with different browsers. What if an Infosealer malware targeted cell phones? Then Cool. Customs saw that. Right?

John Strand:

But they do. And and, you know, when we're looking at the cost, I actually somewhat disagree. Right? If you're a nation state level adversary, right, and you're going after a particular organisation, it becomes a lot easier, like, to target one of their systems administrators to gain access to their phone. That initial cost would be high as a per device calculation.

John Strand:

But you can actually specifically target individuals that are high value targets off the gate. And that's some of the things that we've seen like I said, I keep talking about, like, NSO Group's capability of doing highly targeted malvertisements where you can identify a specific ad profile for a specific individual. You don't have to provide their name. So, yeah. It's more expensive for exploiting per device, but I think with some of the targeting tools that are out there, they do you get that return on investment a lot faster because you can Yeah.

John Strand:

Target those specific individuals.

Alex Minster "Belouve":

Yeah. Was gonna say, like, the able to, you know, move faster,

Ralph May:

you know, chain things better and get yeah. You're able to make better use of it for, you know, get hashed.

Alex Minster "Belouve":

Know, with AI and everything just making things move quicker, you're seeing

Ralph May:

a lot of the, you know, low, you know, the low vulnerabilities getting chained together and getting, you know, exploited. But I think it'll Yeah.

John Strand:

But either way, kind of getting back to the point in the logs though, Alex, we're still just sitting in a cave, you know, making shadow puppets on the wall. We don't have logs off of these devices. How can we make a determination one way or the other if we don't have visibility? And if we don't have visibility into a particular part, do we have security? And, you know, at BHIS, we've got a huge amount of pushback.

John Strand:

It's gonna happen one way or the other, folks. Where we are now you are putting these, you're putting the security tools on your device, or you're gonna run a dedicated BHIS device. And then we have the visibility that we need. And that is such a cultural shock even for people in security to say, wait a minute. I don't have full privacy on this.

John Strand:

It's like, no. If you have corporate data on it, I need to have visibility into that. Right? Either a, you're running one that I give you explicitly for this task, or I'm gonna put some software that gives me the visibility that I need to be able to see what's going on. But we're so caught up into, like, letting go of this this this binky shiny metal box that we have, that we think is our universe, that people I don't even think they wanna have visibility into it, the vast majority.

Alex Minster "Belouve":

And it it needs that that shift away from the I'm not a target. Like, you know, understand there's like that threat modeling, don't don't be like, you know, sky is falling and everything, but it's just if we don't have those logs, we don't have that visibility, if we don't know, how are we saying like, I'm not you know, we're not targets.

Wade Wells:

They're not

Alex Minster "Belouve":

interested in me. It's like,

John Strand:

okay. And we haven't gotten and we haven't gotten into ads this time, Alex, which is weird. Yeah. Because you're on the show, usually we go down that path. But the reason why they don't want us to have that visibility is because then we would have visibility into the amount of privacy violations Yeah.

John Strand:

That are happening on those devices. Because if I can hook at the kernel level, if I can hook and do a full packet capture on what's going on and decrypt what's going on on this device, then you're gonna know just exactly how much data Alexa and Siri and all this shit is getting up on you. Right? Nobody wants to spat.

Alex Minster "Belouve":

A lot about you and then when you're able to see those logs, everything that people have about you, then you go, hey, wait a minute, maybe I am part of this threat model and maybe we do have to start worrying about it, maybe we do have to loosen our grip on that that, you know, security blanket.

Ralph May:

ChatGPT is about to drop their ads. Everyone's about to start getting them.

Wade Wells:

No. Really? I didn't see that. Why why didn't we talk about that?

John Strand:

Like mean Next

Ralph May:

week. Next week. Yeah. They talk about the Super Bowl. They've been, like, hitting it hard.

Ralph May:

But yeah. There's gonna be Yeah. Yeah.

John Strand:

Alright. We'll we'll save that for next week. Hey. Thank you everybody for coming. Let's bring out the crooked finger.

John Strand:

Thank you so much and we'll see you next week.

Alex Minster "Belouve":

Alright.