A weekly Podcast with BHIS and Friends. We discuss notable Infosec, and infosec-adjacent news stories gathered by our community news team.
Join us live on YouTube, Monday's at 4:30PM ET
Like, it's what tomorrow what you expect Tomorrowland to actually be.
Corey Ham:Like, it's just a drug a bunch of drugged up people asking where the toilets are.
Wade Wells:They got, like, real rockets everywhere. Alright? How dare you? They have a moon rock you can touch? Like
Corey Ham:No. I know. I'm saying that's what is it the I don't know what Tomorrowland is, but in my head, it's like, an electronic music festival. Is that how
Bronwen Aker:it is?
John Strand:Yeah. That's what
Corey Ham:I was thinking.
Wade Wells:Partially what it is. I man, you guys have never been to Disneyland. Tomorrowland is, like, the
Corey Ham:future land. In Disneyland like you do, you Californian. I
Wade Wells:just assume you all know what Disneyland is. Alright?
Corey Ham:Yeah. That's the California perspective. Just like Bronwen's like, I traveled to another state, and the weather wasn't as good. Okay. Welcome to Californian first world problems.
Bronwen Aker:No, that wasn't a shock. Hey. And you know what? Truthfully, walking around in an open air sauna day in and day out is probably wonderful for the skin. The heat.
Corey Ham:For the sun wrinkles your skin, so maybe not.
Hayden Covington:Yeah. And then the downside is that you're inside of you're in Florida. That's the downside to that that thing.
Wade Wells:I feel like our our typical Floridian isn't here to defend himself.
Corey Ham:Yeah. No. No.
Bronwen Aker:I'll show up. You never know.
Hayden Covington:It's like talking bad about someone when they're not here. Basically, everybody knows
Corey Ham:what's going I saw a little join as John Strand. Maybe Ralph's just confused.
Hayden Covington:I just saw a silhouette that said seven.
Wade Wells:In the This is
Corey Ham:John Strand. Like the Bill Gates mugshot. The Bill Gates mugshot. Does anyone remember that little
Bronwen Aker:text? Yeah.
Ched Wiggins:The visual model had to warm up first. Yeah.
Corey Ham:Yeah. John's off screen going, how now brown cow?
John Strand:And it's like I've been teaching all day long, and now Zoom is gonna poop the bed. Thank Greg
Hayden Covington:Better now than when you're teaching, I guess.
Wade Wells:Oh, right? Definitely.
Corey Ham:Yeah. Alright. What do you think, Ryan? We ready to launch off I have show?
Wade Wells:John, I have one question. I have a question for John. What? John, have you seen the new Star Fox?
John Strand:The game? Is it out, or is it just a video trailer?
Wade Wells:It's the trailer, but it's supposed to be out at the end of the month.
John Strand:Because I've seen the trailer, and it looks awesome.
Corey Ham:Okay. Dude, top play three is gonna be out this month. No.
John Strand:I'm just kidding.
Bronwen Aker:Stop.
Corey Ham:Alright. Roll the finger. Let's do it. Like how Hayden's just ventriloquisting me. I don't know what's happening.
Corey Ham:Hello, and welcome to Black Hills Information Security's talking about news. It's 05/11/2026. Should we just call it Canvas day? Everyone will bring
John Strand:It it also might be Waza day as well because they got a nine. Waza. Nine from Waza. So
Corey Ham:Alright. I'll start with introductions. I'm Corey. I'm here to podcast with us all today. We've got Wade Wells.
Corey Ham:He's here to wade through some logs. Yeah. And I caught your webcast last week, Wade. It was great. I I enjoyed it.
Wade Wells:I was about to say weren't you were there before I started. You better have caught it.
Bronwen Aker:Like
Corey Ham:I was there. I listened to it, and you, like, plugged me at one point during it. I mean, honestly, like, if you listen to this show, if you're here listening to this podcast right now, you should check out the recording because it's very much what we do. Like, genuinely pretty much what we do on the show. So nice work on that.
Corey Ham:Hayden
Bronwen Aker:Thank you.
Corey Ham:It going, man? Hayden runs do you can I say you run the soccer, or do you just are you just Claude? I'm I'm one of the people.
Hayden Covington:I'm one of the people running
John Strand:the Yeah. One of the people that runs the sock.
Corey Ham:He's one of the people riding the claw that runs the sock.
John Strand:Riding the claw.
Bronwen Aker:Yeah. I
Hayden Covington:spend more time talking to AI than humans. Yeah. That's fair.
Corey Ham:Oh. And I have okay. I have legitimately noticed that since I've been using Claude more and more and more that, like, my carpal tunnel's coming back because I just spend so much time being like, no. You need to refocus. You're you're focusing on like,
Hayden Covington:I I hit, like, a five f bomb Claude day on Friday, and I was like, it's time to end the week, I think. This is not going well.
John Strand:Like carpal tunnel? Excellent. It's working. It's working. Were
Corey Ham:were Hayden, were you the one that it just unprompted decided to call it meat bag? It just No. No. That's fine. That's Derek Banks.
Corey Ham:Yeah.
John Strand:Derek Derek called his his his AI Skippy from the book series Expeditionary Force, and Skippy constantly calls the lead character Meatbag. And yeah. So if you're ever seeing Eric screen and it's like, well done, meatbag or calls him, like, a, you know, unintelligent bacteria or bacteria with promise.
Corey Ham:Yeah. That's because he called me a more. Sense. I I've seen some screenshots of his chats, and I've always been Are you next in the AI uprising? Are you okay?
John Strand:Why is your AI so abusive? Yeah.
Corey Ham:Yeah. It's funny though. We also have John Strand, the owner of Black Hills Information Security and like 17 other companies, including a coffee shop, I think, maybe.
Bronwen Aker:Yeah. Definitely needs an intervention. And
Corey Ham:then we have Bronwen, who we already heard from. She's back from HackspaceCon.
Bronwen Aker:Amazing conference. Definitely gonna be keeping an eye out for the CFP call because I wanna go back.
Corey Ham:You went as an attendee?
Bronwen Aker:I was invited to participate in a panel talking about GRC in business. And it was it was really nice because we had a mixture of people on the panel. So I was able to speak from my experience of what I see on the pen testing side. And then we had, a couple of CSOs, and I'm not sure what the other guy was doing, but it was it it seemed to be very well received. The the audience had great questions, and it was genuinely, an honor to be there.
Bronwen Aker:And so many so many people had so much love for this web stream, the the weekly newscast, and all the other stuff that we do. And, I mean, it was it was great. It had a wonderful b sides kind of vibe. It was small enough that you got that cool intimacy, but they had four or five CTFs. And everything from OSINT to packet hacking.
Bronwen Aker:This is the real deal. So, yeah, I will be going back to HackSpaceCon if at all possible.
Corey Ham:Nice. Very I was gonna highly recommend. I was gonna give you, you know, kudos for going as just an attendee and having a good time, but instead you decided to contribute. So kudos redacted, but, you know, I'll allow it. Redacted or or Retracted?
Corey Ham:Retracted and retracted. Let let the record show no kudos were issued to the sensor record. So redact the right yeah. Strike that.
Bronwen Aker:Strike that. Reverse it.
Corey Ham:Yeah. Strike that. Reverse it. Also have our community members and webcaster friends. We have Ched, mustache champion of the Discord.
Ched Wiggins:Another happy LMS customer.
Corey Ham:Who is here? Yeah. I was gonna say who's here to talk about Canvas. I'm sure he just has like a freaking whole like, it's just gonna be therapy for Ched. And then we also have Alethe with us who is here to hang out and plug her upcoming webcast and workshop.
Corey Ham:So if you are interested in doing some social engineering or pretexting or other I'm a fake evil things.
John Strand:I've gotta ask Alethe a question, though, because I it just kinda kicked things off. Here we go. If you were a tree no.
Bronwen Aker:I was actually once in a play. Go on.
John Strand:So here's here's the question. Do you think social engineering is gonna become more important with in in, like, this next evolution of AI? Like, AI defenses, AI offenses, is social engineering being able to implement that? Because social engineering has always been in the ether. Right?
John Strand:It's absolutely been in the ether. But I'm wondering if it's going to increase or decrease the necessity for social engineering, just with the explosion of AI everything everywhere. Like, people stay the same.
Bronwen Aker:I would agree that it is going to become more important and probably more prevalent. I think if you look at the more recent attacks, most of them start with social engineering, whether they highlight that in the news story about it or not. It's usually buried somewhere towards the bottom, but there's a phishing email or a voice call that happened. And, being able to not only, put together social engineering campaigns, but also being dynamic and able to think critically and pivot is something that AI isn't really capable of doing quite yet. So a lot of the things that we're automating are gonna be easily circumvented, in my opinion, using more advanced social engineering tactic tactics.
Corey Ham:I mean, also to pivot into the article of the day. Well, first of go to Alethe's webcast. It's two days from now. Yep. And that one is about building a pretext, which you will need if you're gonna be doing social engineering.
Corey Ham:You don't wanna just wing it and be like, I'm sure this will work. I don't have any reason to call or any recon done, but I'm sure it'll work. And then Alethe also has a workshop next week talking about that's even I'm assuming it's just expanding on the ideas from your webcast or how is it different?
Bronwen Aker:Yes. So we're gonna be starting with sort of a precursor to how to make pretext, and that's in our webcast on Wednesday. And then at the end of the month, May 29, we'll be digging much more in-depth into pretext fabrication and how to put that into social engineering. And that syllabus is pretty robust for four hours, so we're gonna pack quite a bit into that time frame.
Corey Ham:Yeah. I mean, that's an insane value. Four hours for $25. I mean, do the math.
Hayden Covington:Yeah. That doesn't sound right. It can't be only $25. Right?
Bronwen Aker:That's Somebody made it. That's crazy.
Bronwen Aker:Just kidding.
Corey Ham:So yeah. I mean, honestly, let's just get straight into it because we're talking about Canvas this week, which is executed by shiny hunters who I would I would personally say shiny hunters is probably the most prolific social engineering threat actor of this modern era by far. They have done tons of social engineering attacks. I will say with this one specifically, it's kind of unclear whether it's social engineering or not, honestly.
John Strand:Not a lot of tell.
Corey Ham:Yeah. It seems like the answer is it's not, but also it's shiny hunters. So I just assume it is. Right? Like like Alethe said, they might not say it right away, but, like, if you scroll down far enough, eventually, it'll be like, by the way, social engineering.
Hayden Covington:Yeah. They normally gain access through some sort of social engineering. Don't Yeah.
Corey Ham:It's like their whole MO. Yeah. A 100% phishing.
Wade Wells:Yeah. Didn't they claim that they had a breach and then they had another breach? Yes. Right? Okay.
Corey Ham:And then Chad, since you've been living in this and just for audience context, Chad works in the higher ed industry, and so he's been just living this day to day. And I apologize for that. Ched, do you wanna run us through, like, high level kinda how it's been playing out from your perspective?
Ched Wiggins:Yeah. Now's a good time to mention that I speak for myself and not my employer. The Krebs article is interesting because he goes over the timeline. He also has some insider sources, Krebs does. So it looks like there was an initial access which Instructure detected and then took down, but they called it a maintenance window.
Ched Wiggins:So already, it appears that there were transparency
John Strand:And an an Instructure detected it? Instructure.
Corey Ham:That's Oh, Instructure is the company that owns Instructure is the company that owns Canvas.
John Strand:I misheard I misheard. Sorry. Continue.
Corey Ham:Yeah. There's a lot of different moving pieces here, and Instructure and Canvas are two of the hardest terms to just say and remember on a in a conversation. So yeah. Anyway.
Ched Wiggins:Yeah. Good call out. But they called in a maintenance window, because the initial access factor wasn't known, when they came out of maintenance, boom, they were hit again. But instead of Shiny Hunters just negotiating with Instructure, it looks like Shiny Hunters put that pwned placard in front of 8,800 customers. So then the customers are like, what?
Ched Wiggins:So then
Corey Ham:customers the power school thing. It's exactly like the power school thing where it's like, oh, well, your parent company didn't go for the ransomware demand. So, again, let's try again.
Ched Wiggins:Exact exactly. And so before you had Instructure possibly negotiating with shiny hunters, but then some of some universities started reaching out to shiny hunters to do the negotiations. Again, it's all in the Krebs article, which is excellent. Toward the end of the Krebs article, something interesting where they mention the UPenn breach, which maybe is associated with the Harvard was breached at about that same time. So who knows if old credentials were exfiltrated during that breach and then they just weren't rolled on the instructor side.
Ched Wiggins:Again, like when these big breaches happened, like it takes time for the digital forensics, the incident response to happen. So we don't really get the full picture until weeks or months later where we get like a fully written out technical blog on this. So
Corey Ham:Yeah. Time So real quick, Ryan, can you you scroll back up a teeny bit? So, one of the things here that I wanna call out this specific paragraph right here, where it says we also identified a vulnerability regarding support tickets and are free for teacher environment. So this is like, I wanna kind of bring everyone up to speed on like conceptually how this works. So SaaS products, right?
Corey Ham:They have different like tenants and different subscription levels, right? This is how almost every SaaS product works for like, if you're, if I can probably go sign up for a Salesforce account for like, you know, with my personal email and do some stuff, But then also there's some company that's paying Salesforce a billion dollars a year that has a, you know, an enterprise tenant with all kinds of capabilities that I can access. When they talk about their free for teacher environment, that is like the lower tier environment that would be available to just any teacher can sign up and use Canvas without it being like an enterprise level deployment. So basically reading between the lines on this, lot of people have been speculating that this is a SaaS, like this is a SaaS industry problem where there's not appropriate separation between like different tenants and different environments. And so that's like reading between the lines on this, the free for teacher environment, they created their own accounts and somehow exploited a vulnerability targeting that environment.
Corey Ham:So it could be like social engineering via context or comments or something like with XSS or something like that. They basically essentially compromised one tenant. And then we're able to bleed data from one, from that into the enterprise, like, actual real paying customer environment. That's, like, speculative and hasn't really been confirmed, but that seems to be what a lot of people are kind of
John Strand:But it it it's assuming. It kind of feels random, though. Right? Like, the way that they drop it in. And it's like, oh, the free accounts we're getting rid of.
John Strand:It's like,
Corey Ham:Why? Convenient. Any reason why?
Bronwen Aker:What an
John Strand:odd thing to say.
Wade Wells:This is this is like one of the, like, blue teamers worst nightmares. Right? When you say everything is all clear.
Corey Ham:Oh, and then it's not. Calendar, September. Yeah. So there's a bunch of write ups and different I mean, different people have reported different things. And I like Chad said, it's gonna be months.
Corey Ham:These kinds of incident response things, like, until we get really a post mortem that actually fits together. I guess, Wade, do have any, like, lessons learned? Like, you as the incident responder or Chad, feel free to hop in as well. Like My what what do you think?
Wade Wells:The there's some really like, how you said that it's a a maintenance window. Right? That is a really interesting thing. Like, you're not gonna say incident right off the bat, especially if it nothing is
John Strand:Nor nor should they. Nor should they.
Wade Wells:Exactly. Nor should they. So that's yeah. Perfect. But what I'm thinking is they thought it was something smaller, and then it evolved into something bigger.
Wade Wells:So it's not like they got caught with their pants down type of deal. Right? Like, we don't know the initial breach or how they got in. I think they timed this well. I think this is on purpose during Right.
Corey Ham:Yeah. The timing of it is so brutal. Yes.
Wade Wells:Which which makes me think for other platforms. Right? Like, it's like, I know Intuit has a very strict policy of no changes during tax tax season. Right?
Corey Ham:Or most most most industries have some kind of a peak season where they're, you know, extra careful.
Wade Wells:Target during Thanksgiving type of deal. Right? Like but how many times do people build that into their threat models? Like, this is the
Corey Ham:time The worst time a breach could happen?
Wade Wells:Yeah. Yeah. Like, when when's the possible worst time a breach could happen into your threat model? Because most of the time, it's very much like we see during the holiday season. Right?
Wade Wells:Like, Russian Orthodox is is a much different holiday on different days than our holidays. So you see a little bit new Thread actors dropping off the radar.
John Strand:What was the was the ransom? Was it did I read it right? It was about a million dollars was the ransom?
Hayden Covington:So they they were saying that when they they attacked I think it was the school in Pennsylvania, University of Pennsylvania in September. That was a million dollar ransom. I don't know No.
John Strand:But I thought for this one, I thought that they had I thought I saw it in the Krebs article, $1,000,000. That
Corey Ham:might be the per school price, but I guarantee the price for Instructure was much higher than that.
Wade Wells:They're saying negotiate your settlement, so they don't have it actually in this pay in the photo that we see. But
John Strand:because, the reason why I ask is, man, if it was a million dollars, they I'm surprised they just didn't pay that. But if it's per school or it's negotiating, then
Corey Ham:Chad, do you think most, higher ed places have a million dollars laying around to deploy for something like this?
Ched Wiggins:No. But I'm glad you
Bronwen Aker:asked that.
John Strand:I'm talking I'm talking Canvas. Like, if if Canvas was hit and they're like, yeah. That's Like, oh, we switch it from the university. Even though universities are just hedge funds with classrooms, but, the quote that I stole from someone
Ched Wiggins:Should negotiate a raise.
John Strand:Yeah. You should actually.
Ched Wiggins:No. But I wanted to make a comment here. If you were not using Canvas and your leadership is of the mentality, we dodged that bullet. We're not a Canvas customer. Use that as a lever to move on.
Ched Wiggins:You know, we've needed an audit person for a while, and and we haven't had the political or the price or the the the technical debt to get whatever you need. A tool, a person, an audit, a pen test, Use the a
John Strand:tabletop exercise. Use the fact
Ched Wiggins:that you dodged this bullet as political.
Bronwen Aker:Never let a breach or an incident go to waste even if it's somebody else's.
Wade Wells:It's like you just watched a really good webcast about watching the news or
Corey Ham:something like that.
Bronwen Aker:I don't know. Right? Like
John Strand:I I always like looking at that because you you you get into security conferences and they look at companies, and they're like, well, they were stupid. Like, I remember going back to the Sony, the Target, and even this one, there's already some chatter. It's like, well, their security team must suck. And I think that that's kind of an emotional, like, human way of trying to push what happened further away from you so you don't have to acknowledge or work on things at your own organization. It's like when the Sony and Target ones, you know, happened, you know, I I got I got to know those teams fairly well.
John Strand:And, yeah, they got crapped on a lot, but the thing that I did, you know, to kind of talking about those breaches was as a pen testing company, what hit those, we see all over the place. It is not unique. And we still have yet to see how they got breached here. It may be something really stupid, but even something like default creds or something like that, it happens a lot even to organizations that are trying to do everything right. So I think we have to be careful in the industry of, like, crapping on people that have been breached just because, you know, the the vulnerabilities and stuff that net nail these organizations are not exclusive to these organizations.
John Strand:They happen everywhere.
Corey Ham:This is totally speculation, and I want Hayden to react to it. Here's my, like here here's my, like, theory. Alright? So I am shiny hunters. I create a free account, a free for teacher account.
Corey Ham:I craft some kind of pretext or some kind of reason why I need to get support from Instructure, I submit some kind of whatever payload this is, you know, whatever pretext I've submitted, I submit this to support a support work station is compromised. So like the person, the support person who's looking at this analyzes this thing that I've sent in for support. I don't know what it is. Probably just a zip file with an exe in it. LOL.
Corey Ham:That payload detonates on a workstation. And then basically my theory is that support person has super over provisioned access to everything. And so they basically, the blast radius is absolutely insanely huge. And Instructure has the management problem of, okay, the support person had access to basically everything in this company. We're gonna take things down for maintenance, do a quick threat hunt, see what we can see, engage incident response.
Corey Ham:They just failed to basically identify some kind of access that the threat actors gained, whether it was a token or a a credential, something like that, that they were able to harvest from maybe a session cookie. I don't know. And then basically they said, okay, all clear. And then, you know, stranded hunters was like, we have 17 API tokens for everything. We're just gonna make it rain.
Corey Ham:That's like my, okay. That's completely made up for context for the audience. I made all that up. None of this is confirmed by any sources. Hayden, poke holes in it.
Corey Ham:What do you think? Is that how realistic do you think that is?
Hayden Covington:I I think it's very realistic because eradication is very, very difficult. Difficult. Like, there's a reason that a lot of companies, if a a machine is compromised, they're just like, well, we're wiping it no matter what happens because we can't really guarantee that it's safe. But in, like, the world of cloud environments and all these, like, other, like, API integrations and everything, there's so many different ways you could have a backdoor. We have playbooks for different cloud compromises where there are lists of things that we make sure that we check because there's so many of them in so many different ways that we're like, we will never remember this just off the top of our head.
Hayden Covington:We have to go check for x, y, and z, and maybe they did, you know, h over here somewhere. And so I think that is very plausible. And especially if someone is, like, has an account, they have some sort of, like, established trust, and they're submitting in a support ticket. So there's not as much, you know, scrutiny there as if it's coming truly from the outside. So I I feel like your your proposed scenario, I would not be at all surprised if that's exactly how it happened.
Hayden Covington:And maybe the
John Strand:Well, it
Hayden Covington:also solution is, like, scan attachments.
Wade Wells:Like
Corey Ham:Yeah. Well, it also comes down to social engineering. Right?
Bronwen Aker:I'm like,
Corey Ham:I dare this is where they're known. They're known for being good at this and tricking people into doing things they shouldn't do.
John Strand:I'm gonna do a completely different one. I it it keeps sticking in my head. They're disabling the free for teacher accounts that are out there.
Corey Ham:Yeah. Temporarily, they say.
John Strand:I think that was the vector.
Corey Ham:And and Well, that's what I'm saying.
John Strand:No. But I I don't think that they got an endpoint. I don't think that they got an endpoint.
Corey Ham:Okay.
John Strand:I I think what happened is whenever you're working in an LMS, especially as a manager, there's management tools where you can see your users, you can see their progress, you can see all of those different things. And my belief is that they got access to a free teacher account. They gained access to a management dashboard. They found a way to trick the system into dumping the records for everything. Through the app type of thing.
John Strand:Yep. Like a web app type of thing. Probably a web app or a lot of these things have APIs. So you may not be able to access certain things within the within the web UI, but you can access it through the API. But once you get granted that token for a teacher level account that you have full control, I'm guessing that one of those tokens, the restrictions were not specifically just to the students in their unit, they were able to pull down everything.
John Strand:And the only reason why I'm saying that, I don't think the desktop is involved in this. I don't. I think They had CrowdStrike,
Corey Ham:so it's fine.
John Strand:Yeah. They had CrowdStrike, so it's fine. Yeah. No. CrowdStrike.
John Strand:It's like hacker kryptonite. Oh, crap. They're gonna do that for commercials now. So the the fact that they just apropos for nothing, like, oh, we're disabling temporary, like, teacher accounts. That screams to me.
Corey Ham:Yeah. Right. It's like, we know these can continue to be exploited. We can't do anything about it.
Bronwen Aker:Mhmm. Right?
Hayden Covington:Or or, like, if I
John Strand:was if I was CTO and they got breached through this particular mechanism, you know okay. Let's back up. If you are running a university I can't remember how many thousands of universities are actually using this. But if you're running a
Corey Ham:university They're super dominant.
John Strand:A lot. Right? If you're running a university as an administrator, you're probably doing two factor. You're probably doing all these protections. You're probably doing these things.
John Strand:You're probably a low risk for trying to hack the platform. Right? But if they provide the same level of tools that exist for managing your students, your student groups, and all of these different things, and they provide that to free accounts, I I think that that's that like, they're trying to wipe that out right away to stop the attackers from gaining access to free back end and free access to the management portal on these things until they can have it looked at a little bit better. So I don't think it went to the endpoint. That's my theory.
John Strand:I think that they I think that they it was exclusively within the SaaS app. They found a way to move laterally and pull data either directly through the web UI, or I'm going to bet I'm going to bet that they gained access to a token for API access and abused the API access because we're talking about a huge volume of data that they got. So that's my theory. So I I think everybody take your bets. Let's go.
Corey Ham:That sounds a lot more plausible, I think.
Hayden Covington:Good. Because in, like, the the the scenario you described, Corey, I mean, they're not necessarily gonna go, alright. Well, let's just take down help desk. So it'd probably be a different kind of, like, you know, remediation avenue. Like, how do we fix this?
Hayden Covington:But for them to say, hey, we're shutting down this specific program. Like, I think that might actually be
John Strand:Also let's also let's also go with shiny hunters. And if you had the help desk scenario, that went through, if you use that, then I think that the ransom would have been fundamentally different than what we saw. I I think that you I think the attackers may have pivoted, tried to gain additional internal access, and tried to shut them down, maybe do both. But the fact that it's just data, it it like I said, I'm leaning towards it's it's the SaaS app.
Corey Ham:I think that's a reasonable like, I mean, obviously, the we don't have any real theories, but I it's interesting. The part of the reason I brought up what I, the scenario that I did is because I think we talked about it like last week, that exact scenario and essentially the explanation of, oh, well, why did this happen is because they had, were missing EDR on that system or whatever. Like it was misconfigured. Hypothetically possible, but for a more mature company like Instructure, you would hope that isn't possible. And yeah, like abusing an API, probably not gonna get picked up by most endpoint security product.
Corey Ham:Like you're kind of under the radar at that point. You're just kind of flying under you know, you're you're ex dumping data, but not payloads. Sorry about
Bronwen Aker:factor too. I mean, think about how many teachers would be using this system. And realistically, how savvy are they going to be about cybersecurity practices and why they matter?
John Strand:They're not gonna be hacking the APIs. Well, most of them, you would think. Right?
Bronwen Aker:Yeah. But but getting in through, reused creds or something like that through the free teachers accounts, yeah, there's a lot of plausibility there, I think.
John Strand:The other thing I wanna ask is we get to logs. Right? And we have Wade, and we have Hayden and Ched. Like, here's the problem I have, and this is one of the things. I I've submitted a talk that I'm hoping I'm hoping that it gets accepted somewhere, but it's like zero days, zero trust, and zero visibility.
John Strand:When you're looking at the logs analysis that we have, we haven't even gotten, like, good logs and good analysis out of Azure or Entra yet. Right? And we still have a lot of these SaaS apps where the logging just sucks. Right? So when you're trying to run a SOC, how in the hell like, right now, most SOCs that we deal with and most SOCs that we're dealing with on offense and on the defensive side, they're barely keeping up with the crap that they have, not even starting to look at their custom SaaS apps that they're using or their cloud SaaS apps that they're using.
John Strand:So what the hell are we gonna do moving forward as far as logging and analysis for these things? Because I'm willing to bet that there there should have been at some point, and I don't wanna crap on on them too much, but should have been at some point a very solid purple team against this SaaS app of an attack, stimulus, response, detects that needed to be engineered.
Wade Wells:Where do we start? Right? So The logging. The the logging what here's here's one great, like, yeah, SaaS logging usually sucks, but I have found that if you're a customer of them and you're paying, they sometimes if you say you're not gonna pay anymore or you build it into your contracts, they will build out logging for you. I've had that I've seen that strong-arm enough.
Wade Wells:The crazy part, I think, is a lot of the new AI tooling does not have good logging, and you have to do something very, very similar. Like, I don't think I've seen one good AI log. Any of the big AI
John Strand:Wait. Wait. Why does AI coding not have good logging? Is it Dude. Because it's built on human coding?
John Strand:That is
Wade Wells:You're about to make me flip a table why they can't even have, like, an export to s three for me. Why I have to build a custom Python script to pull from seven of their APIs to just to get an okay log.
Ched Wiggins:It was my understanding that we were supposed to move fast and break things.
Wade Wells:So I one of the things
Corey Ham:with glare.
Wade Wells:Yeah. One of the things with the SaaS stuff that like, a lot of the APIs usually have some sort of management of the API as well with it, where theoretically, you can build out your own custom logging by using that and querying that and storing data with it. But it gets then you're you're you're getting way too heavy. Right? And then your detection engineers aren't just detection engineers.
Wade Wells:They're engineering detections. Right? Like, going Yeah. Above and beyond, and this just make me sad.
Hayden Covington:Go ahead. And say that even
John Strand:I think that that's the whole game. Right? Like Yeah. I I keep talking about the SaaS apocalypse that's upon us, and, you know, people can literally you can literally build your own LMS if you know what you're doing in a week.
Bronwen Aker:And It depends on how many clients you than that.
John Strand:I know. I know, Bronwen. I know you have. And and and think about how painful it was. Right?
John Strand:But now you can get something that's pretty damn functional and looks good, but it's absolutely, you know, held together with duct tape, bailing wire, and toothpicks. So that's that's where we're moving to. I I and I just yeah. I don't know. Whole AI thing's getting great.
Corey Ham:So okay. I have two comments before we pivot to another article because we have burned thirty minutes on this.
John Strand:We should talk about Mozart too.
Corey Ham:So Oh, yeah. So okay. Let two I I wanna give a couple of final thoughts before we close. One is I wanna be clear. The burden, the responsibility for having the ability to log this kind of essentially what I would say is a mass download event.
Corey Ham:The burden to log that properly is on the SAS provider. And you, if you're a SAS provider listening to this, do it before you get dumped by all your customers. Because every other SaaS product has had to do this when they got breached. Salesforce didn't have great logging for this. Guess what?
Corey Ham:Now they do. They got breached like 17,000 times last year. So if you want to be proactive, and you're a SaaS provider, have logging for these the equivalent of all your data is belong to us. You need logs that indicate that's happening, right? Like mass downloads, enumeration, even stupid things like logging the user agents that are hitting your API.
Corey Ham:Implement that stuff now before you become the next headline. The other thing I wanted to say is I wanted to talk about the impact of the data, right? Because everyone's claiming, like I think in higher ed or in education in general, a lot of people would say like, so what? Who cares if everyone knows I got a D in physics or whatever? Like I suck at physics.
Corey Ham:Like what is the impact? Obviously they're claiming a huge, huge amount of data. Like, I I forget the exact numbers, but it was something just stupid. Said that of private messages.
Ched Wiggins:Billions of private messages is what they claim.
Corey Ham:Okay. So to people that work in this industry, what are those private messages? Is it a student saying, I'm drunk. I need an extension on my Shakespeare paper? Like, what is
John Strand:It's gonna be that. Charles gonna have conversations between students. There's gonna be private conversations. They're gonna be highly personal conversations.
Corey Ham:Of minors? A lot of minor
John Strand:Minors? Oh my god. I didn't even think of that. Oh my god. But, like, the level of violations not, you know, for p I PII, PHI, GDPR, like yeah.
John Strand:There are
Corey Ham:Do any of those apply? Are there any regulatory frameworks for the education industry at all?
Bronwen Aker:Yes. There are.
John Strand:Well, here, let me put it let me give you an example. I'm a student. I'm going to school. I think I have a venereal disease, and I'm talking with my friends through direct chat on that. That's PHI.
Corey Ham:Right? Yeah.
John Strand:So you have the platform, and all of a sudden now there's a bunch of data that could be classified as PII, definitely PII, but also PHI health, mental health issues. People talking about self harm and having these types of conversations. This is, yeah, this is this is, yeah, this is bad.
Corey Ham:So Yeah. So, I mean, I guess, you know, does anyone else wanna chime in on the impact of this kind of data? Obviously, social engineering, that's what everyone always says. Like, we're talking about building a pretext.
John Strand:I'm thinking extortion.
Bronwen Aker:If you have Yeah.
Corey Ham:Like, okay. Extortion is yeah. It's basically extortion and social engineering. They tie hand in hand. I mean Well Is there anything else?
Bronwen Aker:You've got you've got just in terms of regulations, you've got PHI, you've got PII, you've got COPA, the Children's Online Privacy Protection Rule. You've got that stuff going on. So many things. What I really, really hope, though, is that we get downstream a positive outcome that now more people in education, meaning educators, teachers, administrators, will realize, no. You really, really do have to take cybersecurity seriously, and this is why.
Bronwen Aker:Because we're not just protecting ourselves. We're also protecting the children who are using these systems. And, you know, we have there's gonna be fallout out of this for years.
John Strand:I'm gonna bring in Ched on this one for no reason whatsoever. But here's here's this gets back to the dichotomy of universities. Right? And it's the same dichotomy that exists in financial institutions, the same dichotomy that exists in health care institutions, and across the board. If you look at financial institutions, health care institutions, their profits are off the charts.
John Strand:If you go and talk to the people that are working, their budgets are shoestring. Right? If you go to a lot of higher ed universities, they're like, we have an endowment that's $46,000,000,000. And it's like, well, could we update our operating systems over here? No.
John Strand:There's no money for that. Right? You constantly have that dichotomy. And I, you know, I wanna get Ched once again for no reason whatsoever. Just apropos for nothing because I think I've used that twice because that's a fun phrase.
John Strand:It's a good phrase. People need to use it more. But this is not something that's unique to educational institutions. The only ins and okay. Ched, you go, and I wanna come back in the end because I can tell you the only organizations that actually take this crap truly seriously and put money into it.
John Strand:So go ahead. I mean, even if I had
Ched Wiggins:a $46,000,000,000, budget, if I don't have the political will at the organization to recruit talent, add technology stack, build detections, get SaaS platform, third party buy in, we're not moving the ball forward. So it's a complex problem. There's not a silver bullet unless you count education as the silver bullet, but this is gonna take some, you know, thinking on this complex problem to push the ball forward.
John Strand:Perhaps we need some internal social engineering.
Bronwen Aker:That would be awesome.
Ched Wiggins:Well, I have a lead actually, have a lead slide.
Wade Wells:So that's
John Strand:that's a class.
Bronwen Aker:I think there's more want
John Strand:you to write. I want you to write social engineering to get what you need in a security team.
Bronwen Aker:Like But I no. I think that there's more more to it than just, like, this is gonna be used for social engineering. I think that the data that they gain from these personal conversations, not only does it lead directly into pretext for extortion. Like, see a lot of kids targeted with extortion scams that start as romance scams, and I could go on for hours about how messed up that is, but I will not. But 275,000,000 affiliates across 9,000 schools worth of personal conversation data, that is something I would guess people are going to be putting into LLMs to analyze that data and then come up with ideas for pretext and different scams.
Bronwen Aker:I'm not gonna call it social engineering because it'll be leading directly into scamming either the student directly, their parents, and then using this data to make those requests or those campaigns seem more legitimate coming into the person that they're targeting.
Corey Ham:So okay. Final final say, I guess. If you're in Instructure's seat, do you I mean, I know there's negotiations ongoing. I know there's negotiations between both Instructure and the threat actors and also between each individual school and the threat actors. So like, I mean, I know we always say, don't pay the ransom.
Corey Ham:Some people say, maybe you should. What are the chances that paying the ransom actually has any impact? Or is this data bound for the is it bound to be released? Kind of like the PowerSchool thing, where they were like, oh, it's fine. We paid the ransom.
Corey Ham:And then they were oh, you're being extorted again. Like, I'm assuming that's the calculus here, but I'm like, I'm kinda worried because it seems like schools would jump to try to contain this. But like, I don't think that's a legit I mean, I don't know. What does everyone think? Are these threat actors to be trusted?
Corey Ham:I don't think so. I feel like you pay the ransom, you just get hit again in a month. Right? Like, I don't know.
Bronwen Aker:Maybe even by then. Two sides to it. Like, you you can't expect the school to have the capital to pay the ransom, number one. Number two, you can't anticipate what the bad actors will do with the data following a successful ransom payment. I think double, triple extortion and then releasing the data anyway is something that would probably be a likely outcome.
Bronwen Aker:I don't think we'd win paying a ransom
Bronwen Aker:in any
Bronwen Aker:case here.
Bronwen Aker:If these malicious actors had anything resembling a moral comp compass Right. You you have to have a moral compass to make good on what you promise to do. Well, they've already demonstrated they don't.
Corey Ham:Yes. Also well, it's beyond that. Even if we assume they were acting in good faith, the the the chances that they can actually properly cordon off this data, not allow unauthorized access to it, and not let anyone else in the threat group get to this before. I'm pretty sure the cat's out of the bag to some degree. Mean, who knows?
Corey Ham:Are they going to host it?
John Strand:It's There have well,
Corey Ham:it's got be an S3. We know it. We all know it's an S3.
Hayden Covington:Like, what is the structure what is the structure gain by paying the ransom? Like, peep schools are not gonna stop boosting
John Strand:the candidates. I did look it up. The ransom is a million dollars per university.
Corey Ham:And it was, like, $19,000 billion $8,000,000. Oh,
Hayden Covington:yeah. They'll definitely pay that for sure.
Corey Ham:Yeah. Well, okay. Yes. Hayden brings up a but I know we're still just circling the drain on this. Hayden brings up a really good point, which is they're super dominant and oligopoly or whatever you want to call it.
Corey Ham:Like, there's not really legitimate alternatives for this. Schools aren't going to switch. I mean, honestly, people in the Discord were saying, like, oh, apparently, we're just going go back to pen and paper. Like, that's basically where we're at with this.
John Strand:Maybe it's better. I do I do disagree. I do disagree. I think that you are going to see a lot of universities that are gonna switch off. I think they're gonna look at other alternatives.
John Strand:I I Yeah. Being at working at universities, for a number of years, working with universities, And this not a good thing. I just wanna make that clear. A lot of universities will look at this, and two things happen. The first thing that happens is CTO is going to come in or somebody, you know, president of a university is gonna be like, screw it.
John Strand:We're moving off, and we're gonna start a project right now to do this. And the CTO and the president, whoever are gonna start that process to do that, sink a crap ton of money in it, and then they're gonna leave and go to another university because universities is like this constant churn in the upper echelons. And then the university is gonna be stuck with this really big transitional project that's super painful and super expensive. That's my theory, but I I I don't know. Other people may have a different approach.
John Strand:Hey, bud. Come on.
Corey Ham:Basically, don't pay. If you're one of these schools negotiating, don't do it. It's I know. I get it. Like, I under they're manipulating you.
Corey Ham:These threat actors are literally social engineering you into paying, and do not do it. Just don't.
Bronwen Aker:And their customer service is fantastic right up until the point where they receive the money, and then they become completely unresponsive.
Corey Ham:Yeah. Yeah. Yeah. Don't trust them. They're manipulating you just like they manipulated the poor people who were part of the breach.
Hayden Covington:It it is sadly too late.
John Strand:Yeah. I did wanna point out the two industries that actually take security, like, crazy, crazy serious, attorneys and investment firms. And the reason why attorneys and investments firms take security so seriously is their whole entire, like, value is their reputation. And it's also very easy for many people to pick up and go with a different attorney or pick up and move to a completely different investment firm. That's the only two groups that we see that consistently have good security.
Corey Ham:So Alright. Moving on. John, you said you wanna say, Waza. Waza.
John Strand:I love this story. So I just shared the link in, chat so Ryan can bring that up. And I I just wanna go over why I love this one so much. One, Wazza's write up is amazing. It's on their own GitHub repository.
John Strand:I don't know where Ryan is. I think we lost him.
Corey Ham:I just pasted it in Discord.
Bronwen Aker:We're good.
John Strand:Oh, there we go. So I I love this, and I want you to go through this with me because this is just beautiful. First, it's cluster sync path traversal and decompress files that enables arbitrary file write and code execution from an authenticated cluster peer. If you go down, they have here's the problem in the code. This is the vulnerable code, and this these are the lines of the vulnerability.
John Strand:They describe the vulnerability. Then they talk about what are the different Python, like, scripts that actually call that particular function that has the vulnerability. So they break that down. Then the cluster sync archive format a bit embeds the file path and the payload, and then you get down to the proof of concept code. If you keep going down oh, what?
John Strand:Stop right there. Stop right there. You can see the permits, the two exploitation variants where it's like the relative traversal, which is dot dot backslash dot dot backslash dot backslash and the absolute path injection to run the backdoor. So that is just fantastic. Keep going down.
John Strand:And then Waza, in their GitHub repository, has also released the proof of concept code for the vulnerability as well. This is, like, amazing to me. I know it's kind of a dumb vulnerability when we're talking about directory path traversal vulnerabilities. They've existed since the early days of IIS and Rainforest Puppy. It's been around.
John Strand:Should have been tested for. I understand that, but it doesn't sound like it's directly accessible through the web. But it it just whenever you're talking about disclosure from a vendor about a vulnerability to give this much data and give this much explanation and POC code, like, just well done.
Bronwen Aker:This is impressive. Yes.
Corey Ham:I don't I
Bronwen Aker:don't think I've seen anything like this before. I mean, it's conceivable. Wow.
Corey Ham:Hey, John. What was is it just Elastic but not? What is this?
John Strand:Yeah. It's it's it's it's it's an open source EDR. So, yes, it they actually do use Elastic. I think they still use Elastic and using full ELK stack with it. But they are an EDR, and how they make their money is they they release an awesome tool to the public, and they charge for services for implementation and maintenance and things like that.
Corey Ham:So it's like the true NAS of ETR?
John Strand:Yep. Yeah. I would kinda go with that. That sounds good. So nothing they're awesome.
John Strand:They're you know, we're big supporters of them in our classes, and a lot of small MSPs and small businesses use them because they're good enough and they're they're they're cheap enough and gosh darn it, people like them.
Corey Ham:Are you telling me that there's an antidote to all this insane SaaS extortion monopoly stuff and it's open source? Is that It could be. Is that what you're saying?
John Strand:Yeah. That's what I'm saying. It's open source. The future is open source.
Hayden Covington:If you're on Canvas, switch to Waza right now.
Bronwen Aker:My gosh.
John Strand:The quality of our interns just they, they moved away from Elastic a few years ago. They moved to OpenSearch, which is the Amazon FU version of Elastic. So just thanks for the clarification.
Corey Ham:Well, yeah, it makes sense because Elastic is doing their own VC monetization strategy that would conflict with was a Yeah. Alright. What's the next story? What what's what else is going on? We got I mean, Canvas is the big thing.
Corey Ham:Apparently, Google Chrome is now rolling a four gig AI model with every install.
John Strand:I don't need that in
Corey Ham:my list. Are they shoving
Bronwen Aker:it into the It's your
Ched Wiggins:silent install.
John Strand:I mean, me, four
Bronwen Aker:gigs is
John Strand:quickly go into incognito mode. I don't understand. I don't what? Sorry?
Corey Ham:Four gig listen. Four gigs nowadays. That's one tab. That's one tab. So it's all good.
Corey Ham:I mean, I don't know. It is what it is. On device honestly, I would say don't don't be upset. Don't be okay. As long as you can opt out of this, I don't think you should be upset because on device AI is actually more private and secure than doing everything through Google.
Hayden Covington:Right? They're like Arguing just all of your space anyway. Like, Chrome uses all of your RAM. We might as well just add even more.
John Strand:I so I I've been thinking a lot, Corey. A couple of weeks ago, we were talking about the AI wars, and you were talking about Anthropic and OpenAI. And you're like, Google is going to win this. And I you know, just the chipping away, they have money to burn. They can literally stand to lose the money that Anthropic and and and OpenAI are losing.
John Strand:And I just see this as, like, you know, another brick in the wall. Like, they just keep adding AI in their overall portfolio. Anybody that's using Google is now using Gemini. It's bringing the AI results right to you. Now we're putting it inside of Google Chrome.
John Strand:I just think that this is just a slow march of Google winning the AI wars.
Bronwen Aker:Yeah.
Corey Ham:Yeah. I mean, me personally, in this moment, I'm like, I would much rather have my browser doing on device model stuff instead of sending everything to Google. Now I know that it's probably still sending everything to
John Strand:Google. Yeah. That's
Bronwen Aker:that's the key. Right?
John Strand:That's the key.
Corey Ham:But but the concept of on device is great. That's how Apple does all their AI stuff. Granted, it's absolutely terrible in every way. But like that, I think, is the secure and private way to do AI stuff. And like some of the use cases they give, I think are kind of interesting, like, you know, on device detection for fraud and safe browsing and stuff like that.
Corey Ham:I don't know. We'll see. But either way, if you run out of disk space, first of all, how did you only have four gigs? Also, Wade, you're muted. And I wanna hear this spicy comment you just made.
Bronwen Aker:So on mute right now.
Wade Wells:Dude, I've been muted for so long. I've been making spicy comments. I'm over here reading things.
Corey Ham:When you go
Wade Wells:to the back to
Corey Ham:me, I'm like
Wade Wells:out of the frontier models, like, I will admit, I haven't seen anyone using, at least, like, my connections using Google
Hayden Covington:at a
Wade Wells:at a production level. Right? Like, and it What about Apple though? It's I'm not talking to them. It's it's missing so much functionality the lot because I I've seen it.
Wade Wells:I've used it. But, like, it didn't even have ways to share things. It didn't have ways to work together. They only have, like, a couple different functionality. Like, I I agree with you
Bronwen Aker:couple weeks.
Wade Wells:With your March. It's bit this has been a year a year, like, since I've had both. Yeah. And, like, think
Corey Ham:about Copilot still doesn't have agentic coding either. Copilot still just like, what's my SharePoint files? Here's all your SharePoint. Like, it's still completely useless, and companies are paying $50 a month for that.
Hayden Covington:Well, I feel like you're talking about, like, two different markets. Like, ChatGPT is in, like, the AI chat market. So Anthropic is in the market for that in development. Google is in, like, we will build a pretty decent model, and you will be using it in all of these tools that we have that you already use anyway, and you're gonna not even realize you're
Wade Wells:So so there's how they're getting
Bronwen Aker:it. I've heard some
Bronwen Aker:pretty amazing things about Notebook LM, which is a Google product. Yeah. And I've I've seen some pretty impressive demos.
Wade Wells:I wasn't I wasn't impressed with it.
John Strand:I wanna I wanna throw this out there, though. Like, I I think what, like, I think what Google is doing is let OpenAI, let Anthraphic be bleeding edge. Let them step out. Let them find the re the the features that their customers want and And let them which ones they wanna ignore, and they're in a goddamn thing that is like, oh, well, Anthropic beat
Wade Wells:Did John go? There's a just John, you muted yourself.
Bronwen Aker:Okay. It wasn't just me.
John Strand:Every time I did my microphone.
Corey Ham:In his rage, he muted himself.
John Strand:All they have to do all they have to do is just sit back, watch what people are using, and then start implementing those features. I I just I just I think it's a good strategy for them in the long run. Now it's Google. They're like the Microsoft of today. I don't know.
John Strand:Microsoft's the Microsoft. But but it's totally possible that they can screw this up because they have the unlimited money glitch in their favor. But I just think that they're
Corey Ham:gonna sit back.
Bronwen Aker:All the AI companies are already doing that. I mean, they they're like babies in a nursery. One starts to do one thing and then, oh, gee. You took it for
Bronwen Aker:a year
Bronwen Aker:or two. So I I think right.
John Strand:I That's
Corey Ham:that's a great analogy.
Wade Wells:Instead of talking about AI all day, let's talk about free, newly open source security programs. I'd like to talk to you
Corey Ham:about Trellix. Oh, Trellix? Yeah.
Wade Wells:So Trellix is the big security company that, like, well, like, anytime I hear Trellix, I just think of you ate more security companies, and now you're Trellix type of deal.
Corey Ham:That and Did you get wait. Did you
John Strand:get Trellix? Do you look rough, man?
Wade Wells:Bro, sometimes I feel Trellix. Did I just I love the
John Strand:Have you thought about getting Trellix? Oh my. Something?
Wade Wells:Or I'm gonna make that a t shirt. Did you get Trellix? But it seems like their entire software base or at least a big part of their software base got completely leaked this week, which is never good for a security team, let alone a closed source security team unlike an open source where you're always leaked. But went ransomwarehouse. Right?
Wade Wells:Hit them up, and fun times. Right?
Hayden Covington:I so I think Ritech in there, doesn't it?
Wade Wells:Yeah, dude. That's their email. That don't get me started about it.
John Strand:No. No.
Corey Ham:Go go away. McAfee, and they own FireEye.
Hayden Covington:Oh, yeah. Don't wanna talk.
Corey Ham:Is it one of those companies that you don't think of as an industry leader, but then somehow they have, like, you know, freaking $2,000,000,000 in revenue? It's like an IBM. It's like, no one uses them, but then they just still have $2,000,000,000 coming into their pocket every month or whatever.
Wade Wells:I that is exactly them. It's it's just so wild.
John Strand:I've I
Corey Ham:mean, how did it happen? I mean, I'm just gonna assume social engineering. Like There we go. They that's social engineering.
Wade Wells:They have not they have not they have not responded to comments. The company confirmed the breach that Trellix re recently identified an unauthorized access to a portion of our source code repositories. Upon learning of this matter, we immediately began working leading towards forensic expert to resolve it. You're telling me they don't have the forensic experts in house to resolve it? They gotta go
Corey Ham:I do. No. I'm
Bronwen Aker:not. I am not gonna take
Corey Ham:Well, that's the Mandiant part.
John Strand:I'm not gonna dig them up. Okay. No.
Corey Ham:No. Hold on. New new question. How much of this company's revenue is from people unintentionally subscribing to McAfee when they buy a new computer and they don't know what they're doing?
Wade Wells:Dude, you know how many people tell me that? I don't have McAfee. I don't have computer security even to this day. And I'm like,
Hayden Covington:This is a 53,000 customers. That's gotta be at least three quarters of that.
Wade Wells:It's over it's over 9,000.
Bronwen Aker:I don't
Corey Ham:know about it. I
Bronwen Aker:think none of us would be shocked at how many of these services are being outsourced to other vendors. I get asked to test vendors all the time. Yep.
Corey Ham:True. That's a good point. That's a really So
Bronwen Aker:you got, you know, on demand, like, during incident vendors, and those are probably the folks that they're bringing in. At this stage, just with my experience with tabletops, we usually have to loop in external legal, external support for incident response. And a lot of times, you've got two individuals that are responsible for all the security in the entire organization, which is a little scary that teens are so under resourced even today.
Corey Ham:Nice. I agree. I've I've we've had a few vendors reach out to us for or companies reaching out to us to have their vendors tested, and it's always like, you guys trust this vendor who you barely even know to do all this? Are you sure that's a good idea?
Bronwen Aker:It actually hurts my soul when I pwn the living daylights out of, like, tier one basic, you know, tech support for business. And it's because it's a completely outsourced team that's managing, like, many multiple clients' help desks, and they just kinda, like, boop into the right one when they get the call.
Corey Ham:God. Yeah. It's painful. That's the model for help desk.
John Strand:When that happens, it's always a pain in the ass because I I think I think you'd agree. Like, a lot of times, you just gained access to a lot more than what was in scope.
Bronwen Aker:Yeah. And then I realized how, like, tricky this is going to be to explain because ultimately, that individual has access to so much data that is not even my clients. And then I think there's also some some scoping conversations that have to happen before we agree to test third party entities that don't know they're part of the test in many cases.
John Strand:Well, we we had one where they we were we were going after their help desk, and they never told us it was a third party help desk.
Bronwen Aker:That's actually how I find out. Yeah. They're like,
John Strand:once you're
Bronwen Aker:in behavior of this person seems like somebody who doesn't work there. And then I go back to ask the POC, hey. What's going on here? They're like, oh, yeah. Totally different company.
Bronwen Aker:I'm like, when? Where are
John Strand:you going? Would have been nice.
Corey Ham:Yeah. No. The other thing we've seen a lot is like companies have multiple help desks. So they have like their on hours help desk is in in house and then have their app's help desk is Yeah.
Bronwen Aker:Different numbers.
Corey Ham:And, you know, the outsourced people are basically just sitting around with Okta god admin keys. Is fine. I don't don't have any
Wade Wells:I mean, flashlights.
Corey Ham:I
Bronwen Aker:can't cannot imagine why social engineering is so successful. Because in a lot of cases, the majority of these employees are trained purely for excellent customer service and customer experience and not really trained towards proper validation that this person actually works for that company because they don't work there. They don't know. But there's this level of, like, assuming that the person must work for the company if they have the internal help desk number, which I can just call a branch and ask for usually, and then I'm in.
Corey Ham:Yeah. No. It's bad out there. Honestly, the incentives are just incredibly misaligned. Like, in the it's it's similar to the MSSP space, you know, like any of these outsourcing type scenarios, the incentives don't align with security almost ever.
Corey Ham:It's always cost reduction, you know, capabilities, security is an afterthought.
Bronwen Aker:I'm absolutely shocked.
Corey Ham:What a thought. What else we got? You got any chicken articles? Let's see.
John Strand:Do. Apparently chickens are
Corey Ham:We technically do have a chicken article, and that chicken article is that Rose Acre Farms, America's second largest egg producer, was hit by Lynx ransomware.
Wade Wells:You ready for egg prices? Friday, we
Hayden Covington:If you try to crack your egg
John Strand:and cut something expensive eggs because this is how you get expensive.
Ched Wiggins:This is intolerable.
John Strand:This is ridiculous.
Corey Ham:Seriously, though. What like, what do these if you're a chicken company, what infrastructure do you actually have and what do you need?
John Strand:Actually, there's a lot there's a lot of There's
Corey Ham:There's There's a lot Distribution and ordering all out. What about egg based local language or large language models?
Hayden Covington:And your chickens are all IoT, you gotta make sure.
Bronwen Aker:I wish I had. I hope
Corey Ham:it's gonna be a lot cheaper.
John Strand:Like, owning chickens, I could totally see them having videos, like, monitoring by AI and saying, hey. This group of chickens is starting to exhibit, like, weird behaviors.
Bronwen Aker:So she
Corey Ham:not remember the on this show where we Yeah. Talked about the AI translation app where you can talk to chickens.
Bronwen Aker:Oh my god.
Bronwen Aker:I appreciate translated their their squawks to let you know how happy they were.
Corey Ham:Correct. It's it's like AI is a mad analysis, but for chickens.
John Strand:Oh my god. I still like the far side cartoon where the guy does the device that translates what dogs say when they bark, and the only thing they say is, hey. Hey.
Corey Ham:Alright. So before we close out, we should let Wade plug his upcoming web or workshop this week. Right, Wade? This
Bronwen Aker:is Oh, yeah.
Corey Ham:On the fifteenth. What is that? Friday or something? Yeah.
Wade Wells:Friday, I am doing a workshop for it's a four hour workshop on threat actor profiling. It should be really fun. I've added in some pretty cool labs. This goes directly one zero one two day course that's gonna be in June. So this this definitely has a lot more about threat actor profiling than that course actually does.
Wade Wells:So that'll be fun. I actually was gonna throw something in here about reading blog posts and looking for, like, opsec failures, and I feel like the Wazza wanted it. Like, I can't, like, I can't fault them. That's too good. Like, do you give
John Strand:them vulnerabilities? Can use that you can use that as a shining example of how to do it right.
Wade Wells:That's you're probably that's exactly what I'm probably going to do, to tell you the truth. But it should be a fun course. I think it's only four hours. I don't remember the time dates. I just wake up, and Ryan sends me invites to Zoom and I
Corey Ham:open it.
John Strand:I got into
Corey Ham:an argument with my wife. For four hours and hope you did a course.
Bronwen Aker:Yes. But
John Strand:We haven't I haven't slept much in the past, like, three days, but I got into an argument with my wife on what day of the week it was. She was like, it's Tuesday. No. And I'm like, no, hon. It's Monday.
John Strand:And she's like, no. No. No. It's Tuesday. It's Tuesday.
John Strand:We flew home on Monday. I'm like, no. We moved it to Sunday. And when she found out it was Monday, I think she started, like, choking up. Like, oh god.
John Strand:It's only Monday. No. There's been a lot of time zones, folks. It's been a lot of time zones. So
Corey Ham:Alright. If you I guess anything else to plug? There's a threat hunting summit coming up June 17. That's that's, like, next year as far as I'm concerned. I don't wanna
John Strand:Yeah. That's that's infinitely
Corey Ham:Alethe has a webcast in two days, how to build a bulletproof pretext. And then later this month, a workshop similar. Yep. Well,
Bronwen Aker:I'm doing a webcast on the twenty first too, and I haven't figured out what I'm gonna talk about. So if there's there are any requests, put them in Discord and I'll take
John Strand:them in
Corey Ham:we should talk about using Notebook LM. Sounds like that was the, you know
Bronwen Aker:Well, I've dabbled with it, but nowhere the the problem that I'm running into is that you can't go deep with every frontier model or tool out there. There just aren't enough hours.
Corey Ham:Can spend ten days deep diving
Bronwen Aker:and then give us less spoons.
John Strand:You know what? Here here's here I got a request for you, and I think that this is something that you already have bits and pieces of, is I would like to see a webcast that's, like, straight up, like, top five, six things about prompt engineering for Infosec professionals. Done. And how to do it how to do it properly. Not general because all the presentations are like, here's how you use a push.
John Strand:Here's how you use a pull. Here's how you do this. And and it's very general. I would like it security. Like, just like, what are the six rules for, like, Infosec prompt engineering that everybody needs to know right out of the gate?
John Strand:So that that, I think, would rock.
Bronwen Aker:Alright. I reserve the right to possibly take it up to as many as 10 or drop it down to five. I not do it into I the game.
John Strand:I trust you. You do
Corey Ham:that. Sweet. Sounds fun. Alright. Alright, everyone.
Corey Ham:Thanks for coming. We'll see you next week. Bye bye.
John Strand:Do I have to go?