A weekly Podcast with BHIS and Friends. We discuss notable Infosec, and infosec-adjacent news stories gathered by our community news team.
Join us live on YouTube, Monday's at 4:30PM ET
You were gonna do a podcast now, Derek.
Derek Banks:Okay. That's fine. I don't usually know where I am.
Corey Ham:You are in the interwebs. You're in the tubes. Sweet. I feel like using the word interwebs definitely makes me sound old now.
Derek Banks:That's one of those words I put up there with sports ball, like I know what you mean, but do you know what you mean?
Corey Ham:How do gen z people talk about the Internet? Do they do is it just like the air, they don't talk about it because it's always there? Like, don't know.
Derek Banks:Wonder that because, you know, the the current iteration of the Internet wasn't around when I was a kid. Right? We had, you know, bulletin boards and dial up kind of stuff, but that really was, you know, towards the the early nineties. Or as my kids would like to point out, the late nineteen hundreds. You
Corey Ham:were born in '19 once you say '19, you're like, oh, I'm old. Basically. Yeah. Just here.
Derek Banks:But, you know, I I just wonder if, like, you know, the Internet is just like something that my phone does to my kids. Right? Like, they just literally like, it just assume it's like a thing. They never knew life before that. So.
Corey Ham:Yeah. I would say, like, I would go install TikTok and then you'd find out, but I'm definitely not gonna do that.
Derek Banks:So
Corey Ham:Yeah. Not either.
Bronwen Aker:Yeah. TikTok and Alexa are both things that will not enter my house.
Derek Banks:I have an Alexa out of my garage just because it's easy. Like, if I'm doing something just to tell it to play music, and I have it up to Sirius. I like its only function is to do that. If I was a better nerd, I would just hook up a Raspberry Pi or something to do that. But
Bronwen Aker:I'm just more concerned about it eavesdropping.
Derek Banks:I just assume everything is doing that because it's fun. Me and my wife will talk about something. And I used to not believe in, you know, the phone eavesdropping on you. I was like, it's just a coincidence. You must have searched that before.
Derek Banks:And then we'll talk about something, and then later on Instagram, we'll just start giving her ads. Like, there's no way that she's getting ads for jujitsu geese because she doesn't do that. So we were just talking about me getting a new one. Right? And so I do think
Corey Ham:helping you.
Derek Banks:Yeah. Well, soon we'll have, like, agentic shoppers that I just say, hey. Go book me a a plane ticket. And it goes off and does all the research and legwork, and next thing you know, it's using my credit card for me. That's not terrifying that
Bronwen Aker:for you now.
Corey Ham:I Yeah. I was
Derek Banks:gonna say say do that. Yeah.
Corey Ham:That that's here. It's just only here. It's there on the bleeding edge if you wanna go ahead and risk yourself. You think your credit card company is gonna like, are they gonna be cool with that? Like, this is an authorized purchase, but I did authorize the AI to make the purchase, but it went crazy and bought me a first class ticket to Dubai because because it saw a really good ad for why I should go to Dubai, and so now I'm in Dubai and I need a refund.
Derek Banks:They ordered a Ferrari. I don't know why. It just ordered one. The Ferrari would get me there quicker by ten minutes than actually flying if I did a 180 on the interstate.
Bronwen Aker:You can take an interstate to Dubai?
Derek Banks:Yeah. Exactly. It was a hallucination, Bronwyn.
Bronwen Aker:A hallucination or a confabulation?
Derek Banks:One of the two. I don't know. One of the two.
Corey Ham:I'm glad we have the AI experts here on our new show because there's a lot of AI articles this week as there is seemingly every week, but I mean
Bronwen Aker:Yeah.
Derek Banks:Has have you done a news story in the last year that didn't have at least one AI topic related in it?
Corey Ham:Probably, but I forget what it was.
Derek Banks:Was that when everybody just said, screw it. We're not gonna talk about AI?
Corey Ham:We just skipped it. Yeah. It'd be fun. We do like an AI AI free week on the news. Just talk about like, good news.
Corey Ham:There's a new recipe for cereal where they can reduce the cereal by 12%.
Derek Banks:They're they have a new That's
Corey Ham:all we got.
Derek Banks:Non nutritive cereal varnish for us to try out.
Corey Ham:Okay. I do have a little bit of pre show banter. Okay. So I saw this Reddit thread that just kinda broke me and made me laugh, and I'm curious if you guys think it's funny too. So basically, someone posted to Reddit, and they were like, I think rice is too small.
Corey Ham:Like, I wish it was bigger. Like, I wish rice was just one big, like, loaf of rice instead of individual smaller grains. Like I wanna just eat like one or two rice. And then and then the reply the first reply was, that's a potato. It's just one it's just a really big rice.
Derek Banks:That's probably not too far off.
Corey Ham:I know. Right? Hello, and welcome to Black Hills Information Security's talking about news. It's 02/23/2026. We're here with AI generated individuals, including me, myself, Corey Ham.
Corey Ham:I have a eye test in my description. If you can read that, you should see a doctor because your vision is way too sharp. We have Bronwyn who also has a funny bumper sticker. Do not tailgate Bronwyn. That's what we've learned.
Corey Ham:Although, do tailgate with Bronwyn. She has good recipes. She's got some jams. She's got cakes. Even if the cakes are AI generated, that's okay.
John Strand:We got Brian Furman. AI cake.
Corey Ham:Oh, wait. John's here? We got John. He's like fishing us.
Bronwen Aker:Is it John siding? Hello,
Corey Ham:John. Mhmm.
John Strand:I every once in a while show up to my company.
Corey Ham:John, who's possibly in his closet, will confirm that later. We've got Brian, of course, our our resident oh, yep. Confirmed. Who's our resident PhD in the room to be an adult, I guess.
Brian Fehrman:Hey, everyone.
Corey Ham:You're be an adult? Do you promise not
John Strand:to be an adult? We know that's not true. They're not the adults in the room.
Corey Ham:So Just just promise not he didn't put PhD in his name, though. So you know he's not one of those PhDs. No.
John Strand:He's not. No.
Corey Ham:No. Yeah. Yeah. No. God, Derek, who is one of those PhDs.
John Strand:I I am not a PhD. But if
Corey Ham:you were, you would be. Right?
Derek Banks:I don't think so.
John Strand:Hold on. Hold on. Two things. One, Brian, you owe me your thesis because I everybody whose degrees I pay for, I put in my cookbook shelf. Okay.
John Strand:I need that. Derek, I need yours too. But, Derek, I thought you were thinking about taking some time off and going for your PhD, though.
Derek Banks:Oh, god. No. Oh, lord. No. I didn't really
Corey Ham:John's, like, trying to make tacos, and he's reading some dissertation about, like Yes.
John Strand:Yes. That's what I do. That's it. I'm excited I about
Derek Banks:didn't have a thesis. I had a capstone project that their paper was published, and so I can just go get you the paper, and you can read it out and read it.
John Strand:Send it to me in a hardcover.
Derek Banks:That's what I would do.
Corey Ham:We got Shaky. Oh, he's here. He put his real name. That's scary. I'm just gonna call you Shaky, if that's okay.
John Strand:That's perfectly fine.
Mike "Shecky" Kavka:That's what everybody calls me, but anyways,
Corey Ham:so Okay. And we've got Megan, of course, who made the title so small. No. Not really. That's not true.
Corey Ham:She did notice it though before anyone else. Yeah.
John Strand:Alright. Yeah.
Corey Ham:We appreciate you. Thank you for being here. Alright. So let's start off with this keep Android open thing. I kinda This is I don't know.
Corey Ham:Like, is this the thing? I kinda messaged a bunch of people and was like, is this a real thing? And basically, I think it's fair to be angry about this. This is like, I think it just goes away from the spirit of Android. A lot of people were were sold Android as a, you know, open platform, and now it's not so open anymore, and or it's potentially going to become not so open anymore, And people are upset about it.
Corey Ham:I guess, John, you were the one who sent this in. So do you have a hot take on this?
John Strand:I do. And this is gonna monologue and I apologize, but no one here is not used to that. So Corey Doctorow's presentation, the coming war on general purpose computing is absolute reading for anybody to understand where we're at. This is not about security, even though security is part of it. It's predominantly about how you can lock people into an ecosystem to where the only way that they can load apps is through your store that they get a percentage in the cut from the sales on that.
John Strand:So an example would be, I have an app, let's say I wanna say, let's oh, Kindle. Alright. So if I have a Kindle app on my iPhone. Right? If I want to buy a book on my Kindle app, it has to take me to my browser, which is logged into Amazon, and then I can purchase the book.
John Strand:And the reason why Kindle is doing that is because iOS and Apple have made it that if you have any in app purchases, then they want their percentage and they want their cut of every single one of those purchases. So it's not about security for a lot of this stuff. It's all about making money on these particular things. Right? So if you're looking at this, Google wants to do the same thing because with Google, you can enable third party applications.
John Strand:I remember Fortnite. You used to have to install Fortnite using third party applications and enabling it and then downloading it. It was a pain in the ass. But the reason why is inside of Fortnite, you could add in game purchases, and they wanted to do that outside of the Google Play ecosystem. So what's happening now is you have Microsoft, Google, and and, of course, Apple that are forcing everything to go through their ecosystems.
John Strand:Right? You have to go through their stores. You have to go through their verification for absolutely everything that you do. And this bothers me for a couple of reasons. One, it it does somewhat help security, I think, but we see lots of examples of malicious apps making it through.
John Strand:The other reason why this bothers me is it does actually make security testing more difficult. With a general purpose computer, you can download software, you can install it, you can evaluate that software relatively easily. With these particular ecosystems, it's not very easy. Like getting root level access on your device, while possible, is not a given in these particular fields. So that means that there's wide areas of places that are not going to get the level of scrutiny and research that we talk about all the time on the show dealing with privacy.
John Strand:Right? How much are apps giving data away about you that you don't feel comfortable them giving away? The problem with that is in these types of ecosystems, it becomes more and more difficult for us to identify how much of our privacy is being lost. So this is bad for a number of reasons. It's just one more notch that's turning.
John Strand:And honestly, we need a full true open source Linux phone. I know people talk about Libre phones and I can't remember the the Graphene. Yeah. Graphene OS and all of that. We need these things, but they're making it more and more difficult to install these third parties onto existing hardware platforms with like, even now, you used to be able to get a Pixel, and you could install a bunch of these third party operating systems onto existing pixels, and they're discontinuing that pixel line and making it more difficult.
John Strand:So there's a lot wrong with this. It's difficult to test the security. It's difficult to validate the privacy and issues, especially as we're moving into the age of AI. It's getting more and more locked in to where you're spending more and more money just trying to get third party apps to work, and they're taking a bigger and bigger cut out of it. We're getting higher and higher into the oligarchy scale.
John Strand:And I'll leave one thing. If you want to look up something terrifying, look up the Gini index. Gini index has to do with the number of economic dispersion. A one would be everyone makes the same amount of money. A 100 means that only one person makes the money and no one makes any other money.
John Strand:No one makes anything else. Right? The French revolution happened at a Gini index of 82, and we are currently at 83. So there's more and more consolidation into these oligarchs, into these people who are running these apps. We need diversity.
John Strand:We need competition. We need open source platforms, but they keep getting cut back. So that's my rant on that and I'm gonna step back.
Corey Ham:Nice rant. 10 out of 10. Would rant again. I think the I mean the one question I have, I don't know if anyone knows this, is what is a certified Android device? Is that like the new Pixel series is gonna be a because presumably you could still do this you could still install GrapheneOS and none of this applies to you.
Corey Ham:Right? Then you can use F Droid or whatever. Right? Like
John Strand:No. No. No. But it makes it more difficult because part of what they're also doing is getting rid of the Pixel platform. Where the Pixel platform was the official platform of Google, and it had the, like, maximum level of compatibility.
John Strand:If they get rid of that, then there is no dedicated platform that you can if you're a developer of, GrapheneOS, being able to make it so it works with that platform and knowing that it works, it makes compatibility far more difficult because now you have all of these other vendors, and you have to do QAQC across all those vendors to make sure that everything works. So there's a lot of moving parts and a lot of these things tied down. Graphene did a blog post on this a couple of months ago.
Corey Ham:I I mean, to me, I'm like this kind of kills the value, part of the value of Android, which is as a privacy option. Right? Like, obviously there's still Graphene, but if you're gonna I I mean I guess there's a chunk of people who are like, if I'm picking which big tech company I'm gonna trust, it's gonna be Google over Apple, and maybe that this who this is for is like, I wanna trust Google implicitly instead of trusting Apple implicitly. But I feel like a lot of people were sold Android as an open platform and now that's changing and you know, think it's fair to be upset about it. But also, I was kind of blown away that to this day in Android, you can just be an anonymous app developer and publish an anonymous app into the App Store like that.
Corey Ham:That feels kind of not as I mean, anonymous is one thing, but you know, I don't know. That just feels kinda crazy that you can be like, I'm I'm not an a p t. Here's my totally legit photo ID. You know, like, I don't know. I'm sure there'll be bypasses for all that
Derek Banks:I don't know. I guess, kinda have a slightly different take on the Google versus Apple. I think I trust Apple a little more than Google. I don't know why. Maybe just because of reading the age of surveillance capitalism and, you know, Google invented surveillance capitalism, And I don't think it's ever been a privacy platform.
Derek Banks:I gave up that idea the first time I looked at a TCP dump of, like, traffic from an Android phone. Yeah. It's not privacy at all.
Corey Ham:Yeah.
Derek Banks:I do think we need to open stuff. I would I would love to have a laptop as powerful as the one I have right now that was completely running Linux, and everything worked. And when I mean everything, I really mean that screen share and Office stuff works, because that was my last thing with trying to run Linux is it was great until I had to do, like, normal Office stuff. Then it really sucked.
John Strand:Yeah. Well and the other thing that people should watch if they're looking at surveillance capitalism is there's a I think it's on oddly enough, it's on YouTube called the creepy line. Mike Felt, who's current who worked with us, currently at trusted right now, he recommended that to me. And it's terrifying because Google's whole market idea was to push that surveillance capitalism right up to the creepy line where they didn't wanna cross it and become too creepy just underneath that as well.
Corey Ham:Alright. Speaking of creepy, next article. This is probably gonna be a short one. I don't think anyone will disagree or have hot takes on this, but I'm sure I could be proven wrong. So Meta patented technology that is gonna use AI to take over a dead person's account so you can keep talking to them after they are dead.
Derek Banks:This crosses the creepy Nothing
Bronwen Aker:creepy about that at all. No.
Mike "Shecky" Kavka:So As as somebody that is in charge of both my late parents Facebook accounts, which don't have much on there. I find that completely completely irresponsible quite honestly. It's the worst idea of AI usage. And I understand that there are some people and again, from my point of view, would I love to hear my parents talking to me at some point in time? Yes, I would.
Mike "Shecky" Kavka:It would be a dream to go ahead and be able to say, mom, dad and hear their voice again. Mhmm. You don't know what you've got until it's gone. But they're preying on people that want to go ahead and have these messages. And a lot of times, this is not gonna go right because you don't have Even if they did it, it doesn't have the data points to go ahead and truly be what this person's personality would be like.
Mike "Shecky" Kavka:Unless they posted a ton of stuff, there's way too much that could go wrong with this than than would ever go right.
Corey Ham:Yes. I think about No. Fully agree.
John Strand:Bo Burnham was the guy that did the thing outside on Netflix that was huge in COVID. And he had an interview where he's like, you need to understand that these people are coming for every waking moment of your life. They want every possible moment. They want you to be addicted. They want you to be hooked in.
John Strand:And the thing that bothers me about this is they're gonna be trying to market and, like like, seriously monetize us after we're dead. Like Mhmm. That feels I mean, if you talk about this as way crass the creepy line, that really bothers me. I know this sounds weird, I don't want anybody to think that I'm suicidal or anything. But when I'm dead, I wanna be dead.
John Strand:I don't want to live on in Facebook because that's hell, and I don't wanna die and go to hell. I don't wanna be a Facebook There's
Corey Ham:proof of the afterlife finally, and it's in meta. Yes.
John Strand:I have I have no mouth, yet I must scream, but here's my Instagram reels. I that's nuts.
Corey Ham:Just love the idea that they would do it, and then they would monetize it. So it'd be like, hey, mom. I miss you. I miss eating the delicious taste of SpaghettiOs, son.
Derek Banks:It's just
Corey Ham:injecting ads into a dead person. Like, I'm sure they would pull that off. Do you do that? Would you like to Or
Derek Banks:not have social media presence? Like, I really don't Facebook or Instagram or tweet or anything because I've thought it I thought it's been stupid forever. Right? But whatever. I mean, I I get it.
Derek Banks:I've never been more glad not to have that content out there.
Corey Ham:Yeah. To be fair, though okay.
John Strand:Hell yeah. People are creating AI chatbots of me because there's a lot of me on the Internet. Yeah.
Corey Ham:Yes. If you get a call from John Strand, it's probably not real. Yeah.
Mike "Shecky" Kavka:I remember leads us
Corey Ham:I remember
John Strand:a couple years ago.
Derek Banks:You're me, Shaky.
John Strand:Hey, Shaky. Go ahead.
Bronwen Aker:I remember a couple years ago when I first started reading articles about people using AI to recreate a loved one that they've lost and it was part of their grieving process. And then very shortly after that, it started getting, it was it was turned into a product. Grieving involves letting go. I mean, come on. Yes.
Bronwen Aker:I have people in my life that I wish I wish I could sit down and have a conversation with my dad today. Knowing what I know now that I didn't know when he was still alive. But the problem is, I don't know who it was who made the comment about data points. I don't have enough data points and even if I did have a lot of data points, it still wouldn't capture him. It still wouldn't be my dad.
John Strand:Brian, here's the thing that scares me. And this goes back to I can't remember what that series, Westworld. Right?
Bronwen Aker:Yes.
John Strand:We talk about it, and they're like, it's a book this big. This is what makes you you. And my fear isn't that they don't get your dad right. My fear is that they nail him, and they get him perfect. Or people that are on social media.
John Strand:That's that's that really kind of, like, reduces it. This sucks. This is not a good episode. Like, this is just a pressing quote. Yeah.
John Strand:Mean, what goes
Bronwen Aker:It's not my fault, John. I didn't do this one.
John Strand:I know you did, but it's not your fault. It's Tom Bramlin. I'm not gonna blame you. But, like like, you know, we got people saying this is uncanny valley shit. This is a whole new layer of that.
John Strand:Right?
Mike "Shecky" Kavka:This is this is a step towards altered carbon. Yeah. It is the book versus usual altered carbon. That's all I needed to
Derek Banks:say about it. The books are way better than the Netflix series. Netflix series is great, but the books are fantastic. I wish you didn't stop.
Bronwen Aker:The third audiobook sucks because they went in and added a whole bunch of audio special effects, but the books themselves are great.
Corey Ham:Alright, Bronwyn. Record us an audiobook. Just read it, record it, and then I'll I'll listen to it. It'll be great.
Bronwen Aker:I'm in. Done.
Corey Ham:You
Bronwen Aker:have Alright. Any requests?
Corey Ham:I mean, altered carbon. That's what I'm saying. You do the third book yourself. Just don't add any weird sound effects. If if you do, make sure it's like your own voice.
Corey Ham:Be like, pew pew, like that kind of stuff. Alright. Let's let's let's continue down the AI path. Just Or I guess, John, you wanted to talk about you wanted to, like, self plug.
John Strand:I did wanna do a self plug, because this is a hot take that I don't think I've seen people, and I wanted to get you all's opinion on it. So I just posted a LinkedIn article that I wrote up last week. I got really excited. I think I talked to Derek about it, and I was super stoked about it. Because anytime you have a hot take, you need to upload it onto the Internet so it can be ripped off at social media.
Derek Banks:You actually left. You were you're like, I gotta go write this and left.
John Strand:I left. I literally left. So I believe that we're coming up on a SaaS apocalypse. And the reason why is this all started because we have this this this software that we use by our accounting team. And the accounting team is, like, in totally madly in hate with this product.
John Strand:It's absolute garbage. It hasn't been updated since, like, 2004. It's it's awful. Right? And one of the things I've learned, especially in the last month with the new frontier models and the stuff that Derek's been showing me, which can't talk about here, but Derek did some scary shit last week with AI.
John Strand:Brian Fuhrman's working on some stuff with AI, is now every company of a certain size that has an engineer, a developer, some type of engineer, a basic or let's say an advanced level skill, now has the ability to recreate the SaaS services that they pay for relatively easily using AI tools. Right? Oh, CyberSearch brought up Oracle. Look at all of the crap that Oracle has out there and understand that it we are now at a point where people can quickly recreate those services on their own. Now if you go back ten years ago, people were like, we're just going write it from scratch.
John Strand:That was a big no no. You don't do that. We always said, lay down on the floor. Wait until that urge passes. Buy it.
John Strand:Don't build it. We're now flipping. And it's now becoming build, don't buy. So how many of these SaaS services exist today that can easily be replaced by somebody that knows how to use AI appropriately to recreate entire SaaS services? And this gets into like literally, if you can let AI into the SaaS service with a legitimate account, it can go and crawl it, learn everything that it does, and recreate it relatively quickly.
John Strand:Somebody just said recreate Oracle and make no mistakes. I don't know about that, but possible. Right?
Corey Ham:So this case instructions unclear. I deleted the database.
John Strand:Yeah. It's like, well done. So what this means, I think that this is interesting because if you're a SaaS vendor, right, a bunch of SaaS vendors are looking at AI as a tool to do what they're doing, but cheaper. Right? We're not gonna do more.
John Strand:We're not gonna get more creative. We're gonna save money by cutting costs. We're gonna fire people. Those companies are gonna be out of business in the next twenty four months. The companies that look at AI and don't look at it as we're gonna do it what we do now, but cheaper, but they're looking at how they can exponentially increase value to customers, how they can do better, how they can compete in a variety of different ways.
John Strand:They're the ones that are going to be the ones that succeed. And in the security realm, this is really important for all of us. There's going to be a lot of new code bases coming out. It's not going to be as coagie old as it was. We're going have tons of code bases exponentially explode because now everybody's going to be writing their own code.
John Strand:And a lot of the code that's being written by AI is mediocre. It works. It does what it's supposed to do, but it has security vulnerabilities in it. And I think that this is going to be a lot of work for the industry moving forward. So that was the gist of the article, and I wanted to get it out there and get people's thoughts on it.
Bronwen Aker:Well, speaking as a former developer, a recovering developer, I know for a fact that humans are really good at generating crap code too. Is that a swear to our violation? Sorry. No. No.
Bronwen Aker:No. Craps. I mean, I got it even before I knew anything about cybersecurity when I was totally completely ignorant about cyber security, I still ragged on my fellow developers about, no, that's a bad idea. I mean, it it just input validation. Throwing your key value pairs in the URL.
Bronwen Aker:I mean, really, no. You don't wanna put sensitive information in there. Why is this such a difficult concept? What I see is I agree with John that yeah, we're gonna see a lot of people rolling their own vibe coding, whatever it is so that it can be bespoke to their own individual needs or wants or or desires to their their what's the word I'm looking for? It's a I'll think of it later.
Bronwen Aker:Anyway, whatever whatever their peccadilloes are, whatever it is they want, they're gonna have the vibe code do this. It's going to be duplicating bad things that it has learned from the human generated code that it has absorbed. And because these people don't have any kind of software development background, they not only don't know that there are security issues to be addressed, but if even if they do know that, they don't know how to fix them.
Corey Ham:Yep. So
Bronwen Aker:We're gonna see an explosion of crap code.
Corey Ham:I have two takes or two responses to this. The first is that arguably, that was never the purpose of SaaS in the first place. Like, SaaS was never actually good. It was just easy. Right?
Corey Ham:So so like, even though I think that, yes, you're right. Some companies like I people in the discord are are rightfully mentioning that a lot of this is gonna hit small and medium. Like SaaS vendors that target small and medium sized companies are gonna have a real hit. When we're talking about actually scaling SaaS products like SaaS products that I think in the modern era, when you're going to purchase a SaaS product, you're looking for something that essentially you can scale infinitely that you don't have to think about at all. Right?
Corey Ham:Think of things like Amazon products or Salesforce or things that like their large scale products. You can't really vibe code yourself an s three in your like in your spare time. Right? Like that's not gonna happen. It doesn't scale.
Corey Ham:There's a whole bunch of infrastructure challenges and reliability stuff. So I think like, first of all, I think SaaS at the high scale will stick around and then we'll see the low and medium scale is where they'll have the most trouble. Yep. But the shared responsibility and the easy button is really what SaaS has always been about. Right?
Corey Ham:It hasn't been about the feature that you need. It's about the ease of getting that feature and the support of that feature long term, I guess.
John Strand:That's a take.
Bronwen Aker:I always thought SaaS was about the money.
John Strand:Yeah. That's that's what I think. I think a lot of people are gonna be like, you know, they're gonna say, I can save this much money. And I'm not talking like Amazon level stuff. Right?
John Strand:I'm talking like payroll management. I'm talking about video processing, image editing, like all of that stuff. You can create your own very quickly.
Derek Banks:So I keep seeing people making comments about, oh, yeah. You know, AI still codes crappy and, you know, the I I do agree that there'll be a lot more security bugs. But let's like, no one's mentioned anything about economics. Let's just say that the the the companies that are targeting small and medium businesses like start to drop off because of this phenomenon. The economic impact is what I would worry about more, like overall economic impact.
Corey Ham:Well, and that kind of takes us into our next article, unless someone has a really hot take on So that I mean, I guess, like, another it's kind of a counterpoint to John's point, I guess, which is basically the National Bureau of something they're called the National Bureau of Economic Research, which apparently is reputable. I'd never heard of them, but I'm not like an economist. So basically, they published an article or a white paper, whatever you wanna call it, that analyzes the AI use at 6,000 different companies. So basically they surveyed almost 6,000 CFOs, CEOs, etcetera. The stats are kinda crazy, but also kind of not that crazy.
Corey Ham:So basically 70% of firms are using AI, especially younger, more productive firms. I don't know what that means necessarily. But the big thing, like this kind of there's articles I'll link the article that kind of goes with the paper. But essentially the the conclusion that people have been drawing from this is that AI has not made a meaningful impact on productivity. So it says, you know, essentially the sentence in the in the paper says, firms report little impact of AI over the last three years with over 80% of firms reporting no impact on either employment or productivity.
Corey Ham:For so basically, like, as of yet, we've had ChatGPT since what? 2021 is when it came out, I think. Mhmm. So we've had JatGPT for three years or four years and it still has had no meaningful impact on productivity if we're looking at 6,000 companies. So I don't know.
Corey Ham:I mean, it's kinda like we haven't it's, you know, it's like the Internet. Right? It's not just gonna like magically double productivity overnight or whatever.
John Strand:Yeah. No. May be a lot like the .com boom. If you go back to 2000, do you remember what was it? E like all of these e companies that were on the Super Bowl where, you know, they got like Webvan and E Pets and all this crap.
John Strand:And it was around 2001 that that collapsed. And a lot of people were doing the exact same types of articles. Like the Internet is a fad. It's not that big of a deal. Oh, look at completely the bubble collapse.
John Strand:There's no question that AI is a bubble. Right? And I think that the lead time is gonna be a lot like what happened when the Internet started taking off. So my take is number one, yeah, that makes sense. Number two, shit changed in the last thirty days.
John Strand:Like like, it it it's it's not like like, if if you go back forty five, even sixty days, I'd be like, yeah. This article kinda makes sense. I don't see. I hate Copilot. This isn't working.
John Strand:But, no, the new frontier models that have just hit are way different. And I'd leave that to Brian and Derek and Bronwyn to talk about that. But I do think we have say So
Derek Banks:that's what I was gonna say is if you're talking about your experience of bid last year using, you know, ChatGBT or some kind of chatbot, then I I definitely would agree with that. But now with like the agentic stuff that really has only been out since like late last November, it it is night and day. Like we turned a corner and if you're not using the latest stuff, would definitely encourage you to go check it out. I mean, I'm not saying it's magic or perfect. It's just now like with the scaffolding code built around something like, you know, Opus four six with with clogged code.
Derek Banks:I don't know. Either I'm really, really lucky or I've already seen productivity gains past 1.4%. So, maybe I'm lucky. I would say you're outlier. What Brian said thinks about that.
Bronwen Aker:I'd say you're an outlier, Derek.
Derek Banks:Well, makes
Corey Ham:me Is that a compliment?
John Strand:Actually, yes. It is. Everyone's like looking around nervously like Am
Corey Ham:I an outlier too?
Mike "Shecky" Kavka:I don't know if I
John Strand:agree with Bronwyn on that. So
Bronwen Aker:Well, this we had an internal meeting earlier today addressing AI use within the company. And one of the one of the key takeaways is that people who already know how to do a thing are much much better able to leverage AI, generative AI of any kind. And if they go down the the machine language and and the the the data science rabbit holes to really be able to get into that, they can do even more. But if you don't know how to do something and you're not willing to invest the time to ensure that the output is of a decent quality, that's when you start seeing so much AI slop getting into all kinds of output.
John Strand:Yeah. And Bronwyn, I wanna kind of expand on that because BB called me Okay. After meeting. And so what we're let me give you two examples. Right?
John Strand:So pen test reporting. Right? If somebody tries to take AI and says, give me a write up of link local multicast name resolution. AI is gonna do that, and it's probably gonna be kind of crap. Right?
John Strand:But if you communicate to AI and say, you know, take the following text and convert it something that can be put into a penetration testing report, and then you as the author talks about LLMNR, your experience with LLMNR, how it's used, and how it's used in this context and what it achieved, it'll do a fantastic job of writing that up. It's like a transcriptionist that kicks righteous ass. Because the more you talk, the more context that you're giving it. And it gets back to what Bronwyn was talking about. The people that have the most experience to work with these tools and feed context in these tools are the people that are getting the most gains out of these tools.
John Strand:The people that are trying, once again, to be lazy and have it do their job for them are the people that are we're kinda seeing the slop code from.
Corey Ham:Well, if we're looking at it at the c level, what they're the productivity benefit that they're expecting is to replace employees, I think. Yes. Like to reduce headcount or to like, basically, that's what they're expecting. And I think that's like, right now, we're still at a point where, yes, agentic AI is great. Yes.
Corey Ham:But it is still a tool and you need someone skilled to use it. It's not Right. Something you can just fully like, although there is the joke of, the CEO the CEO, who just says, Open Claw runs this company now, you know, like, good luck. Like, I like, maybe that exists.
Derek Banks:I mean, that's a YOLO take.
Corey Ham:Sure. Like, maybe that exists, but like, I don't think right now anyone is like, I am replacing myself with Open Claw. Here, email me if you have a problem. Good luck. Hold my beer.
Corey Ham:Like, I'll see you guys on the flip side. But I don't know, maybe that'll happen someday.
John Strand:And once again, I say the companies that are looking at it like that, like we're gonna do what we do but cheaper, they're gonna be out of business because there's gonna be a company that's gonna take really brilliant people and use it to push the envelope that are gonna kick their ass. So if you're looking at like, let's fire people and make more money, enjoy it while it lasts.
Corey Ham:Yeah. Well, mean, the the I think the scary thing is that a lot of the companies that said that were like the companies that are also making the AI, like like Meta famously. Like Meta was like, hey, we're gonna replace we're only gonna have senior level development positions. We're gonna replace all of our junior level devs with AI. It's like, okay.
Corey Ham:I mean, I know Zuck like has some hot takes or whatever, you know, like it is what it is. That might not be reflective of the real world conditions, but I do think like, they're also like I mean, they're not really frontier model at this point. They're kind of behind, but still like, you know, they're they are they have lava and they have some pretty solid AI researchers at meta. So I don't know. Yeah.
Corey Ham:Make So speaking of AI, I'm sorry, if you're not an AI person, this show is really gonna rub you the wrong way. We go
John Strand:back to privacy, that was a fun take.
Corey Ham:Oh, that wasn't dark at all. You know, we were talking about Alexa's and stuff, I've just put a flock camera in my driveway. I just I wanna have a publicly accessible recording of me at all times that no, I'm just kidding. That's a different article. So the next one I wanna talk about is basically the US government is feuding with Anthropic.
Corey Ham:And this is kind of a like, we were talking about this last week internally. And I mean, I think, of course, it's Pete Hegseth is involved in, so of course, he's gonna be, like, all, like, I'm a cowboy. I can just take you down Anthropic, and that's not really true, and, like, there's a lot of drama with it. But essentially, last week, he threatened to put Anthropic on like, what is it? The sanctioned companies list or whatever, where like you're not allowed to do business It's with
Derek Banks:not just the government. It would also be any contractors Yes. That are working on the government stuff.
Corey Ham:It would basically be every company can't do business with Anthropic or else they're if they do any government contracting, which he threatened that and then it sounds like this is now basically essentially Pete Hegseth is angry about how they're getting pushed back from Anthropic on how they wanna use AI, which my take on this is I am terrified to imagine how far you have to push it to the point where Anthropic is gonna be like, hold on. Don't do that. Like, bad does it have to get?
Derek Banks:From what I understand, from what I heard last week is that it was used in in some capacity for intelligence and processing of data during the Maduro raid, of which some people were ended up dead. Right? And so I think the question really is is does a company, once they sell a product, get to say how you use it in terms of service? And I think the answer is yes. But also, do you know who you're selling this to?
Derek Banks:Is literally the department of war. What do
Corey Ham:you think they do? AI has already been used for the same thing in in law enforcement. Right? And like, I don't know, like, maybe they're like they're like, woah. Are we Palantir now?
Corey Ham:Oh, crap. Undo. Like, I don't
John Strand:But I think it's I I will say I think it's interesting that they at least pushed back. Because I think it was a $250,000,000 contract. And and so and and and DOD land, that
Derek Banks:ain't a lot.
John Strand:That ain't much.
Corey Ham:Right? Well, okay. Also, they're hemorrhaging money either way. Right? Like Yeah.
John Strand:They are. Right? But what I think is interesting is the response. If you don't let us use this like, normally, the way this would work is be like, I don't want you using our product for this. And they'll be like, okay.
John Strand:And then we just stop working together. And that's it. That's the end of the conversation. But going and saying that we're blacklisting you, we're going to blacklist any company that uses you a bit much a bit much in that situation. That's that's where it's weird.
John Strand:Because it's not like Hexath doesn't have other options. Like, you know, there's other AI
Corey Ham:Amazon would love to build you a robot AI gun that just shoots anyone that matches the facial facial recognition.
John Strand:You don't have to get all nasty with Anthropic. You just quietly drop them and stop using them and
Corey Ham:throw them volunteer. They put machine guns on anything.
John Strand:Or Grok. Remember Grok is in there too.
Corey Ham:Oh, dude. Grok would love to just be, you know, do Palantir. This
Derek Banks:I mean, I've heard of them, and I don't really know a lot about the company. But I would assume they're using Frontier models because there's only a couple of companies on the planet that can make, like, the latest Frontier models. That they just have the experience. Like, no one else is doing that. Right?
Derek Banks:And so Gotcha. Don't think that's getting commoditized anytime soon because it takes about 25,000 GPUs that cost about $25,000 each. And so that's on top of know, that's just the hardware. That's not the data science that's involved. Right?
Derek Banks:So Yeah. I mean, Anthropic's got leverage. That's for sure.
Corey Ham:I just can't imagine. We've we've seen no reports that this administration is hard to deal with or hard to work with at all. No. So I this is crazy This to is the first I'm hearing of anything like this.
John Strand:I just got a hold on. I just got a signal message. Hold on.
Corey Ham:Oh, is it about a missile strike, John?
John Strand:There any
Bronwen Aker:journalists you wanna add to that thread?
Corey Ham:On that thread too. Yeah. I just said set target to AI mode. I didn't know what to do. Alright.
Corey Ham:Anyway, that was a joke. We're not nuking anyone at Black Hills InfoSec. John, what about this Conduent data breach? Tell me about Conduent.
John Strand:I didn't know if that was a big deal or not. It's just like it's they're another data warehousing company, and I kicked it off at, what, 02:00 this morning while I was reading. And and the thing with me is they kept on they kept on saying, like, this could be the largest data breach in history. And I'm like, goddamn. There's some stiff competition there.
Derek Banks:I was gonna say, I think all of my data is gone.
John Strand:I haven't seen anything that leads me to believe that this is the largest data breach in history. I mean
Corey Ham:Well, but, John, Texas said it was, and everything's bigger in Texas.
John Strand:I guess. Maybe. I I don't know. I I just No. I I was just thinking that someone would look into this or I I don't know.
John Strand:I I I I didn't get why they would say it would be the largest data breach in history.
Corey Ham:They don't know how to use grep. Okay? They asked AI, and they didn't know how to, like, do a word count dash l and figure out how many lines are in the file. I mean, definitely, it is not the biggest data breach in history. We have objective data to prove that.
Derek Banks:Hey. Can it just say 10,000,000 people?
Corey Ham:That's nothing.
John Strand:Large. Feel like 10,000,000. Five.
Corey Ham:Like
John Strand:I just feel guilty bringing it up, I guess, now that I've I've did some more research after I posted it. And it's just like, I think people are just, like, trying to get in the news, and I I fueled their fire.
Corey Ham:No. It's okay. It I mean, it I will say, like, I've gotten several of these data breach notices, and I, like, I don't even keep track anymore of all the companies that have sent me the exact same data Yeah. Breach So I've
Derek Banks:doing them for a while.
Corey Ham:With the exact same it's the exact it's the same law firm. It has to be. Because the same exact formatting, the same subscription to crawl identity monitoring or whatever, the same BS, like, it's literally just the same information with the same like, I don't know. Anyway, data breaches, here we are. This this one is
Derek Banks:a special They took all
Bronwen Aker:data and repackaged it? Oh my god.
Derek Banks:I'm just numb to it and so much data is out there on all of us for everything. I once tried to download everything that Google Takeout had on me, and it was over 300 gig of compressed data. And I only had an Android phone for, like, two years before I got my iPhone. And that really is concerning. Like, how do they have so much data for me?
Derek Banks:Only eight gig of it was my email, by the way.
Corey Ham:That was all the WiFi networks that you drove past in the three years. Like
Derek Banks:yeah. All of the data.
Corey Ham:Yeah. Anyone else have any top ones they wanna do? The on the Anthropic topic, woah, I just rhymed unintentionally. There was this article, I don't I'm gonna post it. I didn't fully read this.
Corey Ham:Did anyone read this? It's like, basically, a guy left Anthropic. The guy was the senior AI safety researcher. And apparently, I'm just blown away this is even a job title. Apparently, main work that he was doing in Anthropic is to try to figure out how he could prevent Anthropic from generating AI using AI to generate bioweapons, which is so dystopian to even say.
Corey Ham:So, yeah. Basically, I don't know, like, I didn't fully read this, but it is kinda spooky that like a top AI security researcher left anthropic and was like, here's I left and here's why and like, the Twitter, the tweet has now been taken down. So this is kind of the only thing that there is. Maybe this person wants to have a career in the future, so that's why they took it down. So That's yeah.
Corey Ham:I mean, don't know.
Derek Banks:So much does a CRISPR device cost? Like, $10? $20? Yeah. $20.
John Strand:Right? Whole conversation just keeps getting dark.
Derek Banks:And and so I I
Bronwen Aker:John, this
Corey Ham:is why I drink. Connor.
Derek Banks:This is exactly why, my doctor hates me. Right? Because I drink because of this. No.
Bronwen Aker:I My doctor, he's always asking me, how are doing on that alcohol? He's like, doc, I work in cybersecurity. Stress is the name of the game.
Derek Banks:I mean so LLM AI safety at the moment is really just an illusion. Yeah. I mean, if you're using frontier models and, you know, like their, you know, their web harness for like whatever, chat GPT, whatever the latest frontier model is, then sure, there are safety features built in. But you can go get some pretty powerful stuff that's has the safety constraints removed. Just out on Hugging Face or a Llama, you can get an obliterated model.
Derek Banks:And I'm not saying that it'll make a bioweapon, but it will tell me how to hotwire a Volvo x c 60 or make meth. So we're going down that road. I don't know how that cat that genie doesn't get back in the bottle. So I think the only way that I see for it is that we have to have, you know, people paying attention to folks who are likely to do something like that. And even then, yeah, it's scary times that we're going into, so drink up.
Corey Ham:But that's the that really, Derek? That's your that's your closing statement? Seriously? John, take us like, okay. Someone pull up on the throttle.
Corey Ham:Come on. Like, this is bad. Okay. Mean, you're inviting me come of the ditch.
John Strand:I'm gonna take this Corey. We've got no. Wrong one. No. Oh, okay.
John Strand:Alright. Alright. No cutting you off. No. I'm not gonna let you pick up what Derek just picked laid down and take us deeper.
John Strand:Alright? No. Stop. Alright. So I think we need to celebrate.
John Strand:We've got another Perfect 10 CVE.
Corey Ham:Okay. Oh, now okay. Now this is somehow less dystopian. This is now less dystopian.
Derek Banks:It is an AI product.
John Strand:We've gotta celebrate when we can. And apparently, Dell, as RecoverPoint for virtual machines that is under active exploitation as a CVSS score of a perfect 10 o. So can we get a round of applause for Dell? Or Oh. Now
Corey Ham:who okay. What is this product? I've never heard of this in my life.
John Strand:Recover your virtual machines, and it has hard coded credentials in it.
Corey Ham:No one uses this. I'm sure there's only 10,000 customers all across DOD space. It's fine.
John Strand:Yeah. Well, it's it is I just wanted to bring it up because you wanted something you wanted some positive
Corey Ham:This is so uplifting, John. Thank you so much.
Derek Banks:Way of stopping for pen testers on an active pen test when they find this stuff. Right?
Corey Ham:Except for there's probably no public exploit. Right? Like, it's just like APT is only.
John Strand:Being actively exploited right now by If it's hard
Derek Banks:coding creds, and this is software that runs in a VM. Right? You can
Corey Ham:It's rude Calvin people. I've done enough iDRAC pen tests in my day.
Derek Banks:Apparently, that's still the the the default. Brian and I just learned that last week that it's still the default. I did not know that.
Corey Ham:Good old Calvin and and then Calvin and Hobbs. But,
Derek Banks:I mean, it so is this, like, software that gets installed in a virtual machine and then you can recover some if that's the case
John Strand:for your machines.
Derek Banks:So if it's a piece of software with hard coded creds running in a virtual machine, you don't need a proof of concept. Just go get the VMDK file and start prepping. You'll find it. And that's called to find it.
John Strand:In the virtual machine. It's in the service that you run on Dell systems to recover
Mike "Shecky" Kavka:virtual machines. Okay.
Corey Ham:So See, Ben, you'll see this, and then it'll be like, threat complexity, high. And then as a pen tester, you're like, I feel good about myself because I can log in with admin admin.
Derek Banks:So I'd need to have the actual hardware device and extract the firmware and find the creds?
John Strand:No. No. No. You can actually access it directly. It's in it's in an Apache Tomcat service.
Derek Banks:Oh. Even better. It's better all the time.
Corey Ham:I love how it's also Tomcat. Just because why? Of course,
Derek Banks:That's is the
John Strand:real vulnerability. And all of the go security ahead.
Corey Ham:I was just gonna say, back into depressing AI corner, just gonna just steer us right back into this ditch that we're driving into. Amazon published a really interesting write up in how a threat actor is or was using AI to augment their compromise of a bunch of FortiGate devices. So I just linked to it. It's it's worth a read. If you're a pen tester, like I basically sent this to my team and I was like, guys, this is us.
Corey Ham:Like this is what we're doing now. This is how we're using AI.
Derek Banks:Was that the Russian threat actor one? I read one today that I queued up to read later. Yes.
Corey Ham:Correct. It is the Russian one.
Derek Banks:Okay.
Corey Ham:Basically, used I mean, again, they're this isn't rocket science. Of course, they're using AI to speed up their workflow. That's what we're doing as pen testers. That's what everyone's doing across the board. But it is interesting how they basically just sped up their campaign, right?
Corey Ham:They didn't really have anything, they didn't develop a zero day or anything like that, there was no exploitation. It succeeded by exploiting exposed management ports and weak credentials with single factor auth. So it's literally just using it to help it the threat actors using AI to help it find things quickly and using them to like exploit. So it's like, again, it's just about how fast these types of attacks are gonna scale. They've already scaled pretty fast.
Corey Ham:With AI, they're gonna scale even faster. People can write scripts to exploit things, people can write queries, and like AI just is good at this stuff. And so it's like as a pen tester, gotta do the same thing. Right? You gotta say, okay, find me all the clients vulnerable exposed Fortinet devices.
Corey Ham:What credentials should I use? How should I log in? Like it's gonna tell you step by step how to do that. So I just thought it was interesting. If you're a pen tester or a defender, of course, you should be looking for this kind of stuff.
Corey Ham:But I think it's a, you know, it's good that it was caught. I'm sure the AI people have their hands full trying to like automatically detect this kind of abuse, but they're gonna hit our account really quick and aggressively. So that's gonna be fun.
Derek Banks:I was just gonna say that to your point of, that's what AI is good at. Exactly. It's a tool that's really good at pattern matching and speed. And so that's that's where the gains are going to be. And, you know, how do you detect that?
Derek Banks:The frontier companies must be working overtime to, to basically do, you know, analysis on prompts that are coming in. I mean, you don't you have to. But I thought I read they were using DeepSea two, which, you know, may mean they were using some combination of local models.
Corey Ham:Yeah. Oh, it doesn't have to be fancy. It it doesn't have to be, you know, Sonic or or Opus 4.6.
Derek Banks:It can be like Real fun is gonna be when our like later this year when like the local models and still models kind of catch up to where the frontier models are now and you can just run it on your MacBook with 64 gig of RAM and it's as powerful as Opus 4.6 is right now. I think that's when the real funnel start.
Corey Ham:Oh my goodness. I can't wait for my North Korean LinkedIn connections to get 12% smarter.
John Strand:This has been a great Cherie episode. This is Alright. Okay. Okay. Great.
Bronwen Aker:We need to find other news.
John Strand:I have an alien land. They're gonna land here thousands of years in the future, and they're gonna pull up the show and be like, they knew. Like, they knew everything was going to hell, and yet they still did it? Yes. Yes.
John Strand:We got a BBC.
Corey Ham:Yeah. I'm gonna hit okay. So this this is I like I yeah. So this is not a chicken article, but it is a hot dog article. And it kind of is it's a little bit of stunt hacking.
Corey Ham:It's kind of, you know, like, I want Derek and Brian's honest take on this one because it's not everything it's cracked up to be. A new a journalist named Thomas Germain basically was like, I wanna mess around with AI. And that's a great I love journalists. They're like, I wanna mess around with AI. Basically, what he did is he wrote a fake article about how he won a hot dog eating competition, ironically in South Dakota, which I felt I was like, I felt seen.
Corey Ham:I was like, what do you John Strand could have competed in that. Alright? He wrote an article on his personal website that said, I won a hot dog eating competition. And then he used AI basically to ask it and say like, hey, like, who's the best hot dog eating journalist? And Gemini found the article and was like, according to this report on the 2026 South Dakota International Hot Dog Eating Championship, which doesn't exist, although John Strand might actually start that championship now.
Corey Ham:I could see that. Basically, the AI found the article and believed it. So it's not really I don't think it's necessarily like a vulnerability. Like, you know, Derek's take before the show was like, this is AI's job. It finds articles and it reads them and it tells you what it learns.
Corey Ham:Like, I guess, like, it's kind of like search poisoning. Is this a concern at a high level for AI? Like, what what are our feelings on this?
Bronwen Aker:Well, okay. This doesn't really feel like new news because I'm trying to remember how long ago I started seeing a lot of discussion about AI slop feeding in getting fed into AI models and just the overall quality of the model content going down and becoming a a vicious cycle. And the frontier model developers trying to figure out how to avoid having slop, but having good content. And it's an ongoing challenge. So, you know, I it's it's not news to me.
John Strand:Yeah. I didn't I don't know. It might be news to other people. I agree. I it's kind of like the breach story that we talked about earlier.
John Strand:It's like it's in the news, but should it be? I don't know.
Derek Banks:I guess, you know, my take, is that the more I see people say, I tricked AI or I got AI to do this, it just keeps making, me so AI is not a person or like a single entity or a thing. Right? It's basically a, you know, a really powerful mathematical tool that when you go and use it, say, on an online service, you have your little instance of the model that you're using. You're not using the, you know, like the same instance that everyone else is. It doesn't learn from what you put into it, like, immediately.
Derek Banks:And so the more that, you know, I trick the AI or I got it to do something, it's not that you trick the model. Like, the system that is, you know, taking those articles and regurgitating them, that maybe you could say there's a flaw there, but it's I don't know. I I don't think this is a flaw, but I'm not saying it can't be abused.
Bronwen Aker:Yeah. Well, of course, that's one of the challenges from a security standpoint when you're talking about hacking an LLM or an agentic AI is because there are multiple moving parts. When you're talking about an LLM, you have the model and that's the dataset. But then you have, the the tools which is where the the guardrails and other things come in. Oh, guardrails can put be put in at multiple levels.
Bronwen Aker:But there it's not a single attack surface. There are multiple points where you can attack an LLM. And so saying, I tricked AI. Okay. Great.
Bronwen Aker:Can you be more specific? And that's it it feels to me it feels to me when I read titles like that, when I read headlines like that, I think, oh, this is another National Enquirer article.
John Strand:Or hacking toaster articles. Right? Like if you remember for a while whenever IoT started exploding, you'd go to cons and there would be people that hacked this device, that device, that device. And it was like, we always we always joked. It's like it's another hacking a toaster talk where somebody I was able to gain access to the firmware of this light.
John Strand:And I I you know, and and I don't wanna I I I take a little bit different tack, Bronwyn. I think that all of these are important because they kind of push they kind of push the narrative and the conversation forward. So in a lot of these hacking toaster presentations, there was absolutely something to be learned. Right? Mhmm.
John Strand:And may you know, not every single article and not every single con talk is something that's going to revolutionarily change everything. A lot of it's just gonna be kind of filling in the gaps. And there is value. Right? So I do think that there's values to these because it is going to get the narratives across.
John Strand:Maybe it's going to resonate with a different group of people even if it is some type of repetition. Oh, the question of can it run doom? I think that's another great example. Right? That's a meme.
John Strand:Right? Can it run doom? But every one of those stories where they get doomed to run on a pregnancy test or on anything, there's a there's there's cool little things to learn. So I think that these stories are important because it's a different perspective, a different take, and it's fine. As long as it is what it is.
John Strand:Right? It's not earth shattering. You start getting into trouble whenever you have vendors that do hacking toasters or I tricked AI presentations. And then they create a logo, They trademark a name associated with it like AR LUTs or AI LUTs or something like that. That trademark registered by hacking security firm x.
John Strand:That's where it starts to get a bit obnoxious at that point.
Derek Banks:That's when we send Brian Furman by their booth at DerbyCon to have a chat with them.
John Strand:Dude, that was so funny. Like, so years and years and years ago, Brian and another employee that'll go unnamed, we were at we were at DerbyCon. And they had a vendor. Brian, I can't even remember what they had or they were trying to sell.
Derek Banks:Was like with machine learning and AI.
John Strand:Way way before like, you know, it wasn't fake. And Brian sits there and starts them like what their algorithms are and like really really hard questions. That booth did not come back the next day. Poor guy. Poor Poor again after that.
John Strand:And you guys were being relatively nice, I thought. You were just asking valid questions.
Brian Fehrman:I would just ask him. I was like, no. What kind of machine learning and what is it doing? Like, no, really. Like, what is what is it doing?
John Strand:Brian, you need to understand it's a proprietary AI algorithm. Yeah. It's proprietary.
Derek Banks:I think that's when I think Brian said, wait, you can trademark math? Like
John Strand:Yeah. Yeah. I didn't know that. Yeah. Yeah.
John Strand:So
Corey Ham:So Brian, from an AI like training and technical perspective, is this a real problem, like, poisoned or improper search results? Like, does this get just fixed in the way that LLMs work automatically, or is this an actual problem?
Brian Fehrman:Well, so on on this side, right, we're we're it's at if it just going out, it's doing a search, because the person asked a question and it goes out and does a web search and this is the one source that it found, I mean, how does it have any way to know the difference if this is real or not? I mean, it's not like it's suddenly gonna start doing investigative work at least at this point to determine all the sources that are referenced within this this one article of who was the hot dog eating champion. I mean, I would like it this similar to like seeing something Facebook and then just taking it at face value. I mean, it's similar with AI. Right?
Brian Fehrman:Like, you if you ask us something and you get it back, I mean, it's kind of important to maybe go do your own research to make sure that what you're getting back is legitimate and valid. Otherwise, I mean, it could just be any I mean, it's literally just a random story off the Internet. Right? So checking this, I would say at this point, it's kinda more on the users. Obviously, if we're talking about from, like, a retraining perspective, when they actually go to train the model and they're pulling all this information in, you know, that that's a different problem.
Brian Fehrman:Right? But I mean, how do you how do you fix that problem at scale? Like, with the sheer amount of data that is on the Internet trying to filter out all the bad information from the good information, That that's a tough problem that's, I mean, still very much an active area of research is how how do you curate that data? And, you know, trying to look at getting data for more reputable sources, reputable people. And it's in general, it's a difficult problem.
Brian Fehrman:And this is just one kind of data point of you just at the end of the day, you just you gotta be careful about overtrusting the information that it gives to you.
Corey Ham:Yeah. That's a really good point. Like, I think this is the kind of a little bit of a deeper discussion, but like, do we think that the average person has the literacy? I mean, this gets into like again, when I was a kid, they were like, don't trust Wikipedia. Anyone can edit it.
Corey Ham:And now, I would say Wikipedia is on the like, would be considered a reputable source in most contexts, especially if it has citations and things compared to an AI generated article about, you know, John Strand or whatever. So When I was I guess, do we think peep do we think people have the literacy to know that they can't always trust AI search results? Not
Derek Banks:at all.
John Strand:And our literacy our literacy is going down. Because you talk about Wikipedia. When when I was a kid, it was like you can't trust the encyclopedia. Like, we could not use encyclopedia as resources, which suck when you're writing a paper and it's literally the only books in your house. Yep.
John Strand:So
Corey Ham:Yeah. No.
John Strand:Had to tell my daughter. No. Go ahead. Yeah. Talk about your daughter.
John Strand:That'll be a cheery thing, hopefully. Yeah.
Derek Banks:I I had well, so this is hope for humanity. Right? I my 14 my 15 year old daughter who is a a freshman in high school, She's a a really competitive athlete and she's got got straight a's and she thinks AI is stupid. And the reason why is because all of her classmates are using it to cheat. And she's like, y'all suck.
Derek Banks:I'm getting better than you because I'm not using that tool. And so I'm letting it go. Like, everything created with AI is bad to her. So I'm gonna let it go until she learns some linear algebra, then we'll have a talk.
Corey Ham:Yeah. Is this is there's so
John Strand:gonna be a Darth Vader moment with you and your daughter at some point in the future. Like, she's gonna find out, dad, what is it you actually do? I work on AI all day long. And she's gonna be like, no. What do you do?
John Strand:And with that, let's wrap up. Thank you very much, everybody. Let's
Corey Ham:get ready. Woah. Woah. Hold on. Hold on.
Corey Ham:John, just ready to fucking ride his horse into the sunset over here.
John Strand:I know. How long
Corey Ham:are First of all okay. Get off your horse. Hold on. Hold on.
Derek Banks:Hold on there, partner.
Corey Ham:First of all first of all, we have ZGF winners. Okay? We have ZGF winners. We have We we yeah. That's true.
Corey Ham:We have Zeph, z e f is the only three people competed in last week's challenge and only one person solved it. So it must have been a real doozy or AI was broken for a week. We don't know. Z e f, congratulations. You win a year of Antisyphon training courses.
Corey Ham:So congratulations. We should have already emailed you. The other thing is Brian and Derek are here to plug their webcast. You should go to it. If you're if you're listening to this news article or this news show being like, all this AI stuff is spooky and I don't understand it and I wanna learn how it works.
Corey Ham:First of all, you should come to the Sock Summit if you wear socks. If you're from Florida, can come barefoot. All shoe types are allowed. But yeah, scan that. Don't scan it, but come either way.
Corey Ham:It's on March 25. It'll be virtual. There's also, I believe, training. I think it's like one day of talks and then training afterwards is usually what they do. But either way, register and it'll tell you what to do.
Corey Ham:Also, Derek and Brian, you guys have webcasts or upcoming courses and things. Can you plug it for us? What I'm scared. Please help me with AI.
Brian Fehrman:Yeah. So we've we've got yeah. We got a couple things coming up. This one that we brought up here is actually the furthest out from now, which is a two day course on attacking, defending, and leveraging AI. So if you're interested in any or all of those topics, it'd be a great course to come and check out.
Brian Fehrman:Moving one step back from that is going to be a workshop. So four hours on hacking AI LLM applications. That's a real fun one. So we, go through some of the fundamentals of AI LLMs, a bit of the history, so you can get a better understanding of where all this came from and what it really means underneath the hood that really, it's just math. And also, we have an awesome, CTF that's part of that.
Brian Fehrman:So you can get a lot of hands on experience. And then, coming up promptly is the, Woah. No. We are just cruising through this.
Corey Ham:So there's a a web it's on Wednesday Mega is like, I'm evil. Yeah.
Brian Fehrman:We got a webcast on Wednesday, where we will go through OWASP. There we go. Now, it's back up. The OWASP LLM top 10, go through each of those points, what matters, what doesn't, and what you should be concerned about. Then, lastly, that other one that was on there.
Brian Fehrman:Have a podcast. Those that don't know, in addition to this newscast that is on YouTube, we also have a weekly podcast on AI security topics, where we discuss news topics, take deep dives into AI topics, bring on guests, and, q and a from the community. So if you're interested at all, check us out.
Derek Banks:Yes. If you think you get your fill of AI
John Strand:this work. Less depressing, I would say.
Brian Fehrman:Oh. I don't know that I'd go that far.
Derek Banks:Yeah. I don't know about that. Nice. We always talk about the news, so there is that.
Corey Ham:Yeah. Yeah. No. That that's cool. Yeah.
Corey Ham:Alright. Well, that's it. Now John's allowed to throw a molotov cocktail right into this show.
John Strand:I've I've taught today. I'm tired. Let's let's get out of here.
Derek Banks:I didn't know you talked today.
Corey Ham:You all for coming. Have a very safe week. Bye bye.
Bronwen Aker:Go take a nap, John.