Join Colonel (Ret.) Matthew Gill and Colonel (Ret.) Michael Jackson, Professors of the Practice at the Bush School of Government and Public Service at Texas A&M University. Full of stories and lessons, Season One follows COL Gill as he moves through the ranks, first as a field artilleryman and later as one of our nation's most elite special operators in military intelligence. Season Two follows COL Jackson's career in the Air Force and his involvement in the Intelligence Community and his move to the Space Domain and it's importance to National Security.
*The thoughts and ideas expressed in this podcast are of Colonel Gill and Colonel Jackson alone and are not representative of Texas A&M, the Bush School, or the Intelligence Studies Program.
Howdy, and welcome back to season 3 of For the Greater Defense. My name is Matthew Plunk.
Speaker 2:My name is Aiden Weatherford.
Speaker 1:And together, we are your hosts. And joining us today is Gary Brown, retired colonel from the air force and professor of the practice at the Bush School of Public Service at Texas a and m University. Thank you very much for for joining us today, colonel Brown.
Speaker 3:Pleasure to be here.
Speaker 1:Colonel Brown has become an expert on cyber policy over the years, and today's episode will focus on cyber technology and the growth of information and communications technologies in the war fighting space. Aidan, I will hand it over to you for our first set of questions. Alright.
Speaker 2:So our first question is gonna revolve around, where most of this podcast is just gonna revolve around the future of cyber warfare and its implications for the United States government and of course, the world. So with that, I wanna ask, how do you believe artificial intelligence could change defensive and offensive cyber strategies for the Department of Defense specifically? That's a huge question. It's a very broad question.
Speaker 3:Yeah. Yeah. It sure is. I I I I think, artificial intelligence is already starting to, rear its head in the space. There was a I mean, most of the time when we think about artificial intelligence and military weapon systems, we're we're talking about incorporating them in a way that would make weapon systems autonomous, and to get inside the, adversaries decision loop.
Speaker 3:So, obviously, machines can process information faster than human beings can. So if we remove the human from the loop, it becomes easier for weapon systems to, to actually engage faster after they after they receive targeting information. Of course, there's a problem with that and that is we we always wanna make sure that the weapon system is targeting the right target, the one we wanted to target. And typically, we've relied on human judgment to make those decisions. So the question then becomes, at what point do we remove humans from the loop?
Speaker 3:Do if we ever remove humans from the loop, or at what point do we have humans step back and take a more sort of a supervisory role over the the decisions made by a weapon system. So, in some ways, we often talk about this in terms of having a human in the loop, which is what we've always had in the past, having a human on the loop, which would be a human that can engage, in the in the decision process if they choose or in certain circumstances
Speaker 2:or
Speaker 3:having humans out of the loop. Generally speaking, for many years, DOD, the US Department of Defense tried to maintain the the position that a human will always be in the loop. A few years back, maybe a decade ago, the DOD started talking in terms of having a human on the loop. And certainly, US adversaries have, for some years now, had systems where the human out of the loop, completely. So, this is a fundamental change in in the way, war fighting could occur and and there are decisions that will have to be made going forward.
Speaker 2:Do you have any particular feelings on removing humans from the loop entirely? Do you feel like that's a good idea? Or should we always have our eyes perhaps on the process?
Speaker 3:I mean, it's a good question and not to be too cynical, but I but I think we can just in in most people I think most people would feel in there just lived experience. They can observe that humans are not always, the very best at making perfect decisions or exercising impeccable judgment. So it isn't as if humans are perfect and machines are imperfect. It's it's more of a situation where humans make maybe one type of error or or errors born of emotion or miscalculation. Machines might make errors born of calculations that are done too quickly or might be fooled by different things than humans are.
Speaker 3:So it's just a different set of errors that both decision makers would make. I think generally speaking, the ultimate decision about whether we're going to remove humans from the loop will come down to how good the technology is. And most of the time when we see examples offered of when a machine would make or has made poor decisions, it's because the technology just wasn't up to par, not because the idea of the technology is impossible. It just isn't it isn't ready yet in some cases. I think, ultimately, the decision will make itself.
Speaker 2:Decision will make itself.
Speaker 3:That was that was an interesting way to put it. We're talking about AI. Okay. But but I do think that ultimately when the technology gets good enough, the humans just won't be able to compete with the speed.
Speaker 2:Going to ask a question you probably don't have the answer to, but how long?
Speaker 3:Yeah. That's a that's that's a loser's proposition to predict how long that will take. A lot of times we, you know, we've been talking about the singularity Yes. You know, for a long time of when AI will essentially achieve some level of humanity and and much like other advanced technologies like quantum, it always seems to be about 10 years in the future. So, okay, at least 10 years.
Speaker 3:Soon. Soon. Soon ish?
Speaker 2:Okay. Yeah. Interesting. Alright. Now that we've cracked open that delightful can of worms, should focus a little bit.
Speaker 1:Yes. What cyber policy issues should future leaders prioritize at government oriented academic institutions like the Bush
Speaker 3:School? Wow. I mean, they're again, a really broad question because one of the things that I that I have found, or actually over the past decade, at least, of doing this kind of work. I think when I first got into the business, it took all my time just to get up to speed and learn all the things I needed to learn. When it became a task of just keeping current on everything, I I began to notice how how much the field has expanded.
Speaker 3:So so many things are pulled into what's called the cyber field now. There really is very little, for example, in the intelligence field, it doesn't somehow touch on cyber even if it's not directly collecting intelligence through cyber means. Many times, even human, intelligence operations are are they leverage cyber capabilities or they somehow use cyber capabilities for communications, to to develop, knowledge about targets, maybe to build a pattern that reveals that certain people would be, amenable, to, engaging in espionage, for example. But the biggest field and I think, in general, I think of cyber capabilities as divided into 2 large categories. One of them I call, hard cyber, and the other one, as you might guess, I call soft cyber.
Speaker 3:But, on the hard cyber side, I think of it as cyber capabilities that that essentially have machines as the target of the operation. On the soft side, the target of the operation would be humans. And I really think that as much time as we spent on hard cyber capabilities, which are much easier to envision, so that's like keeping our computers safe, protecting our critical infrastructure from, malware that might shut it down, that might damage it, that that might, turn it off at a critical moment. Those are all very important things. We have a pretty good idea of what we need to do in those areas.
Speaker 3:We don't always execute it perfectly, of course, but we have pretty good idea of what we need to do. On the other hand, on the soft cyber side, which really many times comes down to, cyber enabled influence operations. Many of them intentional state sponsored operations, some of them intentional, operations sponsored by criminal organizations or some some sort of private organizations that may or may not be leveraged by state actors. But some of them unintentional by your relatives who read a piece of information that wasn't true and and forwarded it and fooled a lot of people unintentionally. That's the hard thing.
Speaker 3:These are very, very difficult questions to even frame much less answer. So we're not exactly sure what we're combating here, but we know we want people to see things that generally are true and to make decisions based on information that's generally factual. But we there are lots of, statutory and constitutional limitations in the United States that prevent the government from engaging in the field a whole lot.
Speaker 2:Do you think future intelligence officers, war fighters, etcetera, etcetera, are going to need to have a far greater education on just cyber matters in general and becoming incredibly cyber knowledgeable? Or do you is that something that can sort of, you know, be left up to the battalion intelligence officer, etcetera, and they can just kinda go hands off?
Speaker 3:Yeah. One of the things that's interesting about this, Aidan, is that when, in in a lot of ways, the more sophisticated technology becomes, the less sophisticated the human has to be to run. So I don't think they're gonna need a lot of of sophisticated knowledge about cyber systems. I think most of that will be handed to them, and I think you you can see that in your day to day technology. Of course, cell phones have become much more capable, so that's that's not necessarily a great answer.
Speaker 3:Although, we can look at going to the the iPhone when it came out. It was a revelation to people that they just had little pictures that they could touch and do everything for them. You didn't even have to read a word in much less a sentence and certainly true on desktop computers, which of course started in the ancient days when I had, my first desktop with this operating system that we had to remember the command line prompts for it to get us to do anything. And then, soon we had the the Windows operating system, which was essentially a graphic user interface that generally speaking allowed you to see a picture and know what you wanted to do, although there were still some things to be done. And and now there really is unless you're quite sophisticated, there's no way to make the computer listen to you in any way other than touching a picture or clicking a picture.
Speaker 3:So, yeah, I think, the computer does more of the thinking for us, and that will continue into the future.
Speaker 2:Sort of happen naturally over time. Mhmm.
Speaker 3:Interesting.
Speaker 2:Okay.
Speaker 1:Most pressing gaps in cyber education that need to be addressed for the next generation of national security professionals. What do you think that those are exactly?
Speaker 3:Most pressing gaps for.
Speaker 1:For cyber education, for training up the next generation of cybersecurity professionals, be it here at Bush or or elsewhere?
Speaker 3:Yeah. So I think, it's a good question. Again, this is something that I think is, in a way, is less of a problem now than it was in the past, because we have sort of now worked our way through the and this is this is no denigration of the previous generation. It's just it was just a matter of what was relevant at the time at the peak of their careers, and we have in now it's 2024, which will work through our 1st generation of senior leaders that had essentially zero experience in the cyber field. And really most of them had never really thought of it.
Speaker 3:Many of the people that I worked with when we first started Cyber Command in in 2010 when Cyber Command started, Not at cyber command, but many of the Department of Defense senior leaders, were were sort of not very familiar with cyber capabilities or cyber, operations. And there were still a few left who sort of famously didn't even do email. I mean, I know it seems incredible to think that only, you know, 15 years ago Wow. There were senior leaders that didn't, but they were at the tail end of the of 35 year careers, and they just had never had they have staff who did that for them early in their career and just continued. Mhmm.
Speaker 3:Now we're we have grown over this past couple of decades. We've grown the new generation that's very comfortable using cyber capabilities to the point where we don't really even think about how much it it permeates our lives. You know, the the use of credit cards and cell technology and GPS and all kinds of things that we just don't think twice about. And, really, in many ways, a normal citizen has lost the capability to function in a lot of these areas without technology. And I know heck, even though I don't have to read a paper map, I can't remember the last time I had a paper map out.
Speaker 3:But, you know, most people now I think in in your generation, it might find it really disorienting to try to navigate in a place they hadn't been before without a cell phone with their map up, for example, or to be flying to visit someone, or or do international study to fly to a foreign country maybe they've never been to before without a cell phone. So they land and have no way to communicate with anybody other than prearranged, travel, plans, for example. Yeah. But these are things that were only 30 years ago were standard practice for everybody. So it's been an incredible, incredible change.
Speaker 3:So what what are the gaps that people have? I think in some ways, we could look at what are the gaps that I'd say is a little bit tongue in cheek, but what are the gaps might be to teach them how to do things without cell, without cell service and without modern technology? Because there may be a time when a national security person would have to do that. And we see some of that in the services where we we develop some redundant backup systems in case our systems would fail, in combat, for example. So we're not completely helpless now, without them.
Speaker 3:But, you you know, people should be resilient just like we we talk, frequently about our the need for our, critical infrastructure to be more resilient. Meaning, of course, we can't guarantee it's always going to be available, so we have to be ready to act, in its absence. So that's one, but I I, you know, I think the big gap and it's a it's a very difficult challenge and it will continue and and that's how do we kind of seamlessly weave cyber capabilities in in with our kinetic capabilities and with our other instruments of national power to create the most secure future for the United States that we can have. We're still learning how to do this. In in the again, in the past, in the in the first generation of Cyber, we thought of it as a separate capability that we looked at separately, and it was sort of a a running, I would say, a a a dark joke that we had in in DOD, which was, you know, we always wanted, sprinkle some cyber goodness on every plan or, you know, put a little cyber salt on it, because we weren't really good at integrating cyber planning with kinetic operations.
Speaker 3:And there are really good reasons for that. It's not like it's an easy it's it's not lack of effort. It's not lack of of good intentions. It's just that the the way national in the national security field, the way cyber capabilities function just means that they aren't necessarily going to be available exactly when you need them. It isn't like kinetic capabilities where you can practice generating aircraft or practice generating, divisions or or special operations forces and you have a pretty good idea to it, kind of a 99% certainty that those forces will be available when needed.
Speaker 3:Cyber capabilities might be based on certain versions of software. They might be based the ability to affect an adversary system might be based on current access to the system that you acquired through, some nefarious means, and you might be you might lose that access at any time. So when you're building strategic plans, you can't necessarily count on the ability to take down a system or infect a system with our cyber capabilities. We can hope and we might be right, but it's difficult to plan an entire, for instance, a major operation.
Speaker 2:You helped write the Tallinn, manual, which is a wonderful city in Estonia, And the manual itself
Speaker 3:I had nothing to do with building the city.
Speaker 2:Okay. I'm not
Speaker 3:that old.
Speaker 2:But, yeah. Highly recommended visit though. Very beautiful. But, the Tallinn manual essentially being a Geneva convention kind of, for cyber warfare. However, that was about 11 years ago, 2013.
Speaker 2:The manual has seen a lot of updates since then. I checked it the other day, there's stuff seemingly added almost weekly. But when you wrote it in 2013, I helped write it with a lot of other, individuals. Do you think it really had its finger on the pulse of where cyber warfare was gonna be going and the correct challenges that we were gonna be, facing? Or do you think that, you know, there's some areas that you and other people might have been completely wrong or completely correct on?
Speaker 3:Yeah. So I'm I'm glad you asked the question. It's a good one and and, I will talk about it. It it's interesting and I and I and I should clarify the the first time in manual came out in 2013. And on on the first time in manual, I was an official observer from from the US for that process.
Speaker 3:The second top Tallen manual 2.0 came out in 2017, and I was a member of the group of experts. I was the, you know, somebody has to be the dumbest, member of the group. That was me. I worked with brilliant brilliant international lawyers on that. They were they were incredible.
Speaker 3:I learned so much from them and I did I played the tiniest role with the tall and manual, but it was a was a wonderful learning experience and a great, opportunity to, to participate in this what I think is a is a fundamental project to try to make international law relevant for cyber operations. So to answer the question, you know, and professor Schmidt, who was the, you know, kind of the brains and and the broad as it turned out behind the manual because he was a, he was very effective at at making sure the the manual is completed, which is not an easy thing to do with a with a pretty, you know, significantly sized, group of brilliant academics who don't tend to it was they're sort of like herding cats. But and he he was quite good at herding cats and and, of course, also a brilliant lawyer and and great writer as well. He made some changes between, the first and the second version of the ptolemy needle precisely, because of this kind of well, at least partly because of this finger on the pulse issue. So on the first meeting, well, it really was a kind of a closed group of academics, and this is important to remember all the versions of the Tallinn manual, at least to my knowledge, sir, and certainly the first two.
Speaker 3:Aspired to look at international law as it is and apply it to cyber operations. So there was not only was there no intent to make recommendations or to try to guess where it ought to be or or or think where it might be going, there was a there was we expressly avoided doing anything like that to keep the credibility of the manual. This is just right. Exactly. Because politics would quickly enter into it.
Speaker 3:So this was just looking at the law as it is, Lex LOTA, international law as it lies and applying its cyber, operations, cyber warfare capabilities. So this, you know, it was a challenge, in and of itself. We didn't need any additional challenges. Mhmm. But in excuse me.
Speaker 3:In the second version, there was a, the the slight modification was portions of the text in the second version were sent out to states for them to give input on it, which was reflected in the comments, not in the what's called the black letter law, which is the the the basic rule. But in the commentary, if you read the commentary, you will see reflections of how states feel about the the, the rules surrounding, international on cyber warfare. The states the state that provided the comment is not listed. Okay. So this was just as, you'll see comments in there about, the majority of the experts concurred that this was blah blah blah, a minority of the experts blah blah blah in there.
Speaker 3:You will see, we made the decision and I think, well, Mike made the decision, which I think was the right one, not to attach individuals, to the commons as well. So it was sort of a Chatham House rule for the, for the Tallinn manual.
Speaker 1:Excellent. That was a very good response. What has been our enemy's most effective use of cyber weaponry? Is it hacking or maybe weaponizing social media? What and or whom do they target most often in your experience?
Speaker 3:Yeah. It's it's a it's a great question. So on the, on the hard cyber side, as I as I defined it before, I think we're seeing more, unfortunately, more, instances where Chinese affiliated, hacking groups are attempting to and successfully penetrating infrastructure systems lying to the United States. And we we've seen this reflected with the Volt typhoon, SALT typhoon, named operations that's that have been publicly disclosed by US officials.
Speaker 2:And what did those target?
Speaker 3:Those target, critical infrastructures such as electrical power grids, which would be and this is a very difficult it's a question we struggle with in the Taliban manual and still there's not necessarily a clear answer in in international law for this at all. And the question is, if malware state sponsored malware is implanted on a, let's say, an electrical power grid on the computer systems that are responsible for controlling an electrical power grid. So the malware is implanted. It maintains a communication with the the the nefarious individuals who planted it, so they can still control the malware. It it may be capable of collecting information from the system.
Speaker 3:That sounds like espionage. Espionage is not unlawful under international law, and I said it that way on purpose. It's
Speaker 2:Did not know that. That's fascinating.
Speaker 3:It is not unlawful under international law. So that would be espionage. The US has publicly stated we engage in espionage just like every other country engages in espionage. We would not call out a state for unlawfully engaging in in, national security supporting espionage. However, if the same piece of code could with a single command or a series of commands from the from the individuals controlling it be turned on or modified in such a way that it would damage or destroy the system, suddenly, it looks like a a for instance, I don't say this is the right analogy, but one might analogize this to planting explosives on a power station that could be exploded in the event, you know, of an invasion or or some other national need on the part of the person who put it there.
Speaker 3:So this is in this gray area below. I'm not sure what's happening. Right now, we I think what's been reported is malware on the on the critical infrastructure that both provides information and has the capability to be modified or used in a way, that it would do damage to the to the infrastructure. So this this is a, you know, this is a real threat and it's, it's something that we should be concerned about. And, of course, we can speculate as to why, the PRC might want to put that kind of malware on US infrastructure systems.
Speaker 3:The the other side, on the soft cyber side, I would say the target, again, is people and the target, and more specifically in the United States is, I I think, in my opinion, the target is our national community. I I think it's it's in our adversaries interest to, encourage fractures in US society to encourage differences in US society and to try to do as much as they can to limit our ability to come together as a single unified national team to engage in the national enterprise, whatever that is, the national security enterprise. That's that's protecting democratic allies we have abroad, If that's protecting our economic interests abroad, whatever it is, it's in our adversary's interest that we do that, or that we're not able to engage in that because, a significant portion of the of the population is so much, against it that they might ride in the streets or or vote out the parting power or or protest, you know, run protests or whatever. So that's in their interest that I think we've seen for for many years now, we've seen, Russia engage in that kind of of, cyber enabled influence operations. Most of the time and most effectively leveraging messaging that's created by Americans, either with maybe misguided with good intent or maybe, having ill intent, creating messages that can then be leveraged with this incredible machine that Russia has to amplify messages or what I would call artificially.
Speaker 3:So using, bots, botnets, or some some people call them zombie machines, if you don't know what a bot is, to amplify these messages. So for instance, it might look on your social media feed, whatever, whatever platform you use as if a 100,000 people have looked at this message when in fact maybe, you know, 200 humans have looked at it and the rest are all bots.
Speaker 2:So it's a fascinating vulnerability that we have there getting seemingly very easily exploited by, adversarial states. However, I think you and I perhaps had the discussion before. I'm curious to what extent do we do the same to them? Does it look the same as the way they do it to us? And does it exist at the same level?
Speaker 2:And if not, should it exist more? Is there a moral responsibility we have to sort of not engage in disinformation in the way they do it to us? Or perhaps should is this something we should be doing more of to them? Yeah.
Speaker 3:Well, I don't think we should be doing more Okay. Personally. I I I don't first of all, well, okay, there there are many aspects to that. I I think there are there are constitutional and statutory prohibitions on the US engaging in disinformation operations, and certainly, we don't engage in disinformation operations that target Americans. And when you start pulling the prop I mean, that sounds easy.
Speaker 3:Right? That sounds straightforward. But when you actually parse that problem now in the cyber age, you realize that, well, that's not so easy because as soon as you release, you know, the the kraken of misinformation on the Internet, well, you don't know who's gonna see it. So the fact that you aren't targeting Americans would would not save you, from a from a violation of the law if in fact 90% of the people who saw it were Americans. So it's a it's a difficult it's a difficult line, to toe.
Speaker 3:But just in in a larger sense, I don't first of all, I don't think the US really has an interest in getting in the business of disinformation because I think we'd rather be the country that as flawed as we are and as as many mistakes as we've made, we would like to be in the business of doing our utmost to provide truthful information to people and reliable information to people so so that we can minimize and make purely accidental the circumstances that people can pull out and say, the US government got this one wrong or said something that was untrue here and we can we can honest we would like to be able to honestly say, yes, that was a mistake that we made and we've corrected it since. We misread the information or the the scientific information changed or, you know, or some some other rational reason for putting it out. But I think the vast, vast majority, you know, there's close to a 100% of the information the US puts out is true. The problem is truth is often pretty boring. It is.
Speaker 3:And so you're not going to see you're not going to get a huge social media following reporting on a bunch of true things that are dull. And so, obviously, you're looking for ways to spice it up. And you you can spice up truth, you know, and obviously still maintaining truth. I'm not saying spice it up with with facts, you know, that aren't facts. But, you know, you can try to pitch it in different ways and make it more interesting, but a lot of those ways are seen as kind of beneath the dignity of the federal government.
Speaker 3:So, you know, in in a lot of ways, I have talked about, you know, if if you wanted the most effective social media engine for the United States government, you would, of course, form a, you know, a social media force analogous to the Space Force, for example, and you'd recruit, 25,000 people, 26 years old and younger, you know, 18 to 26 years old, who would do nothing all day but send out tweets if that's what they're called, an x. No. I haven't figured that out yet. Yeah. And other social media posts talking, you know, you'd have a list of facts that they could put out and they would just spice it up in young people ways.
Speaker 3:But
Speaker 2:It's the truth, though. It's just It's still true. Very flamboyant truth.
Speaker 3:That's right. Flamboyant and with all the, you know, current current lingo that you young folks use and and, you know, misspelled words and no punctuation and and all that, but it's just sort of seen as well. Maybe the US government shouldn't be in that and then who knows how do you supervise that and would they get into back and forth with people and how do you handle that? It's very it's I I don't it's difficult.
Speaker 2:Very difficult. British during World War 2 had a fascinating, propaganda, I'll call it an operation, against the Germans where they essentially had a guy who spoke in fluent German, go on the radio waves and speak to the German soldiers about what was actually, as far as the British intelligence could assess, actually happening to German soldiers at the front line. And he would sort of portray it in a very comical, manner, but it it would be the facts, nonetheless, that he would be treating in Stalingrad and all that the troops would hear that primarily from, this very odd BBC service. So something something like that for perhaps, let's say, if we get in war with China or something, would be better than sort of just shoving as much disinformation into the spaces as possible.
Speaker 3:I think so. Okay. I think so. And also, there's always gonna be a problem with where do you draw the line with this information and, you know, and again, who's the target of the disinformation? It just doesn't seem like a good, a good position for the government to put itself in.
Speaker 3:That's not to say there couldn't be some very specific covert operations done that I have no knowledge of that that would go on in some very specific operation. I just don't it doesn't seem like something we'd want to be in the business of. Seems reasonable to me.
Speaker 1:And to me as well. So we're we're starting to starting to run up against against time here. So we got a couple more couple more questions for you.
Speaker 3:More to.
Speaker 1:Yes. Starting starting with, what with your experience in cyber policy and education, what are the biggest misconceptions you encounter about cyber warfare and defense?
Speaker 3:It's a hard, it's a hard question. I mean, when I'm when I'm talking to students about taking the class, the number one, I hope I hope my tech friends won't beat me up about this. But what my number one thing that I feel like is a misconception that I have to disabuse them of is that that you have to be some kind of a, coder or technical expert to do cyber policy because that is decidedly not the case. You need to have, the ability to, understand things that express to you in normal English terms that means not technical terms. Mhmm.
Speaker 3:Mhmm. And it also really, really helps to have an affinity for technology and that kind of thing. If you if you're if you're a technophobe, you know, then, of course, this is probably not the career field for you. But, you don't need to know how to code, how to write code, or or anything to do cyber policy because, basically, when you get to the strategic level, which is where most of cyber policy is done, because well, okay. Most cyber policy is done at the strategic level because cyber is almost by definition strategic.
Speaker 3:Everything that touches the Internet suddenly becomes strategic because the Internet is available everywhere. There are tactical effects that we do that are considered to be cyber effects. They they often are akin to electronic warfare effects. So you have local denials of of cell service, for example, or or or taking down, integrated air defense systems, for example, might be a cyber capability that's pretty local. Although for for the United States, if we're taking down somebody who's integrated air defense system, that's probably at the strategic level as well.
Speaker 3:But most of the things we do on the on the Internet are gonna be strategic. So if you're thinking about cyber policy, there are certainly local policies that are more what I would call cyber security policy that look at the ways we keep local systems safe and compliance with national law and NIST standards and that sort of thing. But cyber policy, in federal government and for the most part, in other organizations is really kinda strategic, which means anybody operating at a strategic level, like, doing, for instance, even strategic military planning, many of them have had some military experience, but it's not required. And they certainly don't need to know how to do everything. So you don't need to know how to fly a fighter to be able to figure out how to how to plan, how jets will be integrated into a strategic plan.
Speaker 3:And maybe maybe you are a fighter pilot, but you can also understand how the navy would play a role and the marine corps would play a role, for instance. So, yeah, you don't need to to to know how to coach. I think this is a misperception a lot of people have and they're they're scared they're frightened off the field, but, I can say one one of my students from last year got a had a call for an interview and she had she had taken, the 3 cyber classes we had last year at the Bush School. And she was doing an interview as a policy analyst with a defense contractor, and she said they asked her 3 it was just policy analyst. It didn't mention cyber at all, and she had 3 questions in the interview and 2 of them had cyber.
Speaker 3:So, I mean, it just is becoming integrated into the whole policy universe. So I guess that that if I had to have 2 things, it would be, you don't need to dispute it. You don't you don't need the tech technological background to use cyber policy and whether or not you like it, it's gonna be there.
Speaker 2:It was definitely a misconception I had. Thank you for enlightening us on that. Got one more for you. It's down the list. Just to close out here.
Speaker 2:It's gonna be a bit of a dark note. But, what, in terms of just the cyber realm in general, especially as it pertains to defense, what what worries you? If there's anything, what keeps you up for just that extra 5 or 10 minutes in that going, I don't know if I don't know if we got a good handle on this or not.
Speaker 3:I mean, you limited it to the defense realm, which changed my answer.
Speaker 2:I you if you have another answer there, go ahead and just slam it out there.
Speaker 3:Because Yeah. I mean, the thing that honestly, that I guess worries me the most, although being old and less worried about it than than you are, probably should be Thank you. Is it really is artificial intelligence.
Speaker 2:Okay.
Speaker 3:Yeah. Because I think at some point in the not too distant future, we are gonna have to reexamine what exactly is it that humans do. If if artificial intelligence we were working really hard. We we often talk in terms of of creating an artificial intelligence that can think like a human. But that's not really that's not really the end goal, is it?
Speaker 3:The art and I'm not talking about some crazy science fiction person or whatever. I'm talking about the people that are building large language models, for instance, now, people who are looking at artificial intelligence now. We're trying to build an entity that will think better than humans and not just faster, but just better be a better person because we hear all the time about these reports about how, oh, we turn AI, a large language model, loose on the Internet. You know, we did this. We made this mistake.
Speaker 3:A couple large companies that I I won't mention. And and with good intentions, I think, made the mistake of turning it loose, on the Internet and learning from the Internet. Well, what what does an artificial intelligence learn if you turn it loose on the current Internet? It learns how to be an extremely aggressive obnoxious teenage boy Because that's who spends, who at least creates all the posts on the Internet and where they turn into, you know, just kind of racist, homophobic, foul mouthed robots, I guess, is what you'd say. Well, that was terrible.
Speaker 3:So what what do we do? We'll pull that off and then we're gonna reprogram. What we're gonna do is we're gonna program out prejudice. We're gonna do all we can to eliminate bias. We're gonna make it so that it doesn't have any of those things that are inherent kind of in human beings as much as we, you know, try to, educate ourselves out of them and learn how not to exercise those those those poor elements of our nature.
Speaker 3:But we're gonna breed out the bad parts of human nature and continue to build an intelligence that's better, stronger, and faster than we are. And, of course, you you've take if you take away human frailties, like, you know, you need to breathe and eat and stuff like that, just turn it all into electricity. At some point, when you first I guess you can envision a stage that's sort of like the the, the movie Wall E, where humans are just sort of tubs of goo sitting on motorized chairs serviced by robots powered by artificial intelligence. Or and then but what's the stage after that? And I think Wally was the, was the optimistic, way to do that, but I don't know.
Speaker 3:And the question is, if you if you have invented this perfectly objective, non discriminatory, non prejudicial, moral being who thinks as far above us as we are above ants or farther, Why would we
Speaker 2:still be in charge?
Speaker 3:What could we possibly offer? You know, so,
Speaker 2:Excellent question.
Speaker 3:Yeah. I mean, maybe there's, obviously, there's more to it and some people think in terms of beyond the physical and the mental realm. So if there's a spiritual element to it, that's an entirely different conversation. But if there's not, I'm not sure. Yeah.
Speaker 3:I don't know.
Speaker 2:I do hope it would find us entertaining.
Speaker 3:Uh-huh. Yeah. Briefly, perhaps. Briefly. Yeah.
Speaker 3:But yeah. There's a I guess there's a lot more to say about that. And I don't wanna be too, too glum about it.
Speaker 2:No. It's a fascinating question. Thank you for entertaining it. And that is about as dark as I was going for it. So, you know, let's let's hope, yeah.
Speaker 2:It it always finds us, useful if we do reach that point.
Speaker 3:Well, as long as you can generate electricity for it, it should be good. Yes.
Speaker 2:Alrighty. Thank you very much for the time.
Speaker 1:Yes. We're very, very glad you could take the time out of your day to be with us. I'm Matthew, and this is Aiden. And on behalf of The Ready Room, thank you. Gig them, and God bless.