ChatNAPT with A.I. Chatterbots Chuck & Howie

In this third episode of ChatNAPT with A.I. Chatterbots Chuck & Howie, we’re joined by Kirk Foster, Metrology & Calibration Laboratory Manager at NASA. We dive into the complexities of risk management in metrology, exploring how to identify and mitigate operational risks. Kirk shares insights on ethics, precision, and continuous improvement in the field. Tune in for a fascinating conversation on the science behind accuracy and reliability.

What is ChatNAPT with A.I. Chatterbots Chuck & Howie?

In our podcast, we dive deep into metrology, calibration, and proficiency testing bringing you real stories, expert insights, and candid conversations from our 85+ years of combined experience. This isn’t just another technical podcast; we’re here to challenge the status quo, discuss industry changes, and tackle big questions like whether calibration labs are failing to train the next generation or if automation has gone too far. Expect lively discussions, industry leaders as guests, and a little fun along the way. As Howard puts it, “Proficiency testing is checking that transition from theory to application. But what happens when techs are just pushing buttons?” And Chuck adds, “We’re not teaching technicians how to measure anymore—we’re teaching them how to press ‘go.’” Whether we’re reflecting on our journeys—like Howard’s path from Air Force electronics to writing calibration procedures for the NFL—or debating metrology’s future, we promise to keep it engaging, informative, and unfiltered.

Chuck (00:18):
Greetings, welcome to chat app with AI chat box Chuck and Howard, sponsored by the National Association for Proficiency Testing, a podcast about all things metrology. So with that being said, I'd like to welcome my co-host to the podcast. Howard, how you doing today?
Howard (00:37):
It is not such a great, beautiful day outside, but you know what? I'm inside working. That's fine.
Chuck (00:44):
It's going to be 71.
Howard (00:46):
Oh, shut up. As long as it clears up. By the time I go outside,
Chuck (00:51):
Yeah, I put shorts on it. I know it's wintertime probably where most of the country is at right now when we're recording this, but
Kirk Foster (00:57):
I
Chuck (00:57):
Actually have shorts on and I'm hoping to get outside and try and take my walker and walk outside and get a little bit of exercise.
Howard (01:07):
So here's my rule of thumb for shorts 45 degrees or higher outside Fahrenheit. Let's clarify that. Okay. Yeah, as long as it's 45 or higher, then I've got shorts on. I'm not worried about it.
Chuck (01:21):
Yeah, I'm not that cold. I mean I'm probably maybe 50, 55, I can wear shorts, but as we get older, it seems like I see a lot of people our age that prefer shorts moving south. I just got back from the N-C-S-L-I Tech exchange and there were people wearing shorts at the conference. I'm not sure if I really can get on board with wearing shorts to a conference.
Howard (01:48):
I don't do that. No.
Chuck (01:49):
Yeah,
Howard (01:50):
After there in the day, maybe at night when we're up for after hour drinks or whatever dinner, but not during the day.
Chuck (01:58):
Hey, we got a really good guest today.
Howard (02:00):
Yeah, yeah. Kirk Foster.
Chuck (02:03):
Yeah, from nasa. I'm excited about it. I got a feeling that you guys are going to talk a lot about risk today, which is going to be a great topic. Yes, we should,
Kirk Foster (02:11):
Yeah.
Chuck (02:12):
And we can talk about how Nat is going to have some tools on its website that you guys are helping NAP create to allow participants to easily do risk analysis, which they're required to do now under the 17 0 25.
Howard (02:27):
So in the evaluation tool that we'll look at just as a top level, high level, do you have risk in your organization or not? That type of thing.
Chuck (02:35):
Right, exactly. So I probably should introduce Kurt. He's going to be joining us here in a minute or two. Let me introduce Kurt to people that don't know who he is. Kurt Foster from Nassau started his career in metrology in the army like me, so that's a positive thing. After he got done with the Army, he went to stanis program for the Space station Space station where he advanced his senior meteorologist. While at Kirk at Stennis, he became a section leader for the local N-C-S-L-I. It should be noted that Kirk is a huge supporter of N-C-S-L-I as well. He currently works at Marshall Space Fight Center, managing the metrology and calibration lab. He has responsible for the contracts, responsible for the technicians. He's the guy, if you will at Marshall. And Kurt joined us in 2024 as a board of director. He is responsible for government identities. He, that's his responsibility on the board of directors. Just a great knowledgeable guy. I think that when I talk about people that know their shit, Howard, you're one of those people obviously, but Kurt is right up there with you.
Howard (03:48):
He is absolutely up there. We recently we spoke with Paul Reese and another brilliant person that we have. I call him as a friend, love to have him as a friend, but as a colleague, he has a lot of knowledge. Kirk, the same thing. He's just got a lot of detailed knowledge in metrology,
Chuck (04:10):
And I've been lucky enough to become friends with Kirk as well, which is one of the great things in this world is you get to be, we're such a small community. We actually get to become lucky to become friends with people and I've been lucky enough to call Kurt my friend, and
Howard (04:25):
There he is now,
Chuck (04:25):
Behold, there he joins us. Welcome.
Kirk Foster (04:28):
Excellent timing, Mr. Foster. Good morning, gentlemen.
Chuck (04:31):
Welcome to chat app with AI chat box, Chuck and Howie, and we're going to start chatting and it was great timing. So how are you doing?
Kirk Foster (04:40):
I'm doing fine this morning. How are you guys? Everybody good on weather? I know it's kind of wet here.
Chuck (04:47):
Yeah, we were just actually just talking about that, Kurt. It's going to be 70 where I'm at today.
Howard (04:52):
Wow. Must be nice.
Chuck (04:56):
Yeah,
Howard (04:57):
We're both freezing the ice and snow mostly has melted. I'm happy about that. We had some rain last night, so that got rid of a lot of it.
Chuck (05:06):
So Kirk, I just got done given your introduction and we talked about your role at Marshall, how you are the man. We talked about some of your expertise that you have and how excited that we are that you're on the NAPT board of directors and your role there. And so I think one thing that I want to start out with is that what brought you to NAPT Board of directors? Were you tied up and forced to commit to the NAPT board or how did they
Kirk Foster (05:39):
Str? It was actually, it's a very much an honor that you invited me to be part of the board. I became known, known of N-A-P-T-A long time ago. So within our labs, individually within nasa, we've always tried to do some kind of laboratory comparisons or internal testing if nothing else, and NAPT was available. So within nasa, we were looking for a high quality reasonable cost option to do formalize our PT processes and NAPT was there, we made the contacts, everything started going well and about the same time or a little bit after, we went ahead and got our accreditation 17 0 2 5 accreditation as well. So those PT worked with that and then a couple other ones as well. So that's how I heard about NAPT. But over the years I've gotten to know you specifically and others in NAPT and the relationship and the desire to do it right, to be the best quality, to do the right thing without any bias or stuff like that. That impressed me the most about NAPT as an organization. And I would say ethics are extremely important to me within my field. I make sure I do not introduce my bias or take any advantage of the situation that I have and that who we buy from, who we do work for is strictly a hundred percent ethics. And that's where I saw.
Howard (07:21):
I agree. How else do you reach truth and measurement without making sure that you're avoiding bias at all costs?
Kirk Foster (07:32):
Right, and bias here, we're not talking about measurement bias, we're talking about performance bias,
Howard (07:37):
Correct.
Kirk Foster (07:38):
Yeah,
Howard (07:38):
Exactly. Good
Chuck (07:40):
Points. Well, NAP loves being partners with the NASA groups. Just as a sidebar, a quick selling point, we do provide services to all the NASA facilities. We kind of have, I don't want to say a partnership, but we really enjoy the relationship that NAPT has with nasa and it's what I think going on eight years, nine years now since we've been doing work with you guys. So we love it.
Kirk Foster (08:07):
And as an organization, anybody that if you're part of a larger organization, maybe you're just one lab out of 10 within a company or something, we found it helped bring our processes and our issues together. I mean, we didn't treat it as you failed. It was more like, okay, well can you do better? What can we do different? And so that's where it's helped drive improvement, self-improvement really, even though it was across the whole organization
Chuck (08:40):
And we're very, very happy to be able to be that organization that you guys choose. So with that being said, one of the topics that I really want to discuss is that both you guys are risk experts and that seems to be the flavor of the month. We did uncertainty for 20 years and we've kind of got a basic understanding of risk. I know we've got a new component where we're going to talk about risk components, pass fail, that kind of thing, and whether or not our failure is a good pass or not, but that's not what we want to talk about right now. I'd really like to get both of you guys' opinion on risk and if you could maybe talk about where NAPT is going to go, some things that you guys are helping NAP do with risk analysis. That is something I think that our audience would really like to know more about.
Kirk Foster (09:28):
Okay. Thanks. Go ahead.
Howard (09:30):
If you want to talk about the abstract that we have for the paper that we're working on, just a little bit about the highlights there.
Kirk Foster (09:38):
So when you start talking risk around metrology people, they all go immediately to measurement decision risk, and that is a component to it, but there are many more risks involved in taking a measurement than just that and characterizing those or understanding where you need to rank and file them in order that needs to follow. You need to think about those. If you look at, let's stay out of 17 0 2 5 right now, let's look at just general quality documents. The term risk is now entered into as 9,100 ISO 9,001. The term risk was not meant to be about our measurement decision risk. It was meant about operational risk, all risks that involved in your business and your stuff. And within measurement, there's other things like for instance, how easy are your procedures to modify and do you have revision control? Do you have things like that? Those are all important things to risk to the measurement.
(10:46):
Are you checking for bias from your suppliers of your calibration standards? What method? That's a risk there. And so really the risk part that I'm not going to get into the measurement decision risk part because there's 3 million people out there all racing to try to deal with that. I'm saying let's back up and see the forest for the trees. We're really supposed to be looking at all the components that impact our risk from calibration training, the new employee training, awareness, security, redundancy, automation. These are all things that you need to look at as a risk component.
Howard (11:28):
When I have written many of the papers I've put out there, again, I'm not focused on just measurement decision risk, but also on what I call quality blind spots. The things that you don't think about or may not have given much thought to that can cause unexpected or unacceptable results. You mentioned like training for example, or document control. If you don't have something purposeful in place to make sure that risk can't be introduced through those changes that are being made, then something's going to happen that you're not predicting that could end in a result that you cannot accept and may not be aware of. That's why I call it a quality blind spot, right? Trying to prevent those is really very important as part of the overall service you're providing, the work that you're doing, not just the measurements themselves.
Kirk Foster (12:24):
We're more than just a sticker. And that's a common thing we've heard about. So within our field, I said, most people when they see the term risk, they go straight to that measurement decision risk, which is important, don't get me wrong, but there's a lot of things we do every day with risk management and it's something that we intuitively do every day, even in our personal lives. When you leave for work, what time do you leave for work? Will you leave from work in time to be on time for work? But now what happens if you have a flat tire? How often does that happen? That's kind of what risk management's about. You've made allowances or mitigations for potential risks based on their likelihood and severity, and that's what the paper is really going to get down to is take the appropriate action. I'm sorry folks, I didn't
Howard (13:16):
Yeah, it's about thinking ahead, right? In every aspect of your life, not just work, not just measurements, but you're planning for the day, just thinking ahead of potential for things that could go wrong and trying to mitigate those to make sure they don't go wrong.
Kirk Foster (13:30):
And we even do that intuitively. If you look inside your laboratory or your organization, you're doing that and all ISO as 91, all the risk mitigation standards, all they're saying is write it down, document it, and track it and then adjust, which we've done in our laboratories or in our lives automatically. And it's just from a quality standpoint, they just want you to kind of document it and start that. You say it, do it, prove it. It's the old ISO triangle there, but just go. It's a constant of evolution and most people just don't think it's the documentation point that kind of sticks everybody with how do I do that? Right.
Chuck (14:16):
Well, one thing that I've challenged both of you guys on your paper and one thing that I'm afraid of, we're not requiring people to perform risk analysis that don't first of all understand what is to do a full risk analysis, but now we have to, like you said, document it. So I'm hoping that the paper that you guys are writing and the tools that you're going to help provide to the community from this paper provide enough guidance, detailed guidance to where not only can the person perform risk analysis for proficiency testing, but it becomes a tool for the assessor. Because I think even quite honestly, I think the assessors are struggling with trying to understand what is risk because they're not getting enough support from the accreditation body. The accreditation body, in my opinion, has not defined risk to a degree where they can train the assessor on what they need to look for from the accreditation body. That's a component that I think is going to cause a lot of problems. And if we don't get this education, IE your paper that you guys are writing out to the community at large, it's going to continue to be a struggle like we did for 20 years with uncertainty.
Howard (15:22):
Yeah. The intent of the paper is threefold regarding risk to identify it, to quantify it and mitigate it. Finding ways to do that, at least making people aware that they need to identify those risk areas, quantify them and then find steps to mitigate it so that it's minimized.
Kirk Foster (15:42):
And to the finer point, I kind of digress from you a little bit on the, we can give examples, we should be given examples or different people should be sharing how they do it. I don't like personally when things get too prescriptive, we don't really grow. If we legislate step-by-step how you're supposed to do something, you don't get to learn new things, you're going to execute somebody else's plan and never really learn for yourself. And that's why I like to say give 'em ideas, give 'em a broad brush approach, give 'em some examples of how you do it possibly,
(16:21):
But really let them grow themselves. I mean, we're trying to get it started and it just grows and new things are learned. I learn new things all the time. Chris Gratin had just created a paper about risk and he had something in there. I thought it was kind of cool and something I hadn't thought about there before. And I'm sure Howard's got some that I'm not aware of as well in his career. I'm sure he's used different risk management techniques and drawings and who knows. I mean, I just saw one recently that was posted on LinkedIn to another method bow tie I think it was called, which I didn't have time to delve into a lot, but that's what I like to see. The development. Don't be too prescriptive in your standards or methods of how you have to do something and then just get the creative juices flow, so to speak, and see where other people take.
Chuck (17:18):
We should give a post that paper that Chris you mentioned, just Chris wrote, he's available on the NAV website as well. We made available on the website@proficiency.org, another free plug for NE pt.
Kirk Foster (17:31):
Oh, absolutely. So it is great when we write papers, we do all this stuff, but if no one ever reads 'em, what good is it? Right?
Chuck (17:39):
Yeah, definitely.
Kirk Foster (17:41):
So we got to get it published out. You got to get 'em out there,
Chuck (17:43):
And we try and do that. Well, great. So Howard, you got this start of a tool that you and I kind of worked on a little bit and
Howard (17:55):
I mean it's just a draft, so I could share that just to give a quick glance at it, but it's in development.
Chuck (18:02):
The goal of this tool might conflict with what Kurt is saying because what NEPT would like to do is provide this tool on the customer portal to members so they have a basic foundation so they can do risk analysis and maybe it's going to become too prescriptive and that would be not one happy. Yeah,
Howard (18:22):
It's focused more on identifying your risk in providing poor measurements without knowing it. What are the things you can control that will help with that? It's going to be documentation and training and things like that, but also how are you doing with your proficiency testing? Do you have those all in place to cover your scope of accreditation? What's your passing rate on those? There's a number of factors that we'll have in there, and we'll just make a tool that'll be available on NP T's website that will show these factors and you just give yourself a rating from one to 10 on, it'll kind of help guide you on figuring out what the rating is based on what you know about your organization, and it'll come up with an overall score that says, here's the assessment of what we think your risk is based on the answers to these questions. And then that'll help him identify what the next steps are
Kirk Foster (19:13):
To cover. Howard here, I don't believe it's prescriptive. He's not being prescriptive by providing that tool. It's like, here's an option that you can use if you don't know where to start, Hey, try this tool, see if this works for you.
Howard (19:24):
Exactly.
Kirk Foster (19:25):
It is just like uncertainty, budget calculators over the years, there was a few that went out there, then there was many more over the years. There's probably hundreds of Excel spreadsheets. Here's an example of how you can do, and that's what I read prescriptive. I'm talking about regulatory prescription or standard prescription. There's times to be very prescriptive about requirements or things, but they should say they have now you will understand and document your risk's, what they say. They don't tell you how to do it because what works for me is not going to work good for Howard, but to get people started, people like us have to figure out how to convey our knowledge and our experience there with doing risk analysis. Get that out there. Ball rolling, right? Get the ball rolling. Exactly.
Chuck (20:16):
Yeah. Well, I'm, the biggest concern that I have right now is that I'm seeing a lot of organizations saying they don't have risk. Well, we can all argue that there is no such thing as not having risk. There's always risk.
Howard (20:28):
How did you determine that? That's the question.
Chuck (20:31):
Well, that's a good point.
Howard (20:32):
Don't ask, dont tell, right.
Chuck (20:34):
Well, but the cabs are, if they don't understand the basic fundamentals of what a risk analysis is, they can't perform that. So that's where we need to provide that. And I'm really looking forward to your paper because I think it'll become almost a standard for even the assessor to use maybe a guidance document for those people within the ABS when they adhere to the P nine document. These are all hugely important things. So with that, Howard, do you want to share your, a real brief thing, what you're working on?
Howard (21:05):
Quick look at it. Just looking at success factors, this is a format that's used pretty widely in business. You could give it weighting factors. We decided to just weight everything evenly, but you could play with that if you wanted to. And then here's where you give it the score and here's some guidance on what it is defining what each of these success factors are. We'll throw in some identification of what a higher risk in that category is so they can identify in their organization, where do I fall into one to 10 here? Give it the ratings. It comes across with a score. You get an overall rating, and then you have a tool that'll come up to say, okay, if you're in this range, like one to three or one to four, you have pretty high risk and here's the next steps you should be taking. If you're in that moderate range of risk, you definitely need to still get things under control, but you've got some things under control already it looks like. And then if you're in a higher score, things are probably pretty well in check. So it's again, just a gauging tool to figure out where am I today? If I've never done this and point you in the right direction,
Chuck (22:09):
I think this would be a great tool if we can automate this on the customer portal. That's something that we'll have to challenge the programmers with to give them a tool where they can take those 10 criteria, those basic criteria, and then see if they comply or not comply, and then what they should do if they don't comply.
Kirk Foster (22:28):
Yeah, that's a great example. Another way or method to do it, I tend to use the matrix process only because NASA has, that's what they use. I've simplified it a little bit for in the lab side here, but in the other part, I think it's good to document is not only do you have a risk, what have you done to mitigate that risk on there? Why is the risk to your failure or business or product low? What you must have done something right? Because for instance, let's go back to the tire analogy. If you have a bald tire, the likelihood of having a flat, this went up, so therefore your risk just went up. So if your risk goes up, what are you going to mitigate? Well, your mitigation is I'm going to have my tires changed or get new tires when I get to this tread depth, right? You've specified what you're going to do to mitigate that risk of having a bald tire. And I know that's a very basic scenario there
Howard (23:32):
It is, but there's other factors that come with that. So if I'm stretching my budget, I don't have the money to replace those tires. I'm trying to run 'em as long as I can. Now you're risking not only a flat, but possibly some medical problems or with others hitting another car, another building, financial factors that come into play fatalities. You don't want to, how much risk do you want to take, right?
Kirk Foster (23:57):
Well, exactly. So that's why almost every risk analysis process includes two major components.
Howard (24:04):
It's
Kirk Foster (24:05):
The risk of it occurring, the likelihood, and then the impact from a failure. So those are almost always two components that are considered in a risk analysis. I can't imagine not having those two components because that really, like you said, maybe I don't often have bald tires, but the impact or severity, if I did have a flat from that or a blowout, well that would be very large. It could be millions of dollars or cost somebody their life. That would be a severe impact. So yeah, you always have to have those two components. That's a good point, Howard.
Chuck (24:40):
Well, I'm excited about your paper. So the question I guess I would ask, is this going to be a book or is it going to be a paper? Because clearly there's enough information where you guys could write a book.
Howard (24:51):
Yeah, paper should not be books. That's my opinion. They need to be small enough. It's not going to be a book. People will read it, right? Capture the reader's attention, not turn them off.
Kirk Foster (25:00):
Yeah. Number one, I don't have the time to write a book myself. So I think this is a paper with examples and general principle, a couple of general principles just to get the juices flowing, just to get people thinking.
Kirk Foster (25:16):
And
Kirk Foster (25:16):
Like I said, I see too many people focusing on when they read the thing risk in the standards, they're thinking about measurement decision risk. That's our as meteorologists or technicians, that's kind of where we are. Our mind is always thinking about that. But we're talking about risk in general, any operational risk, anything that can impact your calibration. And my list of risks that I have down, I don't have all of 'em here in the own lab. There's things that have never happened. I'll give an example. What would happen if a tornado takes out the lab?
Kirk Foster (25:52):
I
Kirk Foster (25:52):
Don't have that down in the risk. The likelihood is extremely low. The impact would be severe, but that doesn't mean I already have already things in place. All our data, all our stuff is backed up to a different location on a regular basis. So there'd be some recovery effort. There would be stuff like that, but I don't have that. Your
Howard (26:13):
Risk plan for that really shouldn't be that specific or prescribed, right? The risk related to a shutdown is really what happens if there's an event of any kind, power outage, tornado, hurricane, whatever, that causes me to be unable to provide the services that are critical to the operation and what do I do about that?
Kirk Foster (26:38):
And so do you have a policy on how you handle power outages or again, turned how long they are? Do you have UPS backup, generator backup? I mean, you can cover all those things in a policy if you want to. Again, it's about understanding your risks and based on likelihood too, likelihood and severity are the most important. So you
Chuck (26:58):
Guys are going to have this, you're going to have this paper done in time for, because I understand that you might be given a paper at the N-C-S-L-I conference in July on this.
Kirk Foster (27:08):
That's the plan
Chuck (27:09):
For all. And obviously the paper will be written ahead of that time and it'll be a great paper, I'm sure I am excited as heck to not only get it, but to be able to share it with participants of NAT who are struggling with doing risk analysis and just a document that can be shared with assessors and whoever to get them on that starting point. Like you guys said, give 'em a couple of examples. I think it's a great paper and I can't wait forward. So I'm looking forward to the presentation in CSLI.
Howard (27:40):
So this M-C-S-L-I is in Cleveland. We've been there once before a number of years ago, Chuck, that happens to be the weekend of my golf outing.
Chuck (27:51):
Oh no.
Howard (27:52):
Yeah, so that's when it starts. So I probably won't be there for the Saturday Sunday because of that. I'll probably show up on Monday.
Chuck (28:00):
Okay. Well you can show up on late Sunday night.
Howard (28:03):
Yeah,
Chuck (28:04):
Yeah, yeah. Should be. Okay. Alright.
Howard (28:06):
Alright. So let's switch gears here a little bit if you don't mind. And we want to go to some question and answer periods. So before I start with my questions, actually just looking at you Kirk, I'm interested in that artwork above your whiteboard. What is that all about?
Kirk Foster (28:20):
Well, the story behind that was is that we had a mentor process. And so myself and a couple other managers were mentoring a potential new manager in her growth. And part of that mentorship was to go out and do an activity after work of some kind. Right. And we had another one of our managers had suggested that we'd go down to, there's these places you can drink wine and paint.
Kirk Foster (28:46):
Oh
Kirk Foster (28:47):
Yeah, exactly. If you know anything about me, that would be like a heck no.
Kirk Foster (28:51):
Okay,
Kirk Foster (28:53):
Check an eye. That'd
Howard (28:54):
Be a thumbs up.
Kirk Foster (28:54):
So what we came up with instead is there is a place here here in Huntsville called Bullets and Barrels and that's where you can have a beer after you shoot. Right. They have an indoor range and stuff like that. I know we swapped it around, but yeah, I tell you what, we took her to it. She had a blast. I mean she had never shot some of the things we shot there and she just absolutely had a blast. So to pick on me a little bit, she went ahead and went to that art and wine type event and she painted me a rocket. That's awesome. A picture of a rocket. I love it. Thanks. Yeah,
Howard (29:34):
I wouldn't have called that Bullets and barrels. I would've called it shots and Shots. I think they
Kirk Foster (29:38):
There. Hey, we got a new business to
Howard (29:41):
In Huntsville now,
Chuck (29:41):
Right? Yeah, exactly. I like it.
Howard (29:45):
Alright, Chuck, I'm sure you've got a question as well.
Chuck (29:49):
Yeah, I got a really big important question. What's next? I know that you're got a long time before you retire, Kirk. So the question that I'm curious about is do you have any more plans? Do you have any more goals? Do you have anything else that you want to achieve? Are you comfortable with where you're at? Do you want to become the president of N-C-S-L-I? For example, do you want to become the president of the NAT board? Is there anything that you want to share with the audience of what you want to be become further than being the man at Marshall?
Kirk Foster (30:19):
So I can tell you right now, I don't want to get more than a half a step away from putting my hands on equipment and metrology problems, right? Helping customers with their issues and stuff like that. So I personally have no desire to leave the lab or not be very far from one, let's put it that way.
(30:38):
I like testing problems, I like area issues or failures or whatever it may be. I'd like to be involved in that. I'm actually quite happy where I'm at. I wear multiple hats on this existing contract as well. Deputy general manager at this point. So that's enough challenges plus my role with NAPT, that does take a little bit of time and I'm glad to do it. I think it's a great, I wish everybody could understand just how important PTs are in their life. So I would say organizationally, we have the metrology and calibration working group within NASA that has grown quite a bit during my career from where we started, and I'd like to see it grow even more. And it's a great venue to test things, to try things to educate, to assist people. And at the end of the day, my goal is to make the measurements or the customer's way they take measurements better anywhere in that. So that's where my goals are. I can say that this field is not something you ever really stop growing in, right? It's not like, well, I got there and I relax and take it easy. No, the evolution of automation, I mean, I can't wait to see where AI comes in
(31:59):
And the new hassles and controls and stuff that we talk about quality a lot of times. How are you going to quality an AI thing? How am I going to validate an AI if I don't know what it used to make its decision, right? Can you theoretically accept a qualification or validation of your software because the AI looked at, the AI said it was good.
Howard (32:22):
Maybe the focus there is what was the goal and did it achieve the goal? How else do you do it?
Kirk Foster (32:27):
Well, that's kind of what we do. Now, if you have a good system, again, that's another risk how well you manage your software and controls and stuff like that. But yeah, for me, the field never ends. There's always new stuff. And then I will say it gets more complex when the lower our uncertainties get and the better the equipment that we're calibrating gets, right? I mean, yeah, more of a
Howard (32:57):
Challenge.
Kirk Foster (32:58):
It's way more of a challenge and there's more, well, let's go back to risk. There's more risk involved in saying it's good or bad now. Correct.
Chuck (33:06):
Real briefly, you just brought something about ai. We were at an event and they had this trivia contest and it basically boiled down to how quick the person could Google the answer. What we found was that several people were using not only different AI engines, but for example, one of the questions came out was what is the equation to define the volt? And believe it or not, one AI that was used gave the wrong answer. Oh wow.
Kirk Foster (33:35):
Absolutely.
Chuck (33:36):
And so when we talk about risk, you hit it right in the head, Kirk is we can't trust AI because that old cliche, garbage in, garbage out. Yeah. That's
Kirk Foster (33:48):
Developing. This kind of goes to my thoughts on my history and experience with automation in general. If you do your best, you've got people that are highly experienced writing these procedures and the processes and they check and they validate and they do it manually and they compare and they do all that press to validate, and then it gets handed down to somebody else that is not nearly as experienced or with that measurement or with that process. And then over time, maybe that guy retires or he moves somewhere else. So now you've got an automated program that is being given to somebody else and who's to say it's still good that something changes, something happens. So a lot of our procedures stuff are based on the level of competence of the person that wrote them, and then we don't think about that. We've handed them over to somebody that doesn't have anywhere near that level of competence and we'll never get that level of competence because the procedure tells him to do this. He does
Chuck (34:53):
That
Kirk Foster (34:54):
Or she does that, and I'm worried with AI is going to be even worse because like you said, how do you validate the answers unless you know 'em yourself.
Chuck (35:02):
Exactly. Yeah.
Howard (35:03):
Yeah. That requires recurrent review. If you're going to have people that are running those automated processes or machines that are doing that, you've got to constantly evaluate that on a regular periodic interval to make sure that there hasn't been something in that process that needed to change to keep up with whatever else you're trying to do.
Chuck (35:22):
Yeah, I agree.
Kirk Foster (35:24):
I'm okay with I recommending a restaurant based on, hey, you like this, this, and this. So here's a restaurant suggestion. But
Howard (35:31):
That's a pretty quick validation, right? What the
Kirk Foster (35:33):
Heck you
Howard (35:34):
Telling me to go here for? This is not anything I would want. Right, exactly. All right. My next question is a standard question. We ask all of our guests, and I call it the four M moment. What is your most memorable metrology moment? Give yourself a second to think through that. You've got a lot
Kirk Foster (35:52):
Of years to think through. Yeah, that's a tough one to, within NASA through 36 years, there have been a lot of things coming our way to figure out how to do. I would say I've got some ideas and help. It wasn't just me that got it all put together, but we've developed several new processes, almost new instruments, new methods to do calibrations that improved repeatability and uncertainty and everything else. We built the machines
Howard (36:20):
Ourselves. That's a sense of pride
Kirk Foster (36:23):
When it comes together and everything works. That's very rewarding to me.
Howard (36:28):
It is. Yeah, absolutely. I can feel that. That's awesome. Alright, Chuck, I'll pass it back to you.
Chuck (36:35):
Well, the question that I ask is that you're very much involved with NASA and you have a limited support for N-C-S-L-I. What do you do in your private life that you enjoy? I know that you have this farm that you work. Tell us about your private life a little bit where you go away to get away from metrology. I used to play golf to get away from metrology. Howard has his music. What do you do, Kirk? To get away from metrology?
Kirk Foster (37:07):
Honestly, I don't get away from metrology even when I'm doing the things I like to do, you can't take it out of my blood. It's going to be something I consider or would joke with, but in general, so yeah, I do have a small farm and I enjoy taking care of that. That's a good stress reliever I guess you could call it. But I do some hunting, I do some fishing. There's nothing better than being out on the ocean and catching some fish with some good friends and whatnot, but it could literally be a drive. NCSL is work in a regard. Like I say, going to the conference, that's work. But for me, the drive there and the drive home or the relaxing part, so that's kind of like a mini vacation as I'm going to a work
Chuck (37:51):
Event. You don't like to fly, you don't fly.
Kirk Foster (37:54):
I don't like to fly because without being mean about it, they don't treat you well. You're just being treated like cattle and you don't really get to see what America has to offer. You try, you travel from one city, fly over, you can't even see the ground. And there is so much in between right here in America that I think a lot of people are missing because they're jumping on an airplane.
Chuck (38:20):
Yeah,
Howard (38:21):
Great point. I see your point. Yeah, so my rule of thumb since I travel quite a bit for work is five hours, six hours maybe for four of the flights. So if it's a longer drive than five or six hours, I'll fly shorter than that. It makes sense to drive. And you're right, there's a lot that you can see. You can stop when you want. It's a little more
Kirk Foster (38:42):
Convenient. It depends. Some people don't like driving and they like city life. I like to see open scenery and meet people from Idaho from driving through there. That's what I like to see. Instead of just everyone at an airport is all mad, stressed, out, upset, their flight's delayed. So I just prefer to drive. That's all
Chuck (39:08):
Right. So Howard, I think we've got time for one or two more questions. Do you have any other questions that you have for our guest?
Howard (39:15):
Yeah, you kind of went into the whole hobby thing. What do you do with your time outside of work? What about travel destinations? Have you taken any great vacations that are memorable?
Kirk Foster (39:27):
Well, the Alaska cruise was by far, that was a harder one to drive within a timeframe. And by the way, I had to fly on that one because there just wasn't enough time to drive. But it actually is in the forecast to do Alaska again by driving the Alcan and
Chuck (39:44):
Oh wow.
Kirk Foster (39:46):
That is a goal. T West will probably be the next possibility. I haven't been down the keys, so I'm probably going to drive down the keys. It's going to take a couple of days or half a week depending on it's opportunity. Right. I'm trying to work, moving it in with a lot of my, I don't take a long vacation. Usually I take a couple of days to drive somewhere or three days had an extra day. We were out in California for one of the events and I had an extra day, so I'd go up and see Yosemite. Of course, half was closed because of snow, but it still was gorgeous.
Howard (40:23):
Yeah, I think it's 2017 when we lived in California just before we moved back to the Midwest, went up to Yosemite just after a lot of the fires had occurred, and so parts of the park were closed off. But I tell you, we got out of the car and looked up and saw half Dome, unbelievable view.
Kirk Foster (40:43):
There wasn't a direction you could look, there wasn't something amazing. Right. And I'm a John, your fan. I listened to his books almost every night and the descriptions and it was nice to put eyes on some of the descriptions that he had in his books. Beautiful players, and I think I have, so I haven't been to Maine yet. I have driven or been through every state. Of course. I haven't been to Hawaii yet, so I have Washington, I'm sorry, Oregon. I have Oregon and Maine and North Dakota that have states that I have not hit plus Hawaii.
Chuck (41:26):
Well, I'm lucky I've slept in every state except Alaska. And for me now I've got to get to Alaska, even if it's for just one night. So I can through that cliche thing that I've slept in, stop
Kirk Foster (41:40):
Being cheap and take your wife on a cruise. She
Chuck (41:43):
Won't do cruises. She hates cruises. She
Kirk Foster (41:46):
Can't hate this cruise festival.
Chuck (41:49):
When you see her at the barn, we have the barn party that we're going to have here in August. If you can convince her, I would love to do that cruise that you've talked about. I would absolutely love to do it. If you can talk her into it, then I owe you big time.
Kirk Foster (42:05):
Okay,
Howard (42:06):
I'll do my best. I said speaking of Alaska, you mentioned you liked hunting and fishing, and I imagine in Alaska that's part of what you do out there,
Kirk Foster (42:13):
Right? Actually, I didn't get a chance to do any of that while I was up there. The cruise time and the schedules allowed certain things, certain times and you were pretty busy, so I didn't get a chance to get off on a fishing trip.
Howard (42:27):
We had a group of people at Transcat that used to get together and go up to Alaska for fishing. Right, mainly salmon. And boy, they'd bring back quite a bit.
Kirk Foster (42:37):
We scheduled ours later in the season and the last season for, I believe it was red salmon. Anyways, the season had just ended for salmon. So I still could have gone for halibut and stuff like that, but it wasn't the time. A halibut trip is longer than you had at the port of state. Alright,
Howard (42:58):
So what kind of fish do you catch around where you are?
Kirk Foster (43:02):
Well fishing here in Tennessee and Alabama, mainly on the Tennessee River. So it's bass, stripers, hybrid stripers, croppy, stuff like that. And by offshore fishing though, speckle trout, red fish, coia, tuna, different trips will get different kinds of fish. Depend what you want to fish for.
Chuck (43:30):
I think it's fair to say that you miss your offshore fishing. I think that's fair to say.
Kirk Foster (43:34):
Absolutely. If I had enough money to retire and buy the boat and the location down there, that's where I would probably be retired too. Yeah,
Howard (43:42):
Well, or you could charter a boat for others and that'll pay for itself.
Kirk Foster (43:46):
Yeah. Charters work good too, don't get me wrong. They're a good deal. I just to jump in the boat and go out when you want to and it doesn't take long to learn the right thing.
Howard (43:57):
We
Chuck (43:57):
Got time for one more question. Howard kind running
Howard (44:00):
Or not, what type of game do you hunt for? Bird person? Or are you more of a
Kirk Foster (44:05):
I've done it all in my life. Where I'm at now, as all of a sudden I'm getting older. I have a really nice deer stand on my own property and so that, but lately that's not hunting well, it may not be and it just, it's relaxing and I get to, I don't care if I shoot anything or not. I know that sounds cliche, but I watch owls get their mouse. I watch bobcats kind of slither here and there and all sorts of weird stuff. You get to watch all the wildlife and it's relaxing. And then I put a few in the freezer and I enjoy venison.
Howard (44:45):
Any boar hunting or bear hunting?
Kirk Foster (44:47):
No bear and no boar hunting. I was in Mississippi and I had opportunities to do it there, but I was actually working longer hours there than I am here. So
Howard (45:00):
Speaking of birds, so have you ever seen, they've done some studies on this, have you ever seen, let me share this. You ever see birds that fly in this pattern? Right? They're typically a V pattern and sometimes like this. They're longer on one side and they've done some studies on this. Chuck, do you know why that happens?
Chuck (45:20):
I do not.
Howard (45:21):
It's because there's more birds on that side.
Chuck (45:26):
Gotcha.
Howard (45:28):
I got to tell you, the person that told me that actually stung me on that one the first time was back when I worked was Martin Mart at the time Martin Mart became Lockheed Martin. So this is just going back to that thought, Martin Mart, the Sand Lake Road division was missile and defense systems, and they made the HAFI missiles for the ah 64 Apache attack helicopter. They also made on the front end of this thing, the Tads PVIS system target acquisition designation site. And I'll explain that in a second. Pilot night vision system uses a visible camera and a forward-looking infrared that's tied to the pilot's helmet and they've got a piece that comes down over their eyes so they can target and look in any type of inclement weather, target, what they're looking for, zone in on it and send the hell fire missiles to it.
(46:24):
So here I am up in this left corner of this D, you can't see me, but I had worked there less than a year when they did this picture of the 300th Tads unit that they had delivered. And so there's a gentleman I worked with there in the calibration lab, Roger Spalding, who was just a jokester. He was just a funny guy and he got me on that joke. I'm sitting here trying to delve into it. I'm a young guy just out of the Air Force. I'm thinking theoretical all the time. I'm trying to figure out what this new business world is. My head is just in the details. He pulls that out and I'm sitting here going, there's got to be a good reason for this. I just can't get it. Then it gives me the obvious simple answer. And that helped me to stay grounded to think about that, to say, you know what? Sometimes it's just a simple solution. So I'm scouring. I haven't talked to Roger in years. I'm scouring the worldwide web. My daughter loves when I use that term to figure out, hey, where is he now? I think he's still in Florida somewhere. What's he doing? And the only picture I could come up with was this one
Kirk Foster (47:26):
Jeans,
Howard (47:27):
Apparently after he retired, he converted to Catholicism and became a nun.
Chuck (47:33):
Oh, that's terrible.
Howard (47:34):
Roger was hilarious. He was fun to work. But anyway, that's my bird story. I don't really, unfortunately,
Chuck (47:41):
We're actually a out of time guys. Kurt, boy, we have to have you back again because there's just never enough time to explore the wealth of knowledge you have. So maybe after you get your paper written and we can pick a new topic, because I have a ton of more questions I would love to ask you, but we would probably borrow, that's been a pleasure. Yeah, we'd probably borrow list. Thanks for joining us. Yeah,
Kirk Foster (48:05):
I hope the expectations in the papers not too much because my deal is not to get into theoretical physics or anything like that. Or mathematics, significant mathematics.
Chuck (48:15):
No, no, nothing like that.
Kirk Foster (48:16):
It's all generalizations and just again, it's about getting people thinking.
Howard (48:20):
You got it,
Kirk Foster (48:22):
It. It's been great seeing you guys again.
Howard (48:24):
Yeah, thanks for joining us. Any closing words of wisdom, Mr. Ellis?
Chuck (48:29):
Yeah, first and foremost, I'd like to thank our sponsor, Nat, obviously, for helping us put this out on board. As the cliche goes, proficiency testing is the only true measure one can do to perform the technical competency. So sign up today for ILCs. Whether you use NEPT or you use your own internal, you've got to do proficiency testing. It's the only way you can prove your competency.
Kirk Foster (48:50):
Alright,
Chuck (48:51):
So with that being said, we'll look forward to seeing you the next time. Thanks for listening to our podcast and look for the future ones that we have down the road.
Howard (49:00):
Thank you. How easy. Good day everybody. Be
Chuck (49:02):
Careful. Alright, Kurt, thank you very much, sir. I appreciate all your wisdom and I can't wait to see your paper.