The development world is cluttered with buzzwords and distractions. Speed, focus, and freedom? Gone.
I’m Nicky Pike. And it’s time for a reset.
[Dev]olution is here to help you get back to what matters: creating, solving, and making an impact. No trend chasing, just asking better questions.
What do devs really want?
How can platform teams drive flow, not friction?
How does AI actually help?
Join me every two weeks for straight talk with the people shaping the future of dev.
This is the [Dev]olution.
Jennifer Spykerman (00:00:00):
AI is terrible. It sucks at user interface and I'm not really that good at it either, but I know that what it's given me is not right. You want an intuitive application and it doesn't create an intuitive application out of the box.
Nicky Pike (00:00:15):
This is Devolution, bringing development back to speed, back to focus, back to freedom. I'm Nicky Pike. You know what's missing from every tech conversation? Actual live demos. Welcome to Devolution [To]Code where the BS stops and the code starts. Here's the deal. My guest has one hour to put their code on the table. No slides, no promises, no building towards that, just fingers on a keyboard, code on the screen and real results. The rules are simple. You built something? Show us. It works? Prove it. It's production-ready? Ship it. Otherwise we'll skip it.
(00:00:52):
With me today is Jennifer Spykerman, owner and founder of DefenseLogic AI. She's not just another thought leader talking about AI. She's actually built something. This is someone who's traded coding for the C-suite about 20 years ago. She's directed 300 plus professionals. She was involved in the NASA's Orion program and she has saved the DoD a hundred million dollars through process improvements. Now she's back to building software and she claims that she's vibe coded a federal RFP analyzer that is nearing production-ready. No coding bootcamp, no Stack Overflow, rabbit holes, just her aerospace expertise and AI doing the heavy lifting.
(00:01:25):
Before Jennifer shows us what she's built, here's the challenge on the table. The internet right now is obsessed with vibe coding. Cursor has hit a 9.9 billion valuation, Base44 sold for 80 million, and everyone's claiming that they can build entire apps from prompts. So what happens when someone who's spent decades understanding complex federal requirements decides to build a tool that government contractors desperately need? This thing supposedly analyzes RFPs, runs Air Gap LLMs and handles compliance requirements. It's time to see if it's real. Jen, welcome to the Devolution.
Jennifer Spykerman (00:01:58):
Thanks Nicky. It's nice to be here.
Nicky Pike (00:01:59):
So before we get started, anything you want to kick us off with? Anything you want to let us know before we jump into this?
Jennifer Spykerman (00:02:04):
Not so much I have to say. As someone who is coming back into the coding world after a long hiatus, it can be really overwhelming. There's a lot of technology out there. There's a lot that's changed since the last time I did anything in production. Full disclosure, my background in software development was actually in embedded systems, which is a big change in big difference between embedded systems and what we're seeing now on the web, cloud development and all of those front end, backend sorts of environments. So there's a lot to catch up on. So this has been a lot of fun and it's been a huge learning experience for me.
Nicky Pike (00:02:44):
So were you doing this back before the .com boom and everything became an app? Is that what got you in embedded or was it just simply who you were working for and the projects you were working on?
Jennifer Spykerman (00:02:52):
It was actually the projects I was working on. I grew up watching Star Trek: The Next Generation, and I loved space. And so right out of college, I actually was lucky enough to get hired by a very large space defense contractor here in Colorado. And right out of the bat, I was able to actually start developing software for satellite systems and the software I developed is still in orbit, so that's exciting. I achieved all of my goals by the time I was 25 or 26, which is amazing. So then I wanted to keep doing more, then I wanted to go higher and higher into the organization. I got my Master's in Systems Engineering for Computer Science Systems Engineering with an emphasis on aerospace. And then I started doing capture management and I managed a $3 billion capture space crown system for this contractor. And it was a huge learning environment and I enjoyed all of it. It was actually kind of like a startup. That's what I most loved about. It was like a startup culture inside this behemoth of a company, of a corporation. And that's what I enjoyed the most about it, but I learned a lot about just the engineering that actually goes behind how to construct a proposal that will speak the customer's language. So that's kind of where I ended up here.
Nicky Pike (00:04:14):
That's awesome. And in our pre-show interview, I had a comment about instead of the 20 years, I was going to say back before Stack Overflow exists, and Jen told me not to do that, but I'm going to poke the bear anyway and bring that up. But I agree with you, the startup feel. I got to experience the same thing in Xbox, but within the support of a huge company, it really allows a lot of flexibility that you don't see with other types of startups having to watch their money, they're having a penny pinch. Having some of that support allows you to kind of innovate and go outside the box on a lot of things. And from what I've seen on your career, that's exactly what you've done. So jumping into that, I mean, let's get into what we're here to talk about, which is this product that you built. So you've managed 3 billion strategic captures, you helped direct NASA programs like the Orion crew module. What the hell made you think, you know what? I'm going to start building software again after 20 years.
Jennifer Spykerman (00:05:03):
I've always wanted to get my hands dirty. And when I worked my way up the ranks to do leadership sorts of positions, I was actually encouraged to stop slinging code. And I loved working with people and I love leading teams, but at the same time, I felt like I was one step outside of the box on really what was going on and understanding the rapidly changing environment that was happening. The defense industrial base world, it's a slow moving world. And so...
Nicky Pike (00:05:34):
It's federal government. I mean, they've never been known to be rabbits in anything that they do.
Jennifer Spykerman (00:05:37):
Actually. I mean, if you think about it, back in the forties and fifties, they were at the forefront of our innovation for this country, and there's a whole bunch. I actually wrote a white paper for Gene Kim, his IT revolution. It's coming out here this fall, but it's all about how the government is adopting new processes for doing software acquisition because it is still kind of stuck in the place where you're buying tanks and satellites and hardware and not necessarily innovating for the future and supporting the war fighter and that sort of thing. So there's a lot of things that are changing. And so I wanted to find a way to not only understand how those changes were impacting large corporations, but how I've talked to a lot of small business owners. And one of the biggest issues is just figuring out how to manage the costs that it takes to actually develop a compliant proposal that can be sent to the government.
(00:06:40):
And since it is changing, you can hire these outside firms that probably have no idea what your company does, what your actual value is. And so I wanted to create a tool that would allow a small business say, these are what we're really good at, these are innovative products, this is our past performance. This is how we know that we would benefit the war fighter, and then pull in all of the different RFPs. And then these are the pieces that you should really highlight and here are some gaps in what it is that you're doing. I'm a little ways off from my entire vision, but I have a small local MVP that I'm ready to share after dealing with some issues this morning, but yes.
Nicky Pike (00:07:26):
Yep. All right. Well, and you talked back in the forties and fifties, the government was able to move very fast, but as things start getting more and more concrete, that's when we start putting processes on it, and I think that's what led you to this program. Walk me through the moment when you realized that the federal RFP analysis was broken, and you mentioned in our pre-show interview that you used to spend two weeks or more per RFP on security plans, NIST and Fed rep requirements. Was it these things or was it a specific contract where you thought, man, this is insane. There has to be a better way to do this?
Jennifer Spykerman (00:07:58):
Yeah, I mean that 3 billion, and I worked on some other smaller ones as well. I worked on a $10 million proposal. I've worked, I think my first one was, I forget how many, maybe it was a hundred million dollars, I can't remember. But each one of those had its own pain and suffering. Companies like large defense contractors, they have entire teams of people who manage their capture process. These are people that will walk through an RFP, the different sections, the Ellen and the M for traditional RFPs, which provide the evaluation criteria and the instructions for each part of your proposal. And they'll create these elaborate matrices to help figure out how to manage this proposal. And I know each one can cost, they will spend of their own overhead dollars, it'll easily tens of millions of dollars just to submit a proposal may or may not win.
(00:08:57):
And then there's other emotional aspects. I'm not going to get into those. I'll probably make some people angry. I mean, you work really hard and when you lose a proposal, you can actually argue against why you lost. So you can kind of sue the government and say, this was unfair. You didn't follow your acquisition rules, blah, blah, blah. And I was part of the team that was doing that analysis postmortem after we had lost a large proposal. And I saw, I said, gosh, we had all this work. It was, I think $140 million was spent on this proposal. And if you really got down into what the technology, what the architecture was doing against the evaluation criteria, there was no way it was supposed to win. To this day, when I talked to my leader at the time, and I say, we were never going to win, and I knew it as a capture manager, we weren't going to win.
(00:09:52):
It didn't make sense. They get really cranky with me, but sometimes you kind of get lost in your own, we should do this because it's the right thing to do because we've had this program before. Whatever the reason is, you need to have a kind of an outside voice. And I thought, well, sometimes if you work with AI well enough, it can say no, this isn't a good fit for you unless maybe you team with another company or whatever. And so those are the types of things I'm trying to capture, but it's a lot more complicated than just uploading a bunch of documents to ChatGPT and hoping for the best.
Nicky Pike (00:10:31):
Well, and you mentioned bringing up pairing with another company. Most executives would hire a dev team for this. What made you decide to dust off your coding skills and try vibe coding?
Jennifer Spykerman (00:10:40):
I think it's one thing to bring in a dev team that may or may not have that domain knowledge and that history. And even if I were to hire someone myself to say, Hey, I want to create this tool, and I hire some folks, I hire a backend developer, a front end developer, an AI specialist, data analytics, all the things that you would need, they don't necessarily understand what the vision is. And even sometimes I don't because it's still an iterative process, you know, have this thing in your mind. But as you're kind of going down the path, you're like, wait, I need to do something a little bit different to make it either more intuitive or to have the outcomes that I'm really looking for. And when you've got teams of people that are kind of chugging along, it's a little bit more difficult to communicate what that is, especially early on. So later as this gets to be more mature, there's a good chance I'll probably hire a handful of people to say, make this better, make this so that it is market ready, but at least I'll have the vision and it'll be something that will be easier to communicate than just a bunch of words that come out of my mouth.
Nicky Pike (00:11:47):
You brought up something that I think is a very important point. There is a lot of talk about vibe coding, and if you looked on LinkedIn, you looked on social. There's a lot of negatives about it. I think what you just brought up was one of the main points and one of the main advantages of vibe coding. You're bringing up an MVP, you're proving out that this is going to work. When you prove that out, then we can bring in a dev team, we can spend the money that's necessary to bring it in, to polish it up, make sure it's secure, get rid of bugs that are always in the code. That's the important part is this is allowing people that may not have the technical skills. I think that you're an exception here because you did have a very extensive coding background, but it allows people, the citizen developer to come in with a great idea that would otherwise probably get lost because they didn't have the technical skills or the funding to go in and hire a team to do this.
Jennifer Spykerman (00:12:32):
I did just upgrade my vibe coding application to spend a little bit more, but still it's a lot less than I would be spending on these developers that I would need to fill in all the gaps that I have. And I don't know if anyone is actually such a good developer that they can do end-to-end, full stack back end to front end. A lot of people are, they're really good at cloud development, they're really good at Kubernetes, they're really good at React or whatever, but they're not necessarily really good at end-to-end application development.
Nicky Pike (00:13:04):
Absolutely, and this is where I think vibe coding has a place in the enterprise. We've got other teams, we've got other departments besides the software development team. You've got HR and marketing and all these other departments. And generally when they request an internal tool that goes onto the backlog because the software development team is focused on doing things that are revenue generating, but that's still important to the hr, the sales teams. This now gives people to your point that have the domain knowledge. It would be very hard for you to explain this process to a software developer that never goes through it. It allows someone like you that has the knowledge to come in and say, this is the way it needs to be done. This is the way we want to see it, and create that prototype Now for internal tools, maybe that's perfect for an MVP. It's enough to go off and get you off the ground, prove that it works, and maybe go get the funding to polish it up. But that's something that I think is getting missed in a lot of the conversations that we're having around vibe coding today.
Jennifer Spykerman (00:13:53):
Yeah, absolutely. And it is. You start off with a problem. I remember at my former employer, we had this massive lessons learned database, and I had to submit lessons learned to this database and it was for proposals, and I thought, no one's ever going to look at this. I remember scrolling through thousands and thousands of lines trying to find something similar to what I had just experienced to see had the same lessons. Who knows, right? I mean, this was 15 years ago, but these days I could easily imagine hooking up an AI generator to walk through a very structured database, which is perfect, and summarize and identify the same customer, the same size of proposal, the same, and this is what happened in the past and these are what they should have done instead. And maybe it was, don't even go after this thing, or maybe it was change your design or whatever it might be.
Nicky Pike (00:14:47):
Yeah. Well, let's go from that and let's get into the learning curve of what you actually built. So you mentioned in our pre-show that you were using tools like Cursor and Claude Code through the CLI. Why did you choose that combo specifically? Walk us through your tooling choice here.
Jennifer Spykerman (00:15:01):
It was very simple. I went to a small conference and it was actually hosted by Gene Kim, and there was a lot of discussion about vibe coding and they were using Cursor and Claude Code. So first I started with Cursor because it was simple and it actually threw together and I didn't have to do much. I know a lot of people say it's good for completions and I like the IDE. It's a simple IDE that I can understand and I can kind of tweak things, especially my YAML and my JSON because those are the places that make the most sense to me, and I like to have the most control. Claude Code, it gets a little bit more into the details and I can talk to it in a little bit more plain English. At the same time, it's kind of like I have an elderly mother-in-law who she has a love affair watching the Phoenix Suns. And so we got an Amazon Fire stick that every year when the NBA season starts, she is logged out and we have to talk and coach her through over the phone how to log into that thing.
(00:16:04):
And it's the same thing because yesterday for what I actually went through and I did the worst thing you can do the day before you're going to demo something, I changed my design completely because I decided it wasn't going the right direction and I changed it and I ripped everything out. And it took me most of the day just to get it to start working again, but it was a lot of, it's not working, my screen is blank. Why is it blank? And then it's like, well, it should be there. And so you're going doing all this troubleshooting, but really it's like the blind leading the blind. This is what I'm seeing. And then the Claude Code tries to be ever so helpful. And I almost did throw my laptop and I was like, you know what? Maybe it's just not meant to be, but I did get it working again. So yeah.
Nicky Pike (00:16:47):
So you mentioned having two engineering degrees and you had an early career in embedded coding after being told to stop slinging code 20 years ago. What was it like trying to come back into modern framework and languages?
Jennifer Spykerman (00:16:59):
Oh my gosh. When I left my defense contracting company and I went to Pivotal and I went in and I was suddenly thrust into new modern frameworks, microservices with Cloud Foundry, all of that was really new to me. And I came in as more of a program manager understanding how to connect value to technology, which is an important piece that a lot of folks miss. And I love talking value and I love talking about this is how the technology is supporting your business. And it was a lot of fun, but I knew, I mean, I was playing catch up the whole time. I am the type of person that I started trying to figure this out. I have a whole bunch of books that I bought on architecture for the cloud, and I started studying because I wanted to make sure that people took me seriously, and I wasn't just a business person that didn't know what they're doing, although people still assume that I don't, but I only get a little browsed when people assume that I have no idea what technology is, but there is a lot to catch up on.
(00:18:02):
I actually had to ask Claude Code is create an architecture document for me because oh my gosh, I have lost the bubble on what it was. I started out focusing on the rag retrieval augmented generator, to really, that was a piece that I was most interested in at the time, and so I understand what's in there, but the rest of it, I have no idea. I did have Claude create one for me, but I summarized it in a little document, but who knew half of this stuff? I thought, oh my gosh, there's so many libraries and technologies out there. And then after we're done with this, and I did not have the guts to do this yesterday, I was going to, I downloaded Claude Swarm and I'm going to actually create a bunch of agents to someone to be a QA tester and another one to be a security person, and another one to be the front and the backend and all the different pieces and see what happens.
Nicky Pike (00:18:55):
That's awesome. That's something I've been talking about a lot is the potential of multi-agent programming and having somebody come in and do that. That is huge. Claude Swarm's going to do that. I think when we're seeing enterprises, they're starting to adopt small LMS instead of large language models and having those be specialized, you really do have a dev team if what I'm hearing you say, right? You've got a dev team. It's just the dev team is virtual.
Jennifer Spykerman (00:19:17):
Right? And they're a lot cheaper than having to hire someone who has the right experience and who I don't feel so bad when I yell at it. Right? I do. There you go. I yell at Claude. I have not sworn at it yet, but there was some close calls yesterday and this morning.
Nicky Pike (00:19:32):
Well, I have, I've gotten into fights with Claude myself, and it's part of the fun and it's also part of the journey. So everyone's talking about the magic of AI coding, but what was your reality? You talked about there were times that you wanted to throw your laptop out of the window. I get this picture my mind of you pulling an office space out in the field with a baseball bat. You talked about having to learn new stuff. So you did a lot of study on your own, but was Claude not only able to help you with the code, but it did actually help you teach you new technologies?
Jennifer Spykerman (00:20:01):
Absolutely. I mean, I'm constantly asking like, okay, so what is this course thing? What does it do? Help me understand how the fast API architecture works? What are the benefits to me? And especially I think it's important to understand what is going on, at least at some level behind the scenes because you have to engineer, you have to optimize for what it is you're doing, right? I'm trying to create an application that could be operated in an error gapped environment. So it's run locally right now. So I want to make sure that it meets those types of requirements. I can run on my little laptop. I can still have the power that I need it to do and the knowledge that I need it to have as far as the benefits of the large language models. But at the same time, if I need to cut it off from the rest of the interwebs, I'm able to do that.
(00:20:56):
And I have to understand a little bit, okay, what's really going on here? And I know that CLO will add some things, and I know that it's added a few features that I'll probably have to go in and change in order to meet those air gapt requirements that I'm going for. Because in the long run, I want this to be able to be installed in a different level environment for government contractors or maybe even for the government. There's actually an Army solicitation out that's due this week that is really for a tool that will understand all the different pieces of Army Contract acquisition only. It's from their perspective. So they don't have to do as much manual work to review all of these different proposals that they get for every thing that they put up on sam.gov or wherever. So I mean, there's a huge need for this, but of course that would have to be able to be run within their environment with limited access to the internet.
Nicky Pike (00:21:52):
All right. Well, I think that's a great cut over there. I mean, the interview's over Jen, hype is easy, reality's hard. It's time for you to prove that you belong on two code and show us what you got.
Jennifer Spykerman (00:22:01):
Oh, shoot. Okay, well, here we go. Lemme see if I can share
Nicky Pike (00:22:04):
Before we get into the actual tool. You talked about the architecture. You told me it keeps evolving. Do you have a diagram? Walk us through how this thing is actually built.
Jennifer Spykerman (00:22:12):
Yeah, I do have a very simple diagram. So I threw this together, I'm using React front end. I started off working with streamlet. That was kind of the basic thing that Cursor started me with, and it was a basic user interface. I wanted to bring it up a little bit. So I got the React front end with Vite, which helps because making constant changes. Of course, this is in development, so it helps me with that Zustand for keeping it stateful. And then fast API kind of mentioned before with Python three point 12, and then I've got my document processing and AI services, and this is where I spend majority of my time tweaking because to me that's where my power is with my experiences is in reading through those documents and understanding how to pull out the themes that are really powerful for your organization.
Nicky Pike (00:23:06):
Now, did you pick this stack or was this something that AI helped lead you down a path, or did you come into this knowing, Hey, here are some of the things I want to use?
Jennifer Spykerman (00:23:13):
So most of this AI figured it out for me. I knew I wanted React when I upgraded at first Streamlet, I was like, sure, whatever, I don't care. But then I wanted to upgrade it a little bit. So knowing that eventually I'm going to host it on a small web-based platform, I wanted to create some upgrade. So React was my decision. But the rest of us, I mean OpenAI integration, that's pretty straightforward. But no, thanks, Claude. I appreciate it. Thanks for doing it. My architecture, I did a lot of research here just to, I mean, cost was my number one driver as well as I mentioned performance. And so I've got my Chroma vector database, my embedding model, and then Grok for my LLM with two models that I'm using. Mostly Llama. Mixtral I think is kind of a backup, but I'm not sure how Mixtral got there, but I don't always make those decisions.
Nicky Pike (00:24:14):
Well, and it's nice to see Chroma DB getting some love here. Most people when they think Vector, they think Postgres and things of that nature. Was this a decision by AI or did you go into this knowing that Chroma was something that you wanted to use and what were your reasonings?
Jennifer Spykerman (00:24:26):
So I was looking, as I mentioned, something that was lightweight and cost efficient. So these are free pieces of the stack. This is what I came up with. I did use AI just as I was doing the research. I saw Postgres, but Postgres looked like it was a little bit heavier, a little bit. I think there was some sort of cost. I don't remember why I didn't use Postgres, but Chroma seemed to be kind of the easy starter one for the beginner, which I am. And so far it's worked really well. I haven't had any issues with it.
Nicky Pike (00:24:57):
Well, you say it's always changing.
Jennifer Spykerman (00:24:59):
It is, yeah. It's always changing. These things are still here. I double checked last night just to make sure something didn't happen behind my back. So the YAML piece was actually my idea. I wanted to have a very straightforward way to do some customizations, and I read a lot on Substack and that sort of thing. YAML and JSON are really handy at communicating effectively with large language models. And so I wanted to have a way for companies to provide, now here are our strengths as an organization. These are the types of technologies we work with. These are our offerings and our discriminators. So this is what sets us apart from other organizations. There's also things in here about preferred teammates, preferred primes, preferred organizations and that sort of thing. So there's a lot you can do here. And then prompt engineering with JSON for different types of RFPs that you can find. So PWS is a services contract sort of thing. SWP at the bottom is a Software Acquisition Pathway, part of that new method that the government is using to acquire software a little bit better. So that's just the high level architecture there.
Nicky Pike (00:26:14):
Now you talked about prompt engineering. So how did you kick this thing off? Did you basically jump into Claude and say, write me an RFP analyzer tool, or what was your first prompt like to get started on building this out?
Jennifer Spykerman (00:26:25):
What's funny, I actually, I saw you were going to ask me that, and I went back and looked and I thought, gosh, where? And I looked in cursor. I know I started in Cursor and I said, I want to do this thing. And it created the simple starting place for me, and I hated it, but I think it was create something that will allow a small company to upload an RFP.
(00:26:49):
And I think I gave it some pieces. I wanted to have a rag to understand the context and the information about that company so that it really knows how to respond based on a company's past performance and their history. And like I said, there was that huge lessons learned database. So my vision is that eventually it'll access the documents piece, and it has this whole documents folder where you can add all sorts of things in there, PDF databases, whatever that's coming. But that part isn't here yet. So far I've been tweaking a lot the prompt engineering and just how it is analyzing things, because the most frustrating thing about AI is the fact that you can do the same thing twice and get two different answers. And so as an engineer, that really hurts my brain.
Nicky Pike (00:27:44):
I can imagine.
Jennifer Spykerman (00:27:46):
So I spend a lot of time trying to make sure that there is some consistency, and so you can trust what the results are. And they're usually pretty consistent at a high level. There's just some small pieces that continuously annoy me, but that's all part of the process.
Nicky Pike (00:28:02):
And talking about the prompting, I think when most people get frustrated with AI, it is that prompting piece because you get out of it what you put into it. Did you find that your prompting experience involved and grew as you got further and further in this project that, okay, if I provide this amount of detail, then I'm getting closer to what I want. But if I'm very generic, of course I get very generic output.
Jennifer Spykerman (00:28:22):
Yeah, absolutely. So I'm getting a lot better at providing you have to provide the context. And Claude and most tools, they kind of reset. They kind of forget everything, especially in the coding world the next day, it has no idea who I am when I started up every morning. And so you really do have to provide the context of really what you're trying to do. The beautiful thing about government contracting in general, there's a lot that's out there. So that's all not copyrighted obviously. So the LLMs really have a lot of knowledge about what it means to do government contracting to respond to an RFP and that sort of thing. So it's really straightforward from that perspective. It doesn't necessarily understand what the current strategy is of the Army, for example. But so I have to make sure that I provide some of that context on a daily basis. So I'm always getting a little bit better every day.
Nicky Pike (00:29:18):
And that's the goal. I think that's what we should be doing. I mean, you were talking in your lead up to this, you were talking about your theme engines, your document parsing, compliance, checking. These are not trivial features. Jen, you said that AI figured out a lot of this, but what did you have to figure out yourself in order to provide the right prompts in the right context? Because context is everything for AI.
Jennifer Spykerman (00:29:38):
Yeah, absolutely. I mean, so if you just take something on face value, whether you're using just a plain old chat GPT or clot or whatever, or anything else, it'll give you something. And the beautiful thing about most of these LLMs is that it'll be very confidence, kind of like my daughter, when I found some alcohol in her room the other day about how confident she was that it wasn't hers. I mean, so confident because you just fill in some gaps and you're not going to say, I don't know. And chat. BT will never say they don't know. So that'll fill in some details on its own. And so you have to catch those. And so I actually spent a lot of time reading through every single RFP and say, wait a minute, you didn't catch this piece. Why is it still saying X, Y, and Z when it says this specifically? Oh, you're right. So most of it, it's still manual. I still, I'm still the user and I know kind of what I want the output to sort of be. And if it's way off, then I know that the LLM is just making stuff up and it's still making things up. So I mean, whether that's hallucinating, some people call it hallucinating. I just think of a teenager that's lying on their history test that they forgot to study for.
Nicky Pike (00:30:55):
That may be even a better analogy than I use. I always like to tell people AI is like an over caffeinated intern. They've got good skills, they've got good theory, but you can't really trust their judgments. You got to watch, and this is the partnership part of working with AI, is you absolutely have to understand one, what it's doing and be able to catch those mistakes. But it's not just a blind acceptance. This is much working in a dev team itself. You've got junior developers, you're going to have conversations with them, tell 'em where you want things to go. They're going to go out and do some stuff. You're going to have to come back and check and make sure, did you do what I asked you to do? Or did you kind of go off on your own,
Jennifer Spykerman (00:31:31):
Right? Yeah, absolutely. I remember when I was a new developer and I had to learn the hard way why I couldn't use recursive calls in embedded systems because the stack is very limited, but you just don't know some of those little details. And so yeah, you have to learn, and this AI tool is like a new hire. They know a lot, but not all the things that,
Nicky Pike (00:31:54):
And before we jump in and actually take a look at the program, you brought up something else, hallucination and vibe coding is a huge concern. It's what a lot of people are talking about. It's that aspect of my LLM or my agent doesn't want to admit it doesn't know what it's doing, so it's going to make some stuff up. What is your best example of the AI kind of going off the rails there and how do you debug something like that when you aren't the one that wrote it?
Jennifer Spykerman (00:32:16):
So I think some of it is using common sense. For example, chunking, the example I'm going to show you, I'll upload two documents. They're both related, but they're both different. And for a while it was only chunking like one, right? And so it was only really, I kind of figured out it was only really reviewing one document it was uploading to, but who knows what happened to the other one. So I'll ask the questions, why is it only saying that there's one chunk when there's two separate documents? Are you actually merging those together? And I'll say, oh, you're right, you're right. I was like, of course I'm right. So some of it's just paying attention, can't just, and I am a little bit of a control freak too, because when you're making changes and you can say, accept all changes, don't ask for confirmation. No, I want to see exactly what it's doing because I want to make sure that it's not kind of going off the rails. And when it does go off the rails, it is usually so tiny. You do have to pay attention. So you can't just go off and go make lunch and come back and, oh, you've got a new tool. No, I spent a lot of time debugging. That's actually what I spent this morning, an hour just getting ready for this to make sure.
Nicky Pike (00:33:36):
Well, putting you through extra effort. Always good. All right, well, let's fire this baby up. Show us what you got. Let's see this thing in action.
Jennifer Spykerman (00:33:43):
My user interface is still very basic, and I have my debugging code up here because this morning I was having troubles with my API connections, as you can see. But those seem to be working for now. So we'll all cross our fingers that the demo Gods will smile on me today and everything will be fine. So the example I'm going to share is there's an open solicitation by the Army on sam.gov at the moment, and there are two. So this open solicitation came out in early August. And then this one that I'm actually interested in is this AI contracting use case that I was kind of discussing earlier. So there's two RFPs in here, and I have it go through, and this is part of, I want to make sure that it's, oh my gosh, it classified it as Software Acquisition Pathway.
(00:34:37):
Sorry, I'm excited. It was doing something different earlier. But see, once again, it's chunking it into one piece. So I'll have a conversation with Claude here in a minute. And I clicking through this, I know this isn't really part of the user interface, it's more of walking through what's going on behind the scenes so I can kind of figure out where I need to do some tweaks. So it creates a executive summary. And this part I'm not a big fan of at the moment. Technical approach management gives you a high level, this is really what you should have as far as your outline, staffing plan, risk management, and then a cost summary. So I asked it, this is the big change I made this morning. I said, okay, I hate the outline. I'm going to come back to that because I want it to pull in all of those details from my YAML company or file to create a more personalized outline.
(00:35:34):
Right now, it feels very generic, so I'm not happy with that. But down, I spent some time this morning saying, okay, I want to make sure that I understand as an organization what are the key objectives that the Army is looking for, and then what are the technical requirements? So there's two documents here. Yes, you have to have a unique identity piece and SAM and whatever, but what are they really looking for? I noticed at least, I don't know if it's the LLM or the way I am summarizing it, it seems like it really concatenates a lot of critical information. So that's kind of where a lot of the prompt engineering comes in as well to say, this is really what I want to look for in this document. And so I kind of fine tune it for this RFP and then try to figure out how can I apply this to a more global set of documents. So then functional requirements and then performance requirements, and then what are your deliverables? One of the next pieces that I'll be after I get through the outline piece will be where are the big warning signs for FAR and DFARS so far as the federal acquisition regulation? And then there's the defense version of that. And those are huge. And so that'll be probably one of the next pieces is to identify the key FAR requirements that you have to look out for, especially as a small company.
Nicky Pike (00:37:02):
Well, so now here's a stat for you. So there was a report that came out that showed that AI generated code had somewhere around 48% bugs, 30% security vulnerabilities. This is something that you're actually building for government contractors. What are you doing to handle the potential security nightmare knowing that this could be going into that aspect of our government?
Jennifer Spykerman (00:37:23):
The first piece that I'm looking at is making it so that it is air gap, so it doesn't have to access the internet. I mean, that's probably one of the biggest ways that you can have some leaks. The next piece will, I'm examining the different parts of the stack to see where are their security vulnerabilities, especially from a federal perspective. And I mean, like you mentioned earlier, that's not an easy thing to do. It's another reason why I am creating kind of a Claude agent that'll just be a security a SI O that's just going to go through and look for security vulnerabilities. Part of me wants to say, I'm not thinking too much about that yet, but I am. I question it all the time. And I am not a ciso. I am not a security person. And so all I know is what I've learned at a high level.
(00:38:20):
So I need help with that. But I'm hoping that understanding what is in my stack and understanding what it means to have a secure application will really make sure that the data that's in there is going to be secure. I'm adding some authentication pieces, that's part of my next MVP roadmap to make sure that you've got the right authentication, encryption, all those types of things. So I'm doing what I know the most, and then I'll have to either ask someone to review it for me or provide some additional feedback. It'll probably be one of my first hires.
Nicky Pike (00:39:02):
That's a lot of stuff to keep in your head.
Jennifer Spykerman (00:39:04):
Right?
Nicky Pike (00:39:05):
All of that. And then you went back, talked about air gap. I know that's very important when you're talking federal circles and air gap. You talked about the LLMs and we want to use those. Are you including the LLM as part of this? So when they do get to use this and they install it, the LLM will be a part of the package that gets installed?
Jennifer Spykerman (00:39:21):
Yes, yes. So the LLM that I'm using, the Grok with Llama LLMs, and those are lightweight enough so that they can be hosted locally, and then it just becomes a matter of keeping it up to date. But yes, that was one of the design requirements that I had when I was putting this all together, what can be hosted locally and not have to be living on a huge server out in the web. But at the same time, the government has massive servers, so it's not like including defense contractors. And so depending on where you're hosting it, there will be access to GPUs on GovCloud and that sort of thing, but just understanding what those environments look like so that you can take advantage of what you're designing for efficiently. But so for now, it is an agat local contained model that I'm working with.
Nicky Pike (00:40:14):
Okay. So two questions kind of come out of my mind there. First, the AGA LLMs for government work. This doesn't feel like something that you just ask AI and you prompted into existence. What was the approach? Was it simply to do the consideration of they may have secure, they may have insecure where it's going to sit on the federal infrastructure and what it'll have access to. Is that really one of your main design points here?
Jennifer Spykerman (00:40:37):
Yes. Yeah, absolutely. And when we talked earlier about one of my first prompts, it was to design for something that would be portable and that could live in an air gapped environment. I'm sure it's lost context by now, so I'll have to go back and re-architect some of that back into it. But I just asked it this morning, is this going to be ready for an air gap? And they said yes, except for a few areas that are really meant to be hosted on a web server. But yes, that's how I began this whole journey was to say, I want it to be in an air gapped environment.
Nicky Pike (00:41:10):
Are you seeing any pushback from including AI and LLMs into a product that you want to push up? Is the government pushing back on that, or does it seem like that's something that's being accepted more and more going forward?
Jennifer Spykerman (00:41:21):
Yeah, I mean, the government is definitely, they have a lot of LLMs that they host themselves. And I mean, there was just a deal signed recently with the Department of War now I think it was XAI, OpenAI, and Claude and Anthropic, I mean, all of those guys have a contract with the United States Government Department of War so that they're going to be available. And one project that I worked with in my previous company was with the US Army, and they have their own LLM called Camel GPT that they're using. So it's there and that's in their protected environment. So yeah, I don't foresee it being as much of a problem, and I think it'll be really useful because that means that someone else is really keeping up all of those LLM models, and I won't have to do that. That'll be someone else's job in the government to bring in all those models and keep them up to date.
Nicky Pike (00:42:18):
Well and above, you were talking about hosting this, there was FedRAMP requirements, there was NIST. For those that are listening that may not understand, can you walk us through what is FedRAMP? What is NIST? What were those aspects that you had to think about when you were building this?
Jennifer Spykerman (00:42:30):
So FedRAMP requirements are just the requirements that are required to be able to be hosted on any environment with the United States government, no matter what level it is. And there's different levels. Of course, to become FedRAMP compliant, you need to actually have a sponsor with the DOD or DOW now, I guess. But it's a security piece, so that's separate from NIST. NIST, there's a whole document out there. I just went through the whole thing to become compliant myself, and there are 110 I think sections, and it has to do with encryption, and it has to do with access and RAC and all those types of things. So there's two different things to make sure that what you're developing can be hosted in a government secure environment and then that you're following all of the correct pathways. I think they're trying to change some of that because it's very expensive right now, which is a big hurdle for small businesses to overcome.
(00:43:27):
They don't necessarily have the funding to become FedRAMP compliant. It takes a lot of money to just have someone that's just focused on that piece. I mean, that's just at a high level, that's what it is. And then of course, each level, depending on if it's unclassified or secret or whatever, then they have their own requirements. And I haven't gone into looked into those. It's like one step at a time. And then I remember when I would design for different levels of government architectures, you design for the high level, but you maintain on the lower one because it's easier to go up than it is to come down. So you maintain everything on the low level and then you can advance upwards. So one thing at a time, FedRAMP or IL four I think is what it is, and don't ask me what IL means. I don't remember.
Nicky Pike (00:44:20):
Well, I know I've had to look through this before. This is not light bedtime rating. It's not something you snuggle up with on your Kindle and go through. Were you able to use AI to help you with some of those requirements to walk through and make sure that you were compliant a lot for your human brain to get around? I can see why this would be hard for small business because you pretty much got to have a legal team and a security team to be able to walk through all that and understand what they're saying,
Jennifer Spykerman (00:44:43):
Right? Yeah, I mean, I spent at least a week walking through each of those requirements. And yes, I used AI, it was my best friend, but at the same time, once again, it wasn't a matter of, I uploaded the whole NIST 500, whatever it is, 1 7 1, well, I can't remember, but to AI and said, help me write up a security plan. I went through section by section and had a conversation with it like, well, I'm a company of one and I'm using Google Workspace, and then we can have a conversation, oh, this is where you have a gap. This is what you can do to fix that gap, and that sort of thing. So it's not a matter of, it's not an easy button. You still have to do the work, and you still have to use your brain and keep yourself out of trouble by making sure that it doesn't over promise, because it will summarize the heck out of something and it'll miss a lot. Right? So you really do need to be very specific, especially if it's something as important as a contract or something else that you're working with the government.
Nicky Pike (00:45:45):
And that's what I love about this story is everybody, when they're talking about AI, at least today in the development circles, it's all about code generation. That's the only thing that people are really focused on. But you highlighted the fact that AI has a lot of different helps and purposes that can be used. It helped you with NIST, it helped you with aggregation of what tools you should use. There's different parts of AI that you can use here. Now, as you were going through this, again, going back to hallucinations, going back to losing context, did you find that you had issues doing this on a local laptop where you having to reverse engineer some things that AI did? How were you keeping the AI BS on a lease here?
Jennifer Spykerman (00:46:22):
I haven't had too many problems, but at the same time, maybe we should have another conversation after I implement all of my coding agents.
Nicky Pike (00:46:30):
Fair enough.
Jennifer Spykerman (00:46:30):
Because I still have a very tight leash on my little coding agent right now, my little single person AI person, so it hasn't gone off the rails. If I don't like something, if I'm worried about, I definitely use Git. I back up to Git very religiously because smart, if I need to back something out, then I can say, okay, that was a mistake. I almost did that yesterday, but I did. I had it all backed up. I was ready to go to rip it all apart and go back to what I had. But yeah, so I definitely use the tools that are available to me like GI and I ask it. I say, before I do this, I want you to commit everything to gi, everything as this, and I tell it, I'm going to want to go back to it. So far it hasn't been too bad, but yeah, after I implement CLA swar and you hear a subtle scream coming from north of you and South Denver, then you'll know what happened.
Nicky Pike (00:47:23):
Absolutely. I look forward to hearing the fans kickoff. And that was another thing when we're talking about fans and the GPUs, you said that the government does have a lot of GPUs, but if you're doing local LLM, this is mostly from what I'm understanding, document uptake and aggregation and spitting out, is GPU going to be necessary for this tool or is this something that can run on cpu?
Jennifer Spykerman (00:47:44):
This tool can run on CPU. It doesn't need any GPUs, right? So it is, like I said, lightweight enough and simple enough that it doesn't need a bunch of GPUs. Now, if we start getting into other types of processing, then that's when it gets a little bit more complex. If you need more multimodal AI processing, then yeah, maybe there's a lot of things that are called industry days where you go and you see someone present and they talk about the strategy of the agency or whatever. You may want to upload all of that information and have it summarized so that you don't miss any of that context. That might get a little bit more complicated. Right. But yeah, so far, this is pretty straightforward.
Nicky Pike (00:48:27):
So I know you're not done yet. You still got iterations, and I believe that yeah, we probably will invite you back on to show the finished product, but as we're walking through right now, the government officials really reading this, they're watching the show, what's going to make them say, holy shit, I need to have this.
Jennifer Spykerman (00:48:41):
Oh my goodness. I don't think it's the tool. It's the fact that it's an individual who has all of the knowledge of what large defense contractors do when they are delivering for the government. And so on the other side, from the government perspective, they are looking for how do we get rid of manual processes? On the business side, they're saying, how do we not have to stand up a multimillion dollar capture organization just to go after proposals or RFPs? And so on the business side, there's millions of dollars that could be saved by using a tool like this, doing some of the most simple things too. On the government side, they can actually make sure that they are implementing their acquisition rules as they have constructed them to make sure they are getting the best response, and they're awarding to the best business case. And usually they talk about best value. But depending on how much experience some of these reviewers have, it still kind of comes down to, well, we don't know. This all looks kind of complicated, but this one is the cheapest, which a lot of times doesn't work out in the government's best interest. So to make sure they actually are getting best value. So I think it is on both sides from a government perspective and a company perspective.
Nicky Pike (00:50:08):
All right, so I want to repeat that back to you, make sure I heard that right. So it's not necessarily about the tool, it's more about the democratization of the knowledge. You're not going to have to have specialized people. This is going to allow smaller teams, less knowledgeable people jump in, be able to take care of the RFPs, and then from the government side, it's going to help the evaluation.
Jennifer Spykerman (00:50:25):
Right? Yeah, absolutely. And make sure they are getting the best value.
Nicky Pike (00:50:28):
I loved your statement about the lowest cost. That's an old joke in the military of you're riding on an airplane and you got to think to yourself, I'm riding on something that was built by the lowest. Yeah, the lowest bidder. I'm using a weapons platform that was done on cost, not on ability. So that fits in well.
Jennifer Spykerman (00:50:43):
Yeah. Yeah. It's kind of scary, right?
Nicky Pike (00:50:45):
Yep. Let's take some takeaways from this as we're ending near the show. So you've been on both sides now. You've both managed huge technical teams and you've built this tool with AI. What's your opinion on what enterprises are thinking about AI replacing dev teams and how AI could possibly augment what they're doing in their workflows today?
Jennifer Spykerman (00:51:03):
I'm going to take this from two different positions. First, from if you're someone that is a developer at a company and you're really good at cloud development or whatever, and you're getting nervous, I think you need to make sure that you have a good systems view. And I keep hearing this, and I agree because a systems engineer, so I'm biased. But to make sure that you're curious about what is going on around you, what is interacting with what you're good at, and to start to get to know the broader landscape, because that's going to be what's required to make these tools actually successful. On the corporate side, if you're talking about getting rid of vast swaths of engineers, you have to remember that some of these people have very specific knowledge about your infrastructure and about your applications and about your business. Logic and AI is not going to have any idea what that looks like. So you need those folks. And you also need people that are coming in straight out of college so that you're not all of a sudden finding yourself one day with all these folks that are getting ready to retire and no one coming up behind them to keep the company alive. And I have a funny story When I talk about the first job that I had outside of college, I always like to say that I was hired on the 98.6 rule, and that meant that I had a body temperature of 98.6 degrees.
(00:52:32):
And the reason was is the company had gone through and laid off a whole swath of folks, or they just stopped hiring for a long time. And so they had this thing called this bathtub chart. And so I was on one side of the bathtub along with all of my colleagues who were all still really good friends now. And then there's all these folks that we were working for, they were on the other side of the bathtub, but there's no one in between. So you have to make sure that you keep your pipeline moving so that you are carrying forward the corporation, the values, and just the heritage knowledge of what it is that you have built. Otherwise, you'll have an AI tool and a CEO that's getting ready to retire. And that'll be about it.
Nicky Pike (00:53:11):
And that's an amazing point. I think, this is my personal opinion. We do see a lot of companies out there that are talking about replacing all their junior developers with AI, and that may work for them in the short term, but at some point, your senior developers are going to retire. They're going to move on, they're going to go to places, other places. What do you plan on doing to create the next batch of senior developers? It seems like they're making a pretty big bet that they're going to see AI take over that role as well. And I think that may be a false bet for 'em. It'll be interesting to see where that comes out.
Jennifer Spykerman (00:53:43):
We will be in the matrix then.
Nicky Pike (00:53:45):
Absolutely. All right. So last thing. Based on your experience with what you're doing with this tool, where's the line between AI can do this for me or I'm going to need a developer. I have to be very conscious of what's going on. Where's that line at?
Jennifer Spykerman (00:53:58):
I think there's a couple lines, but the biggest one that I'm seeing right now, AI is terrible. It sucks at user interface. It's just terrible. And I'm not really that good at it either, but I know that what it's given me is not right. So I think user experience is still a big deal. You want an intuitive application and it doesn't create an intuitive application out of the box. And so I think that's one area where unless you want to really talk through to the AI agent about what might look right and continue to iterate forever, I'd probably talk to someone about creating a better user interface and then some of the more deeper knowledge of what to look for in the RFPs that I'm uploading just now, I can tell that there are some pieces that are, I'm going to go back after we get off and I'm going to be messing with them because I want to make sure that I'm pulling out the correct pieces and it's still focusing on the wrong areas, so how to help the tool actually know what to look for. But still, I mean, I've used another app to figure out how much time I've spent on this, and I've spent about 30 hours total, and I have a working application. And that's not too bad for someone who hasn't done any coding for a long time.
Nicky Pike (00:55:19):
This would take months with a dev team. And it goes back to what you're stating. I have the knowledge of what I wanted to do if I were trying to talk to a developer. I have to impart upon them the knowledge. We've got to go back and forth. No, this is not what I meant. You're able to do that very quickly with an AI agent. I think that's a huge benefit because of that time savings, rapid prototyping. That's one of the superpowers I think of AI and the ability to jump into other things. And I loved your UI comment as someone that's used government tools, that's a place where the government lacks their tools are functional. But my goodness, you look at some of the UI that they put in there, you bring up something with a good UI and a good user experience, you're going to blow a lot of the DODs minds by doing that.
Jennifer Spykerman (00:55:59):
Correct.
Nicky Pike (00:55:59):
That's just not what they're used to.
Jennifer Spykerman (00:56:01):
Just go to fpds.gov and good luck.
Nicky Pike (00:56:04):
Exactly. All right. We ask every guest this question, I hope you're ready for it. You've directed 300 engineers on NASA projects. You saved millions of dollars, a hundred millions of dollars with the DoD. You've traded coding for the C-Suite about two decades ago, and now you're back to building AI tools after this journey. What does it mean to you to be a coder?
Jennifer Spykerman (00:56:24):
To me, it means being an artist. I think being a coder is a very creative process, and that's my favorite part of building something. I've wanted to create something for a long time instead of directing, instead of building slides, I wanted to create something and put it out into the world. And so to me, that's what it means. It means being able to create something that takes all of my experience and what I feel I'm really good at and summarizes it into a beautiful little tool.
Nicky Pike (00:56:56):
We're in a time where apps are eating the world, and everybody out there listening will know and they'll follow along. You try an app if it's not creative, if it's not fun, if it's not useful, you immediately look for the next one because there's a hundred different apps for everything you'd want to do out there. So there is a large amount of creativity there, and that's one of the things that we haven't got from AI yet. It doesn't have that creativity. Now. I think that's probably a topic for another show is what is creativity and how can we say that AI doesn't have creativity, but you look at it from a software perspective, it's what gaps people's minds. It's what makes them want to download that app. Use that app. If you can't create that user experience, you don't have that creativity. It's a failed project before it starts.
Jennifer Spykerman (00:57:36):
Right? Yeah, absolutely. I agree a hundred percent.
Nicky Pike (00:57:39):
All right. Time to go in the hot seat. Here's the hot seat cushion. Ready. Based on what you've built and your experience on both sides, what percentage of a current developer's jobs do you think could be done by domain experts with AI? Don't sugarcoat it. What do you think of that?
Jennifer Spykerman (00:57:55):
At least 30%. I'm going to give it 30. I think it's less than half. I think the domain knowledge that you might have, if you're someone who doesn't want to do an AI coding experience, it's not that easy. And you may not be thinking like a developer where you're like, oh, I need to make sure I have all my tests done to just check for security leaks. I need to make sure I've got, you're not thinking about that. You're like, give me a tool. And it's probably very specific. I don't think that coders are going to go away, and that maybe that's my wishful thinking, but I do believe that coders will, developers, developers will spend more time developing and not as much time coding. And so I think that's where the real value is. Companies that are just firing huge swaths of their developers, I think they're going to regret that because AI is only as good as what's in there. And yeah, you need to make sure you've got people that can at least be brave enough to open the IDE and say, wait a minute, what's going on over here? And try to kind of do a little bit of engineering to see if it's going off the rails. Otherwise, good luck. You're putting your company at risk.
Nicky Pike (00:59:10):
I agree with you. I think that's a great take. I think we'll see software engineering jobs evolve and change to being more of directors of AI, but we're still going to have to have somebody there that can tell AI what does good look like? Where are things going and bring that creativity in. Personally, I see software engineering actually changing into more of an engineering discipline. I give a talk where we see automotive engineers and aerospace engineers, they're not the ones turning the wrenches. They're building the blueprints, they're doing the designs. I think that software engineering is going to become a lot like that, and time will tell. That's what I hope. I don't think that there's any real fear of it taking all software development jobs. It's just not possible. As you said, there's that creativity that needs to come in, and that's something that we possess. It can't solve business problems. It can only tell you what other problems have been solved in the past.
Jennifer Spykerman (01:00:02):
Exactly, exactly. So if you want to do something a little bit newer or something that hasn't been solved in the past, then you're going to need to find a new solution. But yeah, I think mean the beauty about using these tools is that it allows you to focus on the outcomes instead of being stuck in the weeds of doing the development and the coding and trying to figure out why you're getting an error, instead of trying to figure out why it's not parsing a document the way you really want it to and giving you the results that you know really need. So that's kind of the flip is it's not about the coding, it's really about the outcomes. Now with these tools, that's powerful.
Nicky Pike (01:00:43):
Yep. We'll see it. Now. It's going to become more fixing the issues instead of, did I put a comma in the wrong place? We all know that days of debugging because you didn't use syntax. So hopefully AI will take the typing away from us, but it'll leave the problem solving to us. I think that's 100%. So as we close the show, I do want to say thank you for being on the Devolution. I hope you become a proud member of the Devolution. Any parting words before we end out?
Jennifer Spykerman (01:01:07):
It's easy to be pessimistic about some of these new technologies, but I think if you embrace them, then you're really future-proofing your own career. So play around with them. And just because you haven't done some coding in a long time or ever, that doesn't mean that you can't be a developer yourself, at least at some level. So that's what I would say.
Nicky Pike (01:01:28):
I love that. That's a great way to end the show. It's all about education, embracing the new technologies. Don't be fearful of it. Thank you very much, Jen. I look forward to having you back when you get this product out on the line and it's actually starting to work for people.
Jennifer Spykerman (01:01:40):
Oh, thanks Nicky. It was so much fun talking to you.
Nicky Pike (01:01:43):
Alright, thank you everyone, and for everyone that's out there listening, thanks for being a part of this. If you have any questions, please let me or Jen know. You can find Jen out on LinkedIn. Jen, do you mind if we put your LinkedIn profile out on the show notes?
Jennifer Spykerman (01:01:54):
Yeah, that's great. Please do.
Nicky Pike (01:01:56):
Excellent. So reach out if you got questions. Until next time, stay loose. Thank you for listening to Devolution. If you've got something for us to decode, let me know. You can message me, Nicky Pike on LinkedIn or join our Discord community and drop it there. And seriously, don't forget to subscribe. You do not want to miss what's next.