Software Delivery in Small Batches

Jason and I discuss automated testing from a beginner's perspective and its impact on engineering success. Strong opinions included!

Show Notes

ā˜… Support this podcast on Patreon ā˜…

Creators & Guests

Host
Adam Hawkins
Software Delivery Coach
Guest
Jason Swett
Author, speaker, conference organizer, podcaster, consultant.

What is Software Delivery in Small Batches?

Adam Hawkins presents the theory and practices behind software delivery excellence. Topics include DevOps, lean, software architecture, continuous delivery, and interviews with industry leaders.

[00:00:00] Hello and welcome. I'm your host, Adam Hawkins. This episode is a small batch of software delivery education. If you enjoyed this episode and share it with your friends and colleagues.

[00:00:15] Hello friends, Adam here for the next episode of small batches. This episode features my conversation with Jason sweat. I invited Jason back on the show to discuss his experience teaching testing to beginners.

[00:00:27] So in a sense, this is a back to basics episode. Automated testing is so important to software delivery that it's good to occasionally revisit it from the beginner's perspective, we discussed the initial challenges in learning testing, trying to achieve TDD, the organizational challenges when adopting tests.

[00:00:44] The benefits of the red, green refractor loop and why automated testing is just good software development hygiene. Also stick around to the end of the conversation because I've included some important closing thoughts on some points brought up in the conversation. Now here's my conversation with Jason sweat.

[00:01:01] Jason, welcome to the show. Thanks for having me. So I've already introduced you in my own words. Why don't you introduce yourself in your own words? Sure. I'm a developer. I have been for many years, maybe about 20 years or so give or take at this point, I also do a certain amount of writing and. Speaking and podcasting stuff like that, mostly on the topic of Ruby on rails tests.

[00:01:25] So the reason that I wanted to have you on the show was just the last thing you mentioned, which was about testing. And one of the kind of conversations that's been coming up a lot for me in these interviews is the importance of testing as a prerequisite for any kind of higher level deployment automation.

[00:01:40] Because if you can't say that a particular change has been verified to be correct, and there's no way you can have automated deployment on time. So I want to take it back to basics and the sense of what is TDD? What, like, what is testing? How do we do. I thought you'd be a good person to talk to because I know you have a course on testing rails applications, but have experience also just teaching testing in general.

[00:02:01] Let's start there. Can you tell us a little bit about your course? What do you expect that the students know already and how do you teach them to. Well, maybe we can start with the questions that beginners tend to have because my product offerings are in limbo right now. They're always evolving and there's not really something that's out there right now that I can say this is the course, but I can say some of the questions and some of the, maybe struggling points that beginners have that I see.

[00:02:28] So the first is a conflation of TDD and testing. People will say, Hey, I want to start doing TDD. What are the TDD tools and stuff like that? Well, wait a second. Are you talking about TDD or just testing? Because those are not exactly the same thing. All TD is testing, but not all testing is TDD. And I actually recommend to people that they don't learn TDD first, that they just learn testing first and then add TDD.

[00:02:57] So, let me talk a little bit more about what I mean by that testing itself is already really hard. You have to figure out what to test, how to test it. You have to get the tooling in place. You have to have some kind of testing infrastructure. If you're working on an existing application, that's extra.

[00:03:15] 'cause now you have to like somehow retrofit to existing application. If it doesn't have any testing infrastructure, any existing tests, you have to retrofit it with all that stuff. And that can be very non-trivial. Sometimes even for somebody who's like me, very experienced and comfortable with testing for me to go to an application that has been developed for a year or more without any testing.

[00:03:39] It's going to be a huge challenge, even for me to go and add that stuff retroactively. So for somebody saying, Hey, I want to start doing TDD on this app that I'd been building for the last year with no test. It's like, whoa, hold on a second. Let's do something a little easier first. So what I recommend to be.

[00:03:56] And I'll speak in the context of Ruby on rails, because that's my background. I recommend to people, if you want to start learning, testing, create a fresh rails application and add some tests to it. And maybe even before that, just do some plain Ruby and do some testing with Ruby. But in any case, don't try to add.

[00:04:13] To an existing application and don't try to do TDD because it's kind of like, Hey, I want to learn how to juggle, but also like, I want to juggle it and ride a unicycle on a tight rope at the same time. It's like, okay, you can juggle on a uterine cycle on a tight rope, but you're probably not going to learn those three things in parallel.

[00:04:31] How about let's learn to juggle first. Then you can add the unit cycle, then you can add the tape. I think that's good advice because that's also how I learned testing itself was like, first I have some code that's already written, I've done manual testing. I play with the app. I can save that. In fact, it is working.

[00:04:47] Now let's try writing a test for this functionality. I think part of the challenge in testing is just a learning, all the different tools involved with that, because this comes down to what level you want to test that. Say for example, you might be using some unit testing library, like J unit. Many tests and Ruby or our spec, you know, there's any of these like test frameworks, right.

[00:05:07] But depending if you're trying to test an application and maybe you need something like selenium or you're using cucumber or these, like all these little layers of tools on top of that. And then the other part is it can be really hard to bootstrap a project that has been developed without any testing done at all.

[00:05:23] And I think that speaks to the virtuous feedback loop of testing on design. I think we agree there. So I'm curious on your view when you encounter these systems that don't have any tests at all, and then you want to start doing testing. Why is that so challenging? Why is this a hard thing for beginners to understand?

[00:05:42] There's lots of reasons why it's challenging to add. It's to an application that's been built without tests. Let me actually start with the challenges that are not technical, the challenges that are more organizational. If you have a team of eight developers, let's say you have this application with no tests, you all agree.

[00:06:00] That's bad that there's no tests and it would be good if you had. Well, you can't just say from now on, we're going to start writing tests because you have to have a plan for how to do that. You have to have an agreement, you have to be on the same page as far as how you're going to do that. If it's a Ruby on rails application, for example, there are questions like, are we going to use the, our spec testing framework or the mini test testing framework or something?

[00:06:25] Are we going to have a separate test suite for our JavaScript? Or are we going to test our JavaScript using our spec? You have to have the tooling conversation. I had a situation at a past job where our test coverage wasn't great. And one developer decided to add some. That developer used certain tools, another developer separately added some tests using a different set of tools.

[00:06:47] And so we went from having no tests to having two different test suites. I think one of them was connected to CII. The other one wasn't connected to CGI and it was unclear. Do both these test suites really count like what's going on here because we didn't get together and have that conversation at the beginning saying, here's our plan.

[00:07:06] Here's what we're going to do. So that's. Another challenge is that the skills might not exist on the team. I've heard it said before that. Okay. From now on guys, we're going to write tests for everything we do. Every PR has to have a test included, or we're not going to accept the PR, but it's like, you can't exactly just do that because not everybody knows how.

[00:07:27] And also this brings us to another challenge, which is more of a technical channel. Which is you, can't just slap a test on to any change that you make because the underlying functionality where you're making that change might be really complex. And so in order to have a test, you have to have certain setup data and that data that you have to have in place in order to.

[00:07:52] Simulate this world that exists when this area of code is running, that might be extremely non-trivial. And so to expect somebody to include a test with every PR from now on, at some point in time, that's just not realistic because for some features, maybe your change would be a 10 minute change, but it would take literally weeks of work to get the test set up in place in order to be able to take the 20 minutes to write a test.

[00:08:18] So that's not a realistic way to do it. You have to gradually build up. I always recommend like starting with the easiest, most trivial stuff, add some tests in those areas and then work your way up until you have the ability to test those meatier features. It seems intuitive to test the most important stuff first, but that's actually maybe not a, a very realistic way to go because the most important stuff is often the most non-trivial stuff.

[00:08:43] So it's going to be the hardest to test. I recommend working up slowly. To that stuff, rather than trying to start with that. Yeah. So I think what you mentioned also speaks to one of the other conflations when it comes to testing and TDD, you mentioned that in your story, one of the developers hooked it up to CGI, and we haven't really spoken about CII yet, but there's this assumption that when you do test and you're going to have CIA.

[00:09:06] So in this case, CGI is a stand in for some way of some external system. That's automatically executing our tests. So whenever we push. And then marking it, recording that as part of some commit status or some other indication about the relation between source code and passing tests. So is this something that you also have to explain in the concept you have to introduce to people or do they grok this automated execution automatically or like intuitively?

[00:09:33] Well, the real answer is we generally never even get. We focus on writing tests for your application and just running them locally on your own machine. And CIA doesn't even come to come into the picture because testing is such a huge, huge area, you know? And so the end that I focus on is the very beginning end of the CIA doesn't even come into the picture at that stage.

[00:09:57] I see makes sense. Especially when you're just trying to ladder up somebody to the idea of writing tests and what does writing a test look like? What does writing a test feel like? What are you trying to achieve by making this test? Yeah. And I've made the mistake that every teacher makes, which is okay.

[00:10:13] We're going to learn Ruby on rails testing. And then I discovered that's way too broad, way too much stuff. And the next time is okay. Class. We're going to learn Ruby on rails into, in testing. Oh wait. That's way too much too. Okay. We're just going to learn our spec syntax. Oh, wait. That's also too much. And, and I keep narrowing the scope because there really is just so much to learn.

[00:10:34] I agree with you, especially if you're trying to introduce somebody to the idea of just writing a test, what's the amount of scope that we can remove from this whole exercise. Let's say you can just focus on the two things that you need to pay attention to, which is the coding question and the. So is that kind of where you start when you're teaching people?

[00:10:51] The answer to that question is, again, it has evolved and changed over time. Things have been weird because like I used to do in-person corporate training, fly to a client's office and teach a class. And for certain reasons I don't do that anymore, specifically right now because of coronavirus. Nobody's doing that kind of stuff anymore.

[00:11:09] But I did do one class where I had people sign up online and then we met on a call four times that. What we went over in that class, the pitch for that one was that we would learn rails testing. I honestly like, wasn't really sure what people were going to want or how things were going to go or anything like that.

[00:11:28] And so I purposely kept it vague. I said, Hey, we're here. We're going to do some rails testing stuff. I'm not exactly sure what it's going to be. This is an experiment. I was just really Frank about that part of it. And so there was 20 some students who signed up. I did two cohorts of 12 or so people I'm trying to remember.

[00:11:46] Wait, what we did first, but I had them take a fresh rails app again. And the format that ended up working out nicely was that we would get together on a call. I would give them some assignments beforehand, but then I would do the insight assignments in front of them. If there were eight assignments for that week, I would maybe do the first two or three.

[00:12:07] In front of them so that they could see, oh, okay, this is how I get started on this. And then they would go off on their own and do the rest. It was basically just a bunch of practice. It wasn't like a building up of we're going to do this fundamental thing. And then this thing on top of it, it was more like we're going to add this tiny feature.

[00:12:25] Here's how we write a test for that. While this tiny feature, here's already a test because if all you do is you build to use rails and is an example yet again, if all you do. It's built a crud in her face and write a test for it and then build another credit interface and write a test for it and build another credit interface and write a test for it.

[00:12:44] Even though you're not doing anything, that's all that different. Just that sheer repetition helps a lot because you can see, oh, okay. Every time I make a new creditor. I'm going to write a test for saving something valid. I'm going to write a test for trying to save something invalid and asserting that there some error messages present on the page.

[00:13:04] I'm going to write a test for updating existing record and so on. They're going to see those commonalities and they're going to get a feel for the general approach that you take. So you can write a test for any feature that you. Yeah, the audience can't see me, but I've been nodding along the whole time when you're talking about discovering what things you need to test for.

[00:13:21] And that is a big part of the initial learning, especially when it comes to repetition is understanding that when I hit scenario X, I need the right tests, a, B and C. Also helps in yourself as the program or your own level of what your code is going to be doing. What a happy path execution looks like, what a failure mode execution looks like and what kind of things can go wrong in this?

[00:13:46] What things should I not expect? What things can I expect? And it really does take just a lot of repetition. Yeah. Like in the beginning it can feel lame to just write tests for the same thing over and over again. That's how you learn to do anything, right. It just comes down to practice. Yeah. And I don't know, it might feel lame for me and for you, but maybe for a beginner, it doesn't feel lame because it's all like, so very difficult, like for me.

[00:14:11] So I, I do all the cooking in my household a few months ago, I made this involved recipe. It was this beef stew and it's an all day. And the first time I made it actually turned out really good the first time. And I was happy about that. But when you're, when I'm cooking any recipe for the first time, so much attention and energy is spent just looking up like, okay, it's a tablespoon of this.

[00:14:34] It's one cup of this blah, blah, blah. And like, I'm not actually paying attention to what I'm doing and internalize. So I made that recipe again. And even though I was making the exact same recipe, I learned something new. The second time I did it because I wasn't having to pay quite as much attention to going back and rereading the recipe and stuff like that.

[00:14:54] I could free myself up and say, okay, now I have all this stuff. Uh, all, okay. We're like searing this beef before we braise the beef, because that searing step adds a certain flavor and we're doing this and yeah. Okay. I understand now it was only on the second iteration that I could begin to actually understand the recipe because only then was my mind freed up enough to be able to undertake.

[00:15:18] Yeah, I think this also speaks to the feedback loop that comes from once. Some person is sufficiently skilled and testing, then that understanding of testing goes to influence how they actually write the software itself. So there's that saying? That testing influences design. And if you start testing, you'll make designs that are easier to test.

[00:15:39] And then that creates some more confidence about your code, because now you're able to write more tests for it, understand that it's working and it just goes in this sort of yeah. So have you, in your classes and with your students, seen people hit that point where, like you said, they have this level of understanding and that they don't have to pay attention to so much the low level details of anything.

[00:15:56] And they can take a step back and see aha, like, this is why we're doing this. And this is maybe how I can do it differently. Or maybe even change the recipes. No cause the most in-depth thing that I've done so far is that class where we had four sessions, it was four weeks, one meeting. And then the week's worth of doing homework in four weeks is really not a lot of time.

[00:16:17] It takes so long to learn testing. It might take a year to get really comfortable with it. And so I haven't been with people for long. To be able to watch their progression and see them get to that point. I've certainly gone through that progression myself, but unfortunately I'm not really in a position where I can observe the students doing that.

[00:16:37] Yeah. I think it really does take years. And to me, testing a skill zero of software engineering. I'm like, if you can't do testing, then you're not going to be able to ladder up to any higher level. Of like success in the field. Things like continuous delivery, continuous deployment. If you're stuck doing manual testing, I'll never be able to work as fast as other people.

[00:16:57] Your teams will never be as performance. Your software will never be as bug free as it could be. If there were automated tests. Yeah. And, and something that I've come to discover is that having really high quality code is pretty much, I hope this isn't too like bold of a statement, but having really high quality code is almost impossible without having good test coverage.

[00:17:20] And the reason for that is because test coverage enables refactoring. And if you can't refactor your code, Then it's really hard to keep it in good shape because you can never, ever just sit down and code everything beautifully on the first try. You're going to do a bad job, and then you're going to go back and clean it up.

[00:17:37] And then six months later, your little area of code that you started off with, it's going to be part of this much bigger picture. And because that whole other part of the picture is. Your original part of the picture no longer makes complete sense in the larger context of everything else. And so you're going to have to revisit your whole entire code base and shift things around a little bit.

[00:17:57] And you can't go back and shift your whole code base if you don't have tests because it's just too risky. Yeah, that's true. And I'm glad that you brought the refactoring up because that's part of the whole. PDD loop, right? Red, green refactor. And the whole part of the whole professional sort of software development is just changing code.

[00:18:13] That's already written to meet whatever the new requirements are. So if you can't effectively change existing code with confidence and you're always shooting yourself in the foot. And I, I don't think that your statement about high quality code being predicated on having tests is too much. I think that's just a fact at this point, I don't see that there's any way that you could really dispute it.

[00:18:30] And I think that if you have, like I say, as an engineer, I have sufficient experience right at the software. I gotta buy tests that if for some reason you started writing software that wasn't guided by test, you would still write software in the same way that you learned guided by tests. So like, even if you weren't writing the tests themselves, he was still produce underlying software architecture that mirrored the same way you thought about software architecture as if there were tests, because it could become second nature to think and design software in this.

[00:18:56] Right. Yeah. Yeah. And I'll qualify my statement in two ways. It's certainly possible to write a very small amount of high quality code. You can do that. And I've done that plenty of times. I write one single class that has a few methods and it all looks great. There's no. But it works fine. That's fine. You can do that.

[00:19:16] What is more difficult is if you have a big non-trivial system and keeping the code good, you can, but it will be prohibitively expensive to do so, because the only way to test it is to have somebody go back and test every single piece of functionality that exists in the application. That would just be totally impractical because you'd have to do your refactoring and then you'd have to either yourself or your QA person or whatever would have to spend the.

[00:19:43] Two weeks just going through and testing everything. And then what if they find that something doesn't work, they're going to say, oh, Hey, I found four, four of my tests that I performed manually failed. So fix these four things and then I'll go back and spend two weeks again, going through all my manual tests again, like that's just crazy.

[00:20:01] Obviously it's much more practical to be able to run a test suite and have it take some number of minutes and then. So I think, like you said, it isn't too bold of a statement to say you can't have high quality code without having tests because otherwise it's just not practical. Yeah. I think you can also qualify a little bit more in the sense that you can have say high quality, small unit of code.

[00:20:22] You could write three lines of code that doesn't have a test and still fine, but I don't think there's any way you can produce a high quality school. As a whole, that does not have tests. And I I'll say this again, and I don't have any qualms about saying it is that if you consider yourself a professional software engineer and you're not doing tests and you're not a professional engineer, I don't think there's any way you can get around that because I don't feel comfortable putting a system into production that does not have tests because there's no way that you can say that it is verifiably.

[00:20:48] Y deploy something that's, hasn't been verified to be correct. Like you wouldn't do a test flight of an airplane as the first time you put customers on it. You do that beforehand. Like there's a reason, you know, like, yeah. Why does software developers think different? I don't know if I would put it quite that strongly to say, you're not a professional software engineer, if you're not doing tests, because there is one valid excuse, which is if you're early enough in your career, that you just haven't built that skill yet.

[00:21:16] Like if I hired somebody straight out of college or something, I wouldn't necessarily expect them to be able to write tests. So that's excusable, but I would certainly hope that somebody gets started and, you know, a lot of people might get their first job, find themselves in an organization that doesn't write.

[00:21:35] At all. And so they're not going to get that production experience writing tests. And so there's a lot of people out there who don't know how to write tests. I don't want to say it's no fault of their own because you can always teach yourself anything. But that makes it a little harder when your coworkers, aren't writing tests and it's like uphill learning experience.

[00:21:53] There's only so much that you can learn in a vacuum. Sure. You can teach yourself through the courses and things, how to apply the principles in isolated environment, but it's always different than when you're actually say. Uh, customers product or building something as part of a team. Now those are totally different skills and practices.

[00:22:10] So I do agree with your caveat that like, if you have zero experience, of course you don't have any experience with these things. And there's no reason to expect that. However, if you're calling yourself some sort of senior software engineer, you've been in the field for 10 years and you're not doing tests.

[00:22:24] Then that kind of blows my mind. How could you say that? Yeah, I would say that the title of senior software engineer and testing, like those have to go together. You can't really be a senior developer and not write tests. That just doesn't make sense. Yeah. Like there's some inflection point where somebody is saying, Hey, look, I'm going to hire you.

[00:22:41] You're going to build me this thing. And you as a person who is building it wants to know that you built it. If you don't care about that as the builder, then something is off there. And my mind. And I think the only way to do that is through automated testing. Like earlier in the conversation we talked about just testing, but now we've laddered up into automated testing, which is totally impractical to, in your example of the QA person is to have them go back and do it manually because it just takes two.

[00:23:07] It's not scalable. Right? So once you have these tests, the second benefit is that you can automate the execution and run them whenever you want on any change. This is where CIA. Yeah. And plus at least for me, I'm much too lazy to do the testing manually on my own. It's just painful. Whatever genetic thing makes somebody want to be a programmer also makes them not want to manually test things, but there's also like the wrong.

[00:23:31] Imagine you testing your own stuff versus a QA person testing your stuff. The QA person is rewarded for finding bugs in a way that's their job to try to hit it and hammer on it and scrutinize it and see if they can get it to fail. But you have an incentive to find that everything works, because if everything works, then you can move on and do your next task.

[00:23:51] So I'm completely guilty of this. I'll do some cursory testing and be like, oh yeah, it works because I want it to be. And so I probably, even if subconsciously, I'm not going to test it as thoroughly as somebody else might be, but better even than having somebody else test it is just to have automated tests because that takes out that bias that I have to not test my own stuff that.

[00:24:13] Yeah. When I was working in teams actually had QA people. I was disappointed in myself. If I had wrote code that somebody could find a bug in, I think it's actually relatively easy to produce software. That's free of obvious books. If you're not doing that, then you can improve to hit that level. So one of the other things that comes up to testing on my mind is like one of the first aha moments is that I had written some code.

[00:24:35] Like I had wrote some test for it. I had. And then something went wrong. So like, Hey, there was a bug. So then instead of just like, Hey, I'll do some manual testing and write something that could fix it. I then wrote a test that demonstrated that there was a bug and then wrote the code to fix the test. And that was the first time that ever happened.

[00:24:54] And that was a magical moment for me because now I could say that, Hey, this bug will never, ever have. So, if you think of this on an asymptote, the number of defects you find should eventually decrease down to some point over time with a sufficient test suite. There's no reason to introduce new bugs.

[00:25:11] Only uncover new ones. Yeah, you can quibble. If you're adding more and more code all the time, you're probably going to be adding more and more bugs. If you're working for I'm experiencing this right now, if you're working for a business that's growing, there's going to be more people hammering on the features more and like you're going to encounter more educations and stuff like that.

[00:25:31] So there may be more bugs, but I agree with what you're saying in principle, if everything is fixed, the usage of the application is fixed and you're not adding features. And all you're doing is adding tests as you discover bugs, then. Yeah. Like your bugs should go down to zero because every bug you have, if you write a test for it, then you'll never encounter that again.

[00:25:50] It's a great feeling. Yeah. And in that model, the type of issues you find over time, the complexity of them goes up because they're not really related to interplay of something known, but they're usually related to more and more. Complications instead of just one unit of code interacting, but then five or 10 or 200, or at least just larger components and permutations that you never considered in the first place, because like you said, some customer hit it for the first time.

[00:26:15] He was like, wow. I thought that could never happen. Or I didn't even know this was possible. So in your experience teaching, have you been able to demonstrate the red green? Yeah, definitely. Yeah. And in fact, one of the classes I taught, we spent the whole week just on refactoring to the point where the students asked, can we not talk about refactoring anymore?

[00:26:36] This is getting boring, but there's a lot to talk about when it comes to refactoring. And it took me a long time to really appreciate what it really means. And there's an entire book. You probably know Adam, a refactoring by Martin Fowler, where it talks about all these different refactoring techniques.

[00:26:52] And that's how things really get nice. Because as I said earlier, the first time you write any piece of code is probably not going to be nearly as neat and tidy and understandable as it could be. And testing and going back and refactoring enables you to do it. And I guess the thing that I didn't understanding is I was thinking of the red, green refactor cycle in a sense that it was too localized.

[00:27:16] Like I was thinking about that in the context of a method or maybe at most a class, but now I think of that on the level of a module or an entire application. And that's where it really gets powered. Because if you have sufficient test coverage and you can go back and refactor your entire application, and I'm not talking about making huge changes all at once.

[00:27:38] Oh, here's an example. I had an instance in my application. It's a medical application and we talk about left eye, right eye or both, and left eye is his OSTP right. Eye as ODI. And both is Ooh, that's the jargon of it for the doctors. I was referring to that distinction as I designation. So I had all these various.

[00:28:01] And database columns called I designation later, I learned that the doctors don't call it. I designation, they call it ladder alley. So I had this horrible period of time where it was called. I designation in some places in ladder reality and in other places. And I knew that was really bad. So I went through and everywhere that I had, I designation, I changed it to ladder Allity ran my tests and sure enough, I broke something in the course of doing.

[00:28:26] Because there was some stuff where you can't just do a search and replace for I designation and replace it with ladder reality. There's some stuff where I was referring to it. Like I had designation without the word I write next to it, like has designation or something like that. And so that stuff didn't quite work the same.

[00:28:43] And so my tests saved me, but that's an example. I bring that up to give an example of something that might touch your whole entire code. It's not a huge change, so I'm not radically restructuring something. I'm just changing the name of one thing to another. But those kinds of things aren't even possible unless you have decent test coverage.

[00:28:59] Oh yeah. And I think this speaks to the level at which you can, refactor is directly related to the level of your test suite. If your test suite only covers say methods, Well, then you can only refactor those methods with confidence. And as you ladder up the coverage, like the different types of tests you're right.

[00:29:15] I mean, this is a big level of sort of subjectivity is like, what is the, of what integration? Like end-to-end acceptance, blah, blah, blah, all this type of stuff. But the fact of the matter is that if you write tests that integrate more and more bits of code, then you can change those bits of code with more confidence.

[00:29:30] So if you have some sort of end to end tests that cover. The whole flow or journey, whatever in your product or application, then you can refactor that whole thing or make a change to that with confidence and not have to worry about breaking it. And I think that's where really some of the real fun comes in in software engineering is when you can actually just do your work with the confidence that it is working as expected.

[00:29:51] And you can think like, Hey, there's this thing that's bothering me that I want to change, but maybe without tests, I'm not confident that I can. Because it's just too risky or too will take too much time or, you know, whatever. But with tests you can say, Hey, I will do this, or at least try this. And without it, you're just stuck in this rut where sure.

[00:30:09] Maybe you can do it, but it's going to really suck. Yeah. And would be really. Yeah, exactly. And I think this is one of the other things that speaks to the designation of senior or some kind of more experienced engineers. Really? It all comes down to managing risk, like a risk of the software in question being correct.

[00:30:27] But one of the other risks that like, this is something that I, when I first learned testing, wasn't something that I was aware of, but it was. It only came out like long, long tail of this, which was, there's a risk of producing incorrect software, but there's also a risk to quote the business and that the engineers will not be able to keep pace with the business.

[00:30:46] If they have to spend so much extra time just verifying the software or writing it. But there's a mistake in thinking that like writing tests takes more time. It takes away from developing X. But the reality of it is that if you have automated test. Then you will be able to work much faster than if you didn't.

[00:31:04] Yeah. Even developers believe that writing tests is this thing. That's extra writing a feature with tests takes more time than writing the feature without tests it's as if the belief is okay, I can write this in five hours with no tests or I can write the feature in five hours and then spend an extra two hours adding.

[00:31:22] It was like, no, you can write it in five hours with no tests or you can write it in two hours by using tests. It doesn't work out like that. Always. Sometimes it in fact is much more time consuming. In fact, sometimes it takes 30 seconds to write a feature and then an hour to write the test or something crazy like that.

[00:31:39] That's definitely true. But on average writing tests is a net time saver. Yeah, I think it only takes more time in the beginning when you're honing the skills of learning what to test, because learning how to write. It's a legitimate skill. Isn't just come to, you have to practice it. But yeah, I think that's a super important thing to realize that even myself, I didn't consciously realize that until relatively recently, that test can save you.

[00:32:05] It's it's not like in a long-term investment that takes a long time to pay off in some cases. But like in the normal case, it's not like if, if I personally start a new rails application and I write tests from day one, that's not a long-term investment that takes a long time to pay off. It's going to start paying back immediately.

[00:32:23] There are cases if you to go back to that example, again, of the application that's been developed for two years with no tests and there's this big team and they don't have the skills and yeah, that's going to be a long-term investment. It's going to be a big upfront investment. It's going to take maybe a year or two before that starts paying off.

[00:32:40] When it does start to pay off the payoff will be great, but that's not the case. In all cases in, if you have a skilled developer, who's just starting from a fresh application. Testing is going to start paying off right away. Oh yeah. I think one of the very first projects I had was effectively, not necessarily in the beginning of a rescue project, but it was, as you say, it was, somebody had developed this application, it had some tests, but at some point they stopped passing and then they were never updated, which is even the worst.

[00:33:08] So it's like, okay, our Dwight trustees or not, but they're, if they're not passing then you'll and it's like, do I try to fix these? Or do I just blow them away and start over? Or. Yeah, so I just blew it away and start over because it was just easier and spent two to three months just writing tests for the functionality in existing applications, such that I could build new features and change the ones according to the client's requests.

[00:33:31] And even now the company I work at, they have low test coverage, and this has been a multi-year effort. But this is actually one of the real problems when it comes to testing is that if for some reason, the people who start the projects don't do testing and then the project is successful enough to go on for X number of years without it.

[00:33:50] Then the effort required to add testing after the fact is, I know just for the sake of argument, like an order of magnitude, more difficult than it would have been in the beginning, and then the channel. If you start adding tests later, when new features come in or bugs are found or whatever, then you may not even have the kind of test infrastructure in place to exercise that part of the code, or it's completely uncovered.

[00:34:11] And then maybe status quo. There's a bug fix without a test for it because it's just not possible. And the can gets just really gets kicked down the. Do you try to impart the importance of testing as early as possible to newcomer? Unfortunately, a lot of times I'm not there at that early stage of the project to try to influence that I'm frustrated and I do have mixed feelings, but I'm often frustrated by the attitude of let's not spend time making this.

[00:34:38] High quality, because what if the startup fails and it's, we spent all that time making a good for no reason, but what if you don't spend the time making it good and it succeeds, that's a little bit of a tricky problem to be in, and yes, it's true that you can have the revenue to spend money to fix it later.

[00:34:55] And it may also be true that if you cut corners early on, only by cutting corners, did you get to that place where you have the revenue? Because you've got to market faster. But I'm not super convinced by that. I think that that is maybe a self-delusion because again, the there's this quote from, from Bob Martin, the only way to go fast is to go well.

[00:35:17] And it's like, why is good code, good code is called good code, precisely because it's fast. It's not called good code because it gives us good feelings or something like that. Although maybe that too, I prefer not to use the word good code or quality code or anything like that, because those terms don't carry a lot of meaning.

[00:35:36] I prefer to say that the code is understandable because that makes it much more. Think about what a non-technical person thinks when they hear good code versus bad code, they just think, oh, it's programmer. Just, they want to practice their craft and stuff like that and make themselves feel nice. And that kind of thing.

[00:35:54] But if you say the code is understandable or it's not understandable, that's more tangible. It's like, oh, okay. The code is not understand. That means it's time-consuming and expensive to work with. Got it. So we need to make effort to make the code be understandable because code that's understandable is faster and less expensive to work with.

[00:36:13] Okay. I can see how that makes sense. Yeah. Let's make the code understandable. Yeah. I like your point about. Considering what if the startup succeeds? I mean, that's the happy, that's the path that you're trying to get to, right? Like you're starting a business. You want it to succeed and go on. So then what's the scenario where for some reason you don't put the testing in place early and then X number of years down the line, then there's almost a stay of reckoning.

[00:36:36] And I've been in situations where maybe there's. Or maybe you don't have the best coverage or whatever, but in any business, there can come this point when there's a change in requirements, so great or a business need. That's so important that the system may have to change drastically. And if you don't have tests, then you don't have the capacity to change.

[00:36:54] The code to meet those requirements or that you just can't even do it at all. And I've been in this situation where there was a new business requirement that came in and the company needed this to be successful in the market. I mean, without it, they just weren't going to succeed, but the overall architecture of the system was not there to support it.

[00:37:11] The tests weren't there to. Support that level of change. And if you're going to succeed, you need these things. And there is a really big risk of just deferring it and deferring it and deferring it, you know? And I think part of it comes from the assumption that it will take more time, but I can tell you from my own personal experience now, If I were to write code that didn't have tests, that would take me longer than if I did add tests.

[00:37:35] Like for me thinking this sort of like red, green refactor loo, and thinking in term of like test first and writing testable code, it's just some my native way of thinking these things aren't discreet in my mind. I can't just write code without a test. Like if I were habitats, I'm going to refactor it.

[00:37:49] Like I'm going to make at the very end of this. The best possible outcome, given what I have at the time. And if I don't feel like that, then all kinds of stuff just gets really weird. Like things don't make sense, things break, like just doesn't come out the right way. And I'm wondering if you have had similar experience, like just in your own work-life oh, absolutely.

[00:38:08] Testing has become an inextricable part of my workflow. Just like version control and other things like that. If you asked me to write something. Tests and felt version control. It would just completely mess with my habits because it's like, okay, I build a feature and then I see, oh wait, I can't commit it.

[00:38:26] You're making me not use version control. Okay. This is weird. Um, all right, well, okay, now I'm going to write the test. Oh, wait. Not even doing the same thing at all. It's not programming anymore because it's just these parts, these things, aren't again, extra things that are added. They're part of the process of programming and they can't be separated.

[00:38:46] Yeah. Yeah. I think that's part of something that needs to be taught to the people who just come into it also like sure. They want to learn these skills, but this is the on-ramp into a much different way. Thinking about the act of software development. Once you really get committed to this, then I don't like to use the term, but it becomes like a religion in the way that you can't really change the way that you think about it.

[00:39:07] Like these are rituals and ways of working that are highly, they're just built into how you think and what you do that you can't, as you say, they're an extricable from your world. I would steer away myself from calling it a religion because with a religion it's not grounded in. Um, and I don't mean to offend anybody who is religious, but it's not grounded in reasons for believing what you believe.

[00:39:30] I believe in such things. Because the holy book says so, or whatever, I don't believe in my testing practices because the testing, holy book says that I should do such and such. Although there certainly are people who say do TDD a hundred percent of the time or else you are a sinner or whatever. I think of it more like science in pre-scientific times.

[00:39:50] If somebody had some kind of medical ailment, we would cut them and let them bleed into a bucket for a few hours, because we thought that would make them get better. That's not grounded in any kind of reason. That's just superstition or whatever you want to call it. But now, because we have science, we've found that these certain things work in these certain things don't work.

[00:40:10] And so we, we do things a certain way, same exact thing with testing. We've discovered that everything tends to go better. When we use these certain practices. And so the habits that I've developed are in accordance with those justified beliefs. And so things actually do go better when we follow those practices.

[00:40:29] So it's not like a religion, it's more like a doctor washing his hands before he performs the surgery. This is just a proper way to do the work. And I like what you said about science too, because now I think the benefit of testing and sort of the things that enables specifically continuous delivery.

[00:40:46] Like these higher abstractions of dev ops that we talked about before are all predicated on the idea that automation allows you to go faster with sufficient safety. There's no trade off between these two, which is the common misconception of people outside of software. They just see it as, it will take more time to do these things.

[00:41:04] But empirically, we can say that writing test using automation is the best way to develop software. I think that this is proven by empirical data, and if people want to dispute that. That's a whole different conversation. You know what I mean? Yeah, exactly. Yeah. I don't really know anybody now who actually believes that not writing tests is better than writing tests.

[00:41:26] I've been there before I've encountered those people, but I suppose naturally because of who I am or, or what I've believe about testing and stuff like that. I haven't stuck around at those places. Yeah, this is true. So I think this is something that I said in our, one of our interviews is that if you are working for a company who doesn't think that testing is important, then you should find a different company to work out.

[00:41:46] You will be unhappy your work. There will just be worse than if you were working at a place that actually cared about testing or these other aspects of software quality. You know, it's like you said, being the doctor at the hospital where you don't have to wash your hands before you go to surgery, whereas the ones that expect you to do so now there's a big difference there, right?

[00:42:03] Yeah. And he see forum posts sometimes they're like, Hey, uh, my boss thinks testing is a waste of time. How do I convince my boss that we should write. And it's like, no, first you just need to learn some things about human nature. You're not going to persuade your boss to have different beliefs. The sad truth is that you just need to go work somewhere else.

[00:42:22] Yeah. Well, I think that's probably a good jumping off point for our conversation. So thanks for coming on the show, Jason, there was a lot of fun to talk to you about testing and kind of revisit the perspective of beginner. Is there anything you'd like to leave listeners with before we go? Yeah. Thanks for having me.

[00:42:36] Yeah, you can find all my stuff at code with Jason dot. All right, Jason. Thanks. And for the listeners, you can go to small badgers.fm for links to all the stuff that we discussed in the show and links to Jason's courses. Thanks again, Jason. Thank you. So that was my conversation with Jason. I've had some time to digest it, plus I've listened to it over and over again while.

[00:43:00] Well, I'd like to follow up on two points of the conversation. The first is my earlier statement on testing as a mandatory practice for software development, or as I put it. In other words, you're not a professional software developer. If you're not doing automated testing. The second point is Jason's point about doctors washing their hands before going into surgery.

[00:43:18] I think Jason's point about hygiene, better describes what I'm getting at with my comment on professional software development. At one point in time, the medical field was unaware of dangerous path. Then at some point they learned about them and became aware of the dangers and it became common practice to sterilize instruments and wash up before a surgical operations.

[00:43:39] If a hospital did not do this practice today, then they would be sued out of existence for negligence or malpractice. This mirrors my position on automated testing for going automated testing. At this point in time, it's just professional negligence because it exposes a business to many well-known and potentially fatal.

[00:43:58] I say fatal in the sense that systems may collapse under technical debt, without the safety of automated tests. In the worst case, leading to an absolute total ground-up rewrite as the only way to clean up the mess. Businesses may also collapse because they cannot adapt quickly enough when they rely on manual processes or they may be simply outmaneuvered by a more nimble competitor.

[00:44:20] This doesn't even account for the negative impacts on the developers themselves. I assure you that working in a team or on a product that does not have automated tests will create lower job satisfaction and lead to higher rates of burnout that doesn't even account for the quality benefits of automated testing.

[00:44:36] Jason brought up a point of science and reason as justification for automated testing. And there are certainly a strong case here. Allow me to share a few stats from the book accelerate on high-performing it companies compared to low performers, high performing it company. 46 times more frequent code deployments than low performers.

[00:44:54] They have 440 times faster lead times from commit to deploy. They are 170 times faster with meantime to recovery and five times lower change failure rate, or in other words, one fifth as likely for a changed fail. And these numbers are from a few years ago. The gap is even wider now because the fast will get faster and the slow will stay slow.

[00:45:17] Again, these statistics are largely predicated on the practices like continuous integration and continuous delivery. Both can only happen with automated testing. So if you believe like I do that, continuous delivery is the most effective way to develop software and choosing to work in any other way.

[00:45:35] Isn't the best case uninformed and in the worst case, negative. I don't think that is a bold statement, nor do I have any qualms about saying it. If you'd like to learn more about why I feel so strongly about this, and I have two resources for you. The first is accelerate the science of lean software and DevOps published in 2018.

[00:45:53] This book contains a wealth of information on the technical practices that create successful software organizations, all backed by empirical data. I'll put a link to the show notes to my review analysis, along with some of my favorite. The second is the Dora or state of DevOps report. This free annual report surveys companies on their practices and outcomes.

[00:46:14] Multiple years of reporting served as the basis for accelerate. Now it's a great way to track high-performing companies. Anyway, I hope you enjoyed the conversation and the. Stay tuned for more interviews that completes this batch visit small batch of.fm to subscribe to the show for free, like a topic covered on the show, then call plus +1 833-933-1912.

[00:46:36] And leave your request in a voicemail. Hope to have you back again for the next episode. So until then happy shipping one to learn more about dev ops without wasting your time and send them from my free email course at free DevOps course dot. My chorus combined is the best from the dev ops handbook, accelerate and years of software delivery experience, you'll learn the three ways of DevOps and the four KPIs of software delivery performance.

[00:47:00] More importantly, I'll show you how to put that theory into practice. That means shipping better software faster. Sign up today at freedevopscourse.com.