How I Tested That

In this conversation, Janna Bastow, founder of ProdPad, shares her insights on testing pricing and packaging strategies. She discusses the importance of understanding the different roles and decision-makers in B2B purchasing, as well as the challenges of navigating the procurement process. 

Janna also talks about the evolution of pricing at ProdPad and the lessons learned from testing different packages and pricing structures. She highlights the value of customer feedback and the use of AI in product management. 

Overall, Janna emphasizes the need for product managers to focus on the core aspects of their role and leverage technology to automate repetitive tasks.


Is your innovation pipeline clogged?
  •  Uncover the risks, bottlenecks, and gaps holding your best ideas back.
  •  With the EMT Diagnostic, you'll get a clear, actionable plan to fix them.
👉 Book a free discovery call at https://www.precoil.com/innovation-diagnostic

What is How I Tested That?

Testing your ideas against reality can be challenging. Not everything will go as planned. It’s about keeping an open mind, having a clear hypothesis and running multiple tests to see if you have enough directional evidence to keep going.

This is the How I Tested That Podcast, where David J Bland connects with entrepreneurs and innovators who had the courage to test their ideas with real people, in the market, with sometimes surprising results.

Join us as we explore the ups and downs of experimentation… together.

David J Bland (0:0.891)

Welcome to the podcast, Janna.

Janna Bastow (0:2.787)

Hi, thank you so much for having me. I'm excited to be here.

David J Bland (0:6.075)

So I was thinking about the first time we met, and I think it was at St. Luke's in London, maybe 2014. I was back at NEO and we were doing like a lean day event and we invited all these folks in. And I remember, I don't even think it was a specific session. We were just like catching up either in between sessions or outside the conference. But I vividly remember meeting you for the first time and you were talking about some of the stuff you were testing. And I was like, wow, I really need to follow.

her stuff over the years and just like I became a fan almost right away. And so I'd like people just to get to know you a little bit before we jump into some testing stories. So maybe you can give the folks a little bit of background about yourself.

Janna Bastow (0:46.083)

That's fantastic and great memory. That's wonderful. And if you're thinking back to 2014, what was I working on then? So that would have been my early days of ProdPad. So some of you might know me as the founder of ProdPad, which is the product management software. You might also know me as one of the co -founders of Mind the Product, which is the big community in the series of events for product people.

So I'm a product geek through and through. And so not surprising that I was geeking out with you in the hallways of this event about what we've been testing and whatnot. And back then I was probably geeking out about how we were trying to break into the product management, product road mapping tool, software space. And I think back then it was a lot of stuff around onboarding and getting people through that initial flow and that sort of thing, which actually is not far different from what we're doing today. You know, I've got to...

a saying that I echo throughout the halls of our Slack, which is always be onboarding. So we're still doing onboarding tests and stuff like that. So, and some of that's gonna be talked about today because we're gonna be talking about, you know, what we've been doing lately.

David J Bland (1:55.960)

I was thinking back to that conversation, I was refreshing and I was looking through photos of that event. This is one of my all time favorite events that I've ever been a part of. Jeff Gothelf and I, you know, co -presented and everything and did workshops together in that space. And I was thinking about, I think the conversation we were having was early stage and we were talking about, so the different sort of roles inside bigger companies. So we're talking about the user customer, decision makers and kind of economic buyers and how B2C, they seem to be all usually the same person.

to the most part. And in B2B though, you could have a really strong pull with product folks, but then if they don't have budget authority or decision -making authority, especially over a certain amount, they have to go get approval. And sometimes security people chime in and they go, oh, we don't want that data, we don't want our data in that system. And you could have really positive feedback all of a sudden and they get an abrupt no. And so I think navigating that is super challenging for anybody that's targeting B2B.

Janna Bastow (2:57.315)

Yeah, absolutely. It's something that we live every day. You know, you get an excited advocate who loves it, who wants to bring it on board, who is right there. And if it was their own credit card, if it was their decision, they'd be absolutely all over it. But it's not their decision. They've got to bring it to somebody in procurement. They've got to bring it through the people, get it past the people in legal, the people in sometimes it's product ops, sometimes it's security, IT security, other people who have got to make the call on this. And that they are also your buyers. They are also stakeholders who need to be tested with and convinced.

David J Bland (3:35.029)

Yeah, maybe we can start there then. So you have this saying, I love that saying, always be onboarding. I have to remember that. So when you're going through that process, how are you kind of testing your way or what big assumptions are you having when you're seeing people give you either like positive feedback and you're trying to figure out, is this something that's going to be bigger or how do you navigate that as a founder?

Janna Bastow (3:40.579)

Yeah.

Janna Bastow (3:58.467)

Yeah, I mean, you know, that that whole split between the ultimate buyer and your advocate, one of the things that we realized that makes a really big difference is just getting your foot in the door. Right. And not all procurement in a company is the same. So we actually discovered this by doing a bunch of rounds of discovery work. And by that, I mean, what we were doing was just getting on the call, getting on calls with senior product people and asking them how they bought software.

How did you buy software the last time? What did you buy? You bought Miro. Okay, how did you buy that? Why did you buy that? What did that look like? You bought Loom. What did that look like? Okay, well how does that compare to how you'd buy something like ProdPad? And we learned from that how some of these companies are working. And...

Very much it's that land and expand type of thing that we learned. And we sort of knew that name, we were sort of doing it, but it wasn't ingrained with how we were working and how it was baked into our own platform. And where we were going wrong was that our pricing was set at a level that required them to go talk to these different stakeholders. So if you wanted to buy yourself some prod pad, you needed to go to...

legal and procurement or whatever, because it hit a certain level that meant that you couldn't put it on your, you couldn't pass it through expenses. And there were a couple of conversations in particular that really stood out where the keyword use was that product people wanted to experiment. And that made sense. That actually resonated with me because product people do want to experiment, but they want to experiment with their tools as well. And they want to be able to, uh,

that something works, they want to play with it first and they need more than what's in their free trial. So of course you get a free trial, that's natural, everyone does a free trial. But they want to be able to experiment a little bit wider than that and they're totally okay to put something on either a personal credit card and just like charge it to their own credit card, right? They have a salary, they don't care. Or slip it under.

Janna Bastow (6:10.179)

some sort of radar so that it goes under like an expense account. And we found some magic numbers, you know, some companies, some people in companies had like $50, some people could get away with a hundred, but we're like, okay, if we can keep it under that and just allow them to, to, to get that under the radar, then actually we can allow part of people to experiment and we can sort of get around having to create something that gets in front of the.

legal and the procurement and the operations and the everybody else, until such point that it was ready to go when you had true advocates who were sitting there going, okay, I've been using this and now here's a really good case, a use case as to why we're gonna roll this out to the wider company. Not just I have a strong conviction that we should go because I've been trialing it for two weeks.

David J Bland (7:1.645)

That's interesting, the threshold, I've experienced that too, and some of my work, if you hit a certain threshold and it requires multiple approvals, I'm actually dealing with that in some of my clients right now. And I love that you're talking about the past, having them tell stories about the past. It's not a future hypothetical situation where they would buy your software and how might that work? It's more of a, hey, tell me how you purchased X and get them telling stories because so often it can be this maze of how to...

Janna Bastow (7:10.915)

Yeah.

David J Bland (7:30.732)

how purchases are made and everything. And then if you have to go to other players, you have to, you know, you have a value proposition to them, which is slightly different to the people using it and everything. So I love, I love how you describe that. So testing your way through with conversations, it sounds like it was a very kind of organic or natural way for you to learn.

Janna Bastow (7:50.787)

Yeah, absolutely. And, you know, asking about the past allows you to get real examples that you can compare to. So I was able to take that and then go to the team and say, okay, so here's some real examples. And I talked about Miro and Loom there. They were a couple that came up quite a few times as tools that product teams had bought recently that had worked in this way. And I went, well, let's take a really close look at how they're pricing and what can we learn from them. we dissected their pricing and picked up a few tips.

David J Bland (8:22.123)

Yeah, almost using them as a proxy in a way to learn. I love that too. So early days, right? So we're talking about like early 2014. Going through the years, I mean, all the different risks that you've kind of uncovered and everything.

What about how you price things, how that's evolved? Because it sounds like you anchored your pricing maybe on some principles in the early days. How have you been testing that over the years with like packages and different things? Maybe you can walk us through some things that have worked for you and maybe some things that didn't go so well.

Janna Bastow (8:56.835)

Yeah, absolutely. So I mean, I'll be the first to admit that early on in our history, we did it really poorly. We came up with pricing kind of by back a napkin, finger in the air. Sure, this sounds good. I think I recall a conversation between me and my co -founder Simon early days and we said, well, we're building product management software. There's no one else building this stuff. So we can't really compare like to like, but there's this other tool called Basecamp, which does project management software.

So that's kind of close enough, right? An online tool looks kind of similar vibe -ish to ours. And they charge 25 bucks a month. So we'll charge 25 bucks a month. Let's see how that does. And we just slapped a price on it just to see if anybody would buy it. And they did. So we thought that's good enough. And that was our starting price point. It wasn't long afterwards that we started splitting out into what everyone else does, which is good, better, best pricing. Right? You had the low price, the medium price, and the higher price.

And I can't remember exactly what we put in those packages. We just sort of, we siphoned off different bits and pieces and said that looks premium. We'll call it premium or enterprise or whatever we called it. And we slapped on different price points. We probably looked at a few different comparable websites. Maybe it was Basecamp, maybe it was, you know, the likes of Zendesk or other SaaS tools that were popular back then. And came up with pricing that sort of fit and.

people continued buying it, so we continued shipping it. And that suited our purpose for a little while. And over the years, those early, you know, first five years or so, we adjusted prices a couple times. We changed the names. We would bring the price up, and no one really ever kicked off. But we didn't do it as fast as we could, and we didn't really design experiments around it. But what we would do is we would change the price and see if anybody yelled. And no one ever really did.

And that was probably a really good sign that we were underpriced. And we realized that we were leaving money on the table. One of the big signs that we were really leaving money on the table was that we had a, we were charging these good, better, best packages based on a set of features, but it would also include a certain number of users. So that middle package would include,

Janna Bastow (11:18.723)

enough users to cover 10 of your product people and the rest of your team too. It was pretty good price. And I think at one point in time, it was something like $500. And I think at one point in time, it had gone up to $900 or something like that. I can't remember the exact prices. And I remember being in meetings, and this happened more than once, where I'd do the pitch, I'd show them the whole thing and they'd go, great, so yeah, we're gonna be getting that middle package.

Cool, and so that's the price per user, right? And I'd say, no, that's the price for everybody. And then think, God, I wish it didn't say price for everybody. If that were price per user, I would have been able to sell it for 10x the price. Oh God, we're underpriced. Okay, we really need to think about what our price should be because again, we are leaving money on the table.

David J Bland (12:10.435)

That is fascinating. It's so hard to get pricing dialed in. And I struggle with this with my clients sometimes as well. And, you know, I'd like to think that we were being very methodical about it. We're saying, OK, well, what are you doing today? What are all the tasks that you've done in the past? How much have you paid for these to solve these? And then you try to kind of anchor against that in some way. And in reality, it's not always that clear. And.

I have some teams like Coach where they say, well, we just ask them what they would pay, which is a terrible question, by the way, to ask, because people do not have a great frame of reference for what they're going to pay for things. And sometimes they say, well, what's the most you would pay? And even then, it's still not a great question, but it's a little better in the sense of they start to understand, oh, well, I wouldn't pay more than X per user. And so I'm just trying to put myself in your shoes. So you're going through these pitches, and you are...

Janna Bastow (12:45.827)

you

David J Bland (13:8.130)

getting this feedback. And so what do you and your team do with that feedback? How do you start to internalize that and say, well, can we test our way to a different pricing structure and explain how that works?

Janna Bastow (13:18.979)

Yeah, so keeping in mind that we were pretty slim micro team at this point in time, right? So we were having to prioritize pretty ruthlessly on which things got done and which ones didn't. And we were prioritizing product work over packaging and pricing work when actually in hindsight, we should have been prioritizing pricing and packaging work. I think we could have leapfrogged some of the efforts that we made. But hindsight's 20 -20. So.

We, as you're saying, we were internalizing the stuff, we were capturing it as feedback, but we were never really losing deals because of price. Every once in a while you'd have somebody walk away, but they were probably not the right fit for us anyways. But for the most part, we weren't losing deals on price. And so we realized that there was some opportunity to look at new pricing. But we realized that we were actually pricing on the wrong value.

metric, right? So the price went up at the wrong sort of pace. The bottom level price was actually too expensive. It was too much of a lift from zero, as in not buying ProdPad to buying that first package. Now that first package included enough stuff to get your team started and cover three product managers and, you know, have enough stuff in there. And then it went up to a package that included for 10 product managers. And then it went up to a package for 50 product managers.

And then there was the inevitable call loss. But the pricing was sort of on a shallow slope, right? It sort of stepped up to a high level and then sort of sloped upwards. And as the price went upwards, it didn't really capture those really proper enterprise companies. It didn't price in for these companies that had sometimes hundreds of product people who were coming to us.

But it penalized those who just had one or two product people or just a few product people, particularly those companies that just had one product manager who wanted to use it but didn't want to be charged for three product people. And so we were looking at it going, okay, so do we have the packaging right here? And then there were, of course, the conversations of people who were saying, hey, you know, I see that this functionality in that middle package that I want, but I don't want to pay for 10 product people. I only need, you know, the small package, but I do need that.

Janna Bastow (15:41.283)

higher level functionality. So our packages weren't quite right. And this is one of the fundamental problems with good, better, best with a complex product. Right? ProdPad is a rich product with quite a lot of functionality. It's not a one click, one purpose product. And so when you try to package it up, it's good for three theoretical companies, but actually in reality, not good for many real companies at all.

So we realized we had to bust up these packages and that was going to be a lot of work.

David J Bland (16:16.669)

Yeah, I can imagine. I was thinking to your story. Were there any telltale signs or signals other than just getting verbal feedback of patterns of behavior about how people were using it or anything that you know, I see this sometimes where we get we end up with sub segments or maybe classes of user or customer where.

You have your best laid plans and then when you see people use it, you start to see patterns emerge that maybe you didn't anticipate. So I'm wondering, did you see patterns or were you looking at patterns of behavior and saying, oh, maybe that's a package or maybe that helps inform a package.

Janna Bastow (16:55.971)

Yeah, so our packaging started to get a little bit messy and we allowed it to do so because we were trying to make these three packages work for wider sets. We allowed people to do what we called add -ons. So we would allow people to customize these packages. Now you had three packages, but you could also add on different pieces and these add -ons were great. They were revenue drivers and we could see people willingly paying.

premium to add something onto their package, even though it wasn't the full package, it was a smaller package, but it had this additional functionality. And so we could charge a premium for new modules that we'd added. So for example, we launched an OKRs module, something to take your objectives and your key results and break them down in more detail. If you wanted to be able to add that, you could buy that for an extra charge per product person that you had in your account. Cool. People were buying that. And that proved to us that people were willing to pay for this stuff.

and pay an uplift, but they were willing to pay for more complex packages and that we could charge for extra features that we're building. We didn't have to just lump it in with one particular feature set. We also had additional user access stuff. So the ability to assign product managers to specific products.

which helped with organization and making sure that people knew who they were talking to when they were working with their products. Bits and pieces of functionality like that were accessible as add -ons and people started buying these things. And so it had a, you know, tangible, very measurable uplift. And we could see people who were, you know, not only just buying the stuff, but opening up the pop -up that looked at these options.

reviewing the options and then choosing to add the options as well. We could see that sort of mini checkout flow happening.

David J Bland (18:51.704)

I like that, so you integrated some sort of analytics to see who's clicking on who's seeing, what are they selecting, and then seeing what they use. So did that help you shape your packages in a way where you see, oh, there's collections of add -ons that could be a package, or maybe we should think about how we repackage things. It sounds like you went from purposely messy, maybe, to something that was a little more streamlined, but it's based in, rooted in customer behavior.

Janna Bastow (19:18.979)

Yeah, so that gave us courage to move towards this modular pricing that we wanted to test. And the modular pricing we knew was going to be a bit of a monster project. And because what it involved was blowing up our current versions of pricing and putting something else in front of people and, you know, seeing how it works. Because ultimately with testing pricing, you can ask people all you like. You can say to them, you know, would you buy this?

Yeah, what would you pay for this? How would you pay for this? But you don't know until they actually pull out their credit card and pay. You can't see the usage until it's actually in front of them. So what we did was we came up with the theoretical packages. And we did do the surveys and the the ask the questions to get a sense as to what they

might cost and we used a survey tool or a pricing analysis tool called the van Westendorp analysis. Are you familiar with this one, David? Yeah, so I really like this one because it triangulates the pricing. It asks smart questions that at first seem really simple, but what it does, it allows you to say, asks four questions. At what price is this so expensive that you would...

David J Bland (20:28.277)

Maybe you can explain it to us.

Janna Bastow (20:45.283)

not consider buying this, it's too expensive. At what price is this expensive? But you would still consider buying it, you would still buy this thing, all right? At what price is this so cheap that you would think that there's gotta be something wrong with it, so you wouldn't buy it? And at what price is this cheap, but actually not so cheap that you wouldn't buy it, so it's a good deal? So people are gonna give you four prices.

And you can ask this for the existing packages that you have, you can ask it for theoretical packages, you can ask existing customers, potential customers, you can segment it out, you can do all sorts of stuff with this, but then you can also triangulate this. And that magic point where these numbers cross over is basically sort of gives you a bounding box saying this is where your price is, right? This is your pricing sweet spot.

And there might be different pricing sweet spots for different packages or for different segments as well. You might find that, you know, product leaders at enterprise companies have a different price point versus product people at startups, for example. But we were able to, from there, get a sense as to what the general pricing might be. And we also cross -compared it to what our competitors were pricing, as well as compared it to that.

learning that we had with making sure that we could get stuff underneath the...

the line for expensive amounts, right? Something under that sweet 50 -ish quid mark that people could just slip under without having to ask procurement and all the rest of the folks for permission to buy this thing. And we did find some magic numbers to begin playing with. And what it actually allowed us to do was break ProdPad up into three different products.

Janna Bastow (22:44.579)

And what's actually interesting about this is that when we first came up with ProdPad, ProdPad was actually two separate products. It was a roadmapping tool and an idea management tool. We later added in the feedback management tool. That wasn't in the original sketches that I drew up and brought to my co -founder Simon when I was dreaming up this idea, but it was two products. And he actually said to me, he's like, this is really cool, but it'd be more powerful if you just naturally combined them together as opposed to have...

two separate products and here's what it might look like and here's how we could build it. So it became one product that we tried to group together, which then became unwieldy because there's so many features and we'd added so much to it. But once it had so much to it, it actually made sense to break it back out into three products, because each one adds value in its own way. Some people come to Prodped because they need a roadmapping solution. They've got a roadmap that they need to articulate to different stakeholders and so that's a solution right in itself.

Some people come to ProdPad because they've got a whole bunch of ideas and a backlog. They've got a big scary JIRA that they want to get their junk out of there and put it somewhere that they can better sort through it and understand which of these ideas they should work on and spec out and send to development next. Some people have a big pile of customer feedback. That's where the customer feedback module comes in and they want a way to collect that feedback and understand it. And some people need...

A lot of people need all three or some people need just feedback and ideas or ideas and roadmaps or whatever sort of combination. So it allows you to get the combination of modules that you need and only pay for the value that the product is bringing you. So you're not paying for a whole package and a whole bunch of features that you're not actually making use of. And the other thing that we did is that we set it to per user pricing.

So you're only paying for the users who are using it. None of this, this one comes with three users, this one comes with 10 users. Doesn't matter if you have seven product managers, you're paying for 10. No, if you have seven product managers, you pay for seven. And with all of these packages, we always set that all your reviewers are free. So this was the modularization that we did. We actually, the first version of it came with this idea that we had these three modules.

Janna Bastow (25:3.299)

And we adopted the previous add -on strategy that additional premium features like OKR management or being able to publish your roadmap publicly or doing things like this were additional paid for features that you could add on. So you didn't have to pay for them, but you could pay for them. So the first version was something like three core modules. And then I think it was six or eight potential add -ons. And that's what we put out there.

David J Bland (25:33.741)

Wow, so that is amazing. So going through that whole story and being able to repackage things, I think we go through this phase of bundling and unbundling in these softwares and services. And I think the idea of pricing...

in testing your pricing, it sounds great in theory. And even in my book, we have viability tests and everything. But I feel like it comes with such anxiety. And I know you're looking back on this and reflecting and going, well, in hindsight, we should have done this, this. But in the moment, you don't know. And so why do you think price testing causes so much anxiety, even with product people?

Janna Bastow (26:13.955)

So a couple reasons. First of all, product people, and I don't think anybody really likes asking for money, right? It's easy to ask about, you know, does this feature bring delight? Does this feature bring joy? Would you use this? It's another thing to say, would you pay for this? That becomes a more awkward conversation. So I think there's just a natural bias, a natural social awkwardness that comes with it.

The other thing is that I think a lot of product people focus their time on the features, right? So it's about, you know, changing the product to be better so that more people buy it, as opposed to thinking about changing the price or the packaging, the way that you put it in front of people so that more people can actually get on board with it. And actually, a lot of times, there's more that you can do with the pricing and the packaging that can have a bigger impact.

rather than just adding more and more features to it. But again, a lot of product people are very focused on those new features or adding new code to it. It's also, I think, probably a symptom of being too customer driven, right? I think it's important to be customer informed, but not customer driven. Because frankly, are your customers gonna tell you what your packaging should be like? If you ask your customers, they're just gonna say, make it cheaper.

They're not going to actively tell you how to package it up. You've got to really dig deep to get these insights. But what they will tell you is which features you're missing. And so you're constantly chasing those next features because you're always missing a feature. You could build all the features in the world. We did that for years, building new and new features and throwing them into this good, better, best package thing. And it still wasn't good enough. And it was only when we busted it up into the right packages that we realized,

Oh, now it's sellable.

David J Bland (28:15.113)

Yeah, I can imagine that's a winding path. I think I see this too with some of my clients where we focus on desirability and I love focusing on desirability, jobs to be done, pains and gains to the customer, all that value proposition. But it's not their job to tell you the pricing. It's not their job to help you figure out the business model. And so when you think that overlap, Alex and I often call it like high value jobs. So there are jobs that the customers are trying to do, but are also overlap with what we're trying to do as a company. And then viability wise, it's how do we structure that? And I think.

of some of the more popular experiments or just exercises I see, things like buy a feature, you're kind of focused on features and ranking and prioritizing features with customers, which usually goes really well, but when you come to the pricing of everything and putting in prices on things, I think that can get challenging. So I feel as if what you're touching on here is maybe you...

you need to have the overlap or the integration between the business model and the product. And there are diminishing returns to adding more features to the product. You do have to have the right business model with it. Otherwise, you can have amazing product and still fail because you have the wrong packaging, the wrong pricing.

Janna Bastow (29:21.507)

Exactly that. I think one of the other problems is that pricing isn't necessarily always owned in businesses. Ask any business who owns pricing and you'll have people sort of looking at each other going, hmm, like is it sales? Is it marketing? Is it product? Is it finance? Is it leadership? Who is it who owns pricing? And it's different in every company and in often times it's literally not owned by anybody. It's...

sort of who was set some years ago and it's been trucking along, kind of doing its job, but probably not being cared for the same way that the core product is, if you've got a product team who's really focused on it.

David J Bland (30:1.829)

Yeah, that's really insightful. I think that's something that will probably give our listeners something to think about if you own pricing in their organization. So we've talked a lot about pricing, kind of your journey, you know, since we met in like 2014 and moving forward in packages and using sort of customer insights and your sort of product sense and testing your way through all that. Where is all this headed? Like, what are you testing now? What are you testing in the future? Beyond pricing, what's really excite you about, you know, some of the things you're trying to test now?

Janna Bastow (30:30.083)

Yeah, exactly that. So there's still more to do on the pricing front. We ended up rejigging that because we didn't get it right the first time. So we rejigged it into newer packages, which are still settling in. But the things that are exciting me is we're testing out AI stuff. And AI is notably hard to test, right? Because not only can you not ask people.

what they want out of AI because they don't know. People aren't sure about what it's capable of. But you also get inconsistent results from the actual product. You know, this is the first time that I've ever worked with a product where every time you put the input in, the output is different. And that's kind of the cool part, but also it's terrifying. And so we're testing with things like, well, how do you create a system that...

can see your entire backlog and can see what you're trying to do with your product strategy and your vision and your objectives and can see all your feedback. And therefore you can converse with it and answer questions like, you know, hey, can you tell me all of David's feedback? And can you tell me if any of that is on the roadmap right now? And can you tell me, you know, if any of this is, you know, in development right now? And it can give you those answers. But that takes...

a whole lot of...

you know, working with customers to understand what kind of questions they're going to ask. And also, just trying to dream up where things are going, because you're not basing this off, you know, a previous version of this, you're sort of saying, well, you know, if I could have really interesting tech, what would I do with this? And the way that I'm thinking about it is, how do you,

Janna Bastow (32:30.275)

replace the grunt work of the product manager using this sort of tech and therefore, you know, how do you start asking product people what sort of grunt work, what kind of grudge work is keeping them busy and how do you start pulling that out of them so you can test ways of taking that off their plates.

David J Bland (32:53.155)

Yeah, the AI stuff is interesting. We've been dabbling with it with assumptions and experiment plans and having not necessarily replacing people's work, but having them kind of check against, oh, this is what the AI came up with. Obviously, we're working with chat GPT a lot. And this idea of, well, I have the assumptions I'm worried about, but what else should I be worried about? Or here are the experiments that I want to run. What other experiments could help? Or here's my interview script. What could?

that look like if, you know, is there something I'm missing in this interview script and things like that. And you're right, it often does give something very different back every time. So then you're trying to say, okay, well, how do I use that in my flow? And I'm a big believer of not replacing everything, but to kind of augmenting. So I love those sort of stories you're sharing about what would product people ask and what would they be looking for? And so I'm thinking through this, say, how are you...

planning on testing that or whatever you can share with us about what are some small tests that you might, because it seems like it could be a really large effort to integrate everything. So what are you thinking about for small tests there?

Janna Bastow (33:55.139)

Yeah. Yeah. So, taking it back to when we first implemented our first open AI implementation. So what we got our access to the API and the first thing that we did was we set it up so that you could generate the idea description. So if you've got a stub of an idea, you know, you got

something in your head, just write it, the bare bones of it in ProdPad, click a button and it'll write out the rest of it. And we followed up a week later with a button that would allow you to write the user stories for it. And it keeps the human in the loop. So it doesn't just write it and then publish it. It writes it and then says, hey, here's what we've come up with. Now you can edit it and accept it or edit it and change it and remove it, redo it, whatever, right? So you can accept your user stories or change them or do whatever.

But what it's basically doing is giving you a one click, you know, generate your idea. And we were able to set expectations going, well, how many people are going to use this thing and how many of them are going to accept or reject or redo their ideas? And we could see from the stats how many people were using this thing. And the percent increase of usage of this thing was amazing. I think for user stories.

user story generation went up by 500 % that week. Product people hate writing user stories. They love when their user stories are written for them. And what was great, this was my favorite part of it, was, well, we're sitting here going, well, are these user stories of actually good quality? Right, or is this thing writing pure junk? Now we keep the human in the loop, so the product manager is checking them before they're accepted and put on their idea page.

David J Bland (35:34.531)

That's fascinating.

Janna Bastow (35:50.883)

But then also we could check as to whether the product manager went on to take these ideas and these user stories and send them to development to get pushed. And our assumption is that if you've pushed them to development, you think that they're good enough to go get done. And the percentage of the ones that were user generated versus the ones that were AI generated, the AI generated ones were more likely to get pushed to development than the ones that the user wrote themselves.

David J Bland (36:20.291)

Wow, that's amazing. It's such a fun time. Such a fun time to be in tech.

Janna Bastow (36:22.787)

Yeah, so loved that. Yeah, so we've seen that continue. We've now added it to you can generate key results. And we've we've specifically set it up, we've primed it so that it writes leading and outcome focused key results, because key results are a pain to generate to come up with. And so it brainstorms them for you. And again, we're seeing really good results of people accepting them and then making use of them.

David J Bland (36:50.435)

I love that you're following it through, you know, it's not just...

Well, one, you have the humans in the loop and two, you're seeing the outcome of this. How is it being implemented? And then you can certainly follow that thread downstream of, okay, how did this improve what you're trying to do in your processes? So yeah, I think AI is just a really fun slash scary time at the moment in tech where you're trying to integrate it in and this wave is coming and how do we test our way through that in a really thoughtful way and not try to just replace everything that people are doing. So that's awesome.

Janna Bastow (37:11.427)

Yeah.

David J Bland (37:23.461)

So I love this. So we covered everything from early days onboarding and packaging and pricing to fast forward today of how you're testing even AI features. What are you most excited about in the future with ProdPad and where you're taking it?

Janna Bastow (37:39.555)

Yeah, absolutely. I mean, I think this AI stuff is going to really help accelerate what product managers are capable of. I think we're only just seeing the beginning. And I think there's some product managers out there are a bit nervous. Like, is it going to take their jobs or what's it going to do? And no, I don't think you should be nervous, right? What it's going to do is remove the boring work. It's going to remove the grunt work so you can get back to doing the stuff that makes product management exciting and fun, which is getting out there and talking to the customers.

you know, getting out there and solving real problems. So you're not spending your time writing user stories and specs and sorting through your backlog and trying to figure out whether, you know, you've seen this idea before and, you know, is that a duplicate of something over here? That stuff should all be done. You won't have to read every line of your customer feedback because you just put it in. You talk, you arrange time with your customer and you talk to them and it will figure out what the gist of it is.

and help you make those connections that you can make those bigger, bolder points about this is our strategy. This is where we need to be going. And we know this because we've had, you know, this information from customers and we need to do solve for these types of problems. So I think it's going to allow product managers to get back to the core parts of product management and step away from a lot of the busy work that has been drowning us over these years. And ProdPad is very much staying at the forefront of that and enabling that.

pattern.

David J Bland (39:10.371)

And there you have it. Thank you so much. If people want to reach out to you, what's the best way? So let's say they're listening to this and they had some questions about how you were thinking about your testing of pricing and packages and all that, or maybe they just want to reach out to you for help. What's the best way to get in touch with you?

Janna Bastow (39:27.267)

Yeah, absolutely, so I'm easy to find. I'm the only Jana Bastow out there as far as I know, so come find me on LinkedIn, connect with me, let me know that you heard me on this podcast so I can connect those dots, or just reach out to me by email. Send me an email, I'm jana at prodpad .com, and I'd be happy to chat there.

David J Bland (39:45.411)

Thanks so much for joining us. Yeah, there's amazing stories and details about pricing that I know our listeners are going to just absorb. So thank you so much for coming on the podcast.

Janna Bastow (39:53.731)

Fantastic, thanks so much for having me.