Testing your ideas against reality can be challenging. Not everything will go as planned. It’s about keeping an open mind, having a clear hypothesis and running multiple tests to see if you have enough directional evidence to keep going.
This is the How I Tested That Podcast, where David J Bland connects with entrepreneurs and innovators who had the courage to test their ideas with real people, in the market, with sometimes surprising results.
Join us as we explore the ups and downs of experimentation… together.
David J Bland (0:1.464)
Welcome to the podcast, Jeff and Josh.
Josh Seiden (0:4.029)
Hey, it's great to be here.
Jeff Gothelf (0:5.577)
David, it's a blast to be here.
David J Bland (0:8.078)
I'm so happy that you two are on here. I first became a huge fan of your work. Probably, I'm thinking maybe 2010 -ish or before. And I finally got a chance to work with you all at NEO. I was super excited about that. It's one of reasons I joined NEO, because could hang out with you all. We were on different coasts, but we tried to hang out when we could. And I'm just excited to hear you talk about how you really deeply think about testing and how you put it into practice. Because I think there's so many folks out there
that just don't have the experience of going through this for as long as you all have. And so I'm just so excited to have you here. Thanks so much for joining us.
Jeff Gothelf (0:48.565)
super excited to tell you our story and how we put this stuff in action.
David J Bland (0:53.539)
Yeah, so Jeff, maybe starting with you. What are some of the assumptions you're working on right now? Like, where's your work headed at the moment?
Jeff Gothelf (1:1.823)
Yeah, so Josh and I do a lot of work together. There's a ton of overlap in our work. We've got separate businesses, but we actually sell work and deliver work together. And what's interesting about that is that we are at a point in our careers and the life of our business where we'd like to scale that business. the reality is that you're looking at the business. The two of us here.
Josh Seiden (1:30.630)
Hahaha.
Jeff Gothelf (1:31.323)
are it, right? There is no one else. And so if we're going to scale this business, there are variety of different ways to do that. And there's kind of an infinite number of, not infinite, but there's a large number of hypotheses and assumptions that we could put together for how to scale the business. And so what we've decided to go with first, what we believe to be a promising set of assumptions, is that we would like to
what we're trying to do right now is to scale our training business. So we have a training business based on our books, Lean UX, Sense and Respond, Who Does What By How Much, Outcomes Over Output, et cetera. And we want to basically what we're calling certified training partners in other parts of the world who would like to teach our material, the material that we've developed and the style that we've developed it.
to their markets in their language at their various price points. That is sort of the big hypothesis right now is can we find and make successful a set of external trainers that we trust to deliver our content at high quality, similar quality to what we do in their market, in their language, ultimately at scale, right? So not just at
at global scale where we've got partners all over the world, but these folks are actually making a living, making a business out of this as well. So that's the big hypothesis that we're working
David J Bland (3:10.006)
That sounds amazing. think it's a challenge a lot of our listeners have, especially people who are trying to do work on their own and they're sort of the face of the work, right? And the challenge always is, well, you scale services businesses on people and billable hours. And that's always, you I think that I think there are other ways to do that now, you know, we can test our way through doing that and productizing it in different ways. Josh, when you hear, you know, Jeff speak about
that and I know you are in close contact and collaborating quite a bit. What do you think? How do you start breaking that down into smaller pieces to test because it seems so big and so overwhelming just that face value.
Josh Seiden (3:44.958)
Yeah.
Josh Seiden (3:50.312)
Right. Well, you know, I mean, I think the way the way that we work is we say, OK, well, what are what are the assumptions here? Right. And so there are kind of three big assumptions here. Right. And so, you know, we just talk about it like where are the risks here? What do we have to prove? What has to be true for this to work? And so the first one was we had to be able to attract a a group of high quality
people who were interested in training our material. So could we find partners who wanted to work with us? What's their demand there? And I can talk about sort of how we're testing all of this, but I'll start just by talking about the assumptions. So the first is, are there trainers out there who wanna do this? Then the second was, can they build a business? Can they fill the seats in their training?
Josh Seiden (4:48.348)
It was kind of the next level of demand. And then the third was, the third kind of risky assumption is can they do it well? Like, can they deliver a high quality service that people are happy about and they recommend and they come back with excitement as repeat customers? So it's really kind of like the three big, like to start with, and I think this is kind of true,
for most businesses that are getting started is like you're looking at the different pieces of the market and you're saying, well, we assume this about each of these pieces. Is that true? Because you have to have all of your different constituents in the market. You have to kind of prove out the proposition from all of their points of view. And so that's kind those were the kind of three big initial hypothesis and hypotheses that were kind of in
the middle of testing now.
David J Bland (5:50.858)
So Jeff, when you hear Josh mention those, it sounds to me as if it's a multi -sided market in a way, right? You have the trainers who, or do you have trainers, or can you find trainers you trust that are good enough, and also demand? And I'm wondering, what do you feel like the riskiest part of that? Is on demand side? Is it on getting the trainers? Do you feel as if there's people that will come to other trainers who aren't you?
And and want this content. I'm just wondering where you're drawn to because I feel as if Multi -site markets are super hard to do well and you almost have to do discovery on both sides and Just based on what you're seeing as a signal in the market I'm just curious of where you feel like like what's the risky side of each of those?
Jeff Gothelf (6:34.921)
Yeah, there's a lot. Look, look. So a couple of risks. Number one, our name is on the box here, right? Like this is, you know, we're branding the business Sense and Respond Learning, but all of these courses are being Sense and Respond Learning doesn't have a huge brand presence. and Respond Press does a little bit, but Sense and Respond Learning doesn't really have a huge brand presence. So all this stuff is coming out as, you know, Jeff and Josh certified training partners, Jeff and Josh created content, that type of thing. That's a huge risk. Like my, my name.
Josh's name is on the box. so whatever ends up happening with this, when you kind of hand this over to somebody else, there's a huge reputational risk. Before we even get into business models and that type of thing, there's a huge reputational risk here. I think that's one. On a similar path, there's a quality risk here. One of the reasons that Josh and I are successful at this is that we're good at this.
And I don't mean that as a humble brag. It's, believe that to be true. We deliver a high quality experience with good content. It's the feedback that we get from our clients. It's the reason we've been successful in building this training business. And so the second question is, can we get these folks to deliver at a relatively similar level, right? Maybe they won't deliver at the same level as us, but they'll deliver it at a similar level.
And so that's super interesting to me. Now what's fascinating is that initially, and those were the two things that I was really worried about at first. And since we've gone ahead, we've been working on this for a few months now, the risk of can we find people to do this has been invalidated. We can totally find people to do this. In fact, just today, literally today, hours before,
this recording, I had four inbound requests from four different countries of people interested in partnering with us on this. So that has been, I put out a few cold outreach emails and LinkedIn messages to some folks that I trust and know in a couple of different countries and asked that they were interested and they were. And then as soon as we started talking about this publicly inbound
Jeff Gothelf (9:0.203)
inbound requests to join have been plentiful. mean, we've got, at the moment, we've got seven countries in flight and a backlog of three confirmed, I think, and then like half a dozen that we're filtering right now. So that's been proven. And the experiment literally was cold outreach, right? That's how we tested, is cold outreach. And then as we started talking about it,
It's just people like, I heard you were doing this. I'd like, I'd like in on that. Um, so that's, that's been, that's been incredible. And so, and so then, you know, then I thought, okay, quality, quality, quality reputation is really important, but what we're finding is an interesting challenge right now. We're trying to figure out why is, as Josh said, is can these folks put people in the room? Cause we're, pushing for in -person classes, live in -person classes.
partially because that's one of our assumptions, partially because of language. So we're in a bunch of different countries. So for example, the trainer training partner in Austria, he's going to teach in German. And so most German speaking markets are going to be up in that part of the world. So that's not an issue. But if we think about, for example, our training partner in India, they're likely going to teach in English.
And if they're offering an English language course online in India, what's to stop somebody from London for taking that class because it's going to cost one sixth of what the class in London costs. Right? So we've got those kinds of like interplays. to simplify things, for our first round of experiments with these folks, we've left it at in -person classes, in your market, in your language.
And now the challenge has been, we put folks in the room? And that's been really interesting and slow to move forward. And we're trying to figure out why right now.
Josh Seiden (11:6.718)
And just to kind of build on what Jeff said, because David, your question was which side of the market do you test first? And we didn't really have to struggle with that question too much because we got such rapid validation on the trainer side of the market. was just like the signal was super clear. It was overwhelming. We were ready to move on to the next set of assumptions.
David J Bland (11:35.793)
Now, I like cold outreach. know it's people just say, oh, it doesn't work anymore. But I mean, you're proof that it still does. People still do it. How would you go about cold? Do you feel uncomfortable doing cold outreach? I'm just wondering about, that make you feel very uncomfortable doing so?
Jeff Gothelf (11:46.054)
And you know what I think?
Jeff Gothelf (11:51.299)
So in this case, no, simply because look, there's a couple of things that we were leaning on. Number one was our network and number two was our personal brand. Right. And so we have a decent network of folks that we know around the world. We've worked all over the world over the years. And so we know people in different countries. so it was, it wasn't maybe cold was an exaggeration, at least for the first few, was Luke warm.
Josh Seiden (12:16.436)
Yeah. Yeah. I mean, I think maybe cold outreach was not the only tactic. I think that because the other thing that we did was we posted about it on LinkedIn. And we said, hey, we're doing this. Right. it also, you know, it was more like talking about it publicly. wasn't just. I mean, there were people, there were targets that we did cold outreach to. But
but that kind of socializing it on social media to sort of start getting the message out there was I think an important part of it.
David J Bland (12:54.000)
Okay, so you're doing warm slash cold outreach, getting folks to come and line up, sounds like essentially to do the training, which doesn't surprise me. Yeah, it doesn't surprise me. I mean, it's that's something you do have to kind of almost like throttle a bit because I can see an influx of people going, oh, wow, I can make money teaching your stuff and, you know, and raise their brand as well.
Josh Seiden (12:58.516)
You
Josh Seiden (13:4.733)
Yeah, we've had to slow play it.
David J Bland (13:21.191)
And then you mentioned the second part of that, which I think is interesting, like almost like your risk moves around, you know, it's like, OK, we have a bunch of people lining up to do this. However, we're not sure they can actually fill up these workshops because maybe they're coming forward because they think, oh, it'll be easier for me to fill up the workshops or maybe you fill them up for me. You know, so I'm thinking, like, how do you work through that, Josh? Like, how are you helping test your way? I know this is probably beginning. It's still ongoing and in flight at the time of recording this.
But how are you approaching that problem of, well, how do they fill it up or how do we start testing our way to see if they can fill these up?
Josh Seiden (13:57.524)
Right, well, you know, I think...
Josh Seiden (14:2.002)
So our starting process for that was to say, look, and we had some debate about this. We had some debate about whether it made sense to open one market as a test market or open a handful of markets as a test market. And we decided that there was some value in maybe creating a trainer community.
that that might help us with kind of our third assumption, which was about quality, right? And so, you know, the activities sort of go to both assumptions. so we decided to open a small handful of markets and we created a cohort of trainers and we meet on a regular basis to both to kind of help them learn the curriculum.
right? A kind of a train the trainer situation. These are experienced people. They they they're bringing their their smarts to the table, but they don't necessarily know our curriculum. So part of it is bringing them up to speed on our curriculum. But then part of it is supporting the sales process and figuring out how we do that. And and so, you know, our our what's interesting here is that
You know, think for a lot of people, they assume like experiments are fast, right? But experiments take as long as experiments take, right? And in this case, it's taking a full sales cycle for a workshop. We started it in early summer, which is like the worst time in the world to do sales, right? When everyone in the world is on vacation. And so we've had to adjust some deadlines, but now that we're back, it's the fall.
we're kind of leaning into the sales and we're just meeting to kind of work through that workshop sales process. kind of, you know, as Jeff was talking about, like one of the challenges right now, one of the risky assumptions is that we can build this business on in -person events. And, you know, whether or not the world is ready for a comeback of in -person events, we may be a little early for that.
David J Bland (16:26.477)
So Jeff, when you're thinking through how to test, well, one, how to test through building community, I think I'm curious in your thoughts on that because this isn't the same world, right? Like you and I were working, we all were working together, you know, back in the early 2010s, you know? And now you think about community building and it seems like it's a completely different world after the pandemic where now,
And I was even having a conversation with our peers earlier this week. The communities even start being built online and then go in person, where we think back to when we were getting started in our careers, it was all in person and then, it'd be nice to have some kind of online component. So how are you testing your way through that? I mean, how are you factoring in how much has changed since we all were working together?
Jeff Gothelf (17:17.609)
Yeah, so far the community that we've built is for the trainers only. then, you know, so for example, we've got two Spanish speaking countries. And so that, well, let's talk about how we're testing it. So we're testing it with Slack, right? Let's be super specific. It's a Slack group. We've got every trainer who is participating, including if they've got a partner who they're training with in there. And we're using that as a central communication channel, a distribution channel for content, for marketing materials, for
sharing ideas. Hey, we just recorded a 30 second video with the Colombian team today. Here's what it looks like. If you'd like to do something similar with us, let us know. And it's interesting who's taking us up on what. And so we keep tossing out marketing ideas in there to see what they're doing. It's interesting to see how and when they collaborate with each other and for us to learn from that and then prompt for that in the future. So as I was saying, we've got a team in Colombia. You've got a team in Spain.
Um, they collaborated on the translation work and the, um, the localization work more or less. It's similar, similar languages. And remember like all of our experiments, right? So, so at this point, while these trainers certainly do have to go find a physical venue for this and, potentially book it and maybe put down a deposit for it. The experiment here is a landing page test. It's an event, bright page that talks about when this thing is happening.
what it's going to cost, who's teaching it, what it's about. And then it offers the opportunity to sell tickets. And if enough tickets sell, then the event gets held. Now the interesting thing is, is this variable of enough. And we've left that more or less in the hands of the trainers. So when they feel like they've got enough, we've, we've given them a max kind of, said, look, don't, you know, don't, don't go more than 20 people for your first, your first time out, but
you decide if you've got five people and that's good for you, 10 people that's good for you, whatever it is, you decide. But the test there is basic landing page test.
David J Bland (19:30.018)
I love that. I love the using techniques. feels as if there's this sort of thread in the community sometimes where folks, it's easy for us to come in and say, because I do similar work to you all, to say, oh yeah, here's how you run your tests. You see this, this, and this. And you're always giving that advice. But when you take it and use it on your own work, it does feel different.
And I mean, having us, we all wrote books, right? And testing my way through writing a book was quite humbling experience for me, applying it to my own stuff. And I think it's just refreshing to hear people who are sort of thought leaders in the community about these topics share about how you are applying them internally to what you're trying to do. And you don't have all the answers. It's not that you just...
Josh Seiden (19:58.931)
Right.
David J Bland (20:16.101)
have this single brilliant idea and you will it into being and then it's successful, right? Like you're actually using some of the techniques. So Josh, know, when you think back of where you came, like how you arrived to this point in trying to build this training, the scaling of this training, can you think back to any experiments you've ran in the past or maybe coached on? You feel like, oh yeah, you know what? I think we might be able to use that here. That's influenced how my thinking is now.
Josh Seiden (20:46.227)
Well, I mean, I do think that that, you know, one of the.
Josh Seiden (20:54.408)
One of the dynamics for me is sort of, I think when we use this idea of testing or this word testing or this word experiment, we're kind of implying that everything is going to be science and data. And like, yeah, it's all data, I guess, but it's not necessarily quantitative data.
And so like to Jeff's point earlier about, sort of, know, we've got this landing, these landing pages and we haven't specified what, uh, you know, where the minimum number is to hold a, to hold an event. And, and I think that like, you know, that that's it. There's a question there for me, which is what makes it worthwhile for our trainers.
Right. And so if you leave that open, you kind of, in some ways, the more things you leave open, the more you have the ability to, they're like open -ended interview questions, right? You get a kind of a rich response from people where if you say to people, is 10 people enough? They'll say yes or no. But if you say to people, know, well, how many people is enough? They probably will say, well, we don't exactly know.
And that's interesting, right? That's, I think to me, that's a more interesting answer because it leaves open a whole bunch of related questions. So I think one of the things that, that for me is a guideline for these tests. I mean, you asked me, you know, if I think back, I think before the pandemic, Jeff and I were working on kind of a different way to scale the business. Again, it was an, an events business and
the pandemic just shut that one down. But what was interesting to me about that business or the experiments we were running there was that they were low risk and relatively open -ended. Let's try this with the smallest possible commitment we can make to it, but also leave it small scale and open -ended so that we're learning from the feedback.
David J Bland (23:20.083)
Yeah, keeping things open -ended, almost like exploratory, like you're probing, right? So when you talk about sensor respond, you're kind of like probing to see and then inspecting and adapting on that. I do think we get maybe a little too overzealous sometimes about acceptance criteria. And I'm often conflicted by that too, because I do a lot of this and people say, well, how do we know if we're right?
And some of that early stage discovery, honestly, it's really hard to answer that question. Or sometimes we say, we'll be thrilled if I've had some people say, well, can we just change the language and say, instead of being right, like, we're really excited about this or thrilled and it takes some of the pressure off. Jeff, similar question to you. I'm just, go ahead, Josh. Yeah, go ahead.
Josh Seiden (24:4.576)
I was just going to say, when you're right, you know it. It's a slam dunk. Are there trainers interested in working with us? There was no ambiguity in that data. Bam, we have more demand than we can handle. Ticket sales, it's like ticket, I'm going to just say, it's hard to know. Is it because we we know enough about the events business to know that you sell tickets when you release them.
And then everything is slow and silent. And then just before the event, ticket sales ramp up again. So where this curve, where, what does this mean? Now we're in this part of the curve, right? Uh, and so when you're wrong, you don't want to admit it. And so you try and round the data up, I think. Right. Or when you get a negative result. So how do I know? mean, when you're right, you know it. And when you're not exactly right.
you kind of have to admit it too.
Jeff Gothelf (25:4.415)
You know, what's interesting though is about acceptance criteria and our training partners know us and they know who we are and they know that we believe in this approach to testing and learning and adjusting. And they asked us, they said, what does success look like? Because these are all like this initial cohort of experiments is literally, it's a one -off. We've made no commitments to these folks beyond this one time.
And they said, how will we know if this was successful? And we can't answer that, right? Because there's so many variables. There are so many variables with each of these experiments that, for example, if it was just a simple thing, if you put 20 people in the room as a success and we keep going, okay, right? But it's not that simple. What if you put 20 people in the room, but you're a pain in the ass to work with, right?
or you did a crappy job or you're sort of, you're sort of not living up to the brand, if you will. Right? Cause again, it's our names on the box. Is that a win? I don't know. What if you put six people in the room, but you knocked it out of the park, did a great job. They loved you. They told all their friends. Right? Is that a win? It's probably more of a win than the person who put 20 people in the room.
And so there are qualitative aspects to the acceptance criteria here that we can't articulate upfront. Because they're qualitative, I can't even put a bar in front of these folks and say, if you achieve at least this, then that's a passing grade, if you will. Because I'll know it when I see it kind of thing. There's that aspect to it.
David J Bland (26:53.452)
Yeah, I come back to art and science, know, or pulling from social sciences a bit. And I do think our ability to quantify everything nowadays does work against us, especially at Open -ended Discovery, where we don't know what success looks like yet, but we know if we have directional evidence that we're on the right track or not. So I do get conflicted by it sometimes.
Jeff Gothelf (26:56.224)
Yeah.
David J Bland (27:15.312)
Not so much later down the road when we know we have something that there's a real value exchange and we could say, look, if you get 20 people in here and you have an NPS score or whatever and this this this happened, then it's all more knowable than when you've had repeated events. But really early on, it's really tough. I often put myself in the shoes of people I'm coaching. was like, wow, what would I say here if I said, well, this is how we're going to draw the line in the sand?
And there's no external data we could look to to make that conclusion and research. we have to kind of sometimes ask ourselves, well, if it's our money and we're investing in this, what would we need to see? what direction would we need to be going in? And it's not always cut and dry. think that's where, like you said, with scientific method, it's not all quantitative all the time, especially in the world we work in.
Josh Seiden (28:6.516)
You know, David, one of the things that you just said, if it's our money and we're investing in this, what does good look like for us, right? Like to take that investor's point of view. And I think when we started this, we actually had that conversation with each other. And we said, okay, well, you know, what would our business look like if it was appealing to us as investors? Right? And we came up with a bunch of different versions of what that business might look like.
And one of it is this version that we're testing now, you know, with an end towards a certain kind of business, a certain shape of business that had these trainers in local markets. so there's some other alternatives that we thought about, which are kind of waiting in line, I would say.
But yeah, mean, that's sort of part of it is it starts with, oh, we've got a couple of possible ways we could go, right? And we need to focus our work. So this one seems like the most credible. Let's start with this test. And then if this doesn't work, we pivot to the next idea that's in line.
Jeff Gothelf (29:26.435)
And I want to add one thing too. Sorry, one of the things that we also have been doing as part of our learning journey has been to interview people who have built businesses like this, as well as sort of franchise -based businesses that aren't necessarily training businesses. And that's been fascinating.
David J Bland (29:26.885)
Jeff, I'm wondering, in your work, go ahead. Go ahead.
Jeff Gothelf (29:53.033)
Right. So we've talked to folks who've built and sold businesses like this. That's one, and that's, that's great insight. We've talked to folks who, um, who have built, um, you know, retail franchise businesses and talk to them about how they set up their relationships with their franchisees. Not, you know, we, we're not franchising to these folks per se, but it's a very similar type of relationship. And it was fascinating to learn from those folks because they challenged our, uh, our initial assumptions about the business model.
here for us. Our default assumption here was revenue share with the training partners. But these folks who have built franchise businesses, look, there's two or three other ways to do this. And so if that doesn't work, here's some other ideas for you to move forward on. And that's been fascinating for me.
David J Bland (30:45.541)
I think getting that kind of input where it's like you have your assumptions, but when you interview people that have also done something like this, they can bring out assumptions that maybe you didn't think of, right? And all that. I'm wondering in your work, Jeff, we'll start with you. Does any of this influence how you coach teams when you run through experiments like this and you see what works and what doesn't and you learn things unexpected? Does that change your approach of how maybe you would go into a corporation and talk to a sponsor who
Jeff Gothelf (30:55.531)
100%.
David J Bland (31:15.471)
maybe doesn't know what success looks like and help you have more empathy there.
Jeff Gothelf (31:19.807)
I think, look, a hundred percent, right? think, I think what this does for us in our work is it gives us, um, good stories and credibility to drive inspiration for new ways of thinking about the work that we're doing with the client or that the client is doing on their own. And so if the client says, well, how would you handle a situation like this? Well, I can tell you about how we've been running this experiment with our business and we share that with them and we see what we've learned. it, okay, good. What can we take away from that? How might that motivate you? Right. Because.
Because inevitably, one of the ways, and we've been doing this for years, and I suspect you're familiar with this, David, but one of the ways that we teach experimentation is with time horizons. And so the question is, hey, if you had a month to test this, what would you do? If you had a week to test this, what would you do? If you had a day to test it and had to learn something about this, what would you do? And we use those time horizons. And I think that, you know,
sharing our stories and saying, look, here's our one month experiment. In this case, it's multiple months. Here's our one week experiment, right? Setting up a bunch of calls with franchise business owners, et cetera. And here's our one day experiment, LinkedIn post, talking about this, that type of thing. so there's a lot of inspiration and credibility that comes into the work. And I hope
validates the work in the eyes of the client and inspires them to give this a shot, regardless of what business they're in.
David J Bland (32:53.845)
Josh, similarly, do you feel as if there are certain situations where maybe something didn't go as well as planned in your experimentation? went, that made me change how I might approach this with a client.
Josh Seiden (33:8.243)
You know, I think for me, lot of this, Jeff mentioned this kind of idea of time horizons, right? And I think that's an idea that we learned from our mutual friend, Janice Frazier. That time horizon is a great thinking tool if you had one day, if you had one week, if you had one month. But that doesn't necessarily fit into the business cycle, right?
And so even for, even for us who, you know, we're, we're two and a half people, you know? So like, not even, we're not even three guys in a garage, you know, we're two and a half. so, um, you know, you think like, oh, well, you know, three guys in a garage can go, can go fast. Right. But business cycles are what business cycles are.
And you can't go faster than a business cycle, right? And there's a sales cycle for a public workshop. takes a certain amount of time. Right. And so, you know, I think, I think, you know, it's easy when you're not inside a big corporation or when you're working with enterprise clients, because a lot of our work is with enterprise clients. It's easy to get impatient with how slowly things will move in the enterprise. Right. And, um,
I think it's useful for us as outsiders to kind of continue to push our enterprise clients go faster, go faster, go faster, because there's nobody pushing them to go slower. They got that covered, you know? But, you know, there's a business cycles take as long as business cycles take. And so being able to have patience and confidence and perspective through an experiment that takes, you know, a number of months.
is it's tough. I mean, it's just tough emotionally. You think like, why am I spending all this time on this idea that might not work? You know, so, so for me, that's it. It's that that's like, that's one of the big reasons to do it myself, you know, is is just to, you know, I believe in the method with with all my heart, but
Josh Seiden (35:31.133)
That doesn't mean that it's magic and it solves all the problems.
David J Bland (35:37.509)
There's something in there about working within constraints.
You know, like, yeah, if we had all the resources in the world, we could test this a certain way, but we don't. So, you know, what could we do within these constraints? And I think there's time horizons are a great way to sort of apply constraints to that, right? And see what can we do in a day or a week or a month? So I love that framing. Huge fan of Janice's work too. So shout out to her. So, Jeff, as we begin to wrap up here, like, what gets you excited most about this? know, what excites you about all this kind of uncertainty? Like, we're not all great at, I feel like a lot of us are risk adverse.
And this is like the least riskiest thing we could do is running our own businesses. So what gets you excited about this, you know, dealing with all this big push that you're trying to do here and change the business.
Jeff Gothelf (36:14.784)
Yeah.
Jeff Gothelf (36:22.303)
Look, I've been thinking of, as Josh has mentioned before, we've been thinking about a variety of ways to scale the business for a long time. This is the most promising one. And I have to say, I'm super excited by the inbound interest from folks, and surprised, by the way, by the inbound interest from folks that want to collaborate with us. I had no idea this demand existed. None. Until we actually said something out loud about it,
I had no idea this existed. to me, was like, you know, again, like from our name is on the box. And if we put this out there and no one wanted to do this with us, that would suck. My ego would take a beating there, you know? I'd be like, why does no one want to do this with us? So what gets me excited is that there is a large community of folks that are interested in doing this with us all over the world, literally all over the world and in a bunch of languages.
Josh Seiden (37:3.966)
Yeah.
Jeff Gothelf (37:22.095)
And that gives me the motivation to really push through and see this through. As Josh was just saying, like, why am I doing this? It might fail. I'm psyched by the energy coming from the folks that are partnering with us, these training partners. And it gives me the kind of the motivation to push this and do whatever we need to do to try to make them as successful as possible. That's super exciting for me.
David J Bland (37:48.448)
Josh, what similar, like what makes you excited about this? What gets you excited about all this uncertainty?
Josh Seiden (37:54.515)
Well, so I mean, I will say, you know, I've spent my career as a designer and product person in technology. So I like ambiguity and abstraction. Like, just like it. It's interesting to me purely intellectually, you know? And I think I'm pretty good at ambiguity. So it's a pretty comfortable space for me to live in. So that's the first thing. Like, what gets me excited?
all of the possibilities. And so the flip side of what Jeff is talking about is there's signal there. There is signal from the market that this is interesting. And if this conception of, if this response to that signal isn't the right response, I know there's another response to it. I know there's another way. And so I just like,
I just think, I like the idea of just continuing to take shots. There's something there, so how are we gonna figure it out? And to me, that's part of the creative process. And for me, that's what makes this fun.
David J Bland (39:14.749)
Thank you, thank you. Jeff, if someone's listening to this and they say, you know what, I wanna be a trainer or someone's going, hey, where are these workshops? I wanna go to one of these workshops. What's the best way for them to reach out and get in touch with you all?
Jeff Gothelf (39:28.309)
So they're all branded under Sense and Respond Learning on Eventbrite. So if you search for Sense and Respond Learning Eventbrite, you'll find the page and all the listings. We've got a website, it's senseandrespond .co, that you can see all the listings as well there. And obviously, if you need to reach out to either one of us individually, you can find us on LinkedIn or on our individual.
David J Bland (39:51.835)
Awesome. Thanks so much, Jeff and Josh for hanging out. We went everywhere from, oh, like reminiscing back in the days of when we first were working together at NEO to how you scale a training business beyond just people and billable hours. And how do you start productizing that? How you're applying all this stuff that you teach from day to day and coach real companies through. How do you apply it to your own work and what you learn from that? So I just really appreciate you being open and transparent, hanging out with us and sharing what you're learning in your journey.
Jeff Gothelf (40:20.523)
Thanks for having us on the show, David.
Josh Seiden (40:20.537)
Thank you for having us. It's nice to reconnect and fun to talk about, so thanks.
David J Bland (40:27.443)
Thanks again, guys.