How I Tested That

In this conversation, David J Bland and Lex Roman explore the evolution of experimentation in tech, discussing how perspectives on experimentation practices have shifted over the years. 

They dive into the role of social proof, ethical considerations, and the balance between experimentation and actionable insights. 

Lex shares her experiences working with non-tech industries, particularly journalism, and the challenges of measuring marketing effectiveness.
 
The discussion wraps up with Lex's innovative approach to helping journalists grow their reader subscriptions and the creation of a playbook for success.

Is your innovation pipeline clogged?
  •  Uncover the risks, bottlenecks, and gaps holding your best ideas back.
  •  With the EMT Diagnostic, you'll get a clear, actionable plan to fix them.
👉 Book a free discovery call at https://www.precoil.com/innovation-diagnostic

What is How I Tested That?

Testing your ideas against reality can be challenging. Not everything will go as planned. It’s about keeping an open mind, having a clear hypothesis and running multiple tests to see if you have enough directional evidence to keep going.

This is the How I Tested That Podcast, where David J Bland connects with entrepreneurs and innovators who had the courage to test their ideas with real people, in the market, with sometimes surprising results.

Join us as we explore the ups and downs of experimentation… together.

David J Bland (0:1.277)

Welcome to the podcast Lex.

Lex Roman (0:3.121)

Thank you for letting me air my grievances about experimentation on your show.

David J Bland (0:9.731)

I'm so excited to have you. I remember our first project together way back at Toyota. And I vividly remember like the first day. So we're doing all this discovery stuff and we had a couple of people on the team already and myself. And I remember you walked away, I think to grab a drink or something. And the team members looked at me and they were like,

Why didn't we have her on this team months ago? And I was like, I don't know. We just couldn't find anyone like someone like Lex And, you know, she's really talented and ⁓ we have her here now. And so I was just reminiscing about that the other day as preparing for this, this episode. was such a fun time. So many different areas we were able to explore, but a lot has changed since then too. maybe you can give, you know, our listeners a background of just a little bit, a little bit about you and what kind of.

Lex Roman (0:35.089)

Ha

David J Bland (1:1.299)

catch up on to where we are today.

Lex Roman (1:3.419)

Yeah, so I started my experimentation career when I started my tech career. I was really lucky to take Jamie Levy's UX class at UCLA. If you know Jamie Levy, she wrote a book called UX Strategy and she taught me about the lean startup and about Ash Moira and Eric Reese and sort of this whole lean MVP experimentation method that we continued working on at Neo.

And so had done a little bit of that before Neo with some startups and small businesses in LA. I was super excited to work with you on Toyota. But I think that we were just like really still pretty early to that culture, which is why it was so hard to find people who knew what experimentation was and knew how to do it. Like that was being invented at that time in our realm, right? Like obviously Toyota hilariously invented it years ago in manufacturing. And there's like other

you know, science has been experimenting for centuries. But in tech, since tech was so new, we were coming up with that. We were like, what does an experiment plan look like? What should we be experimenting on? Right? Like there's some interesting tests we ran on that project that I don't think I would do today. And a lot of my perspective has shifted today about how you run experiments. So since then, you know, I've done other large scale experimentation projects with like companies like Nissan,

And I've also worked with a lot of startups and like tiny business owners. Now I'm working with newsrooms and the conditions change so much based on who's running the experiments and who the audience is.

David J Bland (2:44.628)

I agree the 10 or so years ago, we think about, well, we were talking about lean, there was a lot of lean language, right? And that was pretty common in manufacturing, but it's not trivial to take that type of thinking and apply it to tech or apply it to software and product development. It's like, yeah, there's waste, but it doesn't translate easily. And you mentioned something that kind of stuck out to me, which I agree with.

which is, there are some things I did early days that I don't recommend anymore. And there are actually some experiments I didn't include in the book. So maybe we could start there a little bit. What are some things that you've changed your perspective on over the years with regards to experimentation?

Lex Roman (3:29.637)

Yeah, okay, so one thing is that with Toyota, what we did on that project, and I know you've talked extensively about this project, so there's a lot that folks can look up if they're not familiar with how that project went down, but we made like a straw brand, right? We like invented a fake brand. We didn't necessarily tell people we were with Toyota and we were running those tests. We made a landing page and a whole fake brand for that product. But the reality is, it's extremely different.

for Toyota to launch a product than for a random startup that doesn't exist to launch a product. Those conditions are substantially different. And there's absolutely no reason for Toyota to test a non-Toyota entity launching a product. Like there's just not, I can't see a reason to do that. And it puts us at a huge disadvantage to do that. And I think the question becomes, would Toyota...

have gotten a bad signal about product adoption because they are Toyota, they have such strong market saturation, they could have sort of forced the market to use their product, right? Which we see Facebook and Google do and chat GPT doing. And is that like a bad business decision because maybe they would like let this thing limp along for longer than it should, right? And actually would be costing them money and not generating revenue. I think that's a different way to look at it, but.

The reality is like, you don't need to invent a brand if you have a brand. And I don't think there's an advantage to doing that.

David J Bland (4:56.695)

Yeah, I've come around a little bit on my thinking with, how do we use labs brands or brands that aren't completely disconnected from the corporate brand, but are also we're being upfront with, look, this isn't fully built out solution yet. This is something we're considering, we're doing discovery on it. And I do think from large companies, there's still this a little bit of fear of, no, we can't use our actual brand.

And you see them doing things like on Indiegogo where they'll throw up a labs brand or some sort of, you know, innovation brand or something like that, a sub brand. And, it's still connected to them. Another thing I think is really interesting. Going back to how we've evolved our thinking and maybe it's just cause I'm older and grumpier now, but the idea of social proof on some of those. So if you're doing off brand.

And then you're coming up with, oh, we're available in all these cities when you're not, or you're putting testimonials that are not quite fully transparent. Then you're also causing people to sign up to something where the social proof necessarily doesn't exist. And so are you're biasing people to take an action that maybe otherwise they wouldn't. are your thoughts on that?

Lex Roman (6:12.517)

Yeah, yeah. The social proof point I think is really interesting, because I definitely have made up testimonials in the past. Like, I am guilty as charged. I've seen you talk about this, and I was like, that's me, I have done that. I don't do that anymore. I want you to know. But.

What I've seen founders doing recently, there's an interesting thing with social proof because people will leave a testimonial on a product or a service, but they'll also leave testimonials on people and brands, right? So the social proof thing translates. If Google launches a new product, the social proof that they have from Gmail and search and whatever, it transfers to that new product or some of it does.

The same is true of a startup if you're a startup founder even if you've had a failed Startup if you're trustworthy if you're if you had like a couple super fans Those people still care about you and that's still valid social proof even though they didn't use the new product And so I'm seeing founders on product time take

like sort of build a snowball of social proof in that way. Like they take social proof about them as founders, which is really interesting because like we know VCs bet on the team, right? And customers in many ways do too. So that's an interesting switch on that. It's like not necessarily about this new venture. It's about me, the founder or us, the team. I think that's a different way to spin it. And it sort of goes back to the like, do you create a brand out of the sky or do you use Toyota or Google or Indiegogo or whatever?

But it's like that social proof aspect is actually really huge when you're pulling something new off.

David J Bland (7:44.253)

Yeah, I like that. The way you're framing it around a snowball effect and the brand social proof comes forward. think, I would like to think that people are still not using fake testimonials on the pages and they've also evolved. But I also look at the surgence of AI and I'm also wondering how many people are just generating.

testimonials through AI and entire life stories about people that don't exist. So I'm giving advice saying, put fake testimonials on your page to get social proof. I also have phased out some of the things, or I see them far less popular, like mock sales and simulated sales. It seems a lot of effort to go through and then realize, OK, there's nothing there to purchase. I've also...

Some of the experiments I didn't include in the testing business ideas book were things like, well, we're gonna copy our competitor's site and then we're just gonna put our logo on it and test that. And I felt like, that's a great way to get sued. And I don't know if I agree with that, especially corporate brands, right? It's like, oh, let's just copy our competitor and throw a logo on it.

Lex Roman (8:49.925)

Yeah. Yeah. It's like those Y Combinator guys that just forked the code. Did you see that? Those guys quit their startup, they forked the code and then got caught.

David J Bland (8:56.850)

Yeah.

David J Bland (9:0.144)

So while I understand maybe from a pure like devoid of all emotion scientific method approach, I could see why you could justify that. But when you think about social sciences and you start thinking about the impacts and do no harm and everything, I don't necessarily agree with just copying somebody's stuff and putting your logo on and testing it. Another one was cancelable purchase orders. I thought that was a very odd type of validation. Now, yes, yes, you would get some sort of signal where people would sign a.

purchase order, right? That they actually want to pay for your thing. But then you cancel it. And I'm thinking, wow, as a B2B company, you're just burning bridges one after another. And why would you go through all that work, you know, to just destroy the trust? And so I've backed off over the years. I know there's a lot of controversy around kind of lean startup still, but I just feel like some of the lean startup stuff and the growth hacking stuff, you know, we had lots of meetups we ran for years and years and years. And some of the stories were like, oh, that doesn't

I feel like I need to go take a shower after I heard that story. And I was like, I don't really want to evangelize some of that. I do think people should test before they build something that nobody cares about and they waste time and effort and energy and money. I do believe in that. But there's a balance you have to walk between the late 90s of just vaporware and also building the whole thing and realizing nobody cared.

Lex Roman (10:19.589)

Yes, I think so too. And it's like, there's two things there. One is like the ethical considerations of like, you're dealing with human beings. Like you need to be aware that they're not experiment subjects, right? We want you to get signal on the thing that you're building, but you can't just be like playing with people's lives and livelihoods. So definitely that, we've come a long way in the last 10 years, I think about some of the things that.

You know, we were coming up against these headwinds and experimentation and lean at the time when that was really our framing where we were just trying to get people to adopt this mindset, I think, right? And we weren't as concerned about the tactics because they were already pushing back on us about running the experiment at all. We're now on the other side of that where I think a lot of people are running experiments. And so the ethical considerations become much more.

because now we actually really have to talk about this. know, Facebook ran this and yeah, it worked for them, but like actually it wasn't that great that they did it that way. And maybe we're not gonna do it that way anymore, right? Like there's been a lot of discussion about sub stacks recommendation engine as an example of a dark pattern growth hack. I don't know if I agree with that, but as an example of something that's starting to get pushed back on from the experimentation culture that we out of and popularized.

The other thing is that there's actually a lot of things that we just have enough evidence work that they don't need to be tested. I think some people are over experimenting on things that don't require additional validation. Like there's always an experimentation angle to be pulled out, but like some things we just know work. So you building a relationship with a B2B company that's interested in what you're building and them like legitimately showing interest. That doesn't require an experiment. That's just.

just a phone call, right? And like, I told you David that I had a member in one of my programs who was like, email marketing doesn't work for me. And I was just like.

Lex Roman (12:18.691)

Email marketing works for everybody, right? Like that's not the takeaway. The takeaway is like your message was bad or you didn't build your list correctly or the subject line wasn't working. But I think people are like overly precious about, every single thing needs to show this like objective data point. And it's just not gonna do that. Like it's not gonna do that. You can also look at other people's validation and say,

This is evil marketing is working for everyone else ever. That's probably sufficient evidence that it could also work for me.

David J Bland (12:51.703)

Yeah, I remember you and I being in sessions together, even at Toyota, right? Where we were doing like assumptions mapping. We would do this little two by two and we try to narrow in on the stuff where it's like, look, these are the leap of faith assumptions. These are stuff we have to test or we're just gonna like, you know, kill the whole thing. But I always run into teams where they look at the top left and stuff that, yeah, it's important, but it's relatively known or they have, or they already have a lot of evidence on it. And they're like, yeah, but we can move that just a little bit further left. And I'm just wondering, what do you think drives that behavior of

Wait, I don't want to kind of deal with all this scary stuff that's super important where we don't have evidence. And I just want to play over here where we have evidence, but we can always get a little bit more evidence and be a little more confident. Like, what do you think's behind that?

Lex Roman (13:32.859)

I mean, in my mind, it comes back to the obsession with data-informed decision-making that has really been popular the last, I don't know, five years. Tools like Amplitude and Mixpanel and Segment, they've really made it so that data's pretty democratized now. You don't have to have a data science on your team. And I think that in my experience with teams,

they feel like every decision needs data. So even the decisions that feel relatively known, they're like, but where's the data? Where's the evidence? I've been told I need data and I need to go get it to validate that assumption. And I agree. like, it can be a total waste of time. I did, I had two clients this year so far, right? And one of them,

was like, everything needs to be an experiment. And I was like, it really doesn't. Everything does not need to be an experiment. Some things we just know are gonna work. And one of them is you tweeting about this sale. Like I don't need to test whether or not tweeting about a sale works. It works. And the more you do of it, the more sales you're gonna drive. Like we just have enough information about that. You don't need to test that. You can test the specific messaging. You could test the timing, but I

argue that that's a waste of your time and you should just give it the best shot you can. You should just take a guess that it's probably gonna land in the morning Eastern time for a New York audience and just do your best at the messaging and call it a day and actually put your experimentation brain on stuff that's legitimately hard and unique for your brand.

David J Bland (15:10.286)

It's like doing the hard things, you know, it's like we can apply this experimentation mindset, but do it on things that feel relatively safe or we kind of just keep going through this loop. And I wasn't even training this week where I was just really trying to stress that we have to drive to action, you know, like we need to test and then learn and learn from that. But that still has to drive some kind of behavior that we're doing, right? It should shape our strategy. It should shape our product, right? And I do think closing that loop sometimes feels maybe a bit scary.

and folks just they kind of want to analyze and reanalyze and reanalyze and that feels safe. You know, I'm just playing with data over and over again, but it's not really driving them to action. And one of the sayings that really always kind of bothered me in Silicon Valley, which was like, you can win if you learn faster than everyone else. I was like, you know, the market's not going to reward you for just learning. You have to take that learning and put it into action and create something that delights them that they're willing to pay for, that you can also make money with.

And I think the learning mindset, while I'm definitely all for it, there are diminishing returns or running experiments on things that are relatively known.

Lex Roman (16:16.422)

Yes.

David, that's such a good take. We need like a set of memes about that because I do think people get into this like analysis state where they're just like so precious about these experiments, structures and data information that they need to evangelize to the team, that they just lose sight of like, actually, we just need to be winning as much as possible. And the thing that I started coaching, cause I shifted out of tech like a couple years ago. So I've been working with small business owners that honestly, they can't afford to be spending so much time failing. So the thing is,

though some of them are just right about stuff. So my line has been for them if you're winning, if you just happen to be correct about your moves, the channels you're picking, the messages you're putting out, the audience you're targeting, great. It's when you're wrong over and over again that you need to pull back and say okay I'm wrong about something here. It's time to figure out what that is and run a test just there.

David J Bland (17:14.045)

Yeah, I think there's something about if you're wrong over and over and over again, then maybe you don't understand the problem enough or the customer enough and you're iterating on the solution. But we really need to start doing more discovery on the problem because you're a smart individual. We're convinced you could probably get that right. But if you don't understand the problem, you're almost like you kind of skip that step and then you're really frustrated that you're iterating, iterating on the solution that's not getting any traction.

Lex Roman (17:40.879)

Yeah, totally. And some of those things are not.

necessarily experiment driven, right? We have this like, think mindset where it's like, okay, what's the experiment? It's like, honestly, the fastest thing you can do there is just get some of those people on the phone and figure out why it's not landing with them and listen deeply for that disconnect. And if you're not capable of doing that, hiring someone who can or assigning that out on your team to actually just figure out, okay, why aren't these people taking me up on this? Like that's just gonna be faster for a lot of our teams.

David J Bland (18:11.294)

Yeah, and I'm probably somewhat to blame for some of this because I use the term experiment so broadly and that's where I get pushback because they're like, well, that's not an experiment. That's just doing research or that's an assumption test, you know, like how Teresa Torres frames it, which I'm a huge fan of her work as well. And I think why I went so broadly with it is to really try to drive this point home of, look, you need to have an assumption of something that is really important where you don't have evidence and let's anchor our experimentation there because I'm usually working on new stuff. But I do think

Lex Roman (18:14.341)

Hahaha!

David J Bland (18:41.098)

there are side effects of broadly painting everything as an experiment because now it's kind of permeated the language in a way. It's kind of like MVP, right? It means everything. means nothing. It's like pivot experiment, think is teetering. Maybe we're already there. It's teetering on that moment of everything's experiment. And really it's not.

Lex Roman (19:0.213)

Ha ha ha!

Lex Roman (19:4.913)

living in a post experiment world. Yeah, I mean, I think I'm with you. Cause I think I really loved Ash Morial's framing and running lean of like problem validation, solution validation, and sort of seeing that as a continuum, which I think is what you're speaking to, right? It's like, we're on a continuum of understanding and execution. And that's going to involve a variety of experiments and conversations are part of that experiment cadence. But I think that with that,

When that like combined with the obsession with data that Silicon Valley now has and has access to, it's gotten a little bit toxic because now people think they can have an objective answer to every decision in their company. And the reality is that you can't and that all data comes down to is the data story that everyone on your team buys. And that's the hardest in customer interviews and in experimentation because none of that data or rarely is that data going to actually have

a clear consensus objective answer. And that's why people are so latched on to A-B testing, because they're like, the computer said it won. But it's like, actually, there's so many reasons why you could have screwed that up and the computer could be wrong about that. You could have put the wrong metric in, you could have tested the wrong thing, right? Like there's a lot of fallacies in there. And I think it's those convergence of like, what once worked for us as a framing has now come up against this thing of like data and objectivity that like is false.

David J Bland (20:33.793)

Yeah, well, it's like the A-B test tells you the what, but not the why. And you can make big bets on an A-B test that might be designed in a way that is flawed, or you didn't check to see why people behaved a certain way. And so I do think balancing the what and the why and the quant qual really makes a lot of sense. You know, over the years, it's just really interesting catching up with you because, like, we're early stage pioneers in this, you know, trying to get people to even listen to us. And now it's like...

Lex Roman (20:44.496)

Yes.

Yeah.

Lex Roman (21:1.051)

Just shouting into the abyss, please experiment!

David J Bland (21:2.581)

No, come back. It's like, come back. No, no, you listen too much. You listen to the wrong things. You know, you've been, yeah, you've been working on so many cool things over the years. And, you know, I've seen some of those that kind of come across, you know, my LinkedIn feed and just like, just being a fan of you. Where are you with?

Lex Roman (21:6.897)

Bye!

Do as I say, not as I do, yeah.

David J Bland (21:22.626)

You know, you're working with journalists and you're running experiments. Maybe you can educate our listeners on what you've been up to and then how this is, you know, continually to shape your thinking on experimentation.

Lex Roman (21:33.265)

Yeah, absolutely. So I started my own company in 2019 doing growth consulting. So I worked with tech companies when I launched and I did that for a couple of years. But in 2021, I pivoted into the creative entrepreneurship space and I launched actually a growth membership where I taught experimentation to entrepreneurs in an extremely lightweight, pragmatic way. Right. Because as I mentioned, if you're like a solo or duo shop, you can't be learning for learning sake, as we know.

some enterprise companies like to do, right? Like you and I have experienced inside these large conglomerates that like sometimes experimentation is for a product outcome and sometimes it's for a team retention outcome, right? Innovation labs can often be a play for employee retention, which is great, but that doesn't matter to a two-person web design shop in Portugal, right?

I taught experimentation, actually I have my merch here, test track tune was our motto because we want to test, we need to track it, but then we need to fine tune it, we need to loop back, right? But even in that program, I experienced some really interesting lessons in, okay, translating this tech scenario to creatives who may not be experimenting with software at all.

There's a lot to unpack there. And then this past year, I've been looking at the journalism space because a lot of journalists are launching their own subscription businesses. So it's actually really similar to the work I used to do in tech with SaaS companies, but it's non-tech, right? So the interesting things there that I head up against right away this year, I started a project in March with a newsroom.

I have no developers, I have no designers, I have no access to this code base, I barely have a WYSIWYG that can work, I'm on like a really lame WordPress instance, I have no data scientists and I have like barely any analytics. Now I'm running a test on their upgrade page and I'm like having to analyze data line by line, it was excruciating and I was like, oh, this is a really good lesson and all this infrastructure that I'm missing in tech that makes it legitimately difficult for a non-tech business to experiment.

David J Bland (23:45.446)

It's almost like if you don't have a baseline, then how do you know if you made things better or worse? And I work with a lot of non-tech companies as well, and it's a different world. Then, oh, I'm just going to integrate Mixpanel, and then I'll have an answer.

Lex Roman (23:50.587)

Absolutely.

Lex Roman (23:59.589)

Yes, it's incredibly challenging. And I think the thing that's really tough with journalists, one of the things that really draws me into this space is that journalists have really excellent content market fit. So I'm looking at these, they're called worker-led newsrooms. So they're journalists forming worker-owned co-ops where all of the journalists own the company.

They're not owned by like a random business person because that's been failing the news industry. They're owned by the journalists, which is exciting. The journalists are having to learn all these business skills, but then they have such excellent fit with their audience. Their mission is so exciting. A lot of these journalists have built their own social profiles into the 100K million range, right? Like they're pretty popular.

If you have a large audience and you're launching something, there's a momentum you get from that. So when these outlets launch, you know, like there was just an outlet that launched this week in DC, they raised $275,000 from their audience pre-launch just because these journalists are awesome. That's exciting. And honestly, they didn't have to be good at marketing to do that. But what's gonna happen is that their growth is gonna slow down. That's a big launch momentum that they get, and then their growth is gonna slow

down and the thing that I'm trying to figure out is...

How much is their growth gonna slow down? How much does analytics matter? It's like, I know better analytics would help me as someone who's helping them, but could I actually offset the cost of ChartMogul at a hundred bucks a month by bringing in X amount of subscribers? I don't know that I can prove that yet. And that's the challenge is like this, okay, Lex, well, what do you think we need? And I'm like, well, I'm not really sure because like you seem to be doing pretty well. I think maybe we could do better, but I need a little bit of tooling to do that. And that puts us in this little space where it's like,

Lex Roman (25:47.793)

Can we just take a leap of faith that we need plausible or chart mogul or I need this automation tool or Zappi or whatever and see if I can pay it back in the quarter? But like that line is thin. Like I need to prove it pretty fast.

David J Bland (26:5.160)

That's very interesting. It's almost as if you're taking the principles of everything you've learned over the years and practiced, but then you're taking those principles and trying to cultivate almost like different practices for the client you have, which is not tech savvy. Their world doesn't revolve around tech.

And so how do I apply the same principles that you've been using over and over and over again in this context and still have an outcome that helps them, you know, find their path forward and realize, hey, do we need to invest in this or more? That is deceptively hard to do, in my opinion.

Lex Roman (26:41.519)

Yeah, it's super hard to do because as you know, there's a lot of invisible work that goes into experimentation. And one of those things is the data analysis part, both the like vetting data analysis part, like to your point about baseline, like where are we even to begin with? That's a lot of labor, especially if your data is in a bad state. And then the post experiment data analysis. Like I ran a lot of tests about getting the conversion from a free newsletter subscriber to a paying subscriber faster.

and I wanted to track messaging there, right? Was it this email or that email? How many messages did this subscriber get before they converted and stuff like that? And that's really labor intensive, which was fully invisible to my client on this project because they're out in the field doing journalism. And that was a good lesson to me. It's very different from the Toyota project where the team is in the work with you, right?

David J Bland (27:34.602)

Yeah, we were more like part of the team co-creating in a way with them. And I'm thinking, so if you think of like some of the standards we use in tech, right? So think about pirate metrics. And for those folks who don't know, you know, they're listening, it's A-A-R-R-R. So R, that's why you're gonna remember it, or if you ever remember it. So acquisition, activation, retention. And then depending on the business, it could be referral revenue or revenue referral. Do you find that that would translate over to some of your non...

tech clients in some way or does it help at all?

Lex Roman (28:7.471)

Yes, I think it does, but I've shifted. So I do think that you can assign those metrics to nearly any business that's doing transactions. But I think that it's more helpful to frame it as a growth loop, which is Balfour's model. Have you heard of that model, David?

David J Bland (28:23.167)

Yes.

Lex Roman (28:23.833)

Yeah. So with growth trackers, my membership around experimentation for creatives, we use the loop model. I developed my own model, which was visibility, trust building, sales and amplification, right? Which is just a flip on those same metrics, right? Visibility is acquisition, trust building is activation, sales is revenue, and then amplification is referral. So like, I think that you can take that framing. And I do think that the names actually really help people understand. So even with the.

framework it's like if you can rename those for the industry it can make more sense to them but I do think that it can be applied.

David J Bland (29:0.810)

Yeah, I was thinking of the flip side. So ours very quantitative, right? And then the qualitative side of that was like kind of picked up from the late Brent Cooper and then Patrick Vlaskowitz was like awareness, you know, aware, hopeful, satisfied, passionate in a way. And that doesn't match up one to one. And I've had some conversations about them in the past, but the idea of that similar kind of loop, can you customize that for, you know, your industry that you're working with and trying.

in trying to help, but then it comes back to, even if you can, what's the tooling you need? And again, that again, isn't trivial to figure out.

Lex Roman (29:38.073)

It's not, it's incredibly, I think the thing that we're hitting up against as, know, I've now shifted into marketing when you and I worked together, I was a designer and I was a designer for many years, but I'm now a marketer because I work in scenarios where design is not part of the business really. In marketing, we're in this like sort of challenge with attribution where Amanda Natividad from Spark Toro keeps calling it zero click marketing.

And so with referrals in particular, word of mouth is incredibly challenging to measure. And it's even more challenging if you're not willing to pay for that much software or intelligence around it, right?

Netflix can afford to get consumer level marketing about word of mouth and they can afford to track and surveil people on a global scale, but a journalism outlet of seven people cannot. So that makes it really hard to track, okay, well, where are these people coming from? And when I worked with creative, the answer that they would get on that little form, it's like, how did you hear about us was very often referrals, word of mouth, whatever. And it's like, a lot of people don't dig into that slash it's difficult to dig

into we found ourselves sort of coming up with these models of like okay well you heard about me from Justin but it was actually my podcast that made you buy.

So rather than thinking of it as like one flat referral channel, maybe there's like a visibility through a referral, but actually the sale is happening in a different channel. You've been following me on LinkedIn and that's the conversion point. Or maybe it's the other way around. You found me randomly on the internet, but then Amanda told me, told you about me and that's what made you buy. And I think that's just not that well understood and it makes.

Lex Roman (31:18.991)

That's where experiments get hard because it's like you can run an aborted mouth experiment, but then it can actually be pretty hard to measure.

David J Bland (31:26.988)

Yeah, it almost gives me hope on my LinkedIn memes where I'll get like two or three million views and then I'll get one new profile visit out of that.

Lex Roman (31:33.145)

Yes.

Yeah, okay, but that's a perfect, perfect example, right? Because I can't tell you how many growth trackers have reported that their LinkedIn posts got like almost no engagement, but then someone DM them about it who didn't engage with it. And that's a perfect example of like untrackable marketing. Like that DM had to happen and that person had to reference that post in order for you to even know that that worked. And you as an entrepreneur or as an innovator on any level, you have to investigate that because people won't always offer that to you.

David J Bland (32:7.298)

Yeah, it's just given me hope that a lot of my brand awareness work I'm doing. It's not it's not clear because, you know, even in my work, right, it's the book, it's the podcast, it's my LinkedIn memes, it's what I write, it's the newsletter, you know, but it's all a different word of mouth. And it's really hard to know sometimes what's working and what isn't. And I imagine for small business owners, I've had a few cohorts of small business owners I've worked with this year. I have to say it's been

Lex Roman (32:21.574)

Right.

David J Bland (32:36.784)

It's been really challenging to work with them because they you're right. They don't have all the resources that a tech company has. And I'm trying to have them go through assumptions map. And then when it comes to experiments, they're like, oh, but why would I run that? I feel like I just know the answer. And so I've had that experience a bit with small business owners as well. And I can't say I've really cracked it yet as far as my facilitation and coaching. It's just been kind of like surprisingly difficult for me switching gears and like, oh, yeah, of course I can host small business owners. And then I get in front of them like, wow, this is a

actually quite challenging.

Lex Roman (33:7.725)

I think it's one of the hardest markets to crack, honestly. They're just so tough and they come from such a variety of backgrounds that it's hard to solve for them, for sure. I'm really trying to move away from that space because it's hard. You're also never gonna have a big win because it is, to your point about the newsletter, LinkedIn, podcasts, for small businesses, a lot of it is a cumulative effect of many things.

And it's hard for like one experiment to be like, ah, we've cracked it. Like that just doesn't happen. It's like some stuff pays off months down the road and that's great. But you're never going to put that back on me and my programs or my teachings, right? You're never going to credit me with that.

David J Bland (33:53.564)

Correct, and I've had this situation recently where we come up with an amazing experiment for a small business owner, but they're part of like a franchise and they're like, yeah, but I'm not allowed to run that experiment. And it's a very much like a conversation stopper, right? Like, where do I go forward as a coach when they just say, hey, I'm not allowed to do that? Yes, of course I learned, but I can't. So there's some constraints on small business owners, not just resources, but even with legal and regulatory.

Lex Roman (33:54.431)

Ha ha!

Lex Roman (34:13.179)

Dude, that's a good...

Do you remember Nathan Fielder's Starbucks test where he changed the name Starbucks? It was like Starburst or something like that. And it was like an adjacent coffee shop that looked like Starbucks, but was like different than Starbucks. They got sued, but that's what you need to do. This is where the straw brand comes back into play. Maybe as a franchise owner, you need a second brand.

David J Bland (34:43.004)

Yeah, it's just, that's so tough for small business owners, especially even coming out of the pandemic, right? Where they were hit extra hard and now we're telling them, you can go off and test all these assumptions. And it's, can see where it'd be very, very overwhelming for them. Coming back to journalists. So you've been working through journalists, you have your program. Where do you see this going? Like, where do you think the big assumptions you have to test are going forward with that?

Lex Roman (35:7.311)

Yeah, so specifically with journalists, I'm trying to focus on reader subscriptions. So a lot of these worker-led newsrooms and indie journalists who are launching like sub stacks or beehives or their own publication, they want to be reader funded. They want to be majority reader funded because that offers them a lot of editorial control overtaking like nonprofit style donations or investment or dealing with like sponsorships and ads. They're doing a mix of revenue, but reader subscriptions generally

they want that to be the biggest. And so there's quite a game to crack there in terms of, you know, you're selling a reader subscription usually under $10 a month.

that's a really small transaction amount. Your consumer targeting, consumer marketing is incredibly difficult. So I'm working on cracking the plays because the upside of this space is that it's pretty repeatable once we figure out what it is, just like it is in SaaS, right? Like in SaaS, know this is kind of like the set of pricing pages that people use and work, right? This is the way that checkout should work. And we know that if you remove, you know, as many fields as possible on checkout,

better for conversion. Like there are just things that we know in SAS that we don't yet know in the journalism space and also I'm finding journalists are you know the news software space is really antiquated so some of these publications are landing on these like problematic platforms that are hindering their growth. I have a new theory actually about Substack hindering the growth of their their publications monetization. We know that Substack does an amazing job

that free list growth, but I actually think they might be hindering the conversion to paid by the way that their tooling is set up. I'm gonna write about that. Actually, it'll be already out if you're listening to this. And so I want to crack that playbook and I want to make that so much easier for journalists to apply. And I don't quite know what the shape of that will be. Like one possible outcome is that I will advise some of these news software companies. So we just build it in and it's just done for them. Another outcome is that I,

Lex Roman (37:16.851)

I create more like a swipe file set where it's like, okay, these are your sequences, this is your pricing, right? And I'm researching and testing this across newsrooms. So I know definitively nine bucks a month is the starting place or seven bucks a month is the starting place. So I'm starting to form some opinions on that based on, know, I'm working with, I'm not really consulting with newsrooms, but I'm researching them. I have maybe like 10 that I'm talking with actively right now.

David J Bland (37:44.150)

I like to use the word playbook because it feels that would be a very natural outcome of, or maybe an output of what you're working on as, hey, once I crack this, this seems, you know, I've tested it a bit. This seems like a repeatable thing that I could put out there and people can get benefits from. I drew like the idea of a playbook there.

Lex Roman (38:1.999)

Yeah, yeah, because we're starting to see like in the case studies that I've been researching and writing about in my newsletter, things that are working that we can just be like, this worked for this newsroom. Why don't you try it? Once it works for two newsrooms, now I have the beginning of a pattern.

that you can then apply. those are starting to, I've been doing this for like six months on journalists, on the journalists face, and I've already got a pretty good set of patterns. So I think it'll happen pretty quickly that we, because the, you know, journalists are feeling really disheartened right now. A lot of them have been laid off. Publications, entire publications are being sold off or shut down. And so there's this big question of like, do our jobs still exist? Can I make this news outlet fly? And I want them to know that not only can they, but

their peers are actually doing exceptionally well in terms of growth, organic growth and free to pay conversion without being good at it and with running on really lame software. So if that's happening, how much better could it be if we can actually put some growth mechanics in play, right? Start using things like automation, sequences, auto plugs on Twitter, things like that. Like there's so much more that we can do together. I wanna give that hope back to them.

David J Bland (39:12.888)

I like that. I like that in part of your playbook, potentially you have the, here's what's worked for everyone else. This will probably work for you with some fine tuning. And then maybe there's a section where, hey, you're going to need to test your way through this part to customize it for you. And that way you can maybe circling back to our conversation where people are testing on everything when they shouldn't. You can give some guidance on, hey, you should test here, but not over here. We see we see this across the industry. Here are some patterns. It's more about a pattern you apply. So I like that framing.

Lex Roman (39:42.083)

Exactly, yeah, because it's like, is it two subscription tiers or three? I feel like we could just prove an answer to that. But the exact messaging, what you're gonna call those subscription tiers, that's specific to your audience in your newsroom, right? That's where the personality comes forward and those are the things that you can test.

David J Bland (39:59.706)

Yeah, I love that. Yeah, I love how you're thinking about it. It doesn't surprise me that you're applying sort of all the stuff that we've done and you've evolved it and now you're applying it into your work now today. So it was just really amazing catching up with you. Hey, if folks want to reach out to you, so they listening to this, they're like, hey, I want to talk to Lex about either I'm a journalist or I'm trying to work my way through this. How would they best get in contact with you?

Lex Roman (40:23.449)

Yeah, I'm most active on LinkedIn. You can find me at LexRoman on LinkedIn. And then you can also check out my website, LexRoman.com. I have all my newsletters there. Whether you're a journalist or not, I have a couple other ones that you can join too.

David J Bland (40:35.982)

Thanks so much, Lex, for sharing all your insights and your spicy takes on experimentation. It actually got me thinking about some stuff that I need to write about as well. I'm just a huge fan of yours, so I'm just so happy that you made some time to chat with us. And just thanks so much for joining us today.

Lex Roman (40:51.333)

Back at you, David. Thanks for having me.