The Grow and Convert Marketing Show

In this episode, we debate why most companies shouldn't AB test. AB testing has become super popular over the years... Executives often want you to AB test every little thing to try to maximize the impact of all of your marketing efforts. 

Devesh explains why for most companies, it doesn't make sense to AB test. 

We share that reasoning and then we talk about what you should do instead of AB testing to increase conversions. Devesh shares some funny stories from his time running an AB testing agency. Then Devesh shares how to use a conversion rate calculator.

We hope you enjoy this episode!

What is The Grow and Convert Marketing Show?

We share our thoughts and ideas on how to grow a business.

Does it really make sense for
SaaS companies to A B test?

Whatever the test idea you
have, look in your Google

analytics or whatever analytics.

In your back end system that's showing
how many free trial signups you get.

How many do you get per month?

If the conversion metric you are
using, that you are testing, the thing

you're trying to increase, free trial
signups in your example, if that is

less than 1, 000 a month, I think
your time is better spent elsewhere.

And if it's less than
500 a month, forget it.

If it's less than 200 a month, forget it.

Do not waste your time with it.

The answer is, Stop worrying
about conversion rate.

It doesn't matter that that thing
is not high converting channel.

This is a more realistic example.

And this is true for so many businesses.

It's like each channel you should
be evaluating in the end in

dollars, in sales, in revenue.

And then what was the dollars
in to create that channel?

That's it.

Your overall site conversion rate
is some blend of all of those

channels, but does it matter?

We wanted to talk about how most
companies shouldn't do A B testing.

And I think that's a completely different
narrative than what most people say.

If you want to improve conversion
rate, oftentimes someone's like,

Oh, just AB test this thing.

And through working with Devesh, who
runs a CRO agency, I've kind of learned

that more often than not, AB testing
doesn't make sense for your business.

And that is not the way that you should
think about improving conversions.

And I'll let you kind of go into the
argumentation of why, but, um, Yeah,

we kind of wanted to talk through
why A B testing just doesn't make

sense for most businesses and maybe
some ideas on what to do instead.

Yeah.

Um, I'm going to answer this very quickly.

The answer is statistical significance
that the majority of businesses, the

majority of pages on your site do
not get enough conversions to get a

statistically significant results.

And I'm going to add another twist.

Actual statistical significance.

For those of you that have done a little
bit of AB testing, you'll know that like

the industry standard is 95 percent plus
of like this little metric that AB test

calculator spit out or tool spit out
being like statistical significance.

And when it reaches 95, it like bolds it
in green and it's like, you're tested out.

That's actually not the only
criteria to stop a test.

And I'll show an example
of an old blog post.

I did.

Um, showing how you got a 95% plus
statistical significance in like two days.

Um, I got it out of a tool, but
turns out I was, I, what I know is I

actually gave the tool the same URL
twice, so it was actually an AA test

and it was just noise in the data.

So you actually have to
wait long enough also, um.

Well, haven't there been scenarios
where then you've run the same test

again and got a different result?

AB testing is really hard.

And like the biggest thing to
make it more reliable is traffic.

Our best clients.

So the agency Benji is
referring to growth rock.

We now only do AB testing for
e commerce companies because

we used to do it for SaaS.

And I was, Starting to get
frustrated at this exact topic.

This thing that the majority of
SaaS companies don't get enough

conversions, demos, signups,
whatever, to make a difference.

So there's someone, someone
in the organization.

See, now you got me all worked up,
which is like every other episode.

So someone in the organization is
like, some CMO is hired or whatever.

And it's like, someone's like, we need
to increase our conversion rate from, you

know, like traffic to demo or whatever.

It's a 2 percent and we need to get it up.

So they're like hire a CRO firm.

And like in the younger
days, like I was like, okay.

And you can hire them now.

Like big firms are ready to
take 10, 000 a month from you.

And they'll be like, sure.

And what are they going to test?

They're going to go to your like
demo signup page and they're going

to test all the cliche stuff you've
read about in marketing blogs.

If you've read about this, right,
they're going to put a testimonial under

the form to be like, see social proof.

They're going to reduce the number
of form fields, reduce friction.

Don't ask about size of company.

It's people aren't signing up or whatever.

Like it's like, it's like makes the
form cumbersome or something like that.

What else are they going to test?

They're going to test a
few different headlines.

Like, well, benefit copy,
they'll put like three benefits.

So it'd be like, sign up for a demo.

And then the AB tests variation, the
B that, you know, the like account

manager or whatever, like comes up
with, it will be like three check marks.

And I make fun of this.

I have done this.

I know, I know all this and I'm
passionate about it because I've done it

many times and was frustrated with it.

And it'd be like three benefits, like
saves time, like blah, blah, blah, or

like cheaper or something, you know?

And, uh, and, and they'll run it.

And the vast majority of time, if they
are doing the statistics correctly and

responsibly, it will show no difference.

And then people will get frustrated.

What about changing the button
color from red to green?

Yeah, oh god.

That, I don't do.

Because it's just so frustrating.

Yeah, you change the button color.

Um, I will later, I want to ask you a
few more questions to get the context of,

cause you were the one that approached
me and said, we should do this episode.

And I was like, hell
yes, I'm ready for it.

But, um, I will later go over
some examples of statistical

significance with a calculator.

I'll screen share, I'll read it aloud
for people listening on the podcast to

give you like exact quantitative examples
of like, at what traffic and conversion

numbers do you get more reliable data?

But I want to first.

Explain qualitatively, a few
more things, the other thing.

So, so one I have just like straight
up for a majority of these, like SaaSsy

businesses, SaaS type businesses, or even
a service businesses just kind of has

the same number of like signup metrics.

We're talking about like, let's say
hundreds of signup or 1000 free trial

signups a month, even can be hard to
get statistically significant data.

Let me repeat in case you think I'm saying
that incorrectly, or I made a mistake.

I did not.

Even at 1000 free trial signups a month.

It is very dubious whether whatever
traffic number you get for it.

By the way, slight misconception.

People think statistical significance
is a function of traffic.

It is not.

It is a function of
conversions per variation.

Oh, that's interesting.

I didn't even know that.

Yeah.

I, it took me a while to learn that.

Um, and I can show some examples with,
um, the calculator later for the,

for the nerdy part of this episode.

Um, Even at 1000 total free trial
signups, not, which is like, like,

you know, our clients, that's a lot.

Most SaaS companies aren't
getting anywhere near that.

And by the way, if you're a
demo based SaaS, like good luck

because you're, you're getting a
thousand demo requests a month.

Like if so, I would like to talk to
you and see what your channels are.

Right.

But even at that, it's hard
because you need to do an AB test.

And it'll be 500 conversions per variation
and you could see a difference and we'll

walk through some examples of like how
big of a difference you need to see.

But in many cases you're not going
to see a statistical difference if

it's like, you know, 5 percent lift.

It's really hard to get that.

And then here's the qualitative
reason behind all of this.

These things that I'm making fun of,
these UX things, put three benefits.

Or Benji said, but change the
button color, which like people

talk about in like kind of
marketing ecosystem or whatever

those, if you're amazon.

com and you have millions of
purchases a day, and a lot of

them are flippant purchases.

How many times you've gone to Amazon for
some random, you're like a razor, I need

this thing and it's like 14 and you're
just like, should I just, uh, just buy

now, you know, that's a flippant purchase.

Those little UX friction things
can make a difference, right?

Because you're just like, it
just, you just quickly do it.

And you're like, ah, forget it.

Like, you know, this rattan stool is
arriving for our porch or something

like that, and it's like 30 or something
anyways, if you're deciding what company

to reach out to for enterprise human
resources, software, or something.

Do you think people care
what color the button is?

Like, do you know how much
work goes into that decision?

Hence our entire agency
or you're like Googling.

And like, I have seen people online
talk about our bottom of the funnel

technique being like, that is too
flippant that the actual like purchase

of enterprise sales is not, or software
or whatever, like B2B software is not

just like Googling, you know, best it
security, whatever people have said.

Like, no, it's like years
of like cultivating them.

That stuff doesn't matter.

It, it can look, don't get me wrong.

I have done these tests for
these companies many years ago.

There are times where you can
get a statistically significant

result, but it's hard, it's hard
to do that over and over again.

There often isn't like amazing low
hanging fruit there, especially the

tests I see people talking about online
where you're changing a headline.

You're adding three bullets.

You're reducing the form fill.

I'm sorry.

Like.

Whether or not you have an additional
field asking for a number of employees

or like industry or company size
and removing it, that's not going

to make or break whether you're like
large SaaS product grows or not.

Like, you know what I mean?

Um, it's not a flippant decision where
these little UX details don't matter.

So that's my qualitative explanation.

Why versus In DTC e commerce, where there
are a lot more, um, numbers, Amazon is

an extreme example, you know, biggest
e commerce site in the world, but even

some of the clients I have, like, which
are like branded sites or whatever, you

can get the kind of numbers where they're
doing thousands of transactions a month.

And when each variation has
5, 000 transactions in it.

Now we're talking now it's easier
to get statistical significant data.

So that's my rant, but I'm curious
when you're saying you see people talk

about this, are there some examples or?

Yeah, well, I think, so I come
from venture backed startups

and I think the culture in those
companies is to test everything.

And so oftentimes you'll hear CEOs who
maybe are not educated on the As much

on the marketing side is, let's say
you doing a B testing as your career

that will just say we should test this.

We should change the headline and test
us or in regards to our own agency.

We hear this a lot.

Well, in a blog post.

I should test which, uh,
conversion action I should use.

Should I use like a button that says by,
I don't know, sign up for a free trial.

Should it be in the post?

Should it be a pop up all
these different things?

Let's test it.

And what you're saying here is
like, largely that doesn't make

sense because the site won't
get enough traffic to do that.

And so I just kind of want to
pose the question back to you.

I guess, does it really make sense
for SaaS companies to AB test?

So you're saying.

Probably, unless you're in like the
top 50 SaaS companies where they get.

No, no.

I can give you an even more quantitative
specific criteria, whatever the test

idea you have, look in your Google
analytics or whatever analytics.

In your backend system, that's showing how
many free trial signups you get, right?

How many do you get per month?

If the conversion metric you are using
that you are testing, the thing you're

trying to increase free trial signups.

In your example, if that is less
than 1, 000 a month, I think your

time is better spent elsewhere.

Yeah.

So what are, and if it's less
than 500 a month, Forget it.

If it's less than 200 a month, forget it.

Do not waste your time with
it because of statistical

significance, which I can show later.

Yes.

Uh, then I'm just wondering like,
what are all these agencies doing?

Charging these massive
retainers to these companies?

I, they it's the blind leading the blind.

They, they show the test data.

I've, I've seen this.

Oh my God.

Now you're getting me really worked up.

For one of my active, no, it was a
client we had like, you know, last

couple of years, not active anymore.

They hired, I don't want to name
drop this cause I don't want the

company to like sue me, but they
hired a like name brand top consulting

company, like a consulting company.

And he said, Devesh, here, here are
some old like email threads that

are active right now about tests
that are live on the system today.

And I like just got
hired and I looked at it.

And I saw an email thread where they
said, this test is a winner to my client.

Do you want me to create a ticket for
your it team to roll it out permanently?

And I immediately replied to that email to
just my client and said, hold the phone.

That is absolutely not a winner.

I I'm going to say this part as
honestly as I can, but I, you know, me,

I have the memory of a small rodent.

I don't remember the exact
number, but emotionally I

remember it was on the order of.

Something like one of the variations
had like 14 conversions and the other

variation had like six It was 14
and six that has I mean historically

I've run tests like that because
again being in the marketer's shoes

You're told to go AB test something.

You sign up for one of those AB
testing tools and that tool, you, you,

you set up the test and everything.

And yeah, you, you get some answer like
this, like, yeah, this one's a winner.

You beat it out 20 and 18, and
then you just go with that option.

You're, you're saying that's not correct.

And it, and basically, if you
were to run that test again, it

could flip the other direction.

And so you're, you're
running the test again.

You, you should not have stopped.

It's not over.

Like you, the test should have kept going.

It's just not enough data.

So here, so, um, side note,
I had to find this post.

I Googled it.

It's nice that Google AI overview just
says, according to growth rock, you

should rarely stop an AB test before
it's run for two calendar weeks.

That's a good summary of my, my post.

That's amazing that I wrote growth rock.

When to stop an AB test and
there's an AI overview for it.

So let's click into this.

This is the best example I have.

Here is a test we ran
for an e commerce client.

You ready for this?

So I will read it out to people
who are listening on the podcast.

I'm showing a screenshot of AB
test data from the popular testing

program, VWO visual website optimizer.

And there's two variations
control and the variation.

The control has, we're
talking about, what did I say?

14 versus six.

The control has 89
conversions on 3000 visitors.

Both of the control and
variation have about 3000.

The control has 3000 visitors.

At the time I took the screenshot,
the variation has 2, 900, basically.

The control has 89 conversions.

The variation's 65.

If you just divide these
numbers, 89 divided by 3, 000,

65 divided by 2, 900, it's 2.

96 for the control, and 2.

27.

So it is, the control is up by 30%.

That's the difference between 2.

96 percent conversion rate and 2.

27.

That's a 30 percent higher conversion
rate for one of your two variations.

And everyone who's watching the
screen share can see 95 percent

statistical significance in
green from the A B test software.

What is the catch?

This is an A A test.

There is no difference in the variation.

I just clicked run without actually
changing anything in the variation.

And that's 89 to 65.

And people are, I literally saw a
brand name consulting company tell

my client A multi million dollar
revenue a year e commerce brand.

Let's, this test is done.

It's a winner.

Run it.

Like, like, implement it at 100%.

And the conversions per were
something on the order of like

the teens or less, I remember.

This is what people do.

So that, that, so to answer your question,
that's what people are charging them for.

They're, they're just calling tests
early and the client has no idea.

The client doesn't care
about these statistics.

They're just like, Devesh,
is this a winner or not?

They're just like, is this going to
increase my conversion rate or not?

And so they feel like job well done
and everyone's patting themselves on

the back and it's, and then you run
test after test, you implement them.

And then someone high up says, huh,
we've AB testing company for a long time.

Why hasn't our actual conversion
rate increased that much?

For those businesses then, what
would you recommend doing instead?

So, obviously these companies want
to increase their conversion rate.

A B testing doesn't
sound like a good option.

I know something historically that I've
done in the past is use surveys or open

ended form questions Like throughout
the checkout process or something

like that, just to get feedback in
terms of what people think about the

page, how they're perceiving things.

And you can use patterns in those
responses to then go change website

copy or go change pricing or
different, different aspects of

whatever you're trying to improve.

Yeah, so this is a great question.

There's a lot of ways to answer this.

I want to do something kind of fun
while you're asking the question.

I had this idea.

Um, we just recorded, and I don't know
in what order the videos are going to

come out, but we recorded one of our,
okay, so we recorded our monthly update,

our last monthly update for May, um,
earlier, and we just showed that we felt

like, basically, that's not important.

We showed that the last few, um, months,
the number of non junk leads we were

getting at Grow and Convert, reaching
out to us for content marketing help, was

on the order of eight, nine, and eight.

So you're a co founder of Grow Convert.

You're a marketer.

Let me ask you so at those numbers
eight nine and eight If we said we want

to increase the number of leads or the
conversion rate, what, what would you do?

Cause you should, we sure
as hell shouldn't be AB

testing anything like that.

Eight, nine and eight total.

So per variation would be four,
somewhere in four to five, right?

You're not going to see anything.

Well, one of them gets
four, the other gets five.

That's 20 percent increase.

Like it's nonsense.

Yeah, it's, it's too small a number.

I don't know.

I w I wouldn't know the
difference in what we did.

Each of those months that would have led
to the, no, but then your question comes

back and you can say you answer it first.

I can answer it first.

I have plenty of answers.

But the question becomes if you're the
CMO of a company like this and the CEO,

let's say I'm the CEO of like Benji, like
increase the conversion rate or increase

whatever, like what, what would you do?

I mean I would look at what channels
are driving those leads and then

probably invest more in those areas.

Yes.

Okay.

So stop.

So that is what you said is actually a
dramatic change in plan in strategy level.

Forget tactics.

What you said is I would increase traffic
and I a hundred percent agree with you.

But like, so that's number one.

The number one thing I would do
is I would realize that conversion

rate isn't what pays the bills.

I try to say this to clients, like there's
someone in who someone, I apologize.

I should be crediting someone, but
again, everybody listens to this.

Knows my memory.

Someone said it should be called
revenue optimization because

conversion rate doesn't matter.

If I told you I could get
you free television ads.

At the Superbowl, this is
kind of a silly example.

I'd get you a free Superbowl ad.

Um, who would say no to that?

Even like, we're not even, we're
not a consumer facing business.

It doesn't make sense to market
our business on the Superbowl.

But if someone told me that I'd
be like, hell yeah, I'll record a

grown convert ad for the Superbowl.

I would do that a hundred percent.

Oh, who wouldn't, right?

By definition, that is going
to crater our conversion rates.

Because you're going to get a 10 million
people who are like, who's growing convert

and they'll Google it, they're going
to grow and convert the vast majority

of which have are, they don't know what
content marketing is, they're not going

to hire a content marketing agency.

That's an absurd example.

But even if you sold something that
a lot of people know you sell, I

don't know, accounting software, like
QuickBooks, most people know what that is.

And if you've got a free Super Bowl
ad, no one's going to say no to that.

Your conversion rate will drop to
the floor, but it's still good.

Why?

Because even if the conversion rate drops
because a flood of people watching the

Super Bowl came here, no one conversion
rate does not pay anyone's bills.

It's conversions.

It's customers.

It's the total.

It's the total that matters.

So where Benji's instinct went is the
correct place, in my opinion, is that

if someone's like, we have a very low,
the total raw number of conversions

is low, and someone in the company
is talking to you about conversion

rate optimization, your first step
should be to reframe the conversation

to, we just want to get more leads.

We just want to get
more free trials total.

Who cares about the rate?

What channel is getting it?

Have we exhausted that channel?

That's the number one thing to do.

Yeah.

So, so that's like the first,
and you could say, Hey, that,

that question dodges the answer.

Sure.

I think that's actually very important to
drive that point home because I actually

don't think many people think like
that, like, like from maybe, maybe the,

maybe marketers do, but people outside
of marketing, they do think about the

conversion rate and they think that's the
number that you need to move the needle

on in order to drive more conversions.

And it's like, it's not wrong
because your total conversions

is that multiplied by traffic.

Those are the two numbers, traffic
times a sum percentage equals the total.

You know, but as, as we've done it, as
we've seen on our work, like in the SEO

conversion rate post that we have on
our site, there are certain posts that

convert at 14%, 25%, but the volume
for that keyword is, is fairly low.

So even if we were to increase.

That conversion rate, another 10
percent somehow, it's really not going

to affect the overall numbers and so
that's, that's where your point comes in.

Yes.

And then that leads me to like
my second part of this answer.

So then you can say, okay, fine, fine.

We'll increase traffic, but still,
how do I, you know, conversion

rate is half of that formula.

Traffic times conversion rate
equals total conversions.

So that's fine.

You can say like, that's fine, Benji.

Like we'll increase the traffic part.

But.

Do you have any other
ideas to increase the rate?

Number one place I would go is channels
because, and you made a great point.

What we've noticed in SEO and the whole
thesis of the growing convert strategy

to a large extent is the way to increase
conversion rate of blog content or

SEO content, content, written content
is not to do on page stuff like put

a little banner in the middle of your
blog post saying like request a demo or

sign up for a free trial or whatever.

It is, we noticed to write more content
for keywords where there is intent.

So the highest way to
affect conversion rate.

Is actually to just have a larger
fraction of your traffic mix

come from high intent traffic.

It's kind of like the first answer
to some extent, because you could be

like, well, like, you know, again, like
that's all fine and good, but you're

not going to say no to the Superbowl ad.

It's going to take your conversion rate,
but like, you'll just say yes to it.

Those are kind of two ways to come at
the same answer, which is saying like.

Conversion rate really doesn't
matter like by itself as a metric.

It's kind of reminds me of
people who are like, guys, our

bounce rate is really high.

Like, what do we do about that?

And you're just like, what,
like, why do you care?

Are you, are you getting leads?

So that's one.

And then if you're still are like,
no, no, I really want to like

optimize, optimize my on page stuff.

My answer is going to be, and I'll
get into the nuance of this, but the

blanket statement is go for big swings.

Go for meaningful changes.

Forget all the stereotype cliche
stuff that you read on the internet.

Forget the button colors, forget
anything that you would bucket.

Into, um, UX.

And in fact, here, this is a little
bit of shameless plug, but we're not

using this podcast to sell growth rocks.

So just don't, this is not about selling
growth rock, but I will say my honest

opinion, which is here is we've created
this growth rock, the AB testing agency.

I created this idea called like a
purpose framework where I said, every

AB test has some idea behind it.

It's purpose of how it's supposed
to increase conversion rate.

So this is all geared towards
e commerce, which is, you know,

can do e commerce, but that's not
the majority of our client base.

Um, it's, it's SaaS stuff, but,
or B2B, let's say, um, go for the

big things that you think matter.

Benji talked about like, like
go for the headline tests.

Go for, we can talk about even ourselves.

We we've completely revamped our homepage.

We've revamped positioning and
those were the big swings for

us at times where we felt like.

Maybe our message wasn't resonating
as much or leads had fallen and like

everything else had stayed constant.

Then we're like, okay, is
there something else wrong?

Did we maybe get our positioning wrong?

And we, we changed the
messaging on our homepage.

We changed a lot of the design elements
and then we put that new version out

there and we just kind of tracked,
are we getting more leads or less?

Is traffic, we didn't AB test any of that.

We didn't do a proper AB test.

We just did the old
fashioned thing, which is.

Make the change and look at the metrics.

Yeah.

And get feedback from people.

And we, we had an example of switching
our whole positioning and messaging

over skewing way more towards the SEO
side of things where historically.

And, and currently we're more positioned
as a content marketing agency who's

known for really high quality content.

And we went way more onto the SEO side
of things and it really commoditized

our service and made it less
differentiated by making that change.

How did we realize that?

Well, we started talking to
leads and noticing that the

conversations that we were having
with those leads were not as good.

We were getting lumped
into conversations with.

agencies that we normally
don't compete with.

And a lot of that feedback from
those calls led us to think,

okay, maybe our homepage changes
actually did us a disservice.

Like, the quality of leads has gone down.

Even, even the number may have increased,
or Uh, even just stayed constant, but

we weren't closing as many deals and,
and that those conversations and getting

the feedback from the people that were
coming in is what led us to know that

there was a problem in, in all these
changes that we had updated on our

site and so then we changed it back.

Well, the test everything crowd
would then be like, well, but

you guys don't really know.

If your overall lead numbers per
month were due to that change,

because you did not isolate that.

So just to back up for people listening
to this, the whole point of AB testing.

When I say we didn't AB test it.

We just changed it.

When we say that, what do we mean?

An AB test, we probably should have said
this at the beginning, but that's fine.

Is a simultaneous.

In parallel, like in time, there
are two variations of a page live.

What's happening is every time
someone goes to that page.

The code on the page instantly as it's
loading picks whether they're in a or

B, randomly assigns them and then loads
the HTML and CSS for that variation.

And then the test software in
the backend is recording how many

people that were bucketed in a.

Converted how many people that were
bucketed in B converted, but the key to a

B test from like, why, why is it a thing?

Why do people use it?

And why is it genuinely useful for
companies that do have the traffic for it?

By the way, like I'm, I literally
have an AB testing firm.

I'm not scamming people of money.

Like there is a use for it.

I'm just saying the majority of
companies, especially non like D to

C facing companies, just don't have
the traffic for it to be statistically

useful, but like, it's a thing.

And why is it so useful?

Because it eliminates.

Time flux, natural time
fluctuations in conversion metrics.

So you take an e commerce example.

If you have a flower shop and you don't
do AB testing and you have something

and right in February, 14 days before
Valentine's day, You make a site change

and then you're like these, this month's
sales are crazy compared to last month.

You're obviously an idiot, right?

Because like, it's just Valentine's day.

The whole point of AB testing is
regardless of that's an extreme

example, but there's, there's
very much more common examples.

There is natural, like, um, seasonality.

We worked with a swimsuit shop.

You can imagine March through June
is massive and actually like black

Friday, cyber Monday, November and
all was like no traffic because nobody

buys swimsuits in the winter and fall.

Right?

We, we have a paid ad client right
now that sells boat parts and

this is like peak season or it
was a few months ago or whatever.

There's also things like your, um,
ad stuff goes up and down coupons.

You do like an influencer campaign or
like all these stuff cause sales to

go up and down and wildly while you're
doing that, you can run these two

variations simultaneously saying that
is happening to both variations because

every day people bucketed in both.

That's the whole point.

So the AB test everything crowd would
fight back against what Benji said

that we did on our side and say, Okay.

Sure.

You guys took these big swings and changed
your headline and positioning of the

agency as a content like high quality
content agency versus an SEO agency.

And you made these conclusions,
but you don't really know

because you didn't AB test.

So these natural, like, you know,
economy VC funding, drying up like

inflation, overall investment, those
things could have affected stuff.

Yeah.

My answer is you're right.

Like it could have, but what else
are we going to do when we're

getting eight to 10 leads a month?

There was no option to AB test.

This is the only thing you can do.

If you try to AB test, then you're just
risking reading into nonsense noise.

Whereas at least in the
method Benji described.

You're using qualitative feedback
to at least give you something

reasonable, like a reasonable data
point instead of seven versus three.

It's just nonsense.

And you risk being like this variation
one, because that random month, like four

more people who already were coming in
because they wanted to convert happened

to be bucketed into this variation.

And then that variation looks
like it has seven versus three.

It's like seven versus three.

That's double, you know, like
that's double, you're like

a hundred percent increase.

It's just nonsense.

So what Benji is saying
is use qualitative stuff.

You say, step back.

What are the big things that
affect people requesting a demo

for our enterprise HR software?

Why do people choose us?

Why should they choose us versus
workday versus Oracle, right?

You need to have that conversation so
that the B that you're testing, because

you know, you don't have raw statistics.

You're not Amazon.

Sorry.

You can't get that
statistical significance.

But at least then you, you have something
else where you're like, I'm testing

something that I'm testing a new headline
and new homepage angle where we emphasize

the, our savings of our employees instead
of our cost benefit, just like pick

a new benefit to position around, you
know, that that's a substantive thing.

And even better is what Benji
said, you got that information from

listening to client calls or whatever.

And like, you've got it
from the customer's mouth.

So like you need to replace what
the raw statistics could do.

If you were just got more numbers
with meaty, like substantive

qualitative things that you know
are likely to make a difference.

Then you measure and do multiple months.

Don't risk the Valentine's day example.

Be like, what, what, what was it
last month versus when we released?

What was it last quarter
versus when we released?

What about the last year?

Like try to give yourself a bunch of
numbers to be like, can I convince myself?

That this is higher.

And if you can't, and if it all just
kind of looks the same, don't feel bad.

That means like, it didn't
really make a difference.

Then go bark up another tree, right?

Then try to increase traffic and all.

Cause some people can come
back and be like, we'll see.

It's just a wash because the quarterly and
yearly data, you know, you didn't see a

difference, whereas AB tests could have,
no, we've concluded from the beginning,

you didn't have the numbers to AB test
anyway, with statistical significance.

So just leave, just, just
stop thinking about that.

Like you can't do it.

So then if you make a big change, what
you think is a big change, what do

you think is a meaningful change with
qualitative data backing you up and you

still cannot convince yourself in the
numbers that it's made a difference, then

it hasn't made a difference or it hasn't
made enough of a difference to matter.

Then just pick the one that you
guys feel like makes the most

sense from a brand perspective.

And then try to bark up another
tree to make a difference.

And that goes back to the first point
that you and I both agreed on and

made, which is in the end, it doesn't,
the conversion rate doesn't matter.

It's the total conversions.

No, I just think that's the biggest thing
that stood out to me from everything

that you said that I resonated with is
people are asking the wrong question.

They're, they're asking the question,
how to increase conversion rate , to

increase revenue when they should
just be asking the broader question.

How should I increase revenue?

And I think , you'll come to
different conclusions asking the

how do I increase revenue than you
will on the conversion rate side.

But so many people are so obsessed
with conversion rate as the way,

as the number one lever to pull to
increase the revenue side of things.

And I think that's, it's largely
just a function of A B testing may be

coming from the e commerce world where
there's instant transactions like that.

And that's a lever you
can pull in that space.

And now it got automatically
applied to all these different

industries and said, Oh, SaaS is,
is yeah, you're checking out online.

So you should use the same kind
of, Thinking in order to affect

change there as e commerce.

And it's just not the same.

Yeah.

It reminds me of the story.

We had an AB testing client last year.

It was a consumer electronics, like audio
speakers and, and, and that kind of thing.

And, um, it was a brand that like a lot
of people in that space would recognize.

If you're into audio, like you'd
recognize some of those brands.

They got a bunch of traffic to their site
because people have some of that equipment

from like the seventies, you know,
people are like really into like music.

They got like some old school, like thing
from like the eighties and seventies.

So they got a bunch of traffic to
their product pages, looking for

manuals of like decades old products.

There's.

There's nothing wrong with that,
like, um, but the leadership

was like our conversion.

They literally, this is not a joke.

The leadership reached out to some
large e-commerce platforms and said,

what are your average conversion
rates in consumer electronics?

And that e-commerce platform gave them a
number and they were like, we are under

below, we're below industry averages.

So you guys like need to do a better job.

We need to increase our conversion
rate to the industry averages.

And I was like, sorry.

You guys are one of the
biggest brands in the industry.

That's been around for decades.

People come to look up the manuals
of their old products from 1980.

That's still work because they're
like classic collectibles.

There is nothing wrong with that traffic,
but that traffic is not going to buy.

They're just looking for the
manual of their old product.

That's good.

You have this amazing brand where
people keep products from like the 80s.

They like sell it on like
Craigslist and whatever.

And like, it's like imagine like Rolex
probably, I don't know, but I'm guessing

Rolex has some archive pages of just
like their old models or whatever.

And they get a bunch of people going in
there because they're like trading watches

on these like watch forums and all.

I, I got into watches for like
a month during COVID cause I had

nothing else to do or whatever.

And then I was like, I'm way
too cheap to buy this stuff.

But anyway, so like I know them,
like there's nothing wrong with that.

That traffic will not buy, or at
least not going to buy at a high

rate because they're not looking
to buy a new Rolex on rolex.

com.

But then some leadership person
is like, we're under, under the

industry standard for jewelry.

And you're like, homie, like,
you're like one of the best brands.

It's okay.

Well, it kind of goes back to just our
point, even on pain point SEO and just

why, like why we have the strategy that
we do not all traffic is created equal.

If someone's coming to
your site to do research.

They don't have buying intent.

And so traffic from those
queries or various sources might

have a lower conversion rate.

And that doesn't necessarily
mean those sources are bad or

those pages on your site are bad.

It's just a function of what that person
is trying to achieve on your site is very

different from someone coming in with
the mindset that I'm going to purchase.

Yeah.

Here's another example
that actually is relevant.

Um, add, uh, add, oh man, you, you found
quite the topic to get me riled up.

I was having a really peaceful
day and then we recorded this.

Now I'm getting so mad.

That's like, um, ad channels and
I'm going to save for e commerce,

but it applies to SaaS too.

Like, um, we had someone asked this,
another client, this came up where

they're like, well, they're, they're
mad that like the number, the raw

conversion rate number is this percent.

And they want it to be in that percent.

And I was so mad that they were like,
Basically in a way like kind of like,

you know, ripping on, on my work on
from the testing agency and I was like,

okay, then stop all display ads because
display ads for this brand are so cheap.

I don't know if anyone's looked at this.

If you look at this display,
meaning you're buying ads with

those, like, literally like.

You know, like the banner ads on the
side of like news articles, New York

times at the very bottom, you're,
you're seeing like images show up.

And yeah, it's like just, it's like
fractions of a cent on like thousand

dollars CPMs and like, it's just, it's
like a tiny, it's so cheap compared

to, you know, a Google ad for, you
know, blah, blah, blah software.

That'll be like 12 a click or
some absurd amount search us.

Search ads, right.

Or, or even like LinkedIn, like, you
know, these brands will do LinkedIn ads.

It'll be 13 a click.

So the display ads bring a ton of
traffic because you do the same amount.

You put 2, 000 a display
versus 2, 000 of Google.

Google will give you a hundred
visitors and display for that

could give you 20, 000 visitors.

And I said, shut off the display ads.

There's 20, 000 visitors.

It converts it like some tiny
fraction, because who's reading random

news articles and is like, yeah,
I'm going to buy this right now.

And my point of contact said, Devesh,
I'm not going to shut those off.

He was on my side, by the way.

He was also trying to
push back on leadership.

And he said, I'm not going to shut those
off, because even though the conversion

rate is tiny, they're so cheap that I'm
getting positive return on ad spend.

And I was like, this is our case in point.

Like this is an extremely low converting
ad channel when you look at conversion

rate, but he literally is like dollars in
and dollars out still makes sense for me.

I'm not going to turn it off.

The answer is stop worrying
about conversion rate.

It doesn't matter that that thing
is not high converting channel.

This is true.

This is a more realistic example, and
this is true for so many businesses.

It's like each channel you should
be evaluating in the end in

dollars, in sales, in revenue.

And then what was the dollars
in to create that channel?

That's it.

Your overall site conversion rate
is some blend of all of those

channels, but does it matter?

And that goes back to that
Super Bowl ad example, which

was an extreme version of this.

But that's an example.

You know, one is going to say no
to the Super Bowl ad, but it's

going to tank your conversion rate.

It doesn't matter that it
takes your conversion rate.

Right.

And so, yeah, so anyway, so that,
that, that's a classic example.

I don't know if we want to get into me,
I think just do three to five minutes of.

This screen share to show this
tool because now we've given all

the argumentation I would love
for you to share the data because

I don't even know this stuff.

Yeah, so in conversion rate Go to any
conversion rate Calculator I happen

to use this one, which is I probably
shouldn't have shared it But it's

just the one I use like every week.

Anyways, it's a little bit more
obtuse But if you just type a B test

calculator, there's others that'll
do it in a little bit cleaner way.

There's really just a Couple things
you want to do so And I'm saying this

scenario before you run the test,
go to these calculators and put in

realistic numbers just using your G.

A.

So let's go back to a concrete example.

We were talking at the beginning of
like some SaaS company that's using

free trial signups as a as a thing.

OK, so let's actually start blank
and I'll walk through what all

of these numbers mean and which
ones you have to care about.

OK, let's start blank.

First thing you look at is
the metric you are trying to

increase in that extent scenario.

We start free trial signups.

So look in your G.

A.

for the past few months.

What is your average per month?

Why do I do burn per month, by the
way, because I don't want to get into

this level of detail, but like just
estimate that you're probably going to

want to run a test for like four weeks.

It could be two.

It could be longer, whatever.

Just like assume a month.

It's easy.

So look at the last six months of data.

Let's say it's 500
conversions per variation.

That's a lot.

500 free trials.

Maybe it's a hundred.

If it's a hundred, 50 and 50 in the
conversions for a and conversions

to be just split it evenly for now.

Then think about the tech, the page on
which you want to, um, make a change.

So if it's like the page on which they
sign up for a free trial, a very common

thing leadership will tell you, Oh, we
want to increase free trial conversion

rate, we want Then that page where they
literally like slash sign up or whatever.

You want to test that?

Look at how many unique
visitors, not page views.

Unique visitors you get per
month on average users, I think,

in the new GA terminal, right?

10, 000 let's say you're
getting 10, 000 total.

So that's actually, you would
put in each one, 5, 000 each.

Okay.

Now, if we run this, obviously there
is no difference in conversion rate,

but let me just use that to illustrate
because we literally put 5, 000

visitors in a with 50 conversions, 5,
000 visitors in B with 50 conversions.

So that is a 1 percent conversion
rate, 50 divided by 5, 000.

Okay.

On each side, the relative uplift is zero.

We put it in that way out of all these
other numbers you see at the bottom.

Sorry to the podcast folks.

P value is the only number
you really need to care about.

Um, for now in general is, is
it, that's the, that's one minus

the statistical significance.

So P value of 0.

5 means 50 percent stats say
use the industry standard

aim for 95 hell, just 90.

Here is what this number means,
um, on in English qualitatively.

So one month, so this is a
set bad example cause it's 0.

5.

So one minus it is also 0.

5, but anyway, it means
50 percent of the time.

You can get a result like this
from an AA test is what that means.

So if we change this from 50 to
each, to some drastic difference, 80

and 20 still adds up to a hundred.

Okay?

Now one of them, the one that has
80 conversions, converts at 1.6%.

The other one at 20
conversions converts at 0.4%.

That's a 75% decrease.

It's showing a P value of one.

That means like 99.99,
nine 9% significant.

Okay, that means basically a tiny
fraction of the time, multiple nines.

It shows four digits here.

So 0.

0001 percent of the time when you run
an AA test due to natural, if it has

just a total natural bell curve, less
than a fraction of a percent of the

time, would you just get this by chance?

Which means that's a statistically
significant result, right?

And I did the A having more.

So you can get this thing to light up.

If you do the B having more, it says
like, yeah, green, like your B variation

increase stuff by 300 percent or whatever.

Let's make it a little
bit more difference.

Let's do 40 and 60, a difference of
20 raw conversions between the two.

That's a 50 percent increase 0.

8 to 1.

2%.

And that P value now says 0.

02.

You take one minus that.

It means 98 percent
statistical significance.

What that means is.

This 02 that 2 percent of the time,
if you run an AA test, you could get

a result like this at 5, 000 per.

Industry standard is 95, meaning 5
percent of the time, a p value of 05.

So this is also really good.

This would be a valid test result.

Again, keeping in mind the Growth
Rock article of you can get 95

percent plus even on an AA test if
you just don't run it for long enough.

And this is the frustration.

It's a frustrating thing about,
about, um, AB testing is you're

probably going to ask, well, then
what is 95 percent even relevant?

No, I was going to ask, you said
at the very beginning, I wouldn't

even do AB testing if I got less
than a thousand conversions a month.

So here you're saying you can get
up the proper P value in this test,

even with a hundred conversions.

So yeah, you can, you can get a
proper P value, but the reason I'm

not doing a thousand is because of
results like this, this shows 95%.

But I just know that a difference of
what's an 89 minus 65 is something

on the order of like 15 or something
is it's just 15 raw conversions.

This is 20 rock conversions like
random chance can affect that.

But a thousand if we said 400
versus 600 and you're like, I ran

this for three weeks and there's
200 extra conversions on B, the

chances of that being random are low.

Got it.

Okay.

So it's just the pure number
is so low that there's a lot of

probability that random chance
is playing into that difference.

Yeah, and that's why this article
is called the three or four

safeguards to stopping an A B test.

And the thesis of the article
is people think once you hit

95 percent just hit stop.

Many people, not just me, by the
way, many people in the A B test CRO

world have written similar articles.

This is my version of it.

This is not the many
people have said this.

You can't just stop based on 95 percent
because of this exact example that

I showed is that you can get this
at tiny numbers because all the 95

percent is showing or doing is saying
at these exact numbers, how many

times, what percentage of chances in
a test would produce the same thing?

Like real life.

It doesn't work like that.

You need to run, like plan to run it
for three weeks, plan to run it for at

least two weeks, three weeks, four weeks.

And then try to get hundreds of
conversions per variation because then

the likelihood that, you know, you're
seeing just nonsense when there's a

difference of 200 raw conversions is low.

Like, I hate to say it,
but use your gut instinct.

If the difference is 15, even if
the calculator says 95%, if it's 15

raw conversions, you just know by
instinct that could change next month.

Like, I'm just looking at my stats.

I know that 15 just comes and goes.

And so like, forget it like
an hour, an hour thing, right?

Like, you know, let's say we make this
a thousand each in our, like, you know,

lead form or, or let's say overall site
traffic, 10, 000 each, we get a decent

amount of site traffic and what was it?

Eight and four.

So one, one of them does two and
the other one does six or whatever.

That's a 200 percent increase.

And it's on the order.

It's like 93, 92 percent stat SIG.

But you and I just know just in our gut
that it's just four leads next month.

That could change.

So use just like a general rule
of expect hundreds each, try, try

to get like 200 each minimum, you
know, and don't worry about stats.

Like, yes, worry about stats sake,
but don't let that be the only thing.

If you're less than like a couple
hundred conversions per variation,

be very, very dubious or you need
to see huge changes to be convinced.

And at that point you could have just,
you could have just launched the variation

and probably seen it in your GA data.

That's it.

Yeah.

So then, so then to summarize here,
the issue with this is just, you're

leaving a lot up to random chance.

And so the agencies that are doing
this, yeah, you might come up with

the statistical significance that says
this is the winning test, but when you

actually implement it from a revenue
standpoint, it's not really going to

move the needle because there's a lot of
just variability in the data essentially.

Yeah.

Yeah.

That all of those things do those, that AB
test number, the 95 percent or whatever.

It means.

If there is just this natural
Gaussian curve, I'm trying not

to be too analytical, but like
there's just a natural bell curve

distribution between two sets of data.

What's the percent overlap that they'll
have on certain amounts or whatever.

But like real life doesn't work like that.

Like random stuff could
happen in your ads.

Then your, your, your competitor could
be running a sale or like got a bunch

of good press and you get barely any
data and like weird things could happen

where people are just fluctuated.

Use common sense and gut instinct and
then use just a general rule of like

hundreds of conversions per variation.

To, to feel more confident in addition
to yes, look for 95 percent statistical

significance, but if the actual
raw numbers are low, be very, very,

um, skeptical of it being reliable.

You like this video.

Don't forget to subscribe.

You can also get the audio only
versions of these shows wherever

you get your podcasts and you can
follow us at grow and convert.

com slash newsletter for any articles and
updates for when these videos come out.