CRO.CAFE English

For this episode we have Shiva Manjunath from Speero dropping some (painful?) CRO truthbombs for the new year...

Creators & Guests

Host
Guido X Jansen
As a cognitive psychologist and award-winning CRO specialist, Guido has worked with global e-commerce companies such as eBay, Heineken, Sara Lee, ING, Randstad, Sanoma and Jacobs Douwe Egberts to turn their data and insights into continuous business growth.
Guest
Shiva Manjunath
Senior Strategist at Speero

What is CRO.CAFE English?

Tune in as cognitive psychologist and award-winning CRO specialist Guido X Jansen grills his expert guests on experimentation, user research, digital marketing and everything in between.

Funny. Real. Raw. And never boring. Every episode of the CRO.CAFE is filled with knowledge bombs that will inspire and motivate you, as well as give you practical tools to help you up your CRO game.

Join 14K+ subscribers for the world's most exciting, raw discussions on experimentation, CRO, user research and digital marketing from the industry leaders who live and breathe it.

Start listening right now with 100+ episodes on tap!

[00:00:00] Guido: So, so first question I have for you, the, the zero quick zero dilemmas and let's, let's see if you can make a, a quick answer and then just we can discuss afterwards, . Okay. So the first one do you think we should optimize every page for every customer or pick specific areas and improve the shit out of those?

[00:00:25] Shiva: So every page or specific page, I'd probably go specific. , there's ROI you can calculate from specific, like not calculate. There's a lot of more you can do with limited resources if you focus your efforts. I think the ROI tends to be a lot higher if you focus your efforts. I mean, CRO people, we literally never get any.

Support from anyone ever. So when you have limited resources, if you're trying to fix the whole website, I'd say just try and focus on the things that you can optimize for and you know, the ones that are getting higher traffic, you know, things that really make a difference. Focus on the focus on those other than move the whole, yeah.

Mount.

[00:00:59] Guido: Yeah. Second one. Should we call ourselves zero people or CRO industry, or do, should we use another term?

[00:01:08] Shiva: Well, we're Chief Revenue Officers, right? ? Yep. I hate, I hate cro o. I think experimentation's a good word for it. I know everyone debates this c r o, like we know what we do, but I just, I like experimentation better.

Beijing

[00:01:24] Guido: or Frequentist,

[00:01:26] Shiva: I feel like this is a common one. I, I tend to prefer besian, but like each one has for pros and cons. Neither of them's wrong, but I tend to go abian as much as I can. Yeah. Nice.

[00:01:38] Guido: So then let's continue now that we know the most important parts of you what do you actually do from a day-to-day base?

Let's, let's start with

[00:01:49] Shiva: that. Yeah. So I'm a senior strategist at Spero, and effectively what I do is I help manage, you know, I like running experimentation programs for the clients. Coordinate things like, you know, experiments, hypotheses, coordinate research tests, stuff like that. Everything experimentation kind of runs through me.

But one of the things I've been kind of geeking out a little bit more on is the experimentation, maturity, audit work that I'm doing at Experi that's effectively helping not up-level experiments, but make better experimentation programs. That's something I'm pretty passionate about. I've built experimentation programs.

A couple companies. It's something that I really enjoy doing and it's, it's something that I think a lot of people tend to get wrong. They kind of fixate on like the experiments, but not necessarily the process and the program. of building that experimentation program and getting better to support those individual experiments and the research behind it.

Yeah, I think I've

[00:02:44] Guido: seen Ben posting about those things a bit too. So what do you what do you guys use as determining factors for the, for the

[00:02:53] Shiva: maturity determining factors? I mean, there's a lot of inputs that goes into it. We kind of start with just the general survey, like asking people that are involved in experimentation, how do they run?

and the audit piece is basically just trying to understand what is their current process and how can we utilize the current resources to get better. But also it's taking a look at the landscape of, you know, maybe they need more resources. We have, we have some pillars. It's strategy and culture, people in process I'm sorry, people and skills data and then like process and go.

and there's different things you could do to up-level these particular things, but it's more like one of the clients I'm working with. They maybe didn't have enough staffing, but staffing wasn't even their biggest problem. It was a process that the only experiments that were running that were they were running was feature validation.

It was literally a checkbox to make sure it's not going to break the site if they did so. That's not, I mean, that's a good use case for experimentation. It's not a bad check , I mean, and you should do it for sure, but that was the only thing that they were using experimentation for rather than like proactive feature exploration and identifying new features to build.

Right.

[00:04:11] Guido: Yeah, exactly. Yeah. And now I I did forget to ask how did you get get started with this? I mean a lot of people don't start studying studying cero or something specific like that. Yeah. So how did you get started ?

[00:04:30] Shiva: I tried to become a. . I just didn't like medicine. I, I feel like I've said this a lot.

It's no offense to doctors, I will keep on saying this. Doctors, nurses, everyone in the medical field's doing God's work. I just, it is so boring to me. I'm just not passionate about it. So I, I went to school undergrad for four years for trying to become a doctor. Didn't like it, wasn't passionate about it.

didn't pursue it any further. I started working with, but there's a lot of

[00:04:58] Guido: experiments you can do with medicine now. Oh, sure. .

[00:05:01] Shiva: Sure. But like there's blood and there's like, there's just like disgusting things that you have to deal with. Like there's code that you can look at and sometimes code can look disgusting, but blood, you physically hold it and like, yeah, not about So there's there's a different filter it. Yeah.

Yeah. So instead of dealing with that, I felt like I'd rather I wa digital marketing is something that's like interesting to me. Everything techy. Like, I love, like computers and learning about tech stuff. Like I always have the latest tech stuff. So that was like, started me on like, maybe I should do something.

And then next I had an opportunity to work at a digital marketing agency that kind of focused a little bit on like UX research and like design. And that's where I was like, this is kind of interesting. And one of the things that stood out to me was like, I read this book, don't make me Think by Steve Krug and God, that just, it, it made me think ironically about how a lot of websites are just so poorly.

that you have to think and there's so much friction from just getting from point A to point B, and I was like, why is everything so complicated? And that got the gears turning as like, okay. So there's a lot of like UX things that I think is really interesting to me that I think we could do better on as a web.

And then lo and behold, like experimentation started coming out and becoming more and more prominent, especially with tools, allowing people to run experiments. So was like, oh, so I could run tests to validate or invalidate that. Oh, that's awesome. And then there's a tech side of it where it's on websites.

I think the one thing that stands out to me is like the psychology. I don't know if you are similarly as interested in it as me, but like, God damn, the psychology in the way that people use websites is just so interesting to me. You see people who like, go ahead.

[00:06:54] Guido: I so I, I studied psychology, but during psychology, there was nothing about online , fortunately.

Mm-hmm. , mm-hmm. . So all, all the things that I learned about Psychology online is about. It through reading books but working in it myself, doing all that experience, but nothing from from university. Unfortunately, I am not sure if that's been fixed by now that they use online data.

But yeah. But go back

[00:07:25] Shiva: then. The, the funniest things about the psychology user experience is that you, it's almost like a cat and maybe not cat and mouse. You get to just like, yeah. Yeah. What other tool allows you to literally watch someone use your website and. It's creepy as fuck, but it's also awesome data.

[00:07:46] Guido: Yeah. Well, and, and, and the amount of data, right? I mean, you, you can get user data with whatever tool. I mean, if you, if you design a phone mm-hmm. or a ticketing machine, you can get u user data, people using your system, but it's not all users. Mm-hmm. , that's, that's the, that's a amazing thing of course with going online with.

If you look at the sample sizes you use at studies at, at university that doesn't even start to compare with what we have online, the code mine we have here. Yeah. And you think about studies with, with in, in the more neurology parts. Mm-hmm. , you had like eight participants, . .

[00:08:23] Shiva: I mean, even think about like offline mailers.

To me that's like you get something in the mail, you can't ab test that. And what kind of data are you actually getting? Like what is the open rate of like the person opening the letter? You don't even know if it got delivered. Right. Yeah. Versus to your point, the data you have online is so freaking robust.

There's so much you could geek out on. Sometimes it's too much. Sometimes it's like, If you go into like a session recording heat map tool or whatever, and you just watch session recordings for four hours without any reason to look at it, you've probably wasted your time because you're just watching People use your site without any reason.

Sometimes there's too much data, but like too much data within an ability to filter down to what you need to is far better than like. . Eight, eight people. . .

[00:09:10] Guido: Yeah. Yeah. But yeah, you, you, you do get into the, the, the product aside with online. That's definitely something that's that's a real thing that you have no idea.

Okay. Well, where can with, so we have so much tools that we can that we can use. Is this also something so tool use, is that something that's part of that maturity model? Mm-hmm. what, what people are using or what they're able to utilize for their, for their. . Yep. I

[00:09:36] Shiva: mean, I think the dual, the tools and research is like two lockstep things where you need tools to help you research, but you also, like, everything's so interconnected, but like you definitely need tools to capture data and you need tools to actually conduct research.

So sometimes it could be a binary, like do you even have these tools to capture it? But kind of like to that data point, if you are simply tracking things like macro level conversions and. A O V, but you're not tracking actual interaction with like things on the website, micro conversions. You're not understanding the behavior if you're simply just looking at macro level things, which a lot of people who run experiments do.

Just simply look at a fixation of one metric and be like, we ran a test. Did it improve conversion rate? Yes. No, without understanding why, without using another tool to corroborate what happened versus. You're just, you're running into these limitations. And, and to your point, like if you don't necessarily have all these tools, you're limiting yourself.

And that could be an easy like recommendation to improve your maturity is get better tools or get a tool to do something. But having tools doesn't make you good. Like just because you have an experimentation tool, it doesn't mean you're gonna be at a maturity level of. top tier because if you just have a tool but you're not running tests or you have a tool, but you're running crappy tests, that's, that's almost as bad as just not having the tool at all, cuz you could be hurting your, you know, your users and your essu experience.

[00:11:09] Guido: Yeah. And I, I think I have question is somewhat related. There's a lot of Yeah, , there's, there's a lot of BS going on with, with how people are using that the tools mm-hmm. , how they are. Like you said, focusing on experimentation, but not having a decent program behind it.

And you can imagine, especially when you work in an agency, you've seen many of those programs running on, not running . And the question is from from. Tracy Lao how do you maintain a no BS style and apparent, apparently, well, she, she suggested me asking this to you and apparently you are known at least to her for having this no, no BS style.

And that's definitely something I recognize with the, the post. I see you the things I see you posting on on LinkedIn.

[00:12:14] Shiva: I'll take it. So I'll take it because I feel like there's so much of just. Snake oil salesman in terms of like off experimentation where like there's a lot of people who will be like, we'll guarantee you like X amount of dollars, or, we'll, we'll guarantee you X amount of lift and conversion rate.

And I think that's just such a terrible way to you, like you're fixating on the wrong KPIs. Yeah. And you're gonna box yourself into doing stuff that isn't necessarily right for user experience and creating something that's worthwhile for like customer lifetime. specifically, your question is like, what what was it, how do you maintain a no BS lifestyle or, yeah.

Style,

[00:12:57] Guido: well, not necessarily lifestyle, but , well, lifestyle. If, if you wish to go, that's fine too, but how do you maintain a no BS style in, in, in those posts and, and, and when communicating with, with those clients? So I think for

[00:13:09] Shiva: me it's just about transparency. I think like anytime, I think a lot of times CRO people tend to have it be like, this is my baby, this is my thing.

No one else gets to influence it because this is mine. And that's such a crappy way to like, Build trust. Because if you're just saying like, these are the tests, this is my algorithm for our prioritization, and people are like, can I learn about it? And you're like, no, leave me alone. Like that's not gonna build any kind of trust that you're doing anything good, especially if you're running tests that are losing.

But even like, I'd be skeptical, like I had no access to your Cro O program, but you're like, I'm pumping out 50, 70, 80% conversion rate lifts or whatever it is. I. Okay, tell me more. And you'd be like, no. I'd be like, all right, well, I don't trust that . I think that CRO works. We're always just skeptical about it.

So I think like that transparency helps foster authenticity, which I think is important to me. Like that's where all the time I'm not, I'm gonna speak my mind, I don't, I don't really care what other people think because I'm gonna tell you what I feel. And that's the truth. You're not getting, you're not getting bullshit from me.

I'm just gonna speak the truth. You'll get a sense of authenticity. You may disagree, but you know where you stand, where I stand. And then I think that leads to trust that if you're open, if you're authentic, if you're like, if, if a test loses big and you're like, yeah, I mean this was our hypothesis, this was the research that went into it.

Going into this test. We felt really good about it and it didn't win. And that sucks. But it's okay because we have these other tests and guess what? We learned some things from this and we're gonna move on. that breeds trust. That like when you win, people can trust that you actually won. And when you lose, you know, you at least put the thought and effort and research into it.

You're not just picking out some random things that like, I saw what Amazon's doing, let's test that. You know, it's, yeah, , it's, it just builds that trust.

[00:15:01] Guido: Do you have? So I, I have this feeling hypothesis maybe that and, and hopefully you can shine some light on that. That's in. If we, if we compare how the, the zero or experimentation market works Europe versus versus us my hypothesis is that in, in US it's more focus on, on learning more experiments to see what works.

And then just. shooting and see, see some, if something sticks. I'm, I'm very much generalizing. I'm, I'm realizing, and, and then in Europe, cause, and my hypothesis is, is that because the markets are way smaller mm-hmm. You, you cannot get away with that. It's, it's easier to do that for an English websites and a global English website.

You can just ruin a lot of experiments and something will eventually stick you. You have no idea. But because you can have that volume it, it still works somewhat. You can still be somewhat successful with that. While in, in Europe or, and I think, I guess it works for, for all other languages basically.

The market is so much smaller that you have to be smarter about it. , you have to do your research upfront to come up with a good hypothesis and then try to validate that. Is that something. That make se make sense to you? Or is it, is it totally different wrong view of of the market that I have.

[00:16:22] Shiva: That's interesting. I never considered that. I, I don't know. That's a good thought. Like, I, like in the interest of authenticity, that's a sound hypothesis. I, I'm trying to think of like, so. Your base. I

[00:16:40] Guido: also then I also saw some to some, some supporting data. And this is that I did see some some, some English markets English merchant, but then in B2B where the markets are much smaller by nature of being in b b2b.

And they, again, they do need to be smarter about it than the, the mass running experiment doesn't work anymore because again, their, their target audience is much smaller. Mm-hmm. , that's

[00:17:07] Shiva: interesting things a bit on the. . That is interesting. I, I feel like I've worked with some of the clients that are smaller and as a process of that they have to be research driven.

They have to focus on the research and be smart about it. Yeah. The other thing I will say is like, I do wonder if there's a culture, there's a difference in culture between like US culture versus European culture that potentially. . That's the layer of leading into it, which I don't know if you're kind of alluding to that, but I do think that that plays into like, you know, maybe in the US it's a little bit more like try it and see if it works.

Versus in Europe it's like a little bit more skepticism like, Hey, like I'm not just gonna blindly trust you. It's almost like New York City versus like Austin, Texas. So like in New York City. Mm-hmm. , no one trusts you. Like you, you're not gonna walk up to someone on the street and be like, Hey, how's it going?

They're gonna. Please don't rob me. Like leave me alone. Like generalizing. I'm not trying to talk shit about New York, but like New York is largely like, you can't really walk on the street and just talk to someone. They're just, it's not that culture versus like, holy crap, Austin, Texas. You could walk up to someone in the street and just like, Hey, and they'd be like, oh, you're friendly.

Hi, nice to meet you. And like just have a con. I've had conversations with people at the grocery store. I've met so many friends from just like talking to people at concerts and stuff like the there. There's that difference in culture of just. Trust versus untrust kind of upfront. And I do wonder if maybe there's a little bit more of that in Europe that there's not skepticism, but more like why, and maybe a little bit more Inclusi versus us is like, well, let's try it and see what happens.

[00:18:47] Guido: Yeah. And, and by by the way and that's, that's interesting. Let's, I'm gonna talk about that one. By the way, when I say in, in, in, in Europe or, or the smaller websites that need to be smart about it, I, I still think I mean if you can just create a lot of experiments and do thats, and it works, it can still be smart.

Sure. To do that. If, I mean save it saves you a lot of time. And then resources on on doing the research. Yeah. So if it works for you go you. Well, truthfully, I

[00:19:16] Shiva: think for, so if we have few experimentation as like a, a spoke of the hub of research and I don't like, I dunno, people agree, disagree on that, but like research is doing things to understand behavior and like trying to understand kind of what's going on.

Research includes. Research, like analytics data. It includes heat maps, but an experiment is data. It you've, you've tried A versus B, and it's data to inform something. So that's kind, I don't, I, people always feel weird about like that.

[00:19:47] Guido: I often tell when I, when I try to explain this process, I tell people that run the website and say, okay, you are already doing the experiments.

Your content team and your design team and your develop, they're already doing the experiments, but you're not validating it. So you have no idea , what the results are. And maybe that's a bit bit of a fear mongering there, but. But

[00:20:11] Shiva: if it works well, that's kind of back to the point about like you can run an experiment, but if you don't have the proper resources to analyze it, validate data, sanity, like research, then running an experiment doesn't make you like, it doesn't provide value.

if you don't have the proper supporting of process to analyze and do the right things behind it. Yeah, it's just running an experiment. It's the same thing, like personalization. Oh, let's flip the switch on and let's personalize. Well, have you tested it? Like are you, are you gonna constantly evolve the personalization?

Are you just gonna flip a switch and be like, let's just personalize Because I read an article that said it.

[00:20:51] Guido: Yeah. So how, how do you at, at Spiro, how do you then position or may maybe even sell CRO or, or experimentation? Is it you just spoke about? Okay. If, if you're doing it for conver or optimiz conversion rates, those might not be the right KPIs.

I've seen I've seen a lot of people talking about, okay, we shouldn't frame it as optimization at all. It should be about risk management. Mm-hmm. . How, how do you guys position that? With, with clients? I personally always talk about test to learn. So the way I talk about it is every test that's run will generate an insight.

[00:21:29] Shiva: So, like I talked about how experimentation is like a hub of research. Every test you run will teach you something and I'm not gonna run a test that's not gonna tell me something to guide the next test and the next test and the next test. I have a, I have a plan. Like if a test loses, maybe that plane plan gets changed or altered based on the insights.

But every test will generate an insight that alters the strategic plan versus like trying to think like very root level, like a button color test. Like if I run a green versus orange on our homepage, what's that gonna tell us strategically? , like nothing strategic. It might give you a test when it might give you a test loss, but like there's nothing really like, oh, what's next blue?

Like, that's not really as helpful. Like maybe there's a little bit of insight with contrast. Like it's that, but like, it's not like content gating, strategy, pricing, strategy. Like a pricing like a a, a questionnaire or wizard for like b2b. Like that stuff. That's super interesting. One, because that's interesting.

Research two, like win or lose, you're learning something about that. So I, I think we tend to focus a lot of our tests around insights and talk about this is what we're learning about your audience. every test, and then if it wins, by the way, it also generated X amount of revenue. Why I'm onto conversions, whatever it is.

But the focal point for me that I always like to use is talk about the insights first. Because what I've learned like very early in my career is like if I only fixate on like conversion rate, conversion and conversion rate, you're almost training everyone to also focus on the on conversion rate rather than the insights.

Yeah. So instead just flipping around and talk about the insights generated. Win or lose, and then you're focusing on a program that is generating insights that even if you lose 10 tests, you've generated insights for those 10 tests as stuff like we shouldn't do. And it's good that we tested it because we didn't move forward like that.

Risk mitigation you're talking about. Yeah, exactly. You saved a

[00:23:37] Guido: lot of resources on not implementing all those all those things. Yep. Yeah, I think yeah, totally agree. I mean, it's way more interesting to have, I mean, the UI part can be very interesting. I mean, if that's a big issue that can be, that can be a huge blocker on many levels of course.

But the strategic things are way more interesting. Validates your, your whole, whole business id. Are people actually interested in, in what's, what's the. In, in the features that you are trying to sell them. Maybe they're, they interested in something completely different and what's price elasticity on your product, that that can make a huge difference for for a company way bigger than, than changing the

[00:24:14] Shiva: color.

But, and to take that a step further, like think about experimentation maturity. So if, if you're a one-man team and you're running experiments for button color tests, you're, if people who are looking at your program are seeing button color tests are gonna be like, like what? What is the ROI of this program?

Why do we have this? If all we're testing is button colors or some root level tests versus if you're talking about strategic things that is affecting the whole business. Obviously with the qualifier that you are in fact in lockstep with the business, that you're not like optimizing for like a O V and the business is optimizing for like cltv and your metrics are just totally mismatched.

Like obviously you don't wanna make sure you're aligned with that, but if. Providing these insights that are so interesting that maybe is shifting the way that the business thinks about things or marketing or, or anyone else like brand. You're generating these insights that are like, holy crap, this content gating strategy that we have to generate leads didn't work like we removed to the gate and we got the same amount of leads we need to change the business.

And then like your boss's, boss's boss's boss hear about that. Like, how'd we find this out? Oh, the CRO program. Holy. Let's throw some money at this. If they're generating those insights, and that's a way for you to scale your program up. . Yeah.

[00:25:34] Guido: Yeah. Yeah. I think a, a challenge there is that a lot of cero specialists or cero teams are often positioned in the marketing team.

Mm-hmm. they have really different goals than they, they're not necessarily into test and learn.

[00:25:51] Shiva: It is painful, like, . It is a struggle with, and that's another thing that we talk about in the maturity is like, where does experimentation live? Is it a decentralized model, centralized model, hybrid, like whatever model it may be.

But holy crap, if you're not on the product team, good luck trying to get prioritized resourcing to help you drive the program and the solution to, that's like maybe hire some folks on your experimentation team. Like when I was at Gartner, I hired a developer to sit on my team to help bypass some of the limitations I.

I, I, but again, like it's, it is a struggle, especially if experimentation sits on a marketing team to get the resources they need to scale the program to to, to grow and get bigger. I think in a perfect world, if you could have, like, I think each model kind of has its pros and cons, but I think in a perfect world, like a, a decentralized model where you have some you have some resources sitting on individual, and then you, or I guess that's a hybrid model.

Not decentralized, but you have some individual specialists sitting on like a marketing team, brand team dev, team ux, whatever. And then they all report to like a hub of experimentation folks. And those are the ones who like set the process, set the governance. Ben and I geek out about this and he is like very passionate about like that person that sits, the people who's in the hub, like shouldn't look at a single.

All their, their only jobs are like process. Their only jobs are making sure that everyone's following the right processes. They don't look at a single test, they just make sure everything's flowing properly. And then you have the specialists on every team kind of flowing through everything. . Yeah.

[00:27:29] Guido: Yeah. I think the one, one easy way if you're in a stuck in, in a marketing team might be just to, to get budgets for, for to hire an external agency.

Yep. That's, that's often the, the fastest way to ensure that you actually can do stuff. Yeah. And I think that's why there's a little bit of like scrappiness that a lot of CRO people tend to have. Like you have a CRO person that's a coder and, and a data analyst and can kind of design on the side.

[00:27:55] Shiva: Like I think every experimentation. should have a good understanding of all of these skills, but you'll often find like, I can code, not, not like, I not like top tier coding, but I can code pretty well, I can design pretty well. I'm, I'm pretty good UX UI person. Like I, you have to have these skills because growing in an organization that like only hires one specialist and they're expected to drive the whole program is.

is oftentimes what happens in a perfect world that doesn't happen, but it oftentimes does happen where it's one person's task to like do everything, and that's always hard. So sometimes that person has to do, put on a lot of hats. Sometimes in a perfect world if you have those like dedicated UX functions, dev functions, build relationships with these folks and just try and try and get some dedicated resourcing, even if it means maybe you have.

Trim down the sizing of tests as long as maybe your quarter or hypothesis, maybe just pushing some stuff out and just get a little scrappy, because if you can get a little scrappy and prove yourself. , then that starts to get TOIs and maybe you start to get a little bit more money flowing and resources.

Yeah, I,

[00:29:07] Guido: I mean, I, I've seen zero people working with really expensive AB testing tools. Mm-hmm. , , which they could easily, mm-hmm. Switch out for, I mean, just start out with Google Optimize. Or, or convert or, or whatever, not the, the top tier ones. And then instead spend that money on getting yourself a decent developer or abc to work with.

That's a way better, faster model to prove.

[00:29:38] Shiva: Yeah. It's almost like those people who have like Cadillacs, but then like live with their parents. .

[00:29:43] Guido: Yeah. . Yes, exactly. And, and have no money, no gas money. for you're not getting any gas money, huh? Hey, something else I wanted to ask you. So I saw this interesting post this morning post on LinkedIn by Petra Leon Leland already asked her to come on board for, for an episode.

I'm curious what you, what you think of this. So you had some frustration. I. She posted about, and she basically quoted some responses. She, she got, I think from people inside the company or talking about Xero and. Asking her, I guess. So, so quote being why can't we just put this on the website?

Product experimentation will just slow us down and but, but we know our customers need this and our competition already has this, so why can't we just implement it? So not necessarily also for Petro, I guess, but I guess. Many zero people will will come across comments like this in their career.

How would you advise them to, to respond to this?

[00:30:53] Shiva: God that's so triggering. I can't tell you how many times that's happened. just, it's like, it's one of those things that we call jdi. So like just do it. A lot of people tend to just be like, well, let's just do it because we need to just do it. I'd say like, One thing you should probably do is just think about what is the risk.

Sometimes you have to be honest and be like, if it's like, I don't know, swapping out like a video on the homepage to something new, like what is the data supporting it? So is it truly, truly, truly something that's impactful versus not? Like maybe the brand team's like, we wanna change this new video, and you're like, test, test.

but how many people click on the video? Is it like 0.2% of people clicking on the video and they only watch three seconds of it? Like in a perfect world, you wanna test it because maybe that new video is just gonna be that much better. But if that interaction rate's so low, like sometimes it's okay to just be like, fine.

We'll concede. And maybe you, maybe you take that coin and put it in your back pocket and be like, we conceded on this. But when you make sure the next thing that you wanna just j just do it, we cash in that chip and and test it. But. Yeah, I mean mis risk mitigation is just, there's so many things that I've tested in my career that I'm like, this is a guaranteed winner.

And you run the test and it like it loses dramatically and you're like, oh, okay, well that's good. We tested that and like it wasn't a question that you weren't gonna test it versus not, but like, we're wrong a lot. A lot of people are wrong , and it's okay to be wrong. Risk mitigation is so important that you just don't know until you know, and, and sometimes it's a little bit of a personality thing that you have to just like, sometimes it's worth digging into, like why?

Why do we need to be so quick? Why do we need data quickly? Is it your boss's boss's boss really pushing some pressure on this? Is it like maybe it's worth digging into like. , they wanna push it so quickly. And I think that's something like, specifically with hippos, I've like, it's helped me understand and try and work with people who, hippo highest paid person's opinion.

Like someone who's like your boss's boss's boss says, do something. You have to do something. And sometimes it's helpful to understand why they wanna do it, because potentially that context is helpful to talk them off the ledge of maybe testing something. Restructuring a test. Or maybe it's like, you know, maybe that's something of a lesson learned that like you have a process issue.

Like we talked about ex you know, maturity. Maybe you have a process issue that like they have spent five months working on something and you didn't know about it. If you had known about it because it was a process issue, you could have started on day one and be like, by the way, we're testers, we can test, help, test it for you.

And they're like, oh great. And there's you. , a lesson learned that you have a process issue and you just need to work in experimentation earlier. Yep. To help them out. But yeah, it's. I guess I'm trying to be like glass half full because if I was glass half empty I might cry a little bit cuz of how many times I've heard that

[00:34:06] Guido: Yeah. And then just fire the clients or just if you work there yourself, resign and . Yeah, I think that's more,

[00:34:13] Shiva: it's also not, it's just like trying to get an understanding like why are they trying to do it so quickly? Like what is the reason behind it? Is there a process thing that maybe you need to have re-looked and be like, well, experimentation needs to be involved earlier.

A lot of times like what we see. I was working with another client I, we were talking about something and they're like, we're gonna roll out this feature. I was like, why don't you guys test it? And they're like, we can't test it. And they're like, oh, sure you can. You just have to do this, this, and this.

They're like, oh, oh, okay. I liked, and they're like, it, it clicks in their brain. They're like, they didn't understand they could test something in a particular way. So sometimes it's just education that they didn't know they could test it, so they just didn't test it. And then they just moved. But it's, I mean c R O and experimentation is just so often misunderstood that that leads to a lot of problems rather than like actual malice of people who are like hippos.

Yeah. Who are just like, I just wanna do it. I don't care what the results say. I wanna do it cuz it looks pretty.

[00:35:14] Guido: Yeah. And often, I guess when, when the understanding isn't there, it's also for a lot of people, I think the managers that's need to manage the, the zero team or the people coming up with those IDs for the, for the zero team to validate that it feels very personal to them.

Like I said, most things feel. And that gets, that could get personal very quickly if the, if the Desi, if you shoot down nine out of 10 IDs that the designer has he, he might not like you very much and, and stop coming with things to you. So you need to start working with them, understanding what their goals are not necessarily their personal goal, but inside the business.

What are they? What are they, what are their KPIs? What are their OKRs or whatever system you use? And understand that and, and help them achieve those business goals instead of framing it or at least having them, having them perceive this as being very personal to to, to, to their.

Well, maybe, yeah. Their, their life, their ,

[00:36:24] Shiva: their lives work. Well, we put that in context. If you're a UX person and a Sierra person comes to you and they're like, Hey, we should test this. An UX person's like, okay, cool, let's test it. And then they literally run the test and it loses by, let's say minus 10%.

They literally give you evidence that your baby is ugly in a data-driven way. Right. And if you're, you express it, you're like, yes, stand , that's not, that hurts. So like there's that context. Is glass half full where you're just like calling their baby ugly and literally using data to prove it, which is like, yeah, we'll put a lot of people on defense when reframe it, be like, look, you have this design.

Let's test it because maybe this first iteration isn't the best, but let's test it. Let's make it so that instead of, you know, it being a minus 10% or a plus 2%, what if their design was so kick ass that it was a plus 17? , then you have provided data to the UX person that their B, that their baby is beautiful.

And then that UX person goes, tell their boss, my thing is 17% better than the original. So it's, it's, it's looked as a, it should be as a,

[00:37:35] Guido: you don't have to tell your boss. It, it took 10 iterations to, to get there. Yeah. . Yeah. Just but that's but I think that's, that's an issue that it, it's, it's, it's painful in the beginning.

And if you're not used to to that process, that indeed feels like someone is killing off your, your baby. And, and that's, yeah, I guess that's part of the education that you need to go through with your organization.

[00:37:58] Shiva: Well, and I think that's like back to the education piece, it's, it's about the CRO person coming from a, like, I'm here to help.

I'm here to make you look good. I'm here to test your thing so it looks good. Like, you think about even just like minus ux, like let's look at product. If a person's designing a feature, if they were, if a product owner. Said, the decision we need to make is develop this feature. And there was no experimentation done to validate, run anything.

They spent six months validating this feature, building it or not validating, building it, getting it to live. And then CRO person comes in and tests it and it's like minus 6%, like that's gonna make them look really bad. . So instead it's, that's where there might be some hesitation from these people to be like, I don't wanna test it because it can make me look bad in the same way with the UX people.

So there's a lot of these avenues, like even deaf people. Yeah. Like, I don't wanna test it because if it's broken and it doesn't look good, like it's, it's everything is like, it's understandable that people would see it as a have a defense mechanism. The reality is they, we are here as experimentation folks to make you look good, to take what you're doing and make it better, or prove that maybe that's not the exact right thing, but instead of you in like that product person, if we were involved in, you know, very early, we could have maybe done research to validate that this wasn't worth doing.

Or maybe use the research to be like, it's worth doing, but the design isn't great. We're here to help support you, to make the right decisions, to make you look good, and you look good, and you look good. And everyone, and even like the business metrics, we're here to support the business. We're not your enemy.

We're your best friends. .

[00:39:38] Guido: Exactly. And, and what would you say? So the, the, the, the first part of Petros comment was or her colleagues comments, experimentation will just slow us down. What, what would you how do you, how would you counter the argument? That's, I mean,

[00:39:52] Shiva: fair enough. I mean, would you rather be slow or bad?

like fair enough. Yeah, that's add perfect response, I guess. Yeah. Like it, I mean, Yeah, you could be slow, but would you, would you rather make the right decisions and be slow or probably make the wrong decisions and be fast and like there's a whole thing and like there is a legitimate argument for like, let's, I'd rather be, I'd rather be fast and then test and then like continue to iterate.

There is an argument for that. I'm not gonna disagree with that. I'm not saying always be slow and always be right. It's, it's not usually the case that people who are like, it's gonna slow us down are people who are like, we wanna roll fast so that we could test it and continue to iterate and make it better.

Oftentimes from what my experience, it's like, let's roll it out very quickly because we have something else we need to do and we're not gonna really, like, maybe we'll have some health metrics and check up on it every now and then, but it's not like a concerted process to like go. Test research and like keep it going and keep the funnel flowing to optimize.

It's kind of just like, let's go quick and Yeah. Move on. .

[00:41:06] Guido: Yeah, all those are, those are requests like but it's black Friday next week, so we need to do it right now. Or we have this campaign, the TV campaign going live. Yeah. But then you're just you, you got involved in the process way too late.

That's something that's in the process that you need to Yeah, exactly. Nice. Hey I don't want to take too much more of of your time. Let's thank, thank you so much for for sitting down with me and talking us through your your process and no BS style. Let's, let's keep that up.

There wasn't a lot of swearing today, so let's we, we got this cut by sailors out the middle. Guess

Well, one, one question. I do have a few, that's a final question. What are the things in, in your own process, in, in working with clients that, that you want to improve this year, like the next 12 months? What are you, what are you excited about the new things that you're gonna gonna try and implement?

[00:42:08] Shiva: That's a good question. I probably need to, what's, what's on your, okay. I probably need to do some deeper reflecting on this, but I think for me it's, it's something that I think I'm good at. I'm not gonna too my own horn. I think I'm great at, but I think I can be better to support the business better, which is keep on being annoying about understanding what the business is trying to do.

So, like I talked about a little bit before, it's like where, what is the business trying to optimize for? What are their priorities? And making sure experimentation is matching to it on a, excuse me from an internal perspective. it is a lot easier to understand those pieces because you're literally talking to everyone.

It's easier to understand the business context cuz you're talking to everyone on a daily basis. You're having these conversations and you've built these relationships. As an agency, it can be a little bit more difficult to truly understand the business direction and be able to adapt and adjust in advance and like understand it.

Sometimes it'll be like they'll have some plans kind of coming down the pipeline and we wanna make sure that we support it. We only learn about it when they knew about it, like maybe last week, and they knew about it for two months. So it's being a little bit more annoying, but like trying to truly understand the business goals and objectives to support the business in the absolute best way possible, rather than kind of confining our experiments to stuff that we may be entirely kind of see on our own.

And again, that's not saying that we're bad at it, we're really good at it, but I do think there's another level. Really, really pestering executives to understand what they're trying to optimize for. Because I think on one hand that'll help us be better experimenters, but two, that annoyingness will help executives.

It'll it's, I'm not trying to say like Pavlov's dog, but like it'll almost train them to like think of experimentation as top of mind so that whatever internal resource, you know, whatever people we talk to internally on those teams, they keep. They'll get the support because we're being annoying about experimentation in a good way.

Like we keep it top of mind so that they'll get the support so that we'll get the support to help them. And I think it's just something that'll keep on helping if we just keep experimentation top of mind, continue to support their business so they see the value, they love it, and then they feed more money into it, and then we're more helpful.

And it's just that positive psych feedback cycle. Yeah.

[00:44:42] Guido: Nice. Yeah, I think that's really powerful advice. I mean, I, I always advise people to, one of the first things that you that you should or teams that you should talk to is probably your your finance team to figure out, okay, what, what actually drives the business?

Wh where, where's the money being made? And that's really interesting for you as a, as a zero. Person that you know, and, and maybe you work at a company, you think, okay, let's increase conversion rate or revenue or profits. But maybe they're not even interested in those. They just want to grow, market share.

Mm-hmm. . And that's the main thing they have. Maybe they have, you have had a, a big investment. And the goal is to grow. As fast as possible. You can even lose money . And it's really, that's a completely different ballgame than when you're already there. And then maybe just, just the only thing is profit.

Yeah,

[00:45:40] Shiva: that's a totally different. I would even add to that and say, as a, as any CRO research person, you should always be talking to, like your sales team, your marketing team, your brand team, your dev team. Live chat team, support team and like talk to these people because they have insights. They deal with stuff that you don't deal with.

So if you're like, if you're optimizing for sales and like you're trying to just raw increase to raw amount of sales and then you go talk to the sales team and they're like, we've been getting a lot of shit leads recently, recently. , then they're probably then, like you're not, you're not doing anyone any favors cuz you're wasting their time and artificially making your program look good.

Yeah. So it's like it's having these conversations, but also those guys are on the front lines. They see all that. So they have different data, they have different ideas, they have different things that you can use to help guide your program. Because maybe the sales guy's, like, man, all these people are complaining about this one question because they never know how to answer it.

And you're like, Okay, we should test that. And then you get ideas from these other people. Yeah, that maybe your research never uncovered or maybe your marketing team like it's, have these connections also, again, it pushes experimentation to top of mind for everyone. The more you grow your influence and reach, the more people are talking about it.

The more people are interested in it. The more buying you're gonna get, the more interest you're gonna get. The more people are gonna be like, yeah, let's test it. And then again, positive feedback loop that just gets more attention, gets you more money in the bank to, to grow your.

[00:47:10] Guido: Nice. Yeah. Have that as a, as one of your OKRs talk to as many departments within your organization.

Yep. . Yep. As you can. Yeah. Well, thank you so much and I hope to read more of your post on our LinkedIn. Thanks for having me. Thanks so much. Take care. Bye-Bye.