Revenue Brothers

How often do you argue about your data? How many times have numbers been questioned? And is the data you're even gathering helpful in the first place? We all want to make data-driven decisions. But what happens when you lack that data trust?  

In this episode, Raul and Toni talk about the reason why so many companies struggle with data trust, and introduce a framework to get it back.

  • (00:00) - Introduction
  • (08:06) - The data diet
  • (15:29) - Rolling out metric frameworks
  • (19:03) - ATAA: Acccuracy, Transparenct, Analyzability & Acrionability
  • (26:17) - ATAA: Transparency
  • (35:29) - ATAA: Analyzability and Actionability

Creators & Guests

Host
Raul Porojan
Director of Sales & Customer Success at Project A Ventures
Host
Toni Hohlbein
CEO of Growblocks

What is Revenue Brothers?

What happens when a VC and a CEO come together?

– They nerd out about all things revenue. And they don’t always agree.

Raul Porojan of Project A Ventures and Toni Hohlbein of Growblocks are the Super Revenue Brothers. In every episode they dissect and debate current issues in B2B SaaS, and offer solutions on how to solve them

No matter if you’re an early-stage startup or a scaling unicorn – you’ll always learn something new.

Introduction
---

[00:00:00]

Toni: Data does need to be 100 percent accurate for you to have deep insights from this thing. If it's directionally correct, that might already help you. Should you fire a sales rep on directionally correct data? No, you shouldn't do that. You might actually get sued for it if you do that, right?

Toni: But can you derive overall systemic insights from directional data? Yes, you can.

Toni: So everyone, uh, Raul and I, we are back. And today, today we're going to talk about Something that I think most of you probably would roll your eyes Over or would be thinking like, ah, you know, this is not my problem You know some revops sales orbs dude should be taking care of that.

Toni: It is going to be Data trust or data quality. However, you want to kind of refer to it I think everyone exactly now knows what we are going to talk about. We're going to dive into You know where the problem sometimes comes from You How to think about solving it and, and what you can actually do with the data that you already have, right?

Toni: Maybe kind of to start out, uh, Raul, what's the [00:01:00] problem here? What's the problem with, not trusting your data? This is not just about CRM hygiene. What's the problem with, with not having data trust?

Raul: where to begin? To make it really simple, the point of data is that it makes your execution better and it should inform what you should do and, and how to do it and all that stuff. And the worse your data is, the less you can do that. Very, very simple. Um, and there's many aspects to it, which we will get into, but the.

Raul: You will never even get to the place where you might even consider acting on data if you just don't have trust in it. And a lot of companies are there. So for them, something like data driven execution seems like just a buzzword. But there are companies out there who do that and they do it quite well.

Toni: So I think the buzzword piece is kind of whenever you hear data trust and data driven making, and you know what? Yes, Roblox is totally in that. In that space, but I [00:02:00] always then think about digital transformation and all of that jazz . That basic kind of is a big buzzword, but doesn't really kind of happen.

Toni: and you know one thing that kind of I see a lot is whenever I talk to any kind of company around data and especially on the commercial side, right because Because when you kind of think about it, right probably your product team fairly data driven, you know, probably unless you have like 10 You You know, 1 million each customers.

Toni: You're probably going to be fairly data driven then think about High volume marketing teams so that this is kind of super common e commerce But you might have maybe a super strong performance performance marketing team that you know is going to be extremely data driven but what about the rest?

Toni: What about the rest of all the commercial functions? Why? And I think this is my claim here. Why? You Is everyone only talking about being data driven, but in the end actually isn't, right? and one of those issues around it that I've seen many times is, and I think it's a [00:03:00] smokescreen to a degree.

Toni: Kind of people are putting up this data trust, data quality thing. And I'm basically saying, well, since, and that logic is true and makes sense, by the way, but since, I cannot trust this piece of data here, I cannot trust anything that's built on top of that, right? Which basically kind of wipes out.

Toni: Wipes out the whole thing. And whenever I hear this kind of the first question I have is like, okay, data trust, and data is not, you know, it's not believable. What does it actually mean for you? Tell me a little bit more what you're talking about here. And what I at least hear very, very often is, um, Uh, topics around the sales pipeline.

Toni: Um, basically topics around, Hey, the reps, you know, and this is where this data hygiene or CRM hygiene usually comes in reps are not filling in the field. And, you know, this is a big mess and they're moving deals forward and backward and, you know, doing all kinds of things. And that [00:04:00] ends up being really difficult to manage.

Toni: Hence our forecasting meetings are terrible. Hence, I don't have data trust. Right. And. And my thinking here usually is, um, okay, I can totally see that. I've been living in that reality myself, but the realization actually has to be also, okay, how big of a deal, is the sales pipeline, you know, across your full funnel?

Toni: And you know, how many other pieces could you actually trust and other things in your sales pipeline that you could just skip, for example, do you need to know all of these different, you know, movements in order to have, you know, full funnel visibility and see that stuff. And this is where I usually see people.

Toni: kind of have this data trust issue that then spread across all the different other, you know, data areas and then therefore declaring, you know, data bankruptcy. I'm not sure kind of where, you know, how you've been running into the data trust issue, role, but we'd love to, we'd love to hear about that.

Toni: Yeah,

Raul: way that some of what you were [00:05:00] talking about even comes into being is. Um, even early stage companies nowadays, just very early on, go too fast into too many go to market motions already, or too many sales motions or too many marketing motions, which that's not the topic for today, but it is one of the reasons that you have, might have a hard time keeping up with your data needs is just being way too complex early on already.

Raul: And this might already feed a little bit into this topic of a data diet, but. So in practicality, and I've seen this evolution, dozens of times really by now, is. This problem typically goes through an evolution. And, the first, and I would say it's not that they always happen sequentially, but it's sort of like, okay, where are we at as a sophistication as a company, or maybe as a CRO or as a VP sales, whatever.

Raul: And so the first level zero is almost always. , sort of non ambition towards data, sort of maybe not an imagination, , towards what you could do with it. And those are maybe people , who think that those are just [00:06:00] buzzwords or sort of resignation, right? So there's no point in even trying, because I mean, where do I even begin?

Raul: I don't even know how to do it. , I don't even know how we could execute this kind of thing. And, , the good CROs and the good leaders maybe sometimes know that they were not able , to do it or to get to a data driven sales funnel, but they know how to get help or they know to employ revenue ops people or something like that.

Raul: Right. But that's kind of level, level zero. And that's obviously the worst place to be at, but maybe what feels even worse as a leader is sort of level one when you're like, Oh, I actually went to this Listen to this podcast. I went to this event and now I'm really ambitious about what we could do with data.

Raul: And now I saw that all the big guys, they're doing it like that. I want to be like that too. Cause we also want to be a unicorn and I want to make big bucks. And then you look at the state of affairs and you realize, okay, there is no way. That we will ever get there. Like we are absolutely screwed. We have six SDRs.

Raul: They all have different, interpretations of what a funnel is. [00:07:00] Two of them don't even know what it is. Three of them, I don't even know if they show up to work sometimes. And they just have like random mouse clicking software on their phone. So I don't even know what their data even means. Their data entries into Salesforce.

Raul: Three of them have never logged into Salesforce in the last week. And then once you unfold the whole thing, you're like, okay, there is no way we could ever get there. Right. And kind of that, that's level one is like, okay, you have ambition. But you're not there at all yet. And then you go sort of through the progressions and there's actually light at the end of the tunnel where I think the first realization that you have to make, and I'll stop with that level is once you realize that there's Again, there's levels to this and you don't have to be perfect yet.

Raul: And there's sort of like, you can expand how sophisticated you are in data and reap benefits as you go. So it's, it doesn't have to be an all or nothing black or white thing. So just because you buy that great revenue architecture book, or you go to that event and you're like really enlightened doesn't mean that you have to get there immediately.

Raul: Right. It's like. My eternal struggle is trying to get fit and lose weight. [00:08:00] So just because I had a cookie, I don't have to give up the whole week and just throw out the baby with the bathwater. Right. , there's a lot of benefits to be had from a little bit of data.

The data diet
---

Toni: no, that's true. And I think also, um, you know, data is such a all encompassing term also, I think. And at least this is, you know, what I'm recommending and what we are sometimes recommending is to consider something like a data diet. Just kind of slim down on your ambition level, scoped out, right?

Toni: Kind of just because you have collected a data point doesn't mean you need to visualize it and act on it or something like that, right? I think really kind of depending on the sophistication and the complexity of your business, there might be at some point only might be five overall metrics that I need to monitor on a weekly and then Seven on a monthly and maybe ten on a quarterly level And maybe that's it for now, right?

Toni: Kind of that can be your data ambition right there not trying to get everything under control, right? And meaning kind of a data diet, you know scope down figure out what are the key drivers for me and I can tell [00:09:00] you It's going to be Opportunity generation or pipeline generation.

Toni: It's going to be some conversion rates before and after it's going to be ACVs those kind of funnel metrics and maybe some efficiency pieces as well. Those are the things you should probably be focusing on in the beginning, right? And once you have that under control, and once you actually start using it, in a sensible way, right?

Toni: Because really, one rule of thumb that I have is actually, you shouldn't be, calculating, storing, or visualizing any metric where you don't have like a sequence of events to it if something happens to that metric. So if the metric goes up, what do you do? If the metric goes down, what do you do?

Toni: If you do not have anything actionable that it can tie to a specific metric, screw the metric. Don't look at it. Don't use it actually. Right. And especially in teams that are starting out on that journey, they will not have so many good ideas what they should be doing about, you know, each and every single metric and then start just [00:10:00] smaller.

Toni: Right? And once you start smaller, you're starting to kind of get into the habit. You're starting to also see a good reason for people to be, um, you know, to be more, what is it? I have more CRM hygiene kind of to do these things in the right way. You'll start investing your RevOps, SalesOps folks to kind of set up some additional fields so you can get even smarter.

Toni: But the first step is usually, just getting started and having an overview, right. And for that by itself, Data Diet kind of slimmed down what you want to look at.

Raul: Yeah, and I like that you said look at not save or store or produce data because as dumb as it is And I hope that to 90 percent of listeners. This will be like, yeah, obviously Sadly, I still see too many companies just early on, produce data, not have a way to store it and just lose it into the abyss where later on it might become really useful.

Raul: So just because at this point in time right now, which very often is correlated to being very early as a [00:11:00] company, you don't know what to do with it yet. Or you don't know where, I don't know, like how to act on it, or how to report on it or make it beautiful does not mean that later on it might not be useful.

Raul: And so very simple case for that is, well, maybe at some point you want to start off sort of like build a revenue. Engine or in your words factory and you maybe want to look at conversion rates and you maybe want to kind of produce a Prediction of how much money you have to put into this and this lead channel to produce in this many leads, right?

Raul: And if you had the data for the first two three years of your company, yeah, maybe it wasn't too bad too much of data, but it would still be helpful rather than starting at zero and from gut field. And so that's, as easy as it is. And I hope again, most companies don't do that. A lot of companies, man, and a lot of times it's related to the way that sort of you progress through a CRM if you want.

Raul: So many companies, they start out with a pipe, with a, no, not even before, with an Excel. Right. Uh, or maybe with, even a notebook I've seen nowadays still with founders, they write [00:12:00] stuff down in Excel and then maybe they progress to pipe drive maybe. Or maybe they progress to. Some per version of a tool that is not a CRM, but they use it as a CRM, like Monday or I don't know, like some, some project management tool.

Raul: And through all those iterations, they just lose like so much data. And then in year three, four, they're like, Oh, now we're really picking up. And now we want to, we want to do some projections and we want to do some comparisons. How have we developed? I'll look at that. We don't have anything.

Toni: Yeah.

Raul: so just store everything, even if you don't know what to do with it yet, store it well.

Toni: Yeah. I'm not sure if I agree with that actually kind of, uh, so I've in the past stored stuff and. There are exactly zero instances where I actually look, you know, then, Oh, let's use that, the old data for, for something. Right. I think what I would recommend is, not at the notebook or monday.

Toni: com level, but if you do invest in a pipe drive or HubSport, or, you know, later on a Salesforce, build a mini framework for yourself. And I think, yes, that [00:13:00] could be in connection to something like what winning by design does kind of with a bow tie, kind of collect some of those funnel metrics in terms of, how many leads did you build, you know, create, how many MQLs did you create, how many opportunities did you create and kind of just, those are also going to be nice slides for investors, kind of see the demand, kind of curve going up.

Toni: Right. And from, you know, as you store those volume metrics, right, we refer to them as volume metrics. It's very easy to then generate conversion metrics kind of between them kind of, you know, how moved one to the other and what you want to the reason why you want to start doing this a little bit earlier By the way, is you want to start seeing those conversion metrics to flatten out meaning in the beginning They will be super spiky They will kind of jump up drop down jump up drop down and you're going to be feeling it's all over the place And at some point with you know, not only volume picking up but also with the process starting to standardize, you will see, you know, see it, you know, starting to flatten out.

Toni: And that's the moment where you can use it for, you know, really [00:14:00] interesting insights going forward, right? And , the other trick is also once you have built like a mini framework of things, where things kind of snap into place for you, that's also very easy then to add additional, you know, pieces to that agreed framework, right?

Toni: And what do I mean with that? Well, once you have. MQL to Opportunity in there, and you trust those two numbers, right? Kind of imagine as a, this trusted check mark on your Twitter account. Once you have that with a couple of metrics, it's very easy to put additional metrics into that framework, right?

Toni: For example, an SQL, kind of between MQL and SQL and Opportunity. And suddenly this new metric is surrounded by two trusted metrics. And very quickly, You will get to a point where you also trust the SQL piece, right? Another one could be pipeline, right? Many people are just, you know, calculating the pipeline in all kinds of different ways.

Toni: Once you have that trusted, you know, opportunity volumetric that, you know, you can rely on, pipeline itself is really just a derivative of the opportunity [00:15:00] object, kind of opportunity. times, you know, the value that you have in there, et cetera, et cetera. Right. So you're kind of slowly building this thing out over time.

Toni: And then, you know, with it, you have a thing to hold onto when you have a difficult discussion. But also you have a way forward to go from one sophistication level to the next, right. And, uh, eventually being as sophisticated as, I don't know, the, The ones you kind of met at Sastra Europe or Sastra in San Francisco and were like, wow, we, we need to, we need to be, become the same thing.

Rolling out metric frameworks
---

Toni: Have you kind of in, in the companies that you kind of helped build and Project A and kind of the revenue engines? Have you actually rolled out metrics frameworks for them to track what's going on with, um, with those companies?

Raul: Yes, many, many actually. I would say there's a couple of common denominators here. Um, but I do think it's quite important to always go into where that company really is. And before I go into the framework, what I mean with that is, Every company is in a different place when it comes to sort of the readiness to really work with [00:16:00] this.

Raul: Some really understand what it means to, to be a data driven company and some don't, and really to boil it down, you can't just wake up one morning, be inspired and be like, Oh, now we're going to do this. Right. Because there's a lot that goes into it. And we haven't talked about this yet. Like what are the factors that make up a data driven company?

Raul: And one of the factors that makes this up is for example, process and not just for example, but mainly, and it's. Process. Obviously, coherence, as in like you have the same process across different numbers, but then also process adherence. And so, you have to look at these things, because like, if you have six SDRs, and you're trying to look at conversion rate across six SDRs, and they all follow different processes, then obviously that combined number doesn't mean anything.

Raul: If you're trying to compare, Different SDRs to each other and they use different processes that all that those numbers also don't mean anything. So you have to be very very very clear about like process coherence, definition and adherence yields those numbers. [00:17:00] That is what that was That's what you're actually looking at.

Toni: Let me disagree with this. I will kind of challenge this a little bit. So I think, so I think you're right, right? If you don't have process coherence, also super nerdy, boring word, um, it's really difficult to almost do this internal benchmarking, like one sales rep versus another, if you will.

Toni: What you can do though, to get started is to just, you know, go one level up and. Well, how many do we have of those and how many meetings do they generate per week, per month, per whatever. And then what you will find is that some are better than others. And that might be due to process, which is, you know, if you find that out, that's great because now we can start teaching what that person does to all the other folks or it might be other things actually, right.

Toni: But I would say like, even though you don't have process adherence, you still will have. Those checkpoints in the funnel where you know, maybe there's a commission payment that happens or maybe there's something else that happens That basically whatever they do It forces them to [00:18:00] align at least at that spot and after that spot, you know the AE picks it up They do their own stuff again And then but then you know again, it comes back to how many customers did you sign it?

Toni: And again, you can have an evaluation there kind of who's maybe doing the right thing and I would even say early on You know It's kind of even a thing that you want, because maybe in the super early beginning And it's a little bit, it's a bit of a backwards argument. Sorry for that. But in the early beginning, you might want to have several different people trying out different things, seeing what actually works best.

Toni: And then you start like standardizing. Now we know what works and here's kind of what you should be using. Right. And the way you would figure out if what any, one single person is doing works out is by kind of having those those markers, you know in terms of milestones and But obviously once you grow further and maybe this is what you're going to go into next in our role in terms of the revenue engines that he's set up Previously once you go grow further You obviously need to mature and kind of you know Get this more rigid and [00:19:00] kind of get this get this clearer setup going for companies.

ATAA: Acccuracy, Transparenct, Analyzability & Acrionability
---

Raul: Regarding what you just said obviously like it's it's useful to have different processes and test things out if you know what those sales people are actually doing. If you still have no idea about what they're doing, and you don't know how to interpret those numbers, then you also wouldn't know what to do with the learnings, or you hadn't really learned anything about that, obviously.

Raul: But going, into the framework. So what really has helped a lot of companies, and I have, I don't have a sexy name for it yet, so forgive me for that. Very unsexy name, but really sexy framework in, in actually executing this. What do you need to sort of start looking at a revenue engine or a factory?

Raul: What you need to start being data driven is ATAA. And, uh, not very sexy as I said, but ATAA is for Accuracy Transparency Data. Uh, Analyzability and Actionability. And what that basically means is you're looking at a funnel or at a bowtie, doesn't matter, and you're looking at each number within that funnel and there's basically, as you said, there's checkpoints, there's conversion rates between different [00:20:00] stages, and then there's volumes.

Raul: And that's basically what you're looking at within that bowtie. And you want to look at those numbers and be like, okay, are these numbers, and that's the first A, Can we actually trust these numbers? And at least as in they sort of replicate reality. And I think this is already one of the first things that I understand your sentiment, but I do think one of the problems with this trying things out and not having to, not being too strict is at least the data should.

Raul: Even if people have different processes, but if least the data should represent reality, that's like the first basic checkpoint. And that already is where a lot of companies fail, right? So one of the dimensions of accuracy is represent reality. So if you only enter 20 percent of your leads in or have your opportunities into the funnel and or your activities, or you don't update them, then it doesn't even matter if you and me have a different process.

Raul: Your process is just not replicated within your funnel, within Salesforce, right? [00:21:00] So one of the, just very practical, what happens a lot of times, most of the times to be honest really, is that obviously there is a time delay between when things happen and when sales people enter something into Salesforce or HubSpot or whatever, right?

Raul: And typically that's not a problem if it's, I don't know, half a day or a day or depending on how long your sales cycles are, but sometimes that delay can be arbitrarily long, like, Months or weeks, or it can even be sort of toyed with by the salespeople. And so one of the very simple things that has, that, that would happen a lot of times is. You would have salespeople working for a week or two, not entering anything into the CRM, especially in early stage companies, and get through with it, and then enter any, everything on a random Friday when they were in the office and just happened to be in the mood. And then, wow, all of a sudden you have like 55 opportunities coming in that day or 55 leads coming in that day, two of which are closing that day.

Raul: 53 of which are not, and then another 55 which he didn't bother [00:22:00] to enter because there was Friday beers at 4 p. m. And he didn't want to work anymore, right? So if you look at that, obviously sales cycles are fucked. Conversion rates are fucked because half the leads are not even entering the system. And then that's where this whole thing just goes out of the window, right?

Raul: Now, it doesn't mean you can't still try to maybe measure the same person as compared to the last month or whatever. But like this is where, you know, That's important, right? The numbers have to actually mean something. They have to be accurate, they have to represent reality.

Toni: absolutely And I think there's, you know, the, I think the whole data trust thing is, is a spectrum, right? Kind of, let's just be, let's just be honest about that, right? You can have really, uh, extreme situations and scenarios, in both directions, right? I, you know, as an operator, what I felt useful because, I entered a new organization.

Toni: They're already 15 million. And I looked at the data and basically I've decided I can't trust any of that stuff because some of the issues that you just mentioned, because people were just like doing their stuff, right? But, number one, , what I started to gravitate towards is, , which metric [00:23:00] can I do trust, right?

Toni: Um, and what then happens is you, you end up trusting revenue, right? Because, hey, this is, this is a deal signed. This is money coming through the door. I mean, I don't care who has what data this is the real thing. And then from there, you kind of, at least I did, started to implement step by step, you know, what could be a thing that I trust further in the funnel, right?

Toni: And one other thing that I trusted was the demo request, because that was systemic, right? That was kind of in the, in HubSpot, in the CRM, kind of that, that couldn't be screwed with. What was screwed with a lot was, for example, opportunity generation, right? But now you have, you know, lead generation you can trust.

Toni: Close one business you can trust and mind you this was like a 15 day sales cycle in that sense So, I mean there's like lots of stuff, you know already here that you could kind of trust but wasn't quite done yet right and then number two what I learned is Really seeing it also as a you know, directionally correct data set so not kind of going in the Everything needs to be 100 accurate for me to have a intelligent conversation [00:24:00] about it But basically kind of getting to the point to say like, you know what I know That it's not a hundred and 120 mql, but I know for a fact, that it's probably a 20% increase from one day to the next or from one month to the next.

Toni: Right. And let's have a conversation about that. Because what we actually need is a, much larger increase because that's what the plan says. And we only got an increase of 20% regardless of this being a hundred ninety nine, a hundred and two mql. Rationally, we didn't get the 20% up right. And I think this is.

Toni: This is something where, you know, in your framework, which I don't disagree with, but for example, if you don't score a hundred percent on the accuracy, that would basically kind of, you could still work with that data in order to get somewhere and have some intelligent conversation and then still, you know, acknowledge that you're not fully accurate and then start working, you know, also to fix that.

Toni: You know what I mean? It's like, it's, I think it's important for [00:25:00] folks to not. Declare data bankruptcy because they're not a hundred percent on all of these items. I think it can do stuff with the data despite not, you know, ticking all the boxes.

Raul: Yes. A hundred percent. Absolutely. So that's why I say there is levels to this, right? And so the, the, the first level is like, we're below above level zero, I would guess, which is no aspiration at all is okay. This is not perfect. Let's try to understand what's not perfect and work with what we can at least work with a hundred percent.

Raul: And another example that is very operationally and very executionally realistic for most companies is at least comparing, Apples to apples. And so look at the apple pairs that you can compare what that means is well I can compare probably Tony's performance to Tony's performance from last month to Tony's performance from last year in the same month Probably right because at least like this one guy Probably hasn't gotten much better or much worse at like sort of data entry unless something really happened, right?

Raul: So that's one thing you could do. Or you could really [00:26:00] compare like, as you said, like, overall unfuckwithable data, uh, that's produced, such as revenue. Uh, or such as, uh, leads that come out. A hundred percent agree, right? Now, this was where the second part of that comes in, right? So the framework that I use is ATAA.

Raul: Accuracy? Okay, we talked about that.

ATAA: Transparency
---

Raul: Transparency. And this is really important, is, okay, Where is that data being looked at? So sort of like, how is it being made visible and how understandable is that data point? Right. And so this is where I think a lot of companies already, like they, they don't really think about that.

Raul: So like who should be doing what and looking at this at what point. And there's sort of a whole spectrum from, Oh, we go free for all Salesforce where every salesperson can look at everything and build every report. And if they spend two hours a day building reports, It's better for them, uh, up to, Oh no, no, no.

Raul: We have like a highly designed, uh, revenue ops team that, that builds every single report you could ever look at and make sure everything is a hundred percent accurate. There's sort of like a, a [00:27:00] continuum there, but like at some point you should be sort of like showing or designing to some extent, thinking about who should be looking at what and where should it be looked at.

Raul: Right. And then from there, what conclusion should be driven from that? So this is where you went with before. And, and I think. One of the mistakes that I see, it might seem semantic, but I think it makes a big difference in actually solving this problem is that a lot of problems come from trying to solve this by looking at the data first and then thinking what problems can we solve with that?

Raul: I think it's a much easier and intuitive approach to look at the problems we're trying to solve the questions we're actually having, which is sort of what you did with your, well, I can, what can I look at right now? And then go in search of that data. So what I mean with that is rather than just producing like, uh, sort of like random reports and throwing them at people and seeing what conclusions they come up with, think about the questions that you do ask yourself and your leadership team, right?

Raul: So, Hey, how can we make the salespeople better? Oh, what, you know, what would be interesting [00:28:00] for that is like how individual salespeople have developed. Okay. How can we, how can we sort of, sort of find that out? That is a much better way to sort of go into that.

Toni: so meaning, right? Um, the opposite is basically called data mining. Right. You look at a bunch of data and you try and get insights from that versus, Hey, we have a use case that we couldn't consult data with. And, you know, it's going to be a very tiny fraction only, and that might help us to figure out that use case, right?

Toni: So if you have the example of how can we make our sales reps better? It's like figuring out, well, how do you even measure that? Right. Kind of what, what is it that you kind of need for this? And yes, one data point might be, you know, opportunities created, an accurate timestamp, you know, conversion rates, sales cycles, all of that jazz.

Toni: But it might also just be average deal size per rep, right? Kind of that, that might even give you some insights already right there that you can start executing on. And then the other piece is also. People shouldn't forget. Yes, sure, you have funneled data and, you know, CAC and all those metrics, but ultimately, you also have a lot [00:29:00] of quote unquote data in, you know, your call recordings.

Toni: Just listen to what these folks are saying, you know, and you don't need to be a data scientist for that to kind of listen to a call and be like, Yeah, it's pretty obvious why this rep is better than the other, right? So I think all of these things also count into being data driven, I think, um, to, I mean, to a degree, right?

Toni: The last one is a little bit anecdotal, but you can use some of the data points you have access to, to help you guide towards, you know, whom to listen to, etc.

Toni: But let's go to the other two A's that you have.

Raul: yeah, you were just talking about listening into sales data, right? So one of the things that I also have done quite a lot, and I think you also, is we call them sales audits or commercial revenue engine checks or something like this, benchmarks, same thing. Just look at stuff and like do a diagnostics on things.

Raul: Right. And, uh, I think what. I see has worked really well is doing an analysis on two aspects. And one is the quantitative analysis, which is sort of the revenue architecture, bowtie [00:30:00] approach, of like trying to look at every number of the conversion rates and try to formulate the whole revenue formula.

Raul: But then another one that I always employed is qualitative. Analysis. And basically this means just going through deals, talking to salespeople about what has worked and what has not worked, right? So that is sort of what you would look at as, as a soft data, which again has helped if they, if they did save that, those deals in the past, right?

Raul: So you would go through the system and that takes a little bit of time and try to identify deals and like, Hey, this, This deal is quite large. How did that happen? And these deals fell through even though they were really promising. And this one is stuck in the pipeline for very long. And then you do that with a bunch of salespeople over a period of time, you find out trends within that company.

Raul: Uh, if you match, and those are not enough, but if you match those with the actual data, you get quite a holistic picture of what's going on in that company, right? So I know we're talking about data here a lot, but data to me is much more than just numbers. It means what's What has happened in the past and, and what you can understand about that and, and, and yield from that.

Toni: And I think, um, you know, and then it's [00:31:00] almost kind of the two different ways to kind of look at this, right? I use it a lot, to help me narrow down my fact finding actually. I use this a lot, you know, when running, uh, companies on the commercial side to kind of spot in which area something is not going as it should.

Toni: Which then helped me to figure out whom to ping, you know, how to investigate, you know, what, what the project could be in order to kind of look into this thing. Right. But also when we, sometimes we do like a little bit of a consulting gig on top of a rollout here with Growblocks, basically just looking at the numbers kind of helps us in sometimes like half an hour.

Toni: To figure out what's kind of fucked with that organization, right? And then you know it's sometimes funny kind of then we set up the call and kind of go through this and show our findings and then People are like floored by how did you figure all of that stuff out without just looking at those numbers?

Toni: And it's pretty straightforward. Actually, it's a very efficient way. I feel Both consulting but also, , you know when you run your own organization To narrow down where the [00:32:00] problem areas are And then, you know, once you double click on that problem set, you know, you will have personal circumstances.

Toni: Someone is getting the divorce. You will have like, oh, you know, there's a bunch of recordings to listen to. You will have, oh, you know, this, this market is maybe not mature enough, or there might be so many other data sources of interest that you want to kind of now scan through in order to narrow down, your response to that.

Toni: So it's a little bit of a mix between the Uh, data mining, fact finding kind of approach, look at the data first. Um, but then to kind of go really anecdotal and kind of listen to the stuff and, and qualitative basically in your next approach.

Raul: And this is what you should aspire to. I think this is a sign that you obviously have quite a good understanding of. This it's sort of like a doctor. If I walk into a doctor with a broken leg, like it will take them three seconds to figure out I have a broken leg.

Raul: Right. Cause You know, the average person also probably would figure that out, but like, they know what to look for. If you know what to look for and how to look at it, you will figure that out in three seconds, right? And, and this is the point here. Uh, this [00:33:00] stuff is not that complicated if you're quite intuitive about that.

Raul: So it merits educating yourself on it.

Toni: what, what is kind of funny is, um, you know, that stuff, and we have like, um, data validation steps to make sure that, you know, what we see and the client and customer sees is the same thing. But we sometimes do this analysis even before the data validation is done. Because we can't, right?

Toni: It's kind of, it's there. We are looking at this, we're kind of perusing the data, so to speak, to kind of prepare. And we know, hey, there's probably an error margin of 5 percent in the dataset. You know, something might not be right. But the crazy thing is you can make those adjustments that inside you can get it even from a slightly off dataset, because most of the pieces are kind of systemic.

Toni: And that's kind of goes back to my point.

Toni: So when you're a sales ops guy or girl, Comes up with a QBR and presents it to you and you find one number that's off You don't need to storm out of the meeting and say like well, there's one number is off.

Toni: Everything is off We don't need this thing anymore. You can still keep sitting there and [00:34:00] be like, okay There's some pretty crazy insights that we probably should follow right? Rant over on that one specifically, but it's kind of um, I also you know, sometimes I worry. Um, I know it for a fact, actually.

Toni: Sometimes people just use this data quality issue as a defensive mechanism to get out of a sticky situation as an executive. It's like, Oh, someone is coming with bad news. Oh, they're, they're kind of underpinning the argument with data, uh, that, you know, makes their case extremely strong. Let me just question the data.

Toni: It's like, it's like you have a witness in the witness stand and they say something really terrible about you. And then what do you do? You can't take away what the witness has said, because they said it already. But you can say like, well, uh, he also lied before, , and he's a convicted felon. And, you know, you shouldn't be trusting what that person is saying.

Toni: That's essentially what you as an executive are doing at that point, right? And I think this is the strategy and stuff, I get it, but, I just encourage both junior and senior people to try and [00:35:00] see through this and call it out And whoever is a little bit defensive about you know, what quote unquote the witness is saying Um, you know, don't be just embrace it lean into it instead of trying to lean out out of it.

Raul: and so to, to, to round that up, cause I think that that also goes into what you were saying is like, what do you now do with that data? Right? So really short and simple. Your data is accurate or as accurate as can be. There's levels to that. Your data is transparent. It's being shown to the right people and they kind of have an idea of what to do with it.

ATAA: Analyzability and Actionability
---

Raul: Okay, there's levels to that. The third one is Analyzability, and a very unsexy term, as I promised, for, okay, where does that come from? Why does this happen? So, we've observed that conversion rate's going down for Toni. Analyzability, do we have an idea why? Oh, actually, there's some correlated numbers towards that.

Raul: His calls went down. Or maybe we even go deeper within that. We have Gong and we figured out that in the calls he was just talking about himself and his, whatever, , personal problems a lot right now. And that seems not to be rubbing, that to me, it seems to be rubbing people off [00:36:00] the wrong way. And so you kind of have an idea of why things could be, maybe it could even be more complex than that.

Raul: That was a very simplistic example. That's analyzability.

Raul: And then there's actionability, which is there a way for us after we sort of have an idea of why the Toni thing happened. Now we can say, hey, Toni, let's try to sort of get your spirits up again. Let's do a little bit of training. Let's give you some vacation, whatever.

Raul: Revisit that number and figure out whether that, caused an improvement or not, right? And so then there you have the loop, right? So if the data is accurate, If it's transparent, that's the T. If you can analyze it, meaning you understand why numbers happen the way they happen, then you can act on them, and after you've acted on them, that's the actionability part, you figure out whether that actually helped or not, right?

Raul: And that's the whole, that's the framework that I've employed with a lot of companies. And then there's, , We're not going to go into that at all, but then there's levels to that. I've figured out five to six concrete levels that you can go through with each of these, and you should just try to be, to get to the next level when you can.

Raul: There's merit to [00:37:00] each of them. So when you're at level zero at accuracy, transparency, actionability, analyzability, still try to get to level one, you're going to be better off. When you're at level one, try to get to level two. And that's really why I would try to go with that.

Toni: Interesting. So if anyone needs help with that, I think Raul is up for for taking some calls on that topic. And I'm sure once Raul has fixed it, uh, that might become a great Growblocks customer right there. Um, but for, for everyone else struggling with data, um, you know, I think it's really important, , don't give up, don't declare data bankruptcy, rather slim down, start small, , start early, as early as you can, but maybe even smaller.

Toni: And then really, you know, realize that data trust and data quality is on a spectrum. And if you were to slice it into different directions, it's the ATAA framework. It's accuracy, it's transparency, it's analyzability. So this is the why, and then the actionability. Can you do something about it?

Toni: Right. Um, and, uh, thank you everyone for [00:38:00] listening, kind of hearing us rant about some of our data issues, super nerdy topic, but I hope we made it a little bit more palatable for everyone. And, uh, thank you Raul for kind of doing this and thanks everyone else for listening. Have a good one. Bye

Raul: Love it. Thanks, Toni.