The Revenue Formula

That's not the number I have. How did you calculate it? What field did you use!!?. Data promises amazing insights and great decision making.

But more often than not, we tend to not trust the data. That's why we talk about data trust with Lindsay.

  • (00:00) - Introduction
  • (02:24) - Data Trust Issues
  • (03:56) - Common Data Quality Problems
  • (11:27) - Impact of Data Issues on Business
  • (14:57) - Manual Data Correction Challenges
  • (19:46) - Analyzing Opportunity Age in Salesforce
  • (21:10) - Handling Imperfections in Data
  • (22:32) - Outlier Management with Advanced Technology
  • (25:10) - Building Trust in Forecast Numbers
  • (27:45) - Creating a Common Language Across Departments
  • (38:51) - Eliminating Friction in Go-to-Market Teams
  • (41:44) - The Future of Data Trust and Technology

***
Connect with us

🔔 LinkedIn: Toni / Mikkel
✉️ Newsletter: https://www.revletter.io/
📺 Watch: https://www.youtube.com/@growblocks
💬 Contact: podcast@growblocks.com


Creators & Guests

Host
Mikkel Plaehn
Head of Demand at Growblocks
Host
Toni Hohlbein
CEO & Co-founder at Growblocks
Guest
Lindsay Cordell
Helps Make Go-to-Market Simple | Experienced GTM-er | Industry Analyst | Founding Partner

What is The Revenue Formula?

This podcast is about scaling tech startups.

Hosted by Toni Hohlbein & Mikkel Plaehn, together they look at the full funnel.

With a combined 20 years of experience in B2B SaaS and 3 exits, they discuss growing pains, challenges and opportunities they’ve faced. Whether you're working in RevOps, sales, operations, finance or marketing - if you care about revenue, you'll care about this podcast.

If there’s one thing they hate, it’s talk. We know, it’s a bit of an oxymoron. But execution and focus is the key - that’s why each episode is designed to give 1-2 very concrete takeaways.

[00:00:00] Toni: Does your data lead to frequent fistfights? Is everyone complaining about bad data? Do you maybe have a data trust problem? Well, you are not alone.
[00:00:11] Lindsay: I've never met in my entire career, a RevOps person that felt really confident and happy about their current state of their data
[00:00:19] Toni: that's Lindsay Cordell from GTM Partners and she's heard all of the excuses.
[00:00:24] Lindsay: When people say that the data is not good, what they're talking about is the reflection inside of your CRM. What they actually mean when they say, I don't trust the data is, I don't trust the lens of Salesforce to deal with my shenanigany behavior that's happening behind the scenes
[00:00:42] Toni: In today's episode of the Revenue Formula, we talk with Lindsay about what's behind the problem, what's at stake here, and what you should do to finally fix it. Enjoy.
[00:00:57] Mikkel: Now as I was literally just before heading in here, I was texting my wife because someone has to pick up the kids today and yesterday. Usually on Sundays is when we coordinate calendars and it's usually in the evening when the kids are put to bed and then it goes into the calendar.
[00:01:12] But we decided to do it over breakfast and the last time we did it. We both completely messed it up. So if, you know, I dropped off the kids, she would assume, well, then you're also picking them up. And one day she just had a heart attack at work, realizing she was supposed to pick them up like now. So I just wrote her saying, hey, you know, heading into the studio and then I'm going to head home.
[00:01:33] Don't forget, kids. And I was like, oh no, what if she writes, can you go pick them up? But fortunately. She's on top of it. Okay. Very good. She's very good. To be fair, she, she messes up way less than I do. Wait, and it's probably no surprise to you, I guess.
[00:01:50] Toni: No, but good that we have overtalked on the show how your wife is messing up and not you.
[00:01:56] Yeah,
[00:01:57] Mikkel: let's not get into
[00:01:57] Toni: that now. It's like a full
[00:01:58] Mikkel: on 40 minute. This is going to come back. And then there was the time when I hope, God, I hope she doesn't listen to this episode. Also, the thing is, she works with data governance. This is like, if she was to listen to one episode, honey, if you're listening, I love you.
[00:02:12] Toni: Well, so how are you going to Shanghai this now into what we're going to talk about?
[00:02:15] Mikkel: Well, I told you, she works with data governance. And that's really dealing with ensuring you have great data that's accessible. That's correct. That you can work with.
[00:02:24] We have Lindsay with us here. You also probably have heard this quite a bit, but folks, they have a lot of trust issues when it comes to data.
[00:02:33] Lindsay: Oh gosh, yes. 100%. Every, I've never met in my entire career, a RevOps person that felt really confident and happy about their current state of their data, especially like that first 6 to 9 months in an organization. It's just, there's no, there's no trust anywhere for any of them. And it's always, it's always a little heartbreaking to hear that.
[00:02:55] And yet dashboards start to get created and you work your way through it. But yes, it's very, it's, I'd say it's the most common thing I've ever heard from a RevOps person is like, it's the, the data's not good.
[00:03:06] Mikkel: Yeah. And it made me think Toni had once wrote a post calling out some of those RevOpsers. I was like, don't be such a data Dobby. Don't, don't be so apologetic over this stuff. What, I mean, what does it even mean that they they have data quality issues? What's like, what's the most common thing here?
[00:03:23] What is, what is it that triggers this issue actually? How do you, how do you see it in practice?
[00:03:27] Toni: Absolutely. Let's, let's try and break this apart a little bit, because from my perspective you know, data trust and data issues. is like a massive topic with like very specific subcategories of it actually.
[00:03:41] So, so I think today we probably will have the chance to kind of dive into some of those areas and try and figure out how we can tick them off. But, but Lindsay, I mean, when, when someone is talking about data trust, what's your, you know, how do you, how do you try and solve that problem? How do you kind of break it apart?
[00:03:54] Lindsay: Yeah, so I find that
[00:03:56] the data issues come from two primary sources. Now, typically. RevOps is hyper focused on the sales cycle metrics to begin with. That's where almost every RevOps person starts their story. There's of course marketing, and lead metrics, and customer metrics, and all kinds of things that we will eventually tackle, but usually we're starting at the, at the revenue.
[00:04:17] conversion cycle, so that tends to be what the AEs. And typically, it's related to the fact that there's not trust that AEs are following the data input processes properly. And so, there's concerns about everything from like, Do I have all the deals? Do I have AEs that don't put deals into the system until they're sure they're gonna happen?
[00:04:37] And therefore their win rate rates are look wrong, or maybe every AE doesn't have the same definition of their stages. So you can't use this, you can't trust the stages, you can't trust the age, you can't trust whether or not the number of opportunities is accurate, you can't trust the lead source conversion.
[00:04:56] So it becomes this like This like spinning out of control feeling of if nobody was entering the data in a consistent way, then all data must inherently be bad. So that's part one. Part two is usually there's been whoever was running the CRM maybe was doing it to support internal processes. So for example, I've had one client that they use their finance team ran the Salesforce instance.
[00:05:23] And one of the things that's pretty common in Salesforce backends is it needs to talk to a biller. At some point to produce the invoice so it can be sent to your clients in an automated way. However, that can be really complicated. I've had situations where I've had three pieces of technology that went from opportunity to To an intermediary, to a second intermediary, and all the way into NetSuite to get the billing to work properly.
[00:05:45] So the finance team have configured the CRM to work for the biller, not necessarily to properly state what's happening from a revenue engine perspective. So the other reason is because there will be like internal paper processes that are extremely complicated and sometimes they'll use the opportunity object to reflect. Like, we went to contract, and then we went into legal, and then they went into arbitration, and then finally the contract.
[00:06:13] So, they're basically tracking post deal activities, because maybe the revenue doesn't happen after a really long period of negotiation. So, it becomes intermingled with paper process, but because it's pre revenue, it's officially pre revenue, it sits in the opportunity object. So, I've seen opportunity objects with like, 18 stages and only three of them had anything to do with opportunities and the rest was paper process and negotiation management that was being tracked.
[00:06:42] Toni: What about the, the typical fun one? So we're joking about this internally. Also, when we talk to customers, you either have, No MQL definition or a five. There's, you know, usually there's not, you know, one single one. Do you see this also happening across? So kind of, if you go away from just the sales process, so to speak, and I've broadened out that people struggle to define what is an active customer, by the way, that, that's an interesting conversation, but like, you know, those, those kind of other metrics that you know, should be, there seems self explanatory, but are still actually not fully defined and therefore, you Then are raising, you know, trust issue when the CMO is talking to the VP of sales and they just don't have the same numbers.
[00:07:24] Lindsay: Oh yeah, absolutely. SQL, MQL definitions, lead source and usually what I find is That, that tends to happen for two key reasons. The number one reason, in my opinion, that things go wrong in a backend, except for the, the operational things, is the compensation process is not being properly defined, and so people are basically using the CRM to game the system.
[00:07:53] I had this happen my very first job in RevOps, actually. I was going through all of the different stages and trying to understand. Why marketing was doing so poorly when, you know, I was watching marketing. I mean, this was a marketing company. They were very good marketers very clever, very smart, but the reality was marketing didn't get paid for their opportunities, but the SDRs did, and so marketing didn't care if they got credit or didn't get credit, but if they made sure that the SDR Got the credit they deserved and that person would get their full commission check and then they'd all go out for drinks on Friday.
[00:08:25] So like the compensation management can really mess with and then if you change it around a lot It can really mess with the consistency of the metrics being tracked Because people are kind to their co workers and they wanted these SDRs To get their 10 meetings that they had to book and you know It didn't matter because the team wasn't being measured on MQL's the marketing team.
[00:08:45] So You Didn't matter to them either way. So it was really interesting to watch that whole dynamic happen. So that's when people were honestly dishonest in the system and messing with it on purpose. And then oftentimes, right, it's just. You know, people don't know how to use the campaign object, or maybe, you know, they had a MarkOps person that did care about the campaign object, or maybe they switched from Marketo to Pardot, and then everything didn't change over properly, or everything was restated, and it's another way.
[00:09:12] So process changes in the, in the tracking and marketing can also cause those issues. Oh,
[00:09:20] Toni: in a previous company, he was He was, he was very specific about figuring out where his opportunities were. So there was always kind of on the, on the month end, there was always a, you know, this, this tracking exercise, but it led actually to a second one. Which was, you know, then companies come up with rules of engagement for the compensation, right?
[00:09:39] Kind of, Oh, you know, this one was touched 60 days ago by the SDR. So therefore it's an outbound on it. But and I think this is helpful for the compensation, but not always does the internal way of mapping you know, who gets paid and who not actually match the, how was this, you know, How was this piece of revenue actually acquired?
[00:09:58] Right. So kind of people mix those two realities up. And in some, in many cases it's overlapping, it's the same thing, but in some cases it just shouldn't, right? So, I mean, the, the typical issue is. Maybe this was a nurtured lead, kind of has been with you for a while. They got one really shitty email from an SDR and suddenly three weeks later, the inbound rules of engagement say, this was an outbound deal, but, but we all know it was probably not that email from that SDR that turned it right.
[00:10:28] So I think you know, basically
[00:10:29] Mikkel: you admit it finally,
[00:10:30] Toni: Hey, I just want to make this show interesting. Okay, Mikkel no, but those are, those are kind of things that I Further to your point with those incentives and process changes are messing up the data that you, that you actually have in the system.
[00:10:44] Lindsay: completely. And then like, I have an ROE template that I start all my clients with if they don't have one. It's four and a half pages long. So it's all of the different things that could possibly happen. I spent half of my time, every time we build comp plans, like the first half is building it. The second half is trying to figure out all of the different loopholes that I have created with my bad comp plan.
[00:11:08] It's And trying to pressure test it for all the things that could have negative impacts on the revenue engine that I wasn't anticipating because of the way I tried to pay people and bonus people for good behavior.
[00:11:20] Mikkel: So, I mean, we listed out quite a few problems and I'm sure we're going to double click on, on some of them in a second, but.
[00:11:27] What's the problem, actually? What's the big issue with having some of these problems? Couldn't you just say, Oh, well, Greg, our AE forgot to create a couple of opportunities. So it's not 100%.
[00:11:38] But he did put in the majority. Like what's the problem here that you've seen happen for companies when the data is not, you know, when you don't trust it, there's quality issues?
[00:11:48] Lindsay: Yeah. So, sadly, which is kind of a bummer in this moment, but the problem is people. It's people that are making decisions about what they're going to enter or what stage they're going to use, or, you know, how they're going to categorize Opportunity or revenue being built. So when I think of like it's if we leave too much space between What the tech is intended to provide us and what the people are autonomously and individually allowed to do, that's when we're starting to create more risk for inaccurate data.
[00:12:23] And so, like, to start at the top of it, whenever I'm walking in, one of the first things I do at any organization is I get to know the people first. I, I look at, I, first of all, I do nothing in averages. I do everything on a person by person basis, and I look at opportunity cycle behavior for each individual on the team, and then I work my way backwards to, okay, so this one seems to have much shorter sales cycles, and they somehow hop from discovery to proposal in under three days.
[00:12:54] Like, that's a good indication that they're not entering those opportunities, that they're working. And then there's usually enough of a data trail to go back and be like, okay, well So, you know, what's done is done. I'm not going to worry about that. We're not going to fix anything. But what I do know is I'm going to treat that AE's data set as an outlier, and not use that as my average opportunity, in my average age opportunity, or length, or velocity, or whatever you want to call it, because, I can already tell based on a couple of conversations with them and by watching the opportunity stage progression that something here is inconsistent with the rest of the crew and inconsistent, quite frankly, with like what you would expect.
[00:13:33] And then I work my way into kind of figuring out which pieces of the data I trust, which pieces of the data. are really going to mess up my averages, pull out my outliers, and then start from good ish data versus the known damaged data, if at all possible.
[00:13:51] Toni: Yeah. And, and I think like a lot of people listening would be like, well, we have a bunch of Revops folks listening, so I think, I think their, their mind will be different. But you could just say like, sure, then, you know, Greg is just, you know, we can't really monitor his funnel necessarily. So what's the big deal, right?
[00:14:09] The, the big deal is that you know, one level up when you then need to defend specific numbers and they will at some point will always be a pressure on the numbers, right? That there will eventually be. And then someone just coming in, well, but Greg doesn't do it like this and like that. So really actually this number is not correct.
[00:14:25] And suddenly everything, everything just falls flat behind it, right? At least this is what I ran into a lot. Yeah. And suddenly because you have one bad actor or multiple suddenly the whole data conversation is just like stopped. It's like, you cannot use it anymore as an argument. Right. And I'm not sure if you're seeing the same things kind of when you walk into those organizations, but, but isn't this what, what, what those organizations have been robbing themselves of, of like, Hey, you know, actually we can't use any of the data because probably one piece is wrong.
[00:14:55] How do you see that Lindsay?
[00:14:57] Lindsay: So I think when people say to you that the data is not good, usually what they're talking about is the reflection inside of your CRM. Can't be trusted. So when your CRM gives you an average days to close. For a particular business segment, I think what they actually mean when they say I don't trust the data is I don't trust the lens of Salesforce to deal with my shenanigany behavior that's happening behind the scenes that I'm aware of.
[00:15:25] And so usually what ends up happening is big report is built, all the data is extracted. People go in and delete Greg or any of the bad actors or any of the, you know, deal cycles that sat in opportunity stage for 500 days, for goodness knows what reason, three years ago. They, they will remove that manually and then they start building manual Analyses by themselves in their silo about what they think is true.
[00:15:54] And then, but they're still using the data. Like they, they started from somewhere. I have the, the client that the, the finance person kind of messed up Salesforce because of the way that they wanted to invoice. Basically they would sell multi-unit, sell multi unit software to all kinds of different restaurants.
[00:16:11] It's very cool software. I like them very much. But what they would do in finance is they would break that one opportunity, even if it had 35 units and it was one sales cycle, they would break it into 35 opportunities so that it would generate 35 independent bills on the back end. And so now they couldn't even tell how many units they had sold.
[00:16:30] That's not true because we knew they had 35 opportunities. And so what we were able to do is you kind of just Like an archaeologist, you retro out all of the stuff, but what's tough is Salesforce can't speak the truth when that happens. And I think that's what people truly mean when they say, because they're still looking at the ACV.
[00:16:51] They're still looking at the length of time. They're still looking at that information set. They're just having to extract it and massage it, pre massage it before they can put it anywhere.
[00:16:59] Mikkel: Yeah. Yeah. I think, I think I've at least heard and seen the cases where the, the friction is really on the data itself, not what the data says, which is kind of what we're talking about as the problem here. And I know one of the words I at least picked up from you was also this, it also triggers a cover your ass kind of exercise.
[00:17:19] So this spreadsheet you were mentioning, you know, is that going to be a quick task just to get it delivered or will it actually take, quite a bit of time. And I think, especially if you're Dealing with a challenge in a quarter, you need speed in order to move fast. A week is actually a lot of time if you're going to make an important decision to shuffle things around.
[00:17:39] And I think sometimes people might, maybe they know inherently, but it's just a major issue at least on, on the data side.
[00:17:46] Lindsay: Oh, that's 100 percent. Like we will get to the right answer, but it will take sometimes days and very expensive resources. So for this poor company that had the now 35 opportunities that mimicked each other now inside of Salesforce for the biller, we had to pull all that data out and it took me, a RevOps go to market consultant, the CRO of this company, and then another RevOps leader.
[00:18:12] It took us eight hours to reverse engineer, like get all the data out, VLOOKUP, clean stuff up to the best of our ability, and then you have to double check yourself, right? Because If you've summed on the wrong data set during this massage process, so it's not enough, I had a great boss, she worked in ad operations, but she was like, I won't trust a single number the first time it comes out of my VLOOKUPs or my analysis.
[00:18:38] I have to be able to get to the same number through three separate equations before I'm willing to trust it. And so, it comes into like, okay, I think this is my ACV, I think this is my Time to Sale, but I'm not sure how can I get at that data from a different angle with a different analysis to validate all of the things I believe to be true.
[00:18:59] And that's the part that takes a lot of time. It's making sure you're right. And it's very, you know, they want, people want their answers right. They're usually asking you because they're about to make a call. They're about to hire salespeople. They're about to fire salespeople. They're trying to make a call and they don't want to hear, Look, I need a week to reverse engineer this in three different directions to be sure that we've got the right information in front of the business.
[00:19:23] Toni: Crazy. Did I understand this correctly? You basically need to create proof that whatever number you came out with is in fact the number. So it's like, you know, one is a VLOOKUP, the other one is you went through and summed it everything, you know, you know, one by one by one. I mean, it's, it's, can I imagine it like this or, you know, how, how did, how did it work?
[00:19:42] I'm just, I'm just interested about this specific side story.
[00:19:46] Lindsay: Yeah, so essentially what I would end up doing is, so let's say we want to look at like opportunity age. So the first thing I would do, right, I want to see what Salesforce just says all on its own without any help and support. So I'm looking at opportunity ages, but if I know there's a failed process somewhere in the system, the first thing I'm doing is pulling the data out and then trying to correct for the failed process.
[00:20:07] And so what I would do is I would do the VLOOKUPs across each of the account IDs, but then I would probably dr I would probably dive a little bit deeper into, like, how that AE performs, and I would run the analysis with an extra slicer in it, and I would make sure that we got to the same place both ways.
[00:20:28] And then the last way that I would look at it, honestly, I would go back and I would try and figure out from a first touch, like, I would just look at their email. And say, okay, I think this is what the age is, but I've got, I've got sales loft data that tells me when these communiques really got started.
[00:20:45] Is this anywhere in the ballpark for my average age? So I would do primarily using the Salesforce system, add slicers, make sure I didn't make a math boo boo. But then also I would try and find a secondary system that could show me an analogous. Measurement that would help me validate what I'm trying to prove.
[00:21:03] Toni: So sometimes for the data entry, people are not like AI, AI is going to fix it. Right. Someone's going to listen to your call, put it in the right, all the places.
[00:21:10] But but pretty much, and this is kind of the status quo and probably going to be the status quo for a long time is basically saying like, Hey, we need to work with the imperfections that are in the data period.
[00:21:20] We will have to work with this. Right. And we will still need to get something out. And in order to achieve this, all of those different, different workarounds, basically kind of that you just. that you just mentioned there, right? And some of that sounds pretty crazy. Let's see if we keep all of these nerd pieces in there because I think it's, I think it's going to be super interesting.
[00:21:36] But that's I, do you think this is an uncommon way? So maybe not, you know, proving those three things, but do you think that that this is how people are trying to massage the, all the imperfect data into something that someone could be looking at and, and, and trusting?
[00:21:52] Lindsay: Yeah, I mean that's that's definitely what I mean We had to decide This company and I with our now 35 opportunities like we did not have it. We had we were given a funding round We had to hire more salespeople, and I, like, I am very passionate, especially growing up through the sales organization, like, I cannot hire salespeople if I'm not sure that I can feed them.
[00:22:15] Like, it hurts my heart, it makes me sick to my stomach, I can't do it. So I've gotta get this stuff right. So I'll do everything within my power in the massaging process. However, so I don't know that I would trust that to an AI. thing straight off the bat.
[00:22:32] However, what I'm loving about some of the technology, like technology that you guys have created, but there's several solutions out there really handling outliers in a effective way for me as a RevOps person to say, I know that that 600 day sales cycle op from two years ago is like messing up everything inside of my world.
[00:22:56] Like all I have to do is tell the system once. And then the system knows and now any data that I pull on top of that system. So outlier management is one of the favorite things that to me has kind of come from some of this improved technology that we're starting to see out there. And you know, I could probably like hack around some outlier management inside of the Salesforce object if I like got really fancy with it.
[00:23:22] Not in the way that these models, and so that's the other side is that once we've got an understanding of our outliers, if I train a model or not me, because that's not my job, but if you all built a solution that allowed for us to train models on my business on what an outlier looks like, and then it started sourcing the outliers for me instead of me having to do it by hand, like that to me is the Nirvana state because it's not like we're going to stop them We're not going to stop Greg.
[00:23:52] He's still like, I can yell at Greg every day. I can call him every morning. Are you working a deal? Please put the opportunity in Salesforce. I could call him every morning and Greg's still not going to change his behavior. Not enough to properly define what's happening in those sales cycles. So I love that I'd be able to tell, I could tell the machine, Hey, Greg's weird about his opportunities.
[00:24:14] Tack an extra, you know, average 14 days on there if you need to. And it can start to handle those and model those and look for those instances. I mean, that to me is where the future lies of this type of problem.
[00:24:26] Mikkel: But you know what it's funny because we've I've also experienced cases where the data was accurate But you did have a deal in there that was enterprise that messed everything up and you're not just gonna go high five We closed an enterprise deal this quarter. Now. We need to close fewer deals next quarter because our ACV it's up Like that doesn't work like that I also just wonder now like How do we then go about creating trust in the data?
[00:24:51] What can we, beyond just necessarily going out and procuring a tool, what are some of the steps or the basics? I'm sure you've worked with so many clients and you've seen a difference where, okay, they got a complete control over this, they trust the data, they know it. What, what was, what is different about those type of companies?
[00:25:06] How do they go about it to, to create that trust?
[00:25:10] Lindsay: So what I find is the first thing you want to establish is trust in the forecast numbers. So I would be fighting for forecast accuracy number one. Once you start to approach some really solid forecasting numbers, the trust inherently starts to build, right? Because the idea here is That's the last thing that you're tracking.
[00:25:34] And so if you can get that number right, they're going to believe more of the upstream information. So like for our goal, and then, you know, 2022 messed with a lot of RevOps people's worlds, but my goal was to be within 2 percent of forecast. And like one of my one of my best friends, who's just a genius in revenue operations, she's, you know, if she's less than, if she's 1 percent off, she will do full bets.
[00:25:58] She will bet her CRO 500 bucks. That she is going to be more accurate than they are going to be on their analysis. And it's really about fine tuning, it's about fine tuning your forecast percentages by stage. It's about finding outliers and removing them during the quarter, not after the quarter has already happened.
[00:26:19] So my, my big advice to anybody who's like trying to build that data trust again for the first time, is well, it's fight for that forecast number and a lot of that comes from stabilizing what sales is inputting into the opportunity cycle information and trying to normalize stage definitions. So if you can get those two things humming, The rest of the trust will start to follow, and then, so that's forecasting revenue, forecasting pipeline is the other side of this story.
[00:26:49] So having a great methodology that's unique to your business for starting to be able to predict where your pipeline is going to come from, again, huge, huge way of building a lot of trust with marketing, and then people will start to trust more of this information coming from you.
[00:27:08] Mikkel: I think that makes sense. And it's also it's tricky because I was just reflecting even when you go department by department, the approach you're going to take is is going to vary a lot, right? Because with marketing, it's maybe there's a challenge of, hey, what, again, it's back to a definition. And not just on what is an MQL, but what types of MQLs are we committing to bring in to create pipeline?
[00:27:28] Because a low intent lead is not going to behave like a high intent lead, right? So there is something there. How do you see the Establishing a common language and understand, like truly understanding how some of the metrics are calculated and run. How important is that dimension of it?
[00:27:43] Lindsay: Oh, that's 100%.
[00:27:45] But so the way that you should think about it is within each department, you need to build trust. So I'm going to be really good at forecasting revenue for my sales team, and I'm going to be really good at predicting pipeline creation for my marketing team. But like the best part is those handoffs.
[00:28:04] So if we can create a really strong common language between marketing and sales and help them trust each other, like then, then the data trust is really strong. Because there's, there's this natural distrust between all of the go to market teams. Part of it is because of the way we built our companies.
[00:28:25] And this is a hangover from like, oh, there's a marketing department and they're on the 11th floor. And they have sparkly things. And then there's a ninth floor and that's where the salespeople live. And the CS people, by the way, you're in the basement and the product team maybe is offsite in a completely different building for whatever reason.
[00:28:40] So like we have done this to ourselves in corporate America, which is problematic on its own. But if we can create a common language that's based in something that's not opinion, but as fact, which numbers are usually our goal is to make them fact. Then they can stop stressing about, Oh, this pipeline that they delivered to me is going to allow me as a salesperson to earn my entire paycheck, which is really all I care about.
[00:29:07] I mean, we've created an environment where salespeople fear that they're not going to get paid and they act accordingly. They get nervous. They ignore what they think is in their minds from an opinion perspective is bad pipeline. They reject that. The MQLs and they go out and source their own deals and then they get mad at marketing for wasting their time and wasting their resources.
[00:29:28] It creates a ton of tension between those teams and so if we can create a common language and really create those common metrics that everyone can believe and trust in, then that friction can start to ease a little bit between those teams.
[00:29:42] Toni: How would you go about and do that actually? Because I think there are a couple of companies out there that have mastered this or have achieved this, and maybe because they just were built in a different kind of way. But what's your, what's your, what's your advice to folks? How can you get to a place?
[00:29:58] where where those numbers can be trusted across different teams.
[00:30:02] Lindsay: Well, I think standardizing the definition of what things are is probably step number one. So, we've got a company right now, they're building, they're very interesting. Basically, they have salespeople on the ground in a bunch of different countries. One salesperson, and they're responsible for pipeline development and pipeline execution.
[00:30:21] Almost like a, almost like a contractor, quite frankly. And, or like a reseller, a white labeler. So what we're doing with them is I need them all to come together because they're, we can't predict the revenue, like the opportunities or the, the, the pipeline we're building for them, they're rejecting it.
[00:30:39] And so that's a good indication. If sales is rejecting pipeline, something has gone amiss. So let's start there. Why are they rejecting the pipeline? Is it because, what is the data they're using? What is the, you know, internal bias that they're using? So it's getting it all down on paper. These are all of the things that are causing me To not have trust in what the upstream team is delivering to me.
[00:31:01] It can happen between sales and CS just as easily, right? Sales, sales of things. CS gets the deal. They're like, oh my gosh, these people don't know what they're talking about. These salespeople have sold wrong things. Like this customer is not even close to our ICP. You start to get those arguments. So getting those definitions right, is really, really important.
[00:31:20] Showing everybody how you're going to measure it and that this is just going to be it. And then the last thing I look at 100 percent is are we incentivizing these teams properly? Am I incentivizing website traffic? Even if it's not with money, just by like holding it up. In our quarterly meetings, look at all the website traffic.
[00:31:42] If our website conversions, our organic website conversion, tends to result in ACVs that are half of what Any other source type is coming through. Like, that's a, that's a big misalignment. So, the sourcing of the organic growth, we'll say, in this situation, is already telling us that my sales people are going to make half as much money on every deal, which inherently means they must work twice as hard.
[00:32:09] I don't know about you guys, but I'd be pretty ticked. If I was just told like, yay, marketing, you've made it twice as hard for sales to do their job. Congratulations. Like let's have a round of applause at our next all hands. Like anything in the, anything in the system that is incongruent with efficiency and effectiveness that puts work on one team more than another team, anything that's happening.
[00:32:33] Finding those and eliminating them as quickly as possible. Then you'll start, A, the fact that you found them will create a lot of enlightenment for everybody. Because chances are they knew they were annoyed, but they didn't totally know why. And then once we figured out where the breakpoints are, start to fix them, start to improve them, and show data related to that, and then you'll start to build that trust.
[00:32:57] Toni: What's your, what's your view on, so I totally agree with that. What's your view on you know, you will always have some highly data literate folks in the team, right? Operations, revenue operations, FP& A, folks like that. What, what's your view on trying to. Enable other leaders to get closer to that.
[00:33:17] Right. And I see this from two directions, just to kind of give you a little bit, kind of my, my thinking around this. One is, I think we're seeing a new modern VP of sales grew up. Folks that are way more into the data than the, the other generation has been. Right. And then I think there's also potentially a play.
[00:33:39] to just you know, make it easier for folks to consume data, right? It's, it's, it's those two things actually that go through my mind, but I feel, you know, in this, in this, in this trust environment that you want to build, you, you want to have your definitions, you want to be able to lean on the data. You, you know, meaning people shouldn't be able to call out mistakes super easily.
[00:34:00] Right. Kind of that, that kills, that kills your meeting. But also what's part of that, that process I feel is that some of the people that need to make decisions based on that data, they need to get a bit more friendly with it. How do you do that? You know, do you think that's an important piece?
[00:34:14] What's your, what's your perspective on this, Lindsay?
[00:34:16] Lindsay: Yeah, absolutely. So one of the things that us like data people, like we can get really nerdy really fast and we love detail. Like just like, I love my favorite part of any job is breaking apart the a hundred things that are going wrong and then trying to put it back together. Like I'm like building an engine or something.
[00:34:36] But there are lots of people that aren't built that way at all. And honestly, like we'll, I'll run into, and I hate to typecast people, but I'll certainly run into just like what you're talking about. Whenever I had a sales VP that either worked for Salesforce or was close enough to Salesforce adjacent, I mean, they were like, they would pour over Salesforce dashboards.
[00:34:56] Sometimes they would have 140 different dashboards that they would. Demand of my team to create for them so that they could see the information from a million different slices So what happens though is Those people have this like leg up, if you will, in any conversation that we want to put in data. So by saying like, Oh yeah, you thought you did well.
[00:35:19] Well, I actually ran the numbers last night in my little engine room that I created for myself. And I turned out, it turns out the mid market segment MQLs were all crap and none of them converted. And so you didn't do well, you failed. And then the poor marketing person was like, I didn't pull the Mid market, like, like MQL conversion rate last night at midnight.
[00:35:41] So they're, they're, they're already on their heels. So the way that I like to set it up, and this is where RevOps like. We just have to be the Spocks of our organization. We just have to be the unbiased speaker of truth. And say, look, maybe that did happen to the MQLs last quarter in the mid market. Maybe, maybe not.
[00:36:01] But, I can tell you, like, there are 25 to 40 metrics that any business should be able to watch. It's going to be different for different businesses. So you have to be, you have to spend that time getting to those definitions and those metrics you should watch. And all the other metrics are. Ancillarily, they're all feeding these top 40, top 25 to 40 metrics.
[00:36:24] And so, what I would do in that moment is, first of all, like, you can't have, you know, the fox guarding the hen house and pointing out what's going on in other people's divisions, while also obfuscating the non sales that happened in the mid market, the Midwest. So that's a problem, right? So we've got to get that completely neutralized.
[00:36:46] And then as an executive team and as a, you know, second layer executive team, really know what your 40 metrics are going to be and become so well versed in how those 25 numbers come to be. Nobody has the people with the data minds don't have this weird leg up that they can pull out of a hat Because they're good at Salesforce reports because they happen to be a VP of sales Or they were a seller at Salesforce at some point in their life So that's the way I would look at that is get to your top 25 to 40 Make sure everybody is so crystal clear on the standard definition of how you calculate that And then if people try and bring other numbers into the conversation to prove why somebody is worse than they are, immediately throw that conversation out of this, out of the meeting and say, look, that is an offline topic.
[00:37:36] We can take a look at that slicer in another meeting, but we're not, that's, that's not what we're here to talk about. We're talking about our top 25 today. On average, my MQL conversion is correct. So don't poke me. That's not my fault.
[00:37:51] Toni: So I like that basic kind of trying to create a level playing field, right? So because it's, it's, and you're right. I mean, I've done this myself to me, to the marketing guy, literally Mikkel, whenever we had like a QBR, you know, I'm going to get Mikkel, Mikkel walked in a real like arms crossed and then sat down.
[00:38:15] And the expression on their face was literally, how are they going to F me today, you know, how's, how's, how's it going to happen?
[00:38:21] Mikkel: So, but we trust each other today. That's the good news.
[00:38:24] Toni: Do,
[00:38:25] Lindsay: well, well, and that was a really normal, like, we were all taught that I sat in meetings as Rev Ops and would watch the CMO and CRO go at each other. And sadly, like the CRO tended to be like, quite frankly, I mean, part of the job of sales is to be really convincing,
[00:38:44] kind of loud and like really want to get your way.
[00:38:47] And so oftentimes the CRO would come out. On top in those meetings.
[00:38:51] But the way I always describe this to my clients is 100%, if there's friction between your go to market teams, you are leaving money on the table or you are spending money on things you don't need. And that is like your CAC is being damaged by that friction.
[00:39:06] So if you guys can't sort out where that friction is coming from and resolve it Your CAC is higher than it needs to be and you've got to be the grown ups in the room and stop this So and it took me years to figure out that that's what was happening But 100 percent is the best advice I can give anybody eliminate friction in your go to market teams because it is costing you money You might not, might not be 100 percent clear where, but if you follow the money path and do the forensic analysis, you will find where we spent money in marketing that we shouldn't have, or where we converted deals with discounting or added products that we shouldn't have added, or did something to the deal cycle to leave money on the table and cost us money down the road.
[00:39:47] Toni: What I don't understand, and honestly, and yes, you know, Groblox and we're helping solve some of those issues, but what I do not understand is. You know, BI has been around for 20, 30 years, Spreadsheet has been around for what, a century and a half or something? Um, Why, why do we, yes, it's a people issue.
[00:40:06] And this is where we started Lindsay, right? So sure. But I'm wondering we're in 2024 there's very little technology that's helping with some of that stuff, right? Why hasn't this been fixed yet? Kind of, you know, maybe, maybe that's more my question.
[00:40:20] Why, you know, between BI and Excel and I don't know, analytics, whatever you have, why is this still an ongoing problem?
[00:40:30] Lindsay: Well, some of it is muscle memory of how we've all been taught to work. I mean, sales and marketing were put in competition with one another. In order to, the competition would rise all and we would all work harder and it would be, so like we've been set on a mindset that's problematic, that we just have to work as a modern organization to just stop that shenanigans.
[00:40:51] If the team isn't working, if all these teams aren't working well together, that's a leadership issue and it needs to be resolved as quickly as possible. So that's number one. Number two, I mean, I always joke, like I sat on my first forecasting call when I was six, and that's because my dad was VP of sales.
[00:41:06] For an Oracle like, services company and like an ISV for Oracle. And the way it worked was as VP of sales, every Friday, they would all get on a call for four hours. And he would write down on a yellow ledger paper, Jim, deal number one, 20, 000. And so that's where like, and that you know, that was 35 years ago was where we started.
[00:41:28] And now here we are. We are still, I run pipeline deals for some of my clients. We are still behaving that way. I am still doing deal by deal walk downs. How do you feel about it? What's going on? And the reality is, is the technology has outpaced our ability to change.
[00:41:44] And I dream of a world where it's a headless CRM.
[00:41:47] There's enough information in Gong, now that we can do the NLP translation, now that we can find opportun we can find, like, we can see in our deal cycles what's happening and what's working and what's not systematically without having to ask the opinion of the AE. www. google. com We are still asking the opinion of the AE.
[00:42:06] We get health scores from Gong, we get health scores from Salesforce, from Einstein, and you know, there's a lot of different solutions out there, but when are we going to start to like really lean into some of the technology and let it take some of the bias out of these sales cycles out? Like, I think that's going to be a really big deal for 2024.
[00:42:26] So, and, you know, Now that we've created these systems that can account for these outliers that our human minds instantly saw and knew that the average age was wrong. I mean, that's what it was, right? We looked at the average age of these deals and you're absolutely right, Mikkel, it was in a enterprise deal in Q2 that we were chasing for two and a half years and we finally sold it and it was like 10x any ACV we had ever had before.
[00:42:52] Everyone was super excited. I threw a party. I did. This actually happened. We threw a party. There was confetti. We were pumped. But what it did was it damaged our understanding of forecasting for the next three quarters until we just figured out how to pull it. And so, what we want to do there is let those machines who are seeing those outliers that are seeing those issues really do their job.
[00:43:12] Let them tell us what their opinion is. I mean, the humans can still have an opinion, but I, the fact that I'm still asking salespeople When do you think this is going to close? What are you going to do? Oh no, your close date is in the past and it's still open. Oh no. When do you think it's going to close now?
[00:43:31] I mean, you were wrong the first three times you input the data there, but like, like that's, that's where we're going to start to see, I think, some much stronger trust in our data is when we stop asking the humans what their thoughts are on the data and start letting well trained models. Help us with removing some of that bias.
[00:43:52] Mikkel: Yeah. I think that's a good place to end it.
[00:43:56] Toni: That's also, I, I was, I was, I had like an objection prepared and everything. I was about to go for it. But then I was like, no, I actually kind of, let's kind of, let's, let's have it fade out here. I mean, this, this was the last question already, right? I mean, we don't,
[00:44:08] Mikkel: we don't have, that's it.
[00:44:10] No, not unless someone writes us and we can manage. No, no, it's done. I need, and also it's good because I need to probably now go and buy pre apologetic flowers for my wife. Already? Yeah, just to make sure, like as a hedge, you know, I need to make sure. Wow, okay. Lindsay, this was fantastic. I hope this was helpful for the listeners.
[00:44:29] You and uh, Go-To-Market Partners definitely also cover this subject on your Substack Newsletter. I know there's a few posts already about how to work with data, maybe also how to build some trust. At least in this episode, we dropped a few good tips on how to build trust. So you hopefully have some data to trust.
[00:44:46] Lindsay? Thanks so much for joining. Thanks, Lindsay.
[00:44:49] Lindsay: Thank you. I love I love Growblocks. I love this podcast. I love what y'all are doing It was an absolute pleasure and honor to be a part of it. So thank you for having me
[00:44:57] Toni: Thank you so much. And for everyone that's listening and liked it, hit like, subscribe, follow, love, store, whatever it is, but please click that because it kind of helps us on our mission. Thank you so much, Lindsay, and thanks everyone for listening. Bye. Bye. ​