The Proof Point

When we launched our original research report, The Evidence Gap, we thought we’d get 150 downloads in the first 30 days. The actual results? Blew that goal out of the water.

Original research is the secret weapon for cutting through the noise in B2B marketing. In this episode of The Proof Point, we dive into UserEvidence’s Evidence Gap Report with UserEvidence co-founder and CEO Evan Huck, exploring the growing need for credible proof in today’s high-stakes buying environment. The big takeaway? Marketers often think their content is crushing it, but sales teams and buyers see major gaps—especially in relevance and competitive differentiation. With budgets tighter than ever, buyers are looking for more than opinions—they want statistically backed proof that shows a solution works for their unique needs.

Listen to the full episode, and you’ll get a behind-the-scenes process of creating the report, including why testing your survey hypothesis is a lifesaver and how to frame questions for actionable insights. They make the case for going beyond generic case studies and testimonials to deliver content that’s specific, credible, and trust-building.

Things to listen for:
(00:00) Introduction
(00:28) Why credible proof outshines opinions in B2B content
(01:12) What is the "evidence gap" and why it matters?
(02:04) The role of original research in modern go-to-market strategies
(05:24) Case studies vs. customer evidence
(08:43) Why marketers overestimate the impact of their content
(13:20) Designing surveys to uncover actionable customer insights
(17:05) How statistical evidence builds trust with skeptical buyers
(23:29) Why specific, relevant evidence is critical for closing deals
(39:37) Rethinking customer marketing: Beyond case studies and testimonials

Subscribe to Mark’s newsletter, Evidently: https://evidently.beehiiv.com/subscribe
Learn more about the customer evidence platform that B2B teams at Gong, HackerOne, Sendoso, and more trust at userevidence.com.

What is The Proof Point?

Proof is what GTM leaders need to make fast and furious decisions that keep their businesses alive and thriving.

The Proof Point hosts conversations anchored in the reality of day-to-day life as a revenue leader. No algorithm-hacking, talk-track headlining buzz statements around here. We’re hosting conversations between GTM leaders so we can gather the facts and provide you with the tactics and tools you need to bulletproof your strategy.

Join host Mark Huber every other week as he invites the best GTM leaders into the conversation.

Evan Huck [00:00:00]:
That was ambitious. Like, most projects that you do at this is one audience, right? What do buyers think about- blah? The fact that we tried to normalize a set of questions and then compare them across three distinct audiences definitely added some complexity because we had to go get three separate panels. That was ambitious. I thought it was really cool though, because we needed to triangulate this hypothesis, right? Just from one angle isn't enough.

Evan Huck [00:00:30]:
That could be your job. Like, that could impact the company.

Mark Huber [00:00:36]:
Here's what go-to-market teams are missing proof. That's what I think of every morning when I fire up LinkedIn and scroll through boring manifestos and endless lukewarm takes. Opinions are cheap and proof is gold. I'm Mark Huber, and this is the Proof Point, a show from UserEvidence that helps go to market teams, find ideas, get frameworks, and swap tactics. Each episode includes an unfiltered discussion with the biggest names in B2B SaaS to help find the proof points that I'm in search of. You'll learn from sales, marketing, and customer success leaders in the trenches, where I ask them, seriously, what actually works for you?

Mark Huber [00:01:12]:
On this week's episode of The Proof Point, I got to sit down with Evan and we talked about everything that went into the evidence gap report, our original research from earlier this year. And the cool part about this conversation is this was actually the first time that Evan and I talked about a lot of the things that we covered in this episode. So we talked through the original hypothesis, how the idea for this came about, what might have happened had the data not actually backed up, what we were hoping that it would prove out, and then we share a few funny stories along the way. This is a blast to record. Enjoy. All right, so, Evan, this is going to be a fun one because we basically get to take a peek behind the curtain of what went on with our original research that we did earlier this year. And we'll do like a little history lesson almost.

Evan Huck [00:01:59]:
Right. So we're making content about making content. What are you saying? Right.

Mark Huber [00:02:04]:
It's pretty meta, but such as hosting a marketing podcast, you're going to talk about marketing things from time to time. So I'm excited for this because we have never really had this conversation. So we're doing it live, Bill O'Reilly style, and we'll have this conversation here for the first time. Sound good?

Evan Huck [00:02:21]:
Yeah, I'm excited on one with a lot of learning. Should be interesting.

Mark Huber [00:02:25]:
So I'm going to put you on the spot to start this, but I want to hear your take on just original research in general. And here's why I'm putting you, my boss, on the spot to start the episode off. If we go back to my interview process, I know that one of the things that you had talked about was Gong and Gong Labs and their original research. So this has been something that has been really on your mind for a while now and it took us a long, little longer to get to where we are. But why are you such a big proponent of original research?

Evan Huck [00:02:58]:
Yeah, I think you have to look at the environment today. What I've seen is such a huge deterioration of like engagement on shorter format stuff like particularly like cold outreach from BDRs, obviously, like that channel email is just destroyed. And so that's just not useful anymore. And even with, especially with the advance of AI, there's just, it's so easy to put out medium to low quality crap. That's just like a short blog post. And we did a decent amount of that and it was helpful initially at least to like, you know, get a baseline level of brand out there. But there's a diminishing marginal utility to that kind of medium quality short format stuff. I felt like we were just kind of adding to the noise.

Evan Huck [00:03:39]:
Right. There's still more stuff out there, which is not bad in and of its own. Right. But I think the prevalence of all this noise and content created this demand for something to stand out from that. And I think what stands out from that is like really high quality, like meaty data, substantive insights that you can actually take away something from. And yeah, that's why I'm so bullish on original research. And yeah, as a sales rep, like, I love the GONG data, like, all right, people that say pricing in a call, close 3x more less likely or whatever. So yeah, I think that was the goal, to bring some interesting insights and wrap it up in a meaty, longer format container.

Evan Huck [00:04:21]:
Because I feel like that's the anecdote to just like a constant barrage of average short-format stuff.

Mark Huber [00:04:27]:
You saying that reminded me, and I may not even remember all of it, but one of my favorite Gong like reports, the insights that they shared through Gong Labs was something about people swearing on sales calls. And it was like the effect that whenever people swear, I Forget what the second part was, but I know they came out with the research of you swear on a sales call as a sales rep that it has this impact on.

Evan Huck [00:04:48]:
That's probably a lot higher, right? If you think about it, you probably developed enough trust with a customer where you're. Now you get the causation versus correlation thing. So you get a bunch of reps like all right, cool if I just drop fricking F-bombs and improve my close rate. Probably not what the right takeaway is from that, but that's up to the people to know statistics.

Mark Huber [00:05:07]:
So before we get into the meta part of this episode and really our own original research, the evidence gap, had you always thought about adding this sort of research content offering when you and Ray were first thinking about UserEvidence or how did that come about?

Evan Huck [00:05:24]:
I, I had thought about it. Not it didn't end up taking the form that it took now, but one of my early ideas when we were first getting started was this kind of like crowdsourced trends insights thing, right? So imagine like on LinkedIn you could like just create a survey and then all the insights would pool together and so everyone could collect insights and then also draw from the same group inside it together. So I think that's kind of interesting concept. But yeah, I had known that kind of these trend-based predictions and here's what my peers are thinking is an. Is an effective form of content particularly for that earlier stage awareness, social, et cetera. And yeah, obviously we're good at turning survey data into content. So yeah, it is something that we thought about pretty early for sure, it took a slightly different form, but something that we thought about.

Mark Huber [00:06:12]:
If you can think back this far, back to the TechValidate days and the SurveyMonkey days. Was this something that you all were thinking about then?

Evan Huck [00:06:19]:
Yeah, we did come out with a version of the. It's called market research and we would say they go out and get panels, but it was only to get the data into the app to then create just charts and stats and stuff. And that's kind of, you know my, my initial inclination as like a SaaS oriented founder was like, all right, it's got, we gotta be able to productize it, which is just get the data in the app. And I think what Ray is at the other end of the spectrum was like let's take a services agency style approach to creating like a bigger rock, 25 piece page piece of content was brilliant and not the direction I would have gone at attacking this angle. So yeah, the tech-validated approach was Very much just short format charts and stats. And then we left it up to the client to do whatever they want with it.

Mark Huber [00:07:05]:
All right, so this is funny that I'm going to say this out loud on a podcast and admit this to you, but we now are going to get into the actual process around how we created our own original research report. But for the first couple times that you mentioned wanting to do some original research, did you have a hypothesis going in? Because I don't even remember how we landed on the hypothesis and the angle that we ended up taking.

Evan Huck [00:07:30]:
Yeah, it's interesting. And I had several different ideas. This one I felt every day. And the reason why is so in the early stages of the company and even still today, a little bit, like, I'm still selling quite a bit. So I'm talking to customer marketers, product marketers, and then sales leadership and demand gen as well. There's some deals where we'll literally get on the phone with the customer market and be like, yeah, like, our stuff's awesome. Like, we got, we generated four case studies this year from big brands, like our CEO, super stoked. And then you'll go talk to a sales leader who will be like, yeah, our content sucks.

Evan Huck [00:08:05]:
Like, you know, yeah, we have four case studies. That is true, but we have four products. We're selling those things to 16 different industries. I need content around mid-market, around regions, around why we're better than competitors. And all I have is the this beautiful case study around how we helped Sephora do whatever. And so it was frustrating for me because I was like, there's this disconnect, right? Like, on the one hand, when I'm talking to, like, product marketing, sales and sales enablement, I'm hearing like, yeah, we have big gaps in our library, need more competitive, need more stories in different industries. And then I go, the customer marketer, especially if they reported to, like, more corporate marketing and brands, who'd be like, oh, yeah, we're crushing it. Like, these handful of case studies is awesome.

Evan Huck [00:08:43]:
And it's like, ah, what the heck? And so my hypothesis was that, like, if you ask customer marketers and marketers how they're doing, you would get a higher, like, pretty good. And then if you ask the people that they're serving, you know, sales, sales, them, et cetera, you'd get a lower response. It could certainly be better. And so that was what I was least curious to test, was like, do marketers think they're doing a lot better than they are in terms of serving content out to the consumers? That would actually need it.

Mark Huber [00:09:15]:
I don't think you need to do research to prove that statement out. Marketers always feel like they're doing the best job in the world, but I feel like this research helped debunk that a little bit.

Evan Huck [00:09:24]:
Yeah, for sure.

Mark Huber [00:09:26]:
And then I remember when we were first really starting to put this together, I had to lean on you. Mostly you, but also Ray to a degree, just in terms of, like, survey design experience. Because I had sent one survey in my life before, and that was when I sent my first survey at UserEvidence, when we were like, true research surveys, when we were trying to beef up our research library by way of the website redesign. But I know I leaned on YouTube heavily for survey designs. Do you remember, like, where we started from? Because this is where I, like, really remember taking it from. All right, here's this thing that Evan wants to do. And I know that's a good idea. I'm just drowning in work right now to.

Mark Huber [00:10:06]:
All right, let's make this thing and turn it into a reality.

Evan Huck [00:10:10]:
Yeah, I mean, a little bit different than, like, scientific research where, like, the purpose of the hypothesis is to truly test the hypothesis. Whereas I obviously wanted to prove a hypothesis. Right. So that was the goal, was to prove this because it obviously supports what we do and the offering that we have. Right. Which creates a bunch of customer content. Having done thousands of surveys, albeit of a very different type. Like, most of the surveys I've done are more customer evidence-focused, where you're surveying customers, get them to talk about how awesome the product is, and how much value they've seen.

Evan Huck [00:10:37]:
So this was a little bit different, but some of the core concepts are similar in that you essentially. I basically wrote a draft of what I wanted the report to say first, and then I thought about, all right, like, what would be good data to support this report and then work backwards from that to get. All right, what question do I need to ask them to get the data to then support that? So in other words, if I wanted to, the main hypothesis, like, salespeople were frustrated with it quality or diversity of content that they have, getting them to say that in a way where I can create a stat around that says 89% of salespeople wish they had better content would be helpful in my report. But yeah. So this is, again, not scientific research. This is marketing research. Right. And so starting with a hypothesis first and writing that first draft report and then thinking about, all right, now how do I go get the data support? It was the methodology there.

Mark Huber [00:11:27]:
I remember talking about this With Kyle Lacy, right as we were going through what you just said, and one of the things that he had suggested was something that I hadn't done before. And we did some, let's say, trend, like content when I was at my last company, but it was really just dependent on what the performance data showed. There wasn't research behind it. So it's definitely a different process. But what Kyle suggested was, hey, you've got this final report. And just imagine this in your head. What do you want the headline in the report to be? Like, start thinking of headlines at the end of the day. And then once you can think of the headlines, then you can start to reverse engineer.

Mark Huber [00:12:05]:
Like, all right, if this is what I hope to be able to share, what are then the types of questions that we need to survey people using to either prove or disprove whatever that headline is? And I think that got me out of just, all right, what should the questions be? Because it was so easy for me to get lost in trying to figure out the perfect questions to ask because I didn't really know what I wanted to truly prove out at the end of the day.

Evan Huck [00:12:32]:
Yeah, I think that's exactly right. Left PC version of our customer report would have entitled your customer evidence library sucks. Buyers think so too. So does sales do something about it? Right.

Mark Huber [00:12:43]:
Maybe we do that as the next one. But now that we bring this up, I remember going from the word doc that I think maybe you originally started about the survey itself and the types of questions to ask. And then what made it a little bit more complex for me was just trying to stay organized with the three different groups that we were going to be serving because we had the buyers, sellers, and marketers. And I think getting out of the Google Doc and putting it into a sheet and seeing like, all right, we're asking this group these questions, that group these questions, and this final group these questions. That's when it started to make a little bit more sense for me.

Evan Huck [00:13:20]:
That was ambitious. Like most projects that you do, this is one audience, right? What do buyers think about blah? The fact that we tried to normalize a set of questions and then compare them across three distinct audiences definitely added some complexity because we had to go get three separate panels. The questions like wanted needed to be, like, similar, such that you can compare, but they need to be slightly different to be specific to the audience. So, yeah, that was. That was ambitious. I thought it was really cool, though, because we needed to triangulate this hypothesis. Right. Just from one angle isn't enough marketers like, yeah, I think we got good content, even sellers, to have all three audiences' perspective and to see how their perspective differed on the quality of their library of evidence was super interesting.

Evan Huck [00:13:59]:
So I'm glad we did that. It made it a lot. It was definitely made it harder and just more complex, but was really cool to see the different perspectives on the problem.

Mark Huber [00:14:08]:
Yeah, I think I'm going to make an excuse here for why it took as long as it did. I think it took the time that it needed to get it done right for our first piece of original research. But I think having to then survey three different groups and then compare the responses and really what you were seeing from a data perspective across all of those three groups and analyzing it, it felt like that was the hardest part, or at least the longest part, if you think about when we first started to. When we published it and launched it.

Evan Huck [00:14:37]:
Yeah, it was essentially three separate surveys to start. And then the benefit of that is it gives you a ton of options in terms of the narrative just because you have so many different permutations of how you can combine the different perspectives. But the hard part was like, because it gave you a lot more options, it was just less obvious to pull out the kind of salient, interesting themes and kind of key findings. So it just, it took a lot more just looking at the data from different angles to start to hone in on what were the individual key themes and then tying that into a broader narrative that kind of flowed.

Mark Huber [00:15:10]:
All right, so this is a hilarious question to ask and I can thank Jillian for putting this in the outline. It's something that I had never thought about before, which is probably good because I would have worried about this too much. I was just trying to juggle everything that we were doing with this and with everything else in marketing at UserEvidence all at once. That this never once crossed my mind. But what would we have done if the data didn't come back and support, you know, what we were trying to prove with the hypothesis, that did happen, actually.

Evan Huck [00:15:39]:
So the first rev of the question, and not to dilute the findings of the report, it doesn't at all. But it's worth talking about because for people that are thinking about how to do these original research offerings, which is something that we offer now, it's worth knowing about. So there are opportunities to rejigger the questions if you don't necessarily like where the data is going and that is unique to something, something that we offer. But just to simplify it for the purposes of explanation, like our core hypothesis right Marketers think they're great on content. Salespeople don't. The first way I phrased that was like, please rate the effectiveness of your content. Like, basically rate the effectiveness of your marketing team when it comes to delivering content. And salespeople are.

Evan Huck [00:16:18]:
They're still nice people. Right. They're not going to totally roast their marketing teams. Yeah, it's acceptable or it's fine or something like that. And obviously that didn't have much punch to it yet. 80% of salespeople think their content's like, adequate. Doesn't sound as good as, like, yeah. So I was like, all right, well, that didn't work at all.

Evan Huck [00:16:35]:
So go back to the drawing board. Like, all right, how can we frame that question to get an answer that we want there? Do you wish you had better content? To which everyone's sure, yeah. Do you wish you more jacked? Do you wish you were richer? Yeah, of course. Everyone can raise their hand to that. So 96% of people wish they had better content. Supported our narrative a lot more than like, 70% of them think their content's fine. Sometimes it's, you know, it's totally not gonna support your narrative, in which case you got some deeper thinking to do. But there is a lot of method and madness.

Evan Huck [00:17:05]:
And how you frame the question and doing a kind of quick sample test is to see directionally if it's supporting your hypothesis before you go get the full panel, as I think was, I think like a really nice, dexterous way to not just go full bore into a project not knowing what you're gonna get. Right. We were able to test along the way.

Mark Huber [00:17:22]:
So I remember going through that phase as part of this project. Do you remember how many responses we got first before we went back and look, reworked the questions?

Evan Huck [00:17:34]:
Yeah, I think it was like 30, which is not enough to make go big, big claims from like a PR perspective. So, like, obviously you're going to want a bigger sample for the full report, but it is enough to directionally figure out if the data set is heading in the right direction that you want it to go. And you can look at it and be like, all right, is this question going to be useful? Like, what would I do with this from a claims perspective? And if it's not given the data and the responses, then, yeah, either rejigger it or throw it out or something like that.

Mark Huber [00:18:00]:
And then let's talk about the questions itself. We talked about survey design a little bit before, but I'm going to pull up the Google sheet that we had and I remember there were, if I've got this up right now, there are anywhere from 14 to 17 questions that we asked in each group. And there were, I would say about five to six questions that were consistent across each group. So those were really the questions that we could then figure out. Marketers answered this way. This is how sellers answered and this is how buyers answered. So I just remember having not gone through this before that I was worried about how many questions to ask without making this a super long survey. So what advice would you have or give to people who are working on their first piece of original research and are curious about survey design?

Evan Huck [00:18:53]:
The purpose of asking a bunch of these questions is to get something from each question. You don't necessarily know ahead of time which questions are going to yield something. So think of it like imagine you're going fishing, right? Like you throw out and you're trolling. Like you throw out a bunch of different bait. You might have nine lines out there of different types and you're not expecting all of them to necessarily hit. But like you just don't know what fish are interested in. So you put a bunch of stuff out there and you just, you hopefully get a nibble on them. 1.

Evan Huck [00:19:17]:
So a diverse range of questions. Sometimes like I had questions that all I was looking for is one quote, right? Which it seems crazy because like why are we having 200 people responses respond to this question if all we need is like one or two posts? But that's all I was looking for. I like I knew I was going to have plenty of quantitative data so that's why I asked so many open-ended questions, which does make for a torturous survey. And I'll talk about that in a second. But it gave me a diverse range of options in terms of finding these like perfect quotes. And we did find some amazing quotes even if we only got like four or five relevant responses out of the 200 on each one. Like all I'm using is one or two from each of those anyway. So that was the point.

Evan Huck [00:19:54]:
Like more questions allows you to just put more bait out there and hopefully gather a diverse range of inputs where something's usable from that on the torture survey piece. So this is a little bit. A lot of people don't know the panel world very well. So like for this survey we did a combination of both just putting it out to our networks on LinkedIn and email subscriber lists and stuff like that, saying hey, we'd love to get your feedback, but a lot of times for these original research that's not sufficient just to use your own network. You need to go get responses from somewhere else. And so there's this whole weird network of panel providers which I've spent a lot of time and I spent a lot of time in SurveyMonkey's audience business, which was a panel or market research business. So I know relatively well. And each panel provider specializes in different types of audiences.

Evan Huck [00:20:38]:
Right. If you're Sephora and you want 16-year-old women's feedback on the different packaging, that's a different Panel provider than B2B cybersecurity professionals.

Mark Huber [00:20:47]:
What are two extreme examples that perfectly illustrated the point?

Evan Huck [00:20:50]:
Yeah, exactly. The panel providers, you basically pay for a response, so they will have a price that you pay, but then it doesn't really matter how many questions you get, you put in there because they're just delivering, they're guaranteeing a certain number of responses. When you ask more questions. It does drive the price per response up a little bit, but not super meaningfully. So you're not worried too much about like fatiguing the person responding like you are with a customer survey or you don't want to drag your customer through a 47-question survey, but if you're paying third party for the response, like it doesn't matter as much. So that's why you're able to ask just more questions.

Mark Huber [00:21:28]:
Yeah, I remember getting some, a little heartburn when we were looking at, I think maybe like the V1 question set because it was probably somewhere in the, I would say, 25-ish question range. And I was just thinking to myself, well, hey, if I got the, this survey, am I gonna get to question 12 and then scroll ahead and see how many other questions are left and start mailing it in. So I was just a little nervous with the amount of questions that we originally had. And then a funny story, I remember some of the original questions that you wanted to tack on at the end were basically just like people admitting that they need UserEvidence and using it as a regen tool.

Evan Huck [00:22:08]:
Yeah, I wish I would have put that back in there, but I was just trying to make you happy. But the kind of cool thing about the panel providers and the way we capture the data is we do actually throw in questions like around, you know, question 13 and question 19 around just to basically figure out are you paying attention and not clicking buttons. So it's, hey, if you're reading this, what's the square root of 20 divided by 4 or something like that. And like, you know, some people actually have to like Be actively paying attention so we can scrub out the bots and bad data and stuff like that.

Mark Huber [00:22:34]:
Now, I'm going to save one of the prompts here for the end because it's a funny story. But in terms of the report itself, I've got it up here. I'm curious, which stats are you using on calls? What's your go to? Couple proof points from this research that you're using? Because I feel like you're probably using it all the time.

Evan Huck [00:22:54]:
Yeah. I mean, at a high level. Yeah. There's a big gap between if you're a vendor, all these fantastic things that you claim that you can do and what skeptical prospects in this very conservative buying environment actually believe that you can do. I think it was interesting to see that buyer groups continue to grow. Obviously there's more pressure on budgets and stuff like that. So all that just leads to. There's more rigor on the buyer side for selecting a solution and therefore more burden on the vendor to provide overwhelming evidence that they're going to deliver return on investment.

Evan Huck [00:23:29]:
I think the other thing is interesting is like people are kind of. There's a flight back to bigger, safer vendors that have a better brand awareness because, like, it's just easier and safer. Right. No one's going to get fired for buying IBM or Salesforce or anything like that. So to me that really highlights, especially for an emerging or mid-vendor, going against some incumbents. The criticality of having good, believable, credible, specific, relevant evidence, I think is just obviously more important than ever in this environment, especially just as there's so many vendors and so much noise. So that was. That's one.

Evan Huck [00:24:00]:
That's probably the overarching finding. Some of the smaller ones that stood out to me there was super interesting, the focus on ROI and particularly like helping buyers create a compelling financial case that the solution is going to deliver value and return. You know, I think we're so jaded by just like the prevalence of vendor claims, oh, 38% ROI. Like you're gonna get all this crap. Like we're a little bit immune to it as buyers. And so the searching for more credible, believable statistical evidence around time savings or cost savings or improvement in win rates or whatever the metrics are that are ultimately gonna lead to the ROI narrative. That was a big one that stood out. I think like most vendors can have figured out that, yeah, you need some testimonials on case studies on your website.

Evan Huck [00:24:45]:
Like, for sure do that. But that's kind of table stakes now. Everyone has that. I mean, even everyone's like, to a degree, like, kind of figured out the G2 game now where it's like, all right, we gotta have at least four stars on there, so that's good. But now it's like, all right, we all have testimonials in case studies. We're all somewhere between 4.2 and 4.6 on G2.

Mark Huber [00:25:04]:
I'm just a comparison page on G2.

Evan Huck [00:25:07]:
They're exactly the same. Yeah. So it doesn't differentiate yourself versus the competition. Right. And I think vendors in this environment that can produce a believable, specific, relevant. Here's how you're going to see return and here's how we've done this for other people that look exactly like you and your industry and your size. That was a one that stood out for sure. Competitive was another big one that stood out, which makes sense given the super noisy environment.

Evan Huck [00:25:31]:
Obviously differentiating versus your competition is going to be more critical and competition is more fierce because there's more vendors out there competing for a smaller and smaller piece of budget. Yeah. The salespeople said that was the biggest thing they were missing when it comes to competitive evidence was proof around why they're better, different, why people are switching from an incumbent. So that, that stood out to me in a pretty dramatic way. The last one was relevance and specificity matters. Right. So we asked buyers, like, how do they think about the quality? Or basically the influence of customer evidence. And the relevance was a big piece of that.

Evan Huck [00:26:07]:
And so what does relevant mean? Relevant meant I think the things that were on the top of the list were like, same industry, same use case, same company size, same technological ecosystem so that they can know, like, this solution actually worked for people that look like me. Right. I think in the past we've seen customer marketers and marketers generally index towards these kind of bigger brand stories because they look cool and they're impressive and everyone recognizes the logo. But for a buyer to get confidence, they need to know that it's the solution is going to work uniquely for me. Right. In my unique environment. And so I think that really stood out as like, relevance is a. Specificity is a huge component of quality.

Evan Huck [00:26:49]:
I think you get this false choice sometimes it's, hey, are you focused on quality versus quantity? And it's easy to say, oh yeah, we're focused on quality. But like, the reality is you can't accomplish quality, ironically, without quantity. Right. If you sell to 17 different industries and you want to be able to have a high-quality story that is, hey, We've delivered success in your industry, then guess What? You need 17 stories and so you need quantity to accomplish quality.

Mark Huber [00:27:13]:
Yeah, I think. And I've got the report up here and we'll make sure that we overlay it in the episode. But on that relevancy question, we looked at marketers, sellers and buyers answering the same question. So this is one of those questions that was consistent across all three groups that we surveyed. I think what stood out to me was that buyers like far and away. So it was 70% buyers thought that how relevant it was to their specific industry versus 59% for sellers and 53% for marketers. That 70% response was the highest by far of any of the other relevancy factors. And I thought that was really interesting because like for me, and I get that I'm not everyone, but I always look at their role, their use case, their company size.

Mark Huber [00:28:02]:
I at times don't really care so much about industry. I thought the other factors are more important. But that was interesting. That stood out for buyers.

Evan Huck [00:28:10]:
Yeah, I think you and me are so used to just being in tech and selling the tech that like industry really isn't like a concept that we think about much because we're all in the same industry. But if you look at buyers more broadly, which is what we surveyed, like these are the things, types of solutions they're buying, could be or professional services or technology or whatever across a whole range of spectrum. Industry matters because if you're a bank, for instance, like the set of technology and the deployment environment is so different than like a tech company like ours or even like a healthcare organization or federal government's got their own set of stuff. So I think like the good example is even in the CRM space like there's this, you have all the horizontal providers, right? Salesforce, HubSpot, et cetera. And then you have a set of vertical software providers like in the VC and private equity world you have like affinity for instance. And so the horizontal of course wants to solve the alternatives. But there's obviously these industry-specific solutions. And so that that is a, you know, a good indication that especially in these, in these industries that have like very unique requirements like financial services and healthcare that they really need to know if you've delivered success like in a regulated environment instance.

Evan Huck [00:29:14]:
I think that's why he's not so much higher on the buyer side.

Mark Huber [00:29:18]:
I think the other thing that stood out to me, and there's a funny story behind this too that I don't think I've shared before. Is the three least important factors for buyers as it relates to customer evidence. Number one was customer logos at 17%, which is something that myself included for the longest time. Everybody's always worried about the logos that you have on your site, whether or not they're actually still a customer or maybe they churned a while ago and is somebody going to do their detective work and figure out that they're not actually a customer logo. And then another one was awards and accolades at 18%. That didn't really surprise me because I understand how so many of these awards truly are pay-to-play. And then the third was the number of reviews or average review score at 28%. And that's something that I've felt for a while.

Mark Huber [00:30:05]:
I think when I look on a G2 page, I'm not really looking at the quantitative review scores. I'll look to see what people are actually saying and specifically the negative side of things. But the reason why this is so funny was Adam Schoenfeld had just asked me to write at the time a review of Key Play right before the. This would have been the Q3, I believe, reporting period ending. And then I said that I was going to get it back to him and submit the review, but then I needed to do that after I figured out how we were going to launch this report and finalize it. So I sent him some prompts to comment on the findings from the report. And it was basically about the exact thing that he was asking me to do, which was add a review and rate it however I wanted. And he's, are you trying to tell me that this stuff doesn't actually matter? At the end of the day, I was like, no, that's not trying to tell you that.

Mark Huber [00:30:58]:
But it is funny that you were asking me for a review and then here's what the data shows.

Evan Huck [00:31:02]:
Yeah, I think it's important in the sense that it's table stakes. So if you don't have it, you're screwed. Right? If you can't put up some decent logos on your website, if your reviews on G2 suck or you don't have them or something, like, that's a massive problem. So the initial focus on that, especially as a small company, like, I. I focus on it too, just because, like. But because I wanted to have like a good logo wall and like a good initial focus on G2, but there's diminishing, marginal returns to the environment. Like once you have a set of 30 good logos, like replacing one for a slightly bigger company, like, no one's going to notice. Right.

Evan Huck [00:31:35]:
It's just the blink test. You go on there, it's like, all right, cool. They work with big companies. Great, got it. Check. And then on the G2 side, it's yeah, all right, cool. They have a bunch of good customers. Customers seem to generally like them.

Evan Huck [00:31:45]:
It's funny, while you're talking, I was like googling Monday versus ClickUp and there's this hilarious page on Monday. Com that is the comparison page and it's. They have all this crap and then the customer view is like 472 versus 4. 75 or something like that. Yeah. Is that a material difference? I hope it would ever be a fraction more happy. So, yeah. Table stakes, for sure.

Evan Huck [00:32:07]:
Does it help you differentiate? I think is what we're seeing. Not really, no. But yeah, by all means, get the basics in place for sure.

Mark Huber [00:32:14]:
And then before we get into the last kind of funny prompt and story. I didn't know, or at least I didn't have a hunch around this and it sounds like maybe you did, but I was blown away at the responses that we got to any of the questions about statistical evidence and the desire for more statistical evidence because that's something that is firmly in the sweet spot of what our product can actually do at the end of the day. So that was a something that I'm always looking for as a buyer, but I didn't realize that so many other people were clamoring for the same thing.

Evan Huck [00:32:46]:
Yeah, this is one. I have just been frustrated with this for years, which is why I did have the hypothesis that would matter and I was glad to see it did. Like, put it this way, like when something actually matters, like when we have a decision to make that is important, like lives dependent. Like we bring in smart people, we bring in some scientific rigor. If Boeing needs to make a plane fly. And Boeing might be a bad example. Airbus.

Mark Huber [00:33:07]:
Yeah, I was going to say.

Evan Huck [00:33:08]:
If Airbus needs to make a plane fly, we're doing a lot of tests. Like we're doing thousands of tests. Right. And there's a rigorous statistical analysis that informs a. Yeah. Our decision-making. And there's, you know, it matters. Right.

Evan Huck [00:33:20]:
And in B2B, it's so like lackadaisical. It's like, all right, I'm going to go check out this analyst report, do some reviews, do a couple demos, talk to one person and like that's it. Which has been fine because generally it doesn't matter that much. Right. If you get the decision on Salesforce versus nothing. Right. Wrong. Like whatever.

Evan Huck [00:33:35]:
Probably fine. Either way, and the impact is low. Although I'd argue, like, I think what we're starting to see now is like, impact's getting a little higher. The stakes are getting a little higher. If you make a big infrastructure investment and you miss on it, that's not good, right? Like, budgets aren't infinite anymore. That could be your job. Like, that could impact the company. And so I think you're.

Evan Huck [00:33:56]:
What you're seeing here and feeling is like there's a desire for more rigor analogous to what we would normally do in a situation where we absolutely need rigor, like making a plane fly. Statistical evidence is part of that, right? Especially in the software context. Like, these vendors have thousands of users, right? There's no shortage of people. There's no shortage of evidence available. So it's crazy that it's like, oh, here's the results of one person. You should expect to be exactly like that person when there's a thousand people available that have a whole range of different outcomes. There's just such an opportunity because vendors have such a large user base to bring in a lot more statistically significant substantive body of evidence. And I think you're seeing buyers want that because it's going to increase the confidence in their decision.

Mark Huber [00:34:39]:
I think, you know, back a couple years ago when everyone had crazy budgets for buying new tools, and if you swung and missed and bought the wrong tool, there weren't really too many kind of repercussions, if you will, at your company. It was just, all right, be careful the next time. Whereas now there's just so much more rigor and process and just the number of people that are involved. Forget what the stat is. But every single year it goes up by a considerable amount. And nowadays if you swing and miss on a very expensive piece of technology, like you mentioned, like, it isn't just, okay, it's fine, go get it.

Evan Huck [00:35:14]:
I mean, we're, we're. You're literally talking about this with us right now, right? Your budget's fixed. It could either be headcount or tools or program spend or whatever. But if you're going to go, you know, big purchase for us, if you're going to go recommend like a hundred thousand dollars on some ABM platform or whatever, there's an opportunity.

Mark Huber [00:35:30]:
Whoa, whoa, whoa, whoa. I didn't do that. Just so we're clear. But yes, for example, yeah,

Evan Huck [00:35:32]:
That's a headcount or something like that. So it just, that matters, both in your role and your credibility and trust to be able to take future swings and that would severely, you know, impact our ability to grow as a company, because that a hundred thousand dollars might. Might have gone to something better. So. Yeah, it's. Yeah, that's why we're inspecting a little bit more on these things, because we want to help you not make mistakes.

Mark Huber [00:35:55]:
All right, so last little mention here, and this is kind of funny. So did we coin the evidence gap before or after the survey results were in? So I can tell the story about how we changed the name of it, but why don't you tell me where the customer evidence gap came from? And if you don't remember, I can prompt you a little bit more, because I remember exactly where it came from.

Evan Huck [00:36:16]:
Actually, I was gonna say. I actually don't. I've said it a lot now. I now just kind of. I don't.

Mark Huber [00:36:21]:
I'll get your, your mind moving on this. So when we first started working on our sales deck, pretty much maybe a year-ish ago.

Evan Huck [00:36:30]:
Okay, got it.

Mark Huber [00:36:30]:
There was this slide that we had that I personally loved. I knew that you loved it just required, like, a really sophisticated seller, I think, to really explain it. And we were talking about kind of the scale or quality kind of access. And then in the middle there was. Yeah, yeah, okay. The customer evidence gap.

Evan Huck [00:36:50]:
Yeah.

Mark Huber [00:36:50]:
So we had been talking about it then as the customer evidence gap. At the time, we were calling ourselves, you know, we're not really trying to create a category, but we need a box to put our software in so people have a frame of reference. And we were calling ourselves a customer voice platform. Fast forward a year later, we had repositioned a little bit, so we're calling ourselves a customer evidence platform. And it was actually after we had published the report. So we published the report. We were at the drive event in Vermont, and I think it was Brendan Hufford who had the physical copy in his hand when I was interviewing him. And he just called it the evidence gap.

Mark Huber [00:37:31]:
And we've always wanted to kind of name the problem that we solve. And the problem and naming it is not the same as the category name because buyers can see right through it. But it was something that I've seen people who specialize in category design, like, they really suggest that you name the problem so you can start to market the problem some more. So in that interview and coming out of it with Brendan, he's, dude, you should call it the evidence gap, not the customer evidence gap. It's just punchier, it's shorter, it rolls right off the tongue. And I think I just went right into WordPress. And Alex and I talked about it. We're like, yes, this is a good idea.

Mark Huber [00:38:07]:
And I think we just changed it and then told everyone else internally that we had changed it. But it was after the report was launched.

Evan Huck [00:38:14]:
That makes sense. Yeah. Now it does come back to me. Yeah, it is. It's tough. Like, we always have this kind of push and pull between the kind of English language understandability of something versus the kind of trying to highlight the bigger, more important concept. Like just for example, in our world, you could say, like, we create case studies and testimonials. Right.

Evan Huck [00:38:34]:
Everyone knows what they are. Case studies and testimonials. Right. What is customer evidence? A little bit more of an abstract thing, but I think that was the point, Right. Was to get people to think about how else can we show evidence that we delivered success from customers. Yes, testimonials and case studies are one way to do that, but there is a lot more. And I think what we discovered through this report is just testimonials and case studies is not enough. Right.

Evan Huck [00:38:59]:
I think so often we compress customer marketing down to, hey, I just create case studies and testimonials. Right. Where there's such a bigger opportunity. Opportunity, especially in light of this research in this environment, to go get substantive, statistically significant, relevant, specific proof and evidence so that you can inspire confidence in your buyers and let them know that this is not your first rowdy and you've delivered success multiple times in that particular industry or whatever it is. So, yeah, I think, like the category, it should make you think about what is customer evidence. That is the point. Right. Like, you need to broaden it beyond just case studies and testimonials.

Mark Huber [00:39:37]:
Yeah. And I think to bring it full circle as we wrap this up, that was one of the things that I was hoping that this report and original research would help do, which is just show marketers and really non-customer marketers what customer evidence means and open their eyes to all these other forms of customer evidence that they might not be thinking of just because they're not creating those assets or salespeople aren't requesting those assets and just showing them that, yes, there's a whole lot more to evidence and proof points than just case studies and testimonials. Because I think it's so easy to keep the blinders on and just think of those two things and knowing how case studies are written and how majority of the time they're not actually written by the customer because they don't have enough time to write the case study. I just like the more consumable, shorter bite-size proof points that a lot of people aren't thinking of just yet.

Evan Huck [00:40:33]:
Yeah, I think that's right. And part of it's because salespeople and executives and whatever, that's because we ask, hey, I need a case study. Right. Just because that's like the most familiar term to us and I think we literally interpret that sometimes. Okay, you need a case study. I'm going to go get case studies. What that question is above is a deeper need for, hey, I need evidence and proof to prove to this particular prospect that like we've done this before in their, you know, this type of environment. Right.

Evan Huck [00:40:58]:
Give me something that can show them, convince them we're the right choice. Right. And if you interpret it a little bit more loosely like that gives you a lot more flexibility and to go how to go solve that problem rather than just interpreting it so narrowly. Okay, I will go get more cases.

Mark Huber [00:41:10]:
Love it. Well, we are just at time here. Thank you for teaching me a whole lot about survey design and creating original research from scratch. It's not a skill that I had prior to this project since you schooled me on that. Maybe the next time we record, maybe we'll have you teach like a Canva kind of graphic design class too, since you're definitely qualified in that area too.

Evan Huck [00:41:28]:
Yeah. You would like that, I assume.

Mark Huber [00:41:32]:
Thanks to everybody who's listening. This is fun. And we'll catch you again in The Proof Point.

Evan Huck [00:41:36]:
All right. Thanks Mark. See ya.

Mark Huber [00:41:40]:
Thanks for listening to the proof point. If you like what you heard during this conversation, you'll probably like Evidently, my bi-weekly newsletter where I share my biggest hits and get honest about my misses as a first-time VP of Marketing. You can subscribe using the link in the show notes below. The Proof Point is brought to you by UserEvidence. If you want to learn more about how our customer evidence platform can help you build trust and close more deals, check out userevidence.com.