A show to explore all data matters.
From small to big, every company on the market, irrespective of their industry, is a data merchant. How they choose to keep, interrogate and understand their data is now mission critical. 30 years of spaghetti-tech, data tech debt, or rapid growth challenges are the reality in most companies.
Join Aaron Phethean, veteran intrapreneur-come-entrepreneur with hundreds of lived examples of wins and losses in the data space, as he embarques on a journey of discovering what matters most in data nowadays by speaking to other technologists and business leaders who tackle their own data challenges every day.
Learn from their mistakes and be inspired by their stories of how they've made their data make sense and work for them.
This podcast brought to you by Matatika - "Unlock the Insights in your Data"
Aaron Phethean (00:27)
Today we are joined by Ollie Hughes, founder and CEO of Count. We talk about BI and AI and all of the things that data teams are facing today, pressures to deliver for the business. It's a fantastic episode. So without further ado, let's dive in.
Aaron Phethean (00:44)
Well hello Ollie and welcome to the show. I've been looking at your LinkedIn and the thing that stood out to me was the little title that you put there. The most collaborative BI tool in the world. That's your goal Tell us about that.
Ollie Hughes (00:57)
Ooh,
probably. I probably should be careful with the, how do I prove it? I mean, yeah, I think that's, that is almost certainly true. I'm not sure it's because we necessarily have absolutely smashed collaboration or the bar is so low that it's easy to stand out. I think it's probably a bit of both actually, yeah.
Aaron Phethean (01:15)
Yeah
Yeah.
And that is actually what I was thinking about when I read it is, you know, actually, how do you define it? What is collaboration to you? And what does good actually look like?
Ollie Hughes (01:30)
Yeah, that's easy to define it. So just people who are not able to see it in front of you, which is on this, we basically built a, an analytics analytics platform BI tool around the idea of an infinite collaborative whiteboard as the primary interface. You can build reports, which you can send out as a like a normal BI tool to hundreds of people at once. But the core idea is that the data team can work within this collaborative space and
The interface allows collaboration, allows idea generation and the entire analytical workflow of modeling, analysis, data cleaning, building reports can be done with each other and the data team or with the wider business. So we've got this idea of what we see in our customers is they are able to make the data workflow a business activity and the analysis of business activity.
rather than being the data team doing over here and then sending out reports in a kind of waterfall development way as normal BI tools allow. So really the metric we care about is like, how many times do we see a data person and a business user, as it were, a stakeholder wanting a better word, working together with them will bring two halves of the problem and sort of working through issues or scoping a report together or even discussing business logic together. And that's really the most, that's the spark of excitement which we get.
Aaron Phethean (02:44)
Yeah.
Ollie Hughes (02:56)
And think it's why we're now rated the highest analytics platform on G2 at the moment, because it's just an amazing way to work.
Aaron Phethean (02:57)
.
That's, congratulations. That's awesome. I mean, I suppose that was that was one of the motivations we're having on the show. There's a way of thinking there. That's not thinking about a single form factor or a tool. And I think, you know, from memory, one of the one of the big things for you is that engagement of data team to organization. And it's fantastic that you've got a tool that backs that up. Maybe wind everyone back a step. So we've got this tool, we've got this view of BI, but
Ollie Hughes (03:06)
.
Mm.
Aaron Phethean (03:33)
What's data like in a company? Like, why does it? Why? Why is this a problem? Why does it why to be near new way?
Ollie Hughes (03:42)
⁓ I there's so many, have such a, having looked at this more and more, the more I see opportunity and the more I get frustrated the way we've all been working for decades, including me, by the way. It's not like I've been working with these tools for a long time myself, so I know the pain. I think at its highest level, I think that the, if we're being really honest with ourselves as data practitioners, the,
how reliable we are delivering value to the business is there's a very big disconnect between the activities we do and the value we generate. There's not a clear operation which says if we do the following things, we guarantee a result for the business to improve. And that disconnects a problem because we then just do loads of work and how reliably they will deliver is not clear to the business or actually, if honest, to ourselves very much. We therefore end up just...
building stuff, building reports, building, answering questions, practically trying to just do enough of it that we can shift the needle and feel like we're part of the part of a solution elsewhere. And that's not a fun place to be. It leads data teams to kind of retreat into being more code focused, more technical and really not leading away from the business and making that relationship more transactional. When actually there's the business, the data team has so many superpowers.
which the business could exploit and use to drive real value. One is, you know, the fact that the data team has complete oversight of the whole business in a unique way. That's so powerful that we can see everything, see what's going on. And that's a real strength that we could really help the business with and show the wood from the trees. We're also great problem solvers. We're naturally collaborative in general. We like doing the right thing. There's loads of things that the businesses could benefit from if we're there and working with them in a better way.
Aaron Phethean (05:14)
Yeah.
Yeah.
Yeah. And I think a lot of people have that same frustration. You've probably seen in the news Larry Ellison, which has manned the world briefly. You know, that's a company founded, Oracle, founded on data and founded on the value of data. And I think that's, that's what I see a lot of in our customers and clearly the pain that you've just talked about. They know they have this unique viewpoint in the world. They know that they could offer a lot more.
Ollie Hughes (05:36)
.
Aaron Phethean (06:02)
And I suppose that's where the BI tool and the failings of the tools and the people engagement means that often it just falls way short of what they could be doing. And actually I looking back at my notes for what we discussed in a, think you say the BI tools have the worst ROI in the stack. And I can kind of see why you have that, that point of view. elaborate a bit beyond that.
Ollie Hughes (06:27)
Sure, what
do mean by that? Well, what I mean by that is a few things. One is that as an industry, as a sector, ⁓ There's been very little innovation in the BI stack, really. There's been lots of innovation in the kind of back end of the stack with car computing and better pipelines and data governance, et cetera. But actually the BI tools that...
they're still the same BI tools we were using 15, 20 years ago when we had different problems to solve. The paradigm of how we're communicating, working with data is still very read-only. You read a dashboard, you then look at it and then you discuss it somewhere else. You try and work out what's going on behind the scenes. It's the interface, the interactions of the data team and the wider company have changed very little in how they communicate. Whereas every other tool that exists out there have
every every like marketing tool or every single like the way we product, you know, write documents has all changed quite a lot. It's much more fact more collaborative. ⁓ BI tools are still expensive as they were haven't really changed. They're the new version of these new tools haven't really changed the paradigm. They're just making, they may be slightly more integrated with the modern data stack. It's on the middleware in the middle of the stack. And they may be better with DBT for example, but otherwise they're very similar. They're now having to adapt because of AI such a disruption.
Aaron Phethean (07:43)
Yeah.
Ollie Hughes (07:50)
But in reality, they're now building in the same interface that you can get from chat GPT. It's a chat bot. So the innovation still isn't there. They're just building the same thing. And you've now got all these open tools. Yeah, exactly. And you've got all these other open source tools, which exist now because the innovation of sort of proprietary market hasn't evolved enough that actually it makes sense to potentially buy an open source tool because
Aaron Phethean (07:56)
Mm-hmm.
Just more of it and faster perhaps.
Ollie Hughes (08:16)
What are you paying for? You're just paying to see your sales numbers sent to other people on a daily basis. It's not really drawing the numbers. You've had that for long time.
Aaron Phethean (08:21)
Yeah. No, no insight. No, no,
no innovative experience there. You briefly mentioned AI and the, it's having an impact everywhere. Thinking about those factors that are driving, know, companies and changes in the way companies work. How is AI going to change the BI landscape? How do you see that, that having an impact on data teams and your product and that space?
Ollie Hughes (08:52)
Yeah, I, I, ⁓ there's a lot of changes going on. As I mentioned a second ago, actually for the classic BI tool is a real threat because the ability to get to your, for an individual now to get to their operational numbers is very easy and a chat bot interface kind of works well. And you know, if the BI tools just build a chat bot themselves, they're not really digging, not really differentiating or making their, their case well.
Aaron Phethean (09:21)
Yeah.
Ollie Hughes (09:21)
I think
AI is going to be a really positive thing in one way, which I think it will allow a real step forward in self-service. there's no, like even there'd be industries in promising self-service analytics for ages with here's a spreadsheet UI or here's another drag and drop UI. And so, but I think actually an AI when it matures, the technology, which I don't think we're there yet, but it will mature. And I think it will make a leap forward in the accessibility of data, which is a ⁓ good thing. There's lots of use cases that will unlock, is, which is great.
⁓ where I see challenges is in the fact that even now without AI, we have challenges of self-service produce other problems. Like we just have a wash of information and what BI tools don't solve is the decision-making process and how to make that simpler. Getting numbers out of a computer is not really the bottleneck on making data more valuable anymore. It's been that for a long time.
Aaron Phethean (10:19)
Mm-hmm.
Ollie Hughes (10:21)
So really the problem with AI is it's going to make it a bit better, but it's not going to actually necessarily deal with the bottleneck, which is the decision-making human workflow of communicating ideas, discussing what the numbers mean and what we should do about it. And so...
Aaron Phethean (10:35)
Yeah.
Ollie Hughes (10:39)
Fortunately, you could argue that's what we've been focusing on all along for the time we've been building counters, like how do we make the decision making process easier? How do we recognize that humans need to reach an agreement together with evidence, some of it qualitative, some of it quantitative. How do we bring all those people together into a tool where they can make a decision with confidence and quickly. And that is an extension of the BI domain. And then we can now add AI into that workflow to make that better, which I think is really positive. If you're just throwing AI into a...
Aaron Phethean (11:02)
Yeah.
Ollie Hughes (11:06)
BI to which just gives you sales numbers or gives you numbers and doesn't think about the wider consequences of the wider workflow. And that's just going to cause more chaos. Yeah, not much difference.
Aaron Phethean (11:12)
their difference. Yeah.
There's two bits there that I want to touch on. So I think there's the kind of form factor bit, like the output side of things. And then there's the kind of trust in the number and the data side of and the hallucinations. touching on the form factor briefly. So I've seen some really bad type AI add-ons to products where effectively it's like, instead of clicking on a
Ollie Hughes (11:28)
Yeah. Yeah.
Aaron Phethean (11:42)
box or two, it's asking me a question and then hopefully generating the same filled out screen. But there's just no innovation there. That is just like a slightly different interface. Whereas I think what you're saying is the output could be many different form factors, reports or diving into the data or, you know, far more collaborative in nature. Like, are you sort of developing a vision of like,
loads of different kinds of outputs and inputs happening. Is that sort of where you think it's headed?
Ollie Hughes (12:17)
I think that's what we need. I'm not sure I'm seeing it necessarily in the wider market, but I think there's been...
So I think the wider market has, ⁓ we're wrestling, as an industry we're wrestling with this obviously, as everyone actually is, every industry is wrestling with like what's the right way to work with AI. I think there is a form factor question like what is the right interface to work with AI, which is kind of a technology kind of wide issue of computer science problem, which is like what's the best interface to interact with a very flexible piece of technology.
Aaron Phethean (12:50)
Yeah.
Ollie Hughes (12:52)
I'm not convinced it's necessarily a chatbot in every circumstance at all, though I can see why it's where we've landed to start off with. I think the form factor, but an AI which is very flexible, the more flexible the interface you give it, the more powerful it has the potential to be. So that's one kind of fundamental principle we can probably all align on. So that's definitely important. But I think what I'm saying with the challenge, I think I'm saying with BI,
Aaron Phethean (13:07)
Mm-hmm.
Ollie Hughes (13:19)
which has been true pre-AI and still seems to be the remaining the problem of the ways they're using AI now is they are still operating for how do we get an individual to a chart as fast as possible. And that is only a subset of what actually drives value from data. Like just getting to a chart isn't actually valuable unless it leads to a better decision at some point in time.
Aaron Phethean (13:31)
Yeah.
Yeah, there's
no what if interface. There's no actual like, and then kind of step.
Ollie Hughes (13:46)
But even if you
could ask a what if question to an AI, you've got to get the human decision makers of that organization to agree, understand the evidence, understand the methodology you've got to it, how well to weigh up that piece of analysis versus qualitative informational gut feel. And that's the process which actually leads to the decision which leads to the value creation.
Aaron Phethean (13:57)
Mm-hmm.
Yeah.
think I'm building a better mental
model through that description of what that could look like. And you're looking at a piece of information and it's surrounded by essentially the items that made up that, or that you sort of references and sources almost. That's how I'm imagining it.
Ollie Hughes (14:24)
Totally. Yes, it's all about as any good product is like, what is the problem that you're solving for? And I guess one way to summarize my view on the industry and AI is the BI industry for the last few decades has been focused on how do I reliably get a number into to be sent out to hundreds of people at once in a reliable way. And back in 2005, when you know, our cohort of BI tools, which are still in common around that was a very useful problem to solve because data was not easily scarce and wasn't accessible.
We're now in a context where we have more information we can possibly cope with. And therefore the problem that a BI tool needs to solve is different. ⁓ And actually throwing in AI into that problem crudely, if you're still solving for getting charts built as fast as possible as your main goal, as the main problem statement of the product, it's not necessarily going to make things much better. So it's about picking your right problem. And I think what I'm trying to say is that in account, we've picked how do we help
Aaron Phethean (15:03)
Mm-hmm.
Really?
Yeah, good.
Ollie Hughes (15:22)
how do we companies make decisions, see their business, solve problems, and then make decisions as as possible. And that goal, I think, leads you down a different implementation of BI tool, hence a different interface, but also leads us to a different implementation of AI to support the decision-making cycle.
Aaron Phethean (15:38)
Yeah, it strikes me that it's also really interesting context for the individual and the organization trying to make a difference. Is that if they sit back for a moment and think, how am I going to take data into the company in a meaningful way? Well, it's a little bit of form factor. It's a little bit of the quality and supporting. It's a little bit of, you know, they're actually engaging with the data to the company. That really, I think that's quite a cool perspective actually.
Ollie Hughes (16:07)
For sure. mean, that's the context that kind of where are we now in 2025 versus 2005 or whatever history is. As you input, if you generate a chart or generate some insight or analysis as a data person, you're now competing with like we are in just general society with information overload. That analysis is now competing with the analysis that's been produced by the SaaS product that the marketing team is using and they're getting reports in that.
Aaron Phethean (16:29)
Yeah.
Ollie Hughes (16:37)
You've got way less attention than you had before. So you've got, that's the context you're now dealing with. So just pumping out information without thinking about it, how making sure it's useful, valuable, it's clarifying, is actually just making things worse. And it's not necessarily better. And that context is key.
Aaron Phethean (16:49)
Mm-hmm.
Yeah, Okay,
go get it. Let's come to the second bit, which I feel like is kind of the elephant in the room and a lot of people are really grappling with. That's the kind of trust in the data. And I know you have quite strong views on the trust problem for data teams. Before we get there, I just want to talk specifically about the kind of hallucination problem. I was looking at some tests that we have that...
Ollie Hughes (17:17)
Hmm.
Aaron Phethean (17:19)
⁓ produce some AI generated output from documents, from some data. It has been fascinating over time. I've been watching these tests for six months and the variety of generative output is huge. It's a very, very simple test in some of these cases. One of them today produced a number that was supposedly 30 million, but the ⁓ significant digits
were missing one. It was was kind of actually only three million, but the commas are all in the wrong places. And it was like, it was just so obviously wrong. I wonder if you have techniques or you have like technology that deals with that? is is that are those problems, computer science problems getting solved? I sort of feel these are things that maybe are not fully exposed to everyone. And I wonder if you have some interesting insights on
Ollie Hughes (18:16)
So
yeah, so data trust is something we think about a lot and we think about lot in the kind of collaborative context. it's obviously the thing which every data person fears, no matter how senior you are, where you work is this number of doesn't look right. It's a kind of makes you just have that kind of chilled on your back. So it's a real problem. It's often quoted as like the top problem people talk about in data as their biggest challenge. Either that or it's how do I drive more value from my team?
Aaron Phethean (18:45)
You
Ollie Hughes (18:45)
We're talking about
both these topics today. And obviously the more accurate your data, intensive data quality, obviously the higher data quality, the more accurate it is, that will have an impact to like trust and have people trust the numbers, et cetera. Now the mistake that almost everyone makes and I have made is thinking that accuracy alone drives more trust. It definitely has to be there to a certain degree, but it's not all that needs to be considered within it. So for example, you can, if you imagine
Aaron Phethean (18:58)
Mm.
Ollie Hughes (19:15)
pinging your CEO today and saying, you you've done an amazing regression analysis, just very accurate saying we should pivot all our marketing spend from this channel to this. And you just said that to them in a sentence. No matter how accurate those numbers are that you've produced, they will not trust that as a thing to make a decision. They will need more to trust that outcome.
Aaron Phethean (19:32)
Yeah.
not unless they implicitly
trust you and everything that you said before and like you know ⁓
Ollie Hughes (19:40)
Totally. So
that's obviously one variable which would make a difference in the way they would react to that is the like the level of intimacy or trust that you as two people have, the track record of your outcomes makes a difference. ⁓ And then the other part of that is also other variables that need to matter here are how much ⁓ does your do explain methodology? How much, not just the number saying we should do this, but can I explain my methodology?
Aaron Phethean (19:52)
Mm-hmm.
Ollie Hughes (20:09)
how I've done this analysis, here's how I've got to this conclusion, hugely important to help trust what you've done is correct. So when it comes to AI and hallucinating, one thing that you can see happening really well within AI products is they are giving more transparency to their methodology. The whole like, I'm thinking, here's my methodology, how I'm going to get there. That has undoubtedly helped people trust the numbers more.
rather than it being a black box. that sort of solves transparency. Accuracy, obviously, I've seen recently that OpenAI had a breakthrough about how to score information so that it is rewarded by not hallucinating, which is really powerful. So you can do more to improve the accuracy of the number, so it's not out by one significant figure, whatever. But the other bits of the equation are track record and the relational bit of a human, you're communicating in empathetic way.
Aaron Phethean (20:56)
Yeah.
Ollie Hughes (21:06)
which an AI can't do as well. basically trust is more than has a various factors to it. This is all this theory, by the way, is very much built off the kind of the trust equation, which is used by management consultants to help think through how they can help clients more build trust and help them more. It's just about a version of that. And the data trust equation has exactly the same principles as just related to data. And there are more than just data quality as an accuracy as a variable. And some of those AI can solve, but other bits it won't. what does that mean?
Ultimately, a human to human interaction is going to be very valuable when it comes to making decisions with trust. And it also means there's data quality, more than data quality that data team can do to drive trust in what they do. Because what we're really talking about here when some people say numbers aren't right is that the data team worries that they're not being valued enough or their value to the company has dropped. So there's more they can do to solve that problem.
Aaron Phethean (21:41)
Yeah, yeah, exactly. And as I think you.
Mm-hmm.
I definitely can relate to that, that, you know, there is this feeling amongst a lot of, you know, the people that I speak to on a day-to-day basis is that one thing wrong could erode months of trust building. That's sort of the feeling they have that they really worry about the numbers so much. And I think that insight that explaining the methodology can build trust, even if the answer is wrong. Even if it may be like, it's okay if like, a bit like on your maths test, you you
you got the wrong answer, but you can see the workings were right. My son actually had a maths test where he had almost 100 % of this test. He's absolutely delighted. The one question you got wrong, he used six days in the week. Like he just so obviously know that there's seven, but he still made this one error, you know, and that was explainable and you can understand how you arrived at that. And not the black box, think, know, opening up more is a way to remove that fear and that fear of going.
Ollie Hughes (22:51)
Yeah.
Yeah,
and ultimately, that example, the fact that maybe the context of the question, knowing the context and saying, well, you six days is actually only so, is only like 14 % of the error, ultimately, seven, you know, it's actually like, can put the error and you know, yeah, exactly.
Aaron Phethean (23:12)
Yeah, yeah, and I can actually see how that arrived at, and then you can both arrive at the right answer together. I really
like that. Yeah, so definitely a few things there feel really helpful. I wonder then coming back to the challenges that you set out for and count and the kind of companies that you work with.
Ollie Hughes (23:34)
Mm.
Aaron Phethean (23:35)
Where do you see the overall focus for you in the next kind of 12 months and two years? And what are the sort of big things that are on your horizon and maybe they help other people's horizons in teams?
Ollie Hughes (23:53)
Yeah, for sure. mean, there's obviously two ends of that. It's like, what am I trying to help achieve for my customers and what are we trying to achieve as a business? think the latter is the form is more important, So like one of the things that has really powerfully come out of building Count and working with data leaders is how this much more flexible collaborative platform allows data teams, which gives a lot of freedom.
It allows data teams to sort of find, and we've worked with our clients to find a way of working, which has the maximum value to the business. You've got less constraints. can sort of, rather than try and work around your read-only dashboard and what can I do differently, you're now working, talking, thinking, presenting numbers in a different way, communicating in the most natural way to drive value. And it's been really amazing to see customers working differently and
Aaron Phethean (24:29)
Mm-hmm.
Ollie Hughes (24:48)
shifting out of the report hamster wheel they have never been on before. ⁓ And so that's been really powerful. We now have a really clear and measured view about what a data teams, what way they should work to maximize the value they can bring to the organization, which actually anyone can apply regardless of using count or not. So it's of been wonderful to sort of realize that and then codify that as like,
This is the way data should be used working, which drives value to the business and actually makes them AI proof, like gives them a purpose outside of just doing all the stuff and AI, hopefully we'll do for them later. that it sort of quashes this idea that, which I hear a lot is like, will that actually be a data team in the future? Or will it all be AI? And the answer I fully believe is actually no, there's a really important role for a data team, maybe different shape, maybe, but there's a real role for a
Aaron Phethean (25:21)
Mm-hmm.
Mm-mm.
Ollie Hughes (25:46)
an analytics analytical team to have an organization. So I'm well, the most important so we can dive into what that looks like. But my goal for next two years is to keep digging into that role, revealing it to the market, showing what that looks like. So everyone, everyone can benefit. ⁓
Aaron Phethean (26:01)
Yeah. Almost advocating
for the data team and the creation of data team. And I think there's a, there's a, almost a thought that self-service, that's done right. But I think then a lot of companies, maybe not just self-service, but like data-driven is another way of talking about it. Or, you know, actually like how, how people run their businesses is still, I don't think has all the.
Ollie Hughes (26:08)
For sure, for sure, yeah.
Mm.
Aaron Phethean (26:26)
dials and gauges and measures and things that help them make quantitative decisions in a lot of cases. I don't think that's job done. It feels like you're advocating for teams and how they can help and how they fit in and drive a good business.
Ollie Hughes (26:40)
Yeah. Yeah.
Yeah. I think so. mean, I don't get wrong. hope that lot of the tedious stuff, I hope there will be self-service will be there. And I hope that allows data teams to not be doing like have higher value activities as their primary focus. I think that what I'm saying is that some data teams are so stuck in that service trap of just doing stuff for the business.
Aaron Phethean (26:57)
Mm-hmm.
Ollie Hughes (27:06)
It does feel like an AI could replace a lot of that. And then the question because of what do I do now? Well, I'm saying there's a definitely an answer. I'm sure there's an answer. I know the answer is. And that's what's really exciting. Yeah. ⁓
Aaron Phethean (27:13)
Mmm.
Yeah,
I agree, That trap, that kind of service trap, I think it is spoken about quite a lot. I wonder though, how do you find teams get in and out of that as a kind of strategy for you identify that you're in that kind of service trap? Where do they start? how do you start to fix that as a leader, seeing that happening to their team?
Ollie Hughes (27:44)
Yeah, sure.
I that I think there's some behavior that the essence of the service trap, in my opinion, is that the reason it's a trap is because the data team is working in a way which by trying to help the business actually doesn't help the business. That makes sense. That's why it feels like it's the right thing to do. But actually over time, it leads to leads to harm. So for example, that would be like
Aaron Phethean (28:06)
Hmm.
Ollie Hughes (28:11)
Just building reports as the business requests to see more information, like just answering every question the business asks of you, it has an immediate sense of I've done something to help, but over time produces this kind of information overload problem. Like there's this asset everywhere saying different things and the business is getting quite confused and everyone's got their own version of the truth. No matter how many times you centralize metric definitions.
Aaron Phethean (28:31)
Yeah.
Ollie Hughes (28:37)
Everyone's got their own version of that metric in their different reports that have been filtered in directions. So that's one of the biggest pains is like just doing what the business asks of you, not only like psychologically positions you as like a service function to them, but it also generates information which over time, like just floods the business with chaos. ⁓ And also it kind of caps how much value the data team can bring, right? Because if you're only doing what the business tells you,
Aaron Phethean (28:40)
Yeah.
Yeah. Yeah.
Ollie Hughes (29:06)
the business is asking me stupid questions, the data team is going to be producing stupid answers.
So you've read more of the data team is involved with what are the right questions to be asking so that they have a seat at the table. What matters? That's really important. And then the other side of it, which is more in our control as data people, but I see a lot is we often work on things which are really in our control, but have absolutely no business value whatsoever. We can polish our pipelines to an inch.
Aaron Phethean (29:10)
Yeah, yeah.
Yeah.
Ollie Hughes (29:33)
Back to this idea of accuracy, like get from 99 % accuracy to 99.5 % accuracy and tick the box and fill out, we've made a difference, but absolutely no one sees that difference in enough. Or if it does make a difference, it's in a very small, very niche occasion. And that's also a low value add. So those are, that's what it feels like to be in the service trap. Those are the behaviors that lead to it. In many ways, the symptoms of it are, do you have a ridiculous number of dashboards for the number of employees in your business?
Aaron Phethean (29:47)
Yeah.
Mm-hmm.
Ollie Hughes (30:01)
Are you spending loads of your percentage of your team's time just supporting data quality, supporting like status quo reporting and maintaining reports rather than sort of working on problem solving? Those are the symptoms of a support trap function. The way you should be working, it goes against that, tries to work in a way which minimizes these failures.
Aaron Phethean (30:05)
responding.
Yeah, it feels
like I had to summarize some of that. My mind went to one of the pieces of advice I heard is that essentially you work on the questions, work on what questions should be asked. Maybe that's the most important KPIs or the most important measures and less so on the outputs and the answers and the reports. You have more discussion about what questions should we be asking? What are the important ones for us? ⁓
Ollie Hughes (30:49)
Totally. And
I completely agree. That's a great way to summarize the difference is that you've got work in a way where you're involved in what questions, i.e. what work you should be doing and make sure that is ruthlessly prioritized. And you can do that in a few ways. One obviously is just filter hard what requests you're being given, but a better way of doing it is to set the groundwork or set the kind of, help the business have just a fundamentally better understanding of itself.
Aaron Phethean (31:03)
Mmm.
Yeah.
Ollie Hughes (31:20)
So we call
it operational clarity, this idea that in a world where every SaaS product is pumping out information and metrics to every individual person, the role of the data team no longer is to just throw out even more reports. The role of the data team is to produce this sense of real clarity and alignment across the business where the data team in the way that they are putting out is using their superpower of knowing the whole business and saying,
Aaron Phethean (31:38)
Mm-mm.
Ollie Hughes (31:47)
here's how the whole business fits together. Let me show you our business as clearly as I possibly can. So we can all see what's really going on and we can all see where the biggest problems are and we can all align about what really matters. So that alignment, that kind of making the business feel simple, cutting, showing the forest rather than individual branches as it were, like giving the holistic view, how they relate to each other, what's really going up and down and making that as, I'm making it really.
Aaron Phethean (31:58)
Yeah.
and how they relate, yeah.
Ollie Hughes (32:14)
really clear signal of what's happening versus all the noise of background information. That's a much higher value activity. You're basically visualizing the business, visualizing the growth model, making the business feel as simple as possible to get that alignment. And that's a different goal than just pumping out dashboards. And from that, you're then, you're gonna, what we see again and again with our customers is the questions that you get asked are therefore much better informed. So you don't have to go into bat and say, that's a stupid question.
They already, you're helping them just have a better understanding. So the questions they're asking you, I'm just better off the bat anyway. And then you have a much more informed conversation about what really matters.
Aaron Phethean (32:49)
Mm-mm. Yeah.
Yeah, actually there is tons and tons of great advice in there. And really I think so many little nuggets that I hope data leaders and teams pick up on. I wonder, starting to wrap up and think about all the things that we've discussed, I wonder if there's one or two bits of advice that...
now thinking about the teams and the positions that companies are in, what are the things that they should start and what are the things that they should stop?
Ollie Hughes (33:29)
I mean, yeah. So ⁓ I think it's really important for data teams to recognize, a few things to say here. I can say so much. I will wrap up though. Yeah, no, sure. think the core idea here, the core mindset shift that data teams used to get into is you need to recognize that ⁓
Aaron Phethean (33:43)
We might have to pray, see for one.
Ollie Hughes (33:57)
There's a power law or a distribution attached to insight in the work you do that not all requests are equal. Some are really, really important and you'll be remembered for helping on those really well. And the ones that are kind of that come through that you're asked to do, which feel urgent may not be. And you need to have a better way of filtering that than just saying at the time it arrives to you, this is this please you do X, Y, Z. You will not you will people will forget that you didn't help them on that urgent thing over time.
If you've solved the most important problem the business has today and the CEO recognizes that that's the kind of fundamental truth, I think, which matters a lot and helps you prioritize your time really well. It doesn't mean don't get people to answer those questions. It just means be ruthless about how much time you're spending on that kind of stuff. So you can spend the maximum time of your team's payroll on the biggest problem in the business. So the one thing I would say to come summarize that is the most important thing that a data team can control.
Aaron Phethean (34:43)
⁓ And make sure there's a finite capacity.
Yeah.
Ollie Hughes (34:56)
is getting the business to be clearer and also working out your allocation of your payroll as a team, your time ultimately, which is the biggest cost you have to the business. And if you can measure yourself even crudely and say how much of our time is spent on maintenance versus solving problems the business is really struggling with. And you can look the CEO in the eye and say, I am spending at least half my time on the biggest problems the business are facing right now. And I'm contributing towards solving it.
Aaron Phethean (35:03)
Mm-hmm. Mm-hmm.
Ollie Hughes (35:25)
That's the best thing you can do is to have confidence that you're not just spending your time maintaining and maintaining performance flat, basically. So measure yourself. Yeah.
Aaron Phethean (35:36)
Yeah, I
feel like that point actually resonates with me the most is that, you know, the where their effort is going and how they're measuring themselves. You would think a data team has a really great grasp on because that's what they're doing for the organization. ⁓
Frequently not, you're frequently really not sure where they bring the most value and how they spend their time. And yeah, I definitely see an awful lot of that. I think it'd be great advice as a place to start.
Ollie Hughes (36:07)
Exactly Yeah.
Aaron Phethean (36:08)
It's absolutely tremendous, Ollie. I appreciate the almost raw thoughts and the advice that you have for people. And I feel like we've had a really good conversation here about a lot of topics and hopefully helped a lot of data leaders and teams out there.
Ollie Hughes (36:26)
Hopefully, hopefully I'm sure some of this stuff will be great, hopefully relevant to some people, some of it may be easy to forget, but I hope some things stick at least for the audience. I have a lot of very strong opinions, which I think people are picking up on. I hope that's helpful.
Aaron Phethean (36:40)
Yeah, and
you're very active on LinkedIn and there are people can find you and reach out and actually some of the things we discussed I think that the trust post you're a very detailed blog about and you know I think you know that kind of like you know well thought out materials in short supply So yeah, I think it's a great place for anyone to go and take a look. So yeah, thanks for coming on
Ollie Hughes (36:45)
Sure
Thanks
Aaron, it's been great. I really enjoyed it.
Aaron Phethean (37:04)
Cheers.