Data Matas

This episode with Adam Dathi is a must-listen for anyone looking to turn data into a strategic powerhouse within their business.  Adam shares practical insights into how data teams can work seamlessly with other departments for maximum business-wide impact and gives his take on the future of AI-driven data analysis. Aaron and Adam discuss the critical role of reliable sources and governance, but also observe that this is not an isolated issue for a specific team, but a company-wide responsibility if one is looking to harness the true power of data for their business.

Takeaways

  • Data quality is a company problem, not just a data problem.
  • The sophistication of data teams depends on company size.
  • Data governance is a hidden long-term investment.
  • AI can't generate ideas without human direction.
  • Data maturity impacts decision-making effectiveness.
  • Data teams need to interface better with the rest of the company.
  • The value of data is tied to decision-making outcomes.
  • Standardizing terms can improve data governance.
  • Data quality issues often stem from company culture.
  • Investing in data governance can yield hidden benefits.

Titles

  • AI and the Future of Data Analysis
  • Unlocking the Power of Data in Business
  • Sound Bites
  • "Data quality is a company problem, not just a data problem."
  • "The sophistication of data teams depends on company size."
  • "Data governance is a hidden long-term investment."

Chapters
  • 00:00 Introduction to Data in Marketing
  • 04:43 The Role of Data in Strategic Decision-Making
  • 07:34 MVF: An Integrated Media and Marketing Company
  • 10:45 Challenges in Data Technology and Team Dynamics
  • 13:52 Data Quality vs. Company Culture
  • 16:49 The Importance of Data Governance
  • 19:55 AI's Role in Data Analysis and Decision-Making
  • 22:39 The Future of Data and AI in Business
  • 25:51 Conclusion and Reflections on Data's Impact
  • 49:59 Wrapup
  • 49:59 Final Thoughts and Reflections




What is Data Matas?

A show to explore all data matters.

From small to big, every company on the market, irrespective of their industry, is a data merchant. How they choose to keep, interrogate and understand their data is now mission critical. 30 years of spaghetti-tech, data tech debt, or rapid growth challenges are the reality in most companies.

Join Aaron Phethean, veteran intrapreneur-come-entrepreneur with hundreds of lived examples of wins and losses in the data space, as he embarques on a journey of discovering what matters most in data nowadays by speaking to other technologists and business leaders who tackle their own data challenges every day.

Learn from their mistakes and be inspired by their stories of how they've made their data make sense and work for them.

This podcast brought to you by Matatika - "Unlock the Insights in your Data"

Aaron:

Hello, and welcome to the show. Today we sit down with Adam Darthy and discuss all things data. Team, technologies, AI, you name it, we've discussed it. Let's dive in. Hi, Adam.

Aaron:

Welcome to the show. Thanks for coming on and and telling us all about your experience and and the world of data from your perspective. Why don't you start by telling us a little bit about, you know, your current role and and how you found yourself there?

Adam:

Sure. Thanks, Aaron, for having me. So my name is Adam Daugherty. I have a background in strategy consulting, and then I've been in the world of data for 12 or 13 years now. So I started working for a start up, built up the data and BI team.

Adam:

I worked for a data consultancy called RedKite for about 2 years, and I was head of data at Farfetch for 4 years. And at the moment, I'm head of data at a company called MvF. In terms of who MVF are, they're an integrated media and marketing company. MVF has a number of different brands. They try to attract people to those brands, different customers.

Adam:

So there's a b to c, department, and then there's also b to b department. And we create engaging pieces of content. We attract different customers and clients, and then we connect them to suppliers who are our clients. So we have different brands like EcoExperts, Expert Reviews, startups.co.uk. And the whole idea is to connect potentially willing customers and clients with people that can provide them with whatever it is that they're after.

Aaron:

Well, that sounds like an absolute wealth of experience. So all that's left to do is dive in. So I I have a a theory. So I wanna dive into, you know, your your experience and your kind of your unique way into data. But I have a theory that data starts in companies in different parts of the organization, which affects the outcome.

Aaron:

So it might start in finance and you get a, you know, may perhaps more a spreadsheet driven view of the world. Or, you know, it starts in, ops and you get a sort of more reporting view of the world. I wonder, in your experience and in your well, you're mostly in marketing led companies. I'm guessing they've started in the marketing department, the date data teams that you've you've worked in.

Adam:

Not funny enough. I I mean, I I would say that my I've kind of crossed into marketing and media in a few different places. I'm not sure if I'd say it's even the majority of my experience of, working in business and data. Maybe it's about half of the half of the time. But my my initial experience was on strategy consulting.

Adam:

And, I suppose to some degree, I think that that's influenced a lot of what I think data should be doing. So I think about data try as an avenue to be able to make strategic decisions. So in some ways, I think it has actually influenced a lot of how I operate. But, yeah, probably maybe not specifically from a marketing perspective.

Aaron:

Interesting. Well, actually, that maybe is even better because of that kind of perspective from different approaches. I wonder then, is does my theory hold any weight at all? Does where it starts in a company affect how the team develops and how the technology is used and how it's applied for decision making? Does does the, you know, genesis actually make a difference?

Adam:

I think I think it does. Going back to, like, some some history of data, I think, like, Kimball, the Ralph Kimball who created dimension modeling, I think most of what he discovered about data came from marketing and how marketing teams were using data. Marketing teams are very data driven generally. I think it's probably a fair assumption that where the the background of the team, the background of the people probably plays an influence on where the strategic direction is, of the data team. Yeah.

Adam:

I mean, do do you have any specific examples I can in mind, You're bringing this up.

Aaron:

Let's, I suppose let's let's take a little step back to you and your your role, and I guess why I sort of started from there, and and perhaps we can circle back to it. So your your company now, why don't you tell us a little bit about what MDF do and and what you do for MDF? And then we'll we'll we'll get back to the what I was thinking about in terms of how they use data.

Adam:

Sure. So MDF is an integrated media and marketing company. The way that it operates is it has number of different brands, key brands, like Expert Markets or Eco Experts, TechCo, start up dotco.uk. And those brands, in different ways, try to connect interested customers or clients with, companies that want to sell them something. A lot of it is is brand content led marketing.

Adam:

And the way that it operates is it tries to create a kind of fairly end to end ecosystem where if a customer or consumer has a demand or a client wants to purchase a new SaaS product, then it offers content to help them along that journey and connects them to the right people that will be able to sell them the product. So that's essentially what it does.

Aaron:

And I think we see there there's a lot of sources. So you tend to be taking from many different places to help with this decision making. Is that is that fair to say?

Adam:

When you say sources, do you mean, like, conceptual sources, or do you mean, like, data sources?

Aaron:

I think both, actually. You see, you could imagine a system that had just one connector, but it had a lot of different inputs from different kinds of departments. I think, what I see and people say quite a lot is that marketing tends to demand many different types of products to be connected because they're using a lot of different types of products.

Adam:

I I think that that is very true. And I I know that we've had a conversation before about the number of different connectors that MDF uses. So you've probably seen we've got, like, a really long tail of different sources that we're interacting with. And you're right. There's there's a lot of there's a very long tail of different marketing sources.

Adam:

However, they also use some of the the big prominent ones as well.

Aaron:

Yeah. And if you compare that then to, you know, finance led or, yeah, the the kind of consulting, type, you know, roles and companies that you've been in, are they similar? Do they have a lot of sources, or do they tend to have one really high volume, you know, data source that you have to analyze in in a lot of detail? What what how would you? Are they different?

Adam:

I I would say that there's some level of sophistication with lots of the connectors. I I never felt like, at least MDF's stack was was that fundamentally different, but it's also because I think it it depends on how mature the marketing departments are. And I I think that that does actually play a big factor almost in a slightly industry agnostic manner. If you have, like, a really diverse marketing, kind of audience that you're trying to engage with and you're doing it through lots of different channels, trying to do it via influencers, and you're using Google and Facebook and all the standard stuff as well as lots of internal tools, then I think it does become a lot more complicated. But in a sense, I'd I'd say that that is more industry agnostic.

Adam:

You could probably judge it more by the size of the company, especially the amounts of legacy products that they have, the different markets that they're in, like geographies. I think that there's a quite good correlation between those different aspects and how many connectors they end up using. Legacy stuff in particular That

Aaron:

that actually makes total sense, doesn't it? You know? The most influential thing is is how big the company is, how much, you know, how how varied their trading is, you know, what what actually is going on for the company. And would you call then MDF quite sophisticated? You know, the the marketing team there is is after a very data led approach to their decisioning.

Adam:

Yeah. I'd I'd say that they are. And to a certain extent, I'd say that they almost have to be because that is the way in which they're going to differentiate themselves. So I think for lots of companies, they may try to sell themselves on the product. But for MDF, the marketing is part of the product in a sense because we're saying we can connect you with the right people.

Adam:

And how do we find those people? In terms of customers and clients, we need to attract them via different marketing campaigns. So I think that they need to be as sophisticated as they can.

Aaron:

Yeah. That's a that's a that's a really interesting observation, actually. And somewhat somewhat disproves my sort of general idea that the sophistication arise because of where it started in the the company. Yeah. And, actually, it it makes total sense.

Aaron:

The size is is far more important. And what so you you're relatively recent at MDF. You've you've been dropped into these kind of leadership roles in other companies. What did you find? What what did you sort of have a preconceived idea about what to do?

Aaron:

What what was the journey like in in the first little while after you joined?

Adam:

I I think, luckily, there was already a fairly clear idea about some of the things that MDF wanted, which in some ways made my job fairly straightforward. So for example, they brought me in. One of the big things that they wanted to do was to invest in data science. So they wanted to find potential data scientists. They wanted to work with them to look at the road map, understand if there's a lot of value and whether it it justified further investment.

Adam:

There were certain other things as well in terms of looking at the scrum setup of the team. But I I'd say that they probably had a fairly clear idea of some of what they wanted me to to do. There's some companies that I've entered, and in in some ways, it's easier because of my experience. If they have less of an idea of what they want to do, sometimes it's easier to prove value. Whereas funnily enough, I felt like MDF in many ways was quite mature, and they knew

Aaron:

what

Adam:

my remit should have been. And a lot of it I I agreed with, which is less interesting a conversation to have. Usually, I like it when I say I disagree with this. I think we should be doing these things that are fundamentally different. I think that's that on my consultancy background.

Adam:

I I like it when I can say something that's different to what they're currently doing as opposed to here's our ideas about what you should be doing. Do you just wanna implement it, or do you wanna do something different? In this particular case, it was very similar to what they'd already had an idea of doing.

Aaron:

Yeah. That's interesting. That is, I I can definitely see that. Yeah. If you come into a mature team with mature processes and albeit ahead of it, it's quite hard to then rip it all apart and tell them to do it differently.

Aaron:

You know, that's that's that's illogical. Like, why you always why would you do that? Whereas if you go in and it's a mess and things are falling to bits and actually adding a bit of process, you could just show some real value because, hey. We're more organized now, and there's less fires day to day. And, you know, that that that helps in certain terms of adding value when you join.

Adam:

Yeah. Yeah. Exactly. I I think that you'd no longer if if things are going wrong, the only way is up. You know, you can't make things much worse.

Adam:

So it's a good position to be in. If if you have a certain level of experience, you're going into a position, you think there's lots of things that you easily could change to make things better, then often that's a good position to be in. Easy way to prove value. Yeah.

Aaron:

I also

Adam:

believe in something that I've I've kind of taken on from the scrum coach approach, which is that when I enter teams, I usually don't try to change things straight away. I try to get a good feeling for what's going on in the company and what do I think makes sense. And often one of the things that I might end up changing initially are the ways of workings of the team. And there were some some things that we ended up kind of tweaking and changing and improving. But, fundamentally, I don't think things are that different, and they're not that different in terms of the ways of working because I don't think they needed to be.

Adam:

So I'm Yeah. I like Did you look think about the change?

Aaron:

Look for a role that, sorry, did you look for a role that had scrum, ways of working? Like, yeah, that obviously suits you. That, you know, that would be how you would guide things. But did that just was that a mistake, or that that's kind of, like, a happy coincidence that you found a role that the, you know, to scrum based?

Adam:

I I think it was if if anything, like, lots of the analytics engineering teams I've worked with like to operate in sprints, but I've also done some kind of scrum band approaches, like hybrid between scrum and kanban. I wouldn't say that I specifically looked it out looked out for it. If it didn't exist, if there was nothing really there, then I probably would have evaluated whether I could have implemented it. The fact that it was already there means that it's it's kind of aligning with the industry because I think that that's what the majority of Yeah. Majority of the teams operate.

Adam:

So, yeah, I I wouldn't say I looked out for it. If anything, I would have almost liked it if they were earlier on in that journey, and then I could have helped them a little bit more. But they're already fairly developed. I think they've been doing it for a year before, I joined.

Aaron:

Yeah. Yeah. Then, it can't all be smooth sailing. What what stuff does go wrong? Because it sounds like the team's working well.

Aaron:

Are there other things that don't quite go to plan?

Adam:

You mean a MBF or in data, my experience?

Aaron:

I suppose, yeah. I it's always nice to look at the current role and what what's happening sort of day to day, but, you know, maybe it's worth sharing your experience of what does go wrong in data teams. You know, if if the team's working well, there might be the technology is a problem, or it might be, working with the rest of the company, the interface. You know, there there could be a a lot of different things that could go wrong. I suppose in your your experience, it's fascinating to understand what does go wrong.

Aaron:

What what what do you see?

Adam:

I generally don't see as many issues with the technology itself. I think, MDF has had some issues with some technology vendors. But, generally, I think that those are rarer. Like, if you're using a fairly modern tech stack, which they are, then things generally seem to operate fairly well. Where I see some of the issues in data arising, I see sometimes some issue with migrations.

Adam:

I mean, that's often a problem. Like, when you realize that there's some kind of legacy architecture and you want to align it and rationalize it, and then you wanna migrate, that's sometimes quite costly, difficult. I often see issues with interfacing between different teams and competing priorities.

Aaron:

Yeah.

Adam:

Specifically, like, upstream teams that then feed onto data. I think that that's a very common problem. So something breaks, and then the data team might investigate it, and they realize it's an upstream change that they actually didn't have any control over. And in general, I think data space often within a company can be a bit of a topic of discussion. When do when should they be brought into projects?

Adam:

Are people aware that if they make certain changes in tech systems

Aaron:

Yeah.

Adam:

Then it might have a downstream effect?

Aaron:

I think I like that way

Adam:

of talking about it. You sorry. You you like the way of talking about it?

Aaron:

Yeah. Yeah. So I I like that way of talking about it because a lot of people talk about it as a data quality issue, but, actually, it's more like a data team to rest of company interface issue. Like, you know, if people don't even know that downstream there's a data team to worry about, of course, the quality's gonna be, you know, and, you know, it's gonna have the surface, but that's not exactly where it starts.

Adam:

Yeah. Yeah. I I think that this is a lot to do with the company's internal ways of workings rather than necessarily, like, a data quality problem. Data quality now makes it sound like it's the data team's responsibility, And there is a responsibility to make sure that the data is good quality, but it's it's not specifically a data problem. It's a company problem that surfaces via the data.

Adam:

And it's just usually lack of alignment. And I do think generally in lots of places I've worked in, data quality is an issue, and I've seen different levels of investment in trying to support it and trying to correct it. But quality is something that I think in most teams that I've seen, they could do they could be doing more to monitor the quality of what they're producing. Also think that SQL itself doesn't lend itself well naturally to things like unit tests. There's certain best practices that would be there if you were from more of a development background, a software engineering background, and it's taken a little while for it to start to migrate down into more the the analytics world.

Aaron:

Yeah. Exactly. It tends to be, you know, characterizing a little bit. It tends to be that the data team is expected to follow fast and just sort of put up with what they get. And, know, that is obviously really hard.

Aaron:

That's a really hard place to be because either you're pressed for time or you've got no ability to change your input. So, yeah, that definitely hampers what you can, you know, produce. You know, that that's that seems to be a, you know, a data team slot in the world. And like you said, actually calling it a data quality issue almost lands it in the data team, and that's that's kind of unhelpful in its own way.

Adam:

Yeah. Yeah.

Aaron:

Yeah. What would you do about it? In in the ideal world, what what would the what would the perfect solution be?

Adam:

I mean, the perfect solution is that you codevelop solutions. So if there's a change to an upstream system, then the data team is, informed at the right point. But the I've always had these discussions about what is the right point. So do you involve them too early? Because if you do, then there's not much that they can really contribute to the conversation.

Adam:

And if you involve them too late, then they end up having to to respond too quickly or something just breaks, and then you end up with the end consumer that's telling you this metric doesn't work anymore. Can you fix it? What's the problem? I mean, lots of the ways in which I think companies are now trying to solve this has to do with is via more sophisticated data quality tools to be able to understand where there is a problem, be able to understand before the customer, but that that's kind of the last resort. Beyond that, I think it's more about, like, data governance, kinda having close relationships with the engineering teams.

Adam:

Often I've seen that trying to be solved via product managers. So product managers from engineering teams, software engineering teams, starting to coordinate with the product owners from a data team, be that a separate product manager or maybe the the manager of a data team, depending on who's doing that role.

Aaron:

Yeah. I see I see that Yeah. I I see that quite a lot with product management that it's quite easy to understand the user interface and what they're making the product look like. It's a lot harder for them to articulate what they want their own reporting or what the outputs to be. You know, that sort of that that tends to be to the natural way of the world that they wait till afterwards to think about the the output, whereas it's, you know, obviously, an awful lot better if they can think about what they wanna get out at the beginning.

Adam:

Yeah. I feel like this is a kind of data culture and data maturity, type problem. I think it's often almost unfair to expect a product manager to be able to understand what the data implications are. So I've seen this in a few different places. In fact, it's kind of same.

Adam:

MDF and also Farfetch, we had this as well where, there would be certain product managers associated with specific teams that were non data related. And then because data is a horizontal discipline, you end up operate across, like, every if you have a centralized data team, they operate across loads of different businesses and verticals, departments. Whoever's the product owner for that data team cannot reasonably expect it to go and interface with every single team in the company and work out what are the requirements, how do you bring it all together. Yeah. So one of the ways in which people try to solve this is you already have product managers or product owners for those specific departments or those other non data teams.

Adam:

Why don't we just get them to interface with the central data product owner? And then, you know, that we've kind of bridged that gap. And whenever that's happened, I found that the non data product owners struggle to understand how to articulate the data needs that they might have and also to think there might be a data implication to us changing the sole system. So in some ways, it's Yeah. It simplifies the problem because you you at least know who you're supposed to speak to.

Adam:

But on the other hand, it doesn't necessarily solve the underlying issue, which is there's potentially a lack of data maturity, because data isn't the first thing that a product owner that owns, I don't know, payment system will be thinking of. They won't think about, well, how how would we report on what these payment metrics are? And that's Yeah. That's understandable. That that isn't their primary objective.

Adam:

So sometimes it could be taken off.

Aaron:

In the same way you you wouldn't put security in you know, you you'd have a principle across the whole company, but you wouldn't put the responsibility into every single team. You know, you'd you'd centralize it to some degree. That that makes tons of sense to do the same thing with data. Why you know, so it's a cross company concern. You centralizing it makes lots of sense.

Aaron:

Yeah. What what is it that, you think, you know, heading back to MDF and and the way things work? I was quite fascinated to talk we were both at the other day, and and you're on the panel of. The the marketing mix model, one of the guys I was sat next to was in a completely different industry, And that's a a, you know, a very specific way of using the data, but he could immediately see how it was useful elsewhere and in in, you know, other other disciplines, you know, not in marketing. And I wonder then in your experience, is that is that something you've spotted as well, that there are multiple cross functional, you know, very similar ways of using data that that perhaps we could do a better job of of, you know, as an industry, thinking about the patterns of, thinking about how we share and and and, you know, make more of that knowledge.

Adam:

Yeah. I think that that is true. I think that the the underlying tools that you might end up using to solve data problems are actually very universal. They're industry agnostic. Almost doesn't matter which department you're you're speaking about.

Adam:

And so if you if you have a good encyclopedia of the different ways in which you're solving problems for one team, you'll probably find out that you could actually apply them to lots of other departments. It's why if I was hiring for an analyst, someone in a data role, sometimes it doesn't matter to me what their industry experience has been, more about what technical skills they've they have learned over that time and their attitude, their behaviors. Those things are more important to me than the specific domain experience. Yeah. But domain experience can definitely help you hit the ground running.

Adam:

I've also found that, lots of the companies will have lots of idiosyncrasies with their own data, and it will take you a while to learn about some of those idiosyncrasies. So it means that even if someone was very experienced at no. Let's say that really experienced marketing analyst, but then they go to a different company that has a completely different data warehouse, different data models, or they have their own nuances. Mhmm.

Aaron:

There's still

Adam:

a lot of learning that you end up needing to do before you can add value.

Aaron:

Yeah. I definitely see that as well. The the other thing that I think the domain knowledge really helps with is is just the naming. For every single place, every single customer I go to, even if it's the same industry, they have their own names for things. And, you know, it can be really, you know, either frustrating or quite hard to really understand what they're talking about just because they're calling, you know, this thing that everyone knows and that a whole culture that's built up over time, everyone knows what that is.

Aaron:

And, you come in from the outside, and, of course, you have no idea what that widget is in in their in their lingo. The other thing I'm interested in is

Adam:

Sorry. I'm going to say to have the different names for the same thing as well. So even to have the

Aaron:

same kids different. Let's get that. Names to that.

Adam:

Yeah. I mean, you probably have a lot in your your work with Matt's Tico. Right?

Aaron:

That is that is very much our yeah. What what we have to deal with, I suppose, is that, you know, different types of customers, different types of industries. And, you know, when you when you get into a a company and a a project, you're trying to rapidly try and understand what they are talking about, what they call them so you, you know, you know, get, you know, the right point across. And it's you know, like you said, sometimes a big company, they're all calling the one thing multiple things. They haven't decided themselves what they're going to universally call it.

Aaron:

And, yeah, that that part of what we might do is try to repeat the one most popular thing instead of almost influence that that's what we're gonna call it now. You know, that's especially if we're building a data model or building building out you know, writing documentation and, you know, making trying to make sense of it for them. That's, yeah, that's far from easy.

Adam:

Yeah. That's what a lot of data governance projects have been involved with in the past. They've they've all been trying to achieve. So let's try to standardize our terms. Let's get a standardized business glossary.

Adam:

Let's build a proper data dictionary. It just takes a lot. It takes a lot of time to be able to do something like that effectively.

Aaron:

What, in your experience, has has there been return on that investment?

Adam:

I mean, generally, I think that I think that there has been a return of investment, but a return on investment. The the issue that I see is that it's almost like writing good quality codes or, like, well documented code. You kind of do it with the hope that sometimes you you don't need to do it or, like, it wouldn't actually need to be useful or, like, writing good handover material so that if you were to leave the company, then would someone else be able to to take it over? I I think some of the value for having very good data governance, as a best practice is kind of hidden into the long term of the company. Yeah.

Adam:

But, occasionally, there's some points where it's just really confusing and then solving that you have an immediate impact. Like, you have multiple different terms and people don't know which field to use within the data warehouse because there's lots of different competing definitions, so you standardize it. And then there there is some immediate value. Yeah. I I almost see this as a bit of a hygiene factor because sometimes it's really difficult to quantify how much more effective would this organization be if we had fewer terms and fewer fields.

Adam:

And our Looker instance was less messy and we could find the right explores in the right fields more easily? What's the cost of someone running the wrong report and making some decisions off the back of it? It's a difficult thing to quantify.

Aaron:

That that's very consistent with my experience as well. You get, you know, you get one person who tends to maintain a quality level, you know, what they consider good enough. And if it's if it's then, you know, adopted by all and it's sort of part of what we do, you know, as a team, you you really sort of reach that, you know, level of understanding of what good enough is, and then you kind of get these incremental benefits. You can't you don't really know what the world would look like without them. The when when I see a big investment trying to build out and improve the data, you know, governance, the documentation, the, you know, the the what we know about the data.

Aaron:

It tends to be really hard work. People don't really appreciate what they're doing it for because they can't see the immediate benefit. And there's this massive lift, and and often it fails then because they they don't get the reward themselves, let alone see what it's doing for the company. And, yeah, it's it's very much what you said there. It's hygiene, you know, what you should be just doing and and probably better off incrementally changing ways and and and improving what is acceptable going forward.

Aaron:

Very difficult to go back and, you know, see the value of making everything clean and tidy. It's it it looks like technical busywork sometimes.

Adam:

Farfetch did, like, a a fairly large investment in data governance. So they they took it very seriously, and they had a dedicated data governance team and a head of data governance. And I think sometimes if you're if you're a large enough company and the data ecosystem has enough legacy and it's kind of has enough, nuances to it and you can articulate that it's enough of a problem, then sometimes you can get that kind of investment. But it's

Aaron:

it's

Adam:

much better generally if you start things in a fairly data governance consistent manner for right from the start. But most of the time, that's just not how people wanna operate because you don't need it at the beginning. Exactly.

Aaron:

Yeah. The the the one of the probably funny things about data is that we mentioned security earlier. Often, if there's a breach or a hack of some sort, you know, there's a catastrophe, you take an action. You take action to make it better. If your IT systems go down, if there's an outage, you know, it's a catastrophe.

Aaron:

There's, like, a it's a, you know, great big smoking hole problem. You've gotta sort this out, and you'd then you make processes to make it better. I'm not sure I can put my finger on what the catastrophe is in data. If if I had to have a go, it would be trust is so lost in the data that nobody uses it anymore. Like, that's the kinda but it's a very it doesn't feel like a great big smoking hole.

Aaron:

It's not as obvious what the big problem with that is. Do you see, like, a what what would be your smoking hole problem in in data?

Adam:

I think it depends upon if we're talking about analytics workflows or if we're talking about operationalized workflows that involve data. So let's say, for example, you have a machine learning model that's being used by the company and there's no full, fallback. So the machine learning algorithm goes down and all of a sudden you can't do something. So for example, MDF, we're looking at lead scoring. If the lead scoring was then going to be used for different marketing campaigns, that happens on a daily basis.

Adam:

So we as a bit of context, by lead scoring, I mean, a leads we create a lead. We get a lead by paying for different marketing channels. So we get a new client or a new customer onto a site that's interested in a product that, our our clients can provide them with. We wanna score how good that lead is. What's the probability that it's actually going to convert for our clients?

Adam:

Yep. There's a there's that lead score could then be operationalized where we use it to be able to work out which marketing channels are we investing in. We could give it to Google in a performance max campaign so that Google optimizes what leads it's giving us or what traffic it's driving to our site based upon this metric. If that then breaks or we start giving some erroneous figures, then you can actually start to see an impact on revenue. And at least for MDF, there are some operationalized data use cases.

Adam:

Beyond that so I sometimes think that some companies want to make a clearer distinction between operational workflows and kind of data workflows, which is sensible. There's there's lots of different reasons why you might wanna do that. So operational ones aside where you can see an immediate impact on, like, revenue and costs and other things. From a data perspective, I think it depends on what decisions are going to be made with that data. If there's nothing that's critical, and it could be a strategic decision or marketing decision or finance decision, that's gonna be made based upon the data and that data is wrong, the wrong decision is made or that data is late, they can't make the decision, then I think it's mostly about trust.

Adam:

But, I would like to believe that lots of the organizations, the more mature they become with data, the more they will care if something breaks. Yeah. I would agree.

Aaron:

I

Adam:

suppose it's the goal of this kind of competing goal, I think, as a as a data leader where you want data to be so important that everyone needs it because you believe that that will make the company better in the same sense. Mhmm. You also wanna cover yourself so that if something breaks, then that doesn't mean that, the company is going to lose money because of it.

Aaron:

Yeah. That's that's that's that is actually like, when you when you put it in that context, you know, you're you're actually in a really important role. We almost don't want them to trust it too much without questioning it too much because it it could lead to a very poor decision because, you know, you don't necessarily have full control of the supply or full control of the outcome yourself. You know, you're dealing with a, you know, as we said earlier, perhaps poor quality sources, or you're dealing with, you know, less than ideal governance and less than ideal, hygiene around what's going on with the data. So, you know, there's a there's a lot of debate, you know, presently, I think, trying to improve the quality.

Aaron:

But, actually, I think if we talked about it in those terms, you know, improve the, confidence, you know, improve the, you know, the the the real understanding of where it comes from, the decisions that are possible, and the ones that are not. Maybe that would be far more successful instead of saying data quality and putting it back on the team as a as a as a damn problem. Yeah. That's really interesting.

Adam:

I suppose some some of this comes back to how I view data, but like I said, because of my my background, I kind of view it in terms of I'm helping inform decision making in the company, be that automated by machine learning or whether it's actual strategic decisions that I made or operational decisions that I made. Mhmm. So I I think that when we're thinking about data quality and the cost of data and what we want data to be able to achieve, it also has to be tied back to what what is the final outcome that we'd like for the company. So it needs to be very value driven. If it's the case that data is used all over the company, people use it on a daily basis.

Adam:

But in reality, they don't need to. They could just make all the decisions off of gut feeling, and that would end up with exactly the same output. I think that's completely theoretical. But if that were the case, then data shouldn't really have much value. Right?

Adam:

Because it's the value of data is contingent on how much better those decisions

Aaron:

Yeah.

Adam:

Would be.

Aaron:

Yeah. Yeah.

Adam:

So if your benchmark is already just as good as you would achieve, then data doesn't add that much additional value.

Aaron:

Yeah. I mean, I would argue that I really like that. Yeah. I really like that. Yeah.

Aaron:

In fact, I I wanna what I'm then thinking is, well, how would you measure that? You know, how would you measure the importance and significance of the decisions that you make possible? Then that's that's perhaps a topic for a for a future conversation. I'm

Adam:

not sure if I have an answer if even if we were to Yeah.

Aaron:

I know. That's what I'm thinking. It might be a beer conversation.

Adam:

Yeah. I I mean, really, I think it's it's up to the stakeholders to determine how how valuable was this information in changing your decisions. But, like, what's the counterfactual? What would have happened if there was no data? I mean, we don't know.

Adam:

It's up to those people who are hopefully asking for data to to be able to tell us that this is actually really valuable. It was one of the the weird things is that, I was saying that, ideally, we need to evaluate the value of data based upon the this incremental benefit. But, really, it all comes down to what the stakeholders tell us. So if someone who is making a decision that's worth 100 of 1,000,000 says, I need this data. It really helps my decision making.

Adam:

Then it's a really difficult thing to do an AB test on that to be able to work out exactly how important it is. But we kind of just take them at the word. If someone

Aaron:

That's okay.

Adam:

Who's making these big decisions, says, I need the data to make these big decisions, then chances are they need the data. So how much would you be willing to bet Yeah. Make it possible?

Aaron:

In in in some ways, I prefer the tangibleness of those roles where they're looking for savings or they're looking for value. You know, things I enjoy about going to events and, you know, meeting people from different industries is that one of the guys talking to recently, they found a kind of 50,000,000 cost saving within their operating model through what they could tune and tweak and and make better just from observing the data. And that's, you know, that's that's glorious compared to well, here's here's some information, and here's a kind of intangible decision or support decision making, your role. It's it's it's not it's not as doesn't feel as easy and as as satisfying as if you really link it back to, yeah, thin, yeah, thin margins all over the place that that

Adam:

that feels more tangible. How significant was that 50,000,000 for that company? It must have been a huge company if they made 50,000,000 savings. Yeah.

Aaron:

It was a it was an energy company, and it was absolutely right. So, actually, funnily enough, I don't think it was something that would change the bottom line, you know, report to the shareholders. I mean, it was a massive company. So that that's kind of that's always scary in some ways that that level of difference would hardly even show up on the radar. Yeah.

Aaron:

That's that's that's that was the other part of it for me. I was, like, less bonkers. That's how could it how could it be that way?

Adam:

And,

Aaron:

yeah, it's very easy then to justify a relatively you know, modest growth in your data team if you're going to make that kind of return. Whereas I think a lot of people are in that situation where trying to grow their team by 1 or 2 or 5 people. It's really hard to justify. Not only can you not link it back to a tangible, you know, revenue benefit or, you know, profit benefit, you can't even really make a tangible, you know, measure of it, and that's that's tough.

Adam:

Yeah. Because that that would just I mean, I don't know how big a date team would need to be to justify 15,000,000 if if they could say that they could do that every year. I mean, not that they could.

Aaron:

Yeah. That one. Actually, there's only 5 people in the team. What's that? Team of 5 people find that kind of saving.

Adam:

I I think that this is also some sometimes part of the issue with saying that you're going to invest in data. The size of the team sometimes doesn't necessarily correlate with how much of an impact they're going to make. That 50,000,000 savings, I don't know whether they're going to make anywhere close to that saving next year. I mean, that that might have been a one off. Yeah.

Adam:

It's it's interesting.

Aaron:

I imagine it's the typical model, isn't it? It's not always gonna be, you know, incrementally that that step change in in benefit. But, yeah, the sure that sure. There's most companies, those those sort of fine margins, you can always make small improvements, and then they add up. But, yeah, who knows?

Aaron:

We'll have to go to the next event and bump into them again.

Adam:

I'd I'd love to hear about some, like, savings or some optimizations that were made that made, like, a huge material difference to some smaller companies or, like, just a big percentage change, because I'm sure that they exist.

Aaron:

Yeah. In in your roles, have you had that experience? Have you have you been sort of on the either delivery end or receiving end of a of a a kind of something that's game changing for a company?

Adam:

I I don't know about, like, game changing. I I think I worked at a startup probably about 10 years ago, and a lot of the decisions that we're making were fairly large for the company because we're making investment decisions or where should we be putting our investing our money in which marketing channel, what are the big deals that we need to close in order to, like, make sure that we can get enough money to, last until the next round of funding. Like, there there were lots of decisions that were made that were fairly critical, and they were informed by data. In terms of, like, game changing, where I've been in a company where they implemented something that was data related and made a huge difference. I don't think so.

Adam:

I think a lot of it has been either fairly core to the business or, like, incremental. I'm interested as well. I've heard about stories, like, even at Farfetch, where things like that would would happen, where people managed to suddenly make the marketing investments far more efficient because they didn't have marketing analytics, and then they had it for the first time.

Aaron:

Yeah. But usually opportunity.

Adam:

Yeah. I I think a lot of the data teams that have come into the data infrastructures for maybe, like, the last 6 years, 7 years have been fairly mature, fairly developed. So there's been improvements. But in order to create that step change, you almost need something new that wasn't there before. Whereas a lot of what I think I've been doing in the last 6 years or 7 years has been incremental improvements.

Adam:

Who knows? Maybe data science learning What's that?

Aaron:

Like we said earlier with joining a team, if you join, you know, a part of the business that's perhaps gone through massive growth or massive, you know, cost pressure challenges, well, there's an opportunity, and you and you can make a difference. I wonder then as you were talking about that, there's there's a topic I've been asking people about, and that and that's around AI and its use of AI. What are the big use cases I think people want is that kind of intelligence to spot those opportunities. I wonder if you think that's a reality. Like, do you think that's a, you know, do you think that's something that is even viable for the generative AI technologies or gen you know, generally from AI?

Aaron:

Or do you think it's a sensible, observant person? Like, what what is it somewhere in between? Is it what what what do you think, you know, the sort of future point of AI in in around data is?

Adam:

I think when it comes to, like, step changes in companies, I don't think it would I I think we're really far away from AI being able to generate some idea that's gonna be really impactful for a company without them asking for it almost explicitly.

Aaron:

Do you know what I mean?

Adam:

Like, it has to it's gonna have to be directed towards it. Yeah. I think that there's gonna be lots of improvements in AI, and its involvement in data and across different companies is obviously going to increase. But we're almost saying, like, at what point will AI come up with the next big idea that's going to improve things within a company? Like, notice that you're spending way too much in this area.

Adam:

This doesn't make sense. Or, I don't know, this new region is opening up. This new region is increasing in terms of the the total Yeah. Population size, and that's actually perfect. Or this is the ideal demographic that you should be going for.

Adam:

I think we're we're quite far away from anything like that. I'd agree. I think we're fine. I'd I'd agree. Yeah.

Aaron:

I think, like you said, the it it's really optimal at identifying clusters of what you ask for. So, you know, it it actually, funnily enough, lacks the intelligence bit entirely because it can't spot them without, you know, any any decent, you know, prompting from human or the other kind of you know, not not literally prompting necessarily, but actually, like, guiding your data and your questions and your, you know, your model towards something that you've already thought could be a possibility. Yeah. You know, it's I I agree. It's it's far from being able to sort of just detect where the best next opportunity is at the at the sort of company across, you know, multifaceted, you know, parts of a company.

Aaron:

It's just not there yet. Yeah. Perhaps with more data and more sources, it can get there. But, yeah, it seemed that seems far.

Adam:

Yeah. I think one of the issues that it's going to face as well is that lots of companies just don't wanna share that kind of data with it. Like, if I I think that, LLMs, like, AI in general, is gonna be particularly useful if people were very open about sharing company related information. So if you knew exactly how other people in your in industry were creating their data warehouses, I I'm talking from a data perspective. Obviously, this is far more far reaching than this.

Adam:

Which marketing channels they wanted to invest in? What were their key demographics? How are they marketing their, what will say, USB? Like, if you have all of this information, you're aggregating it across all the companies in the world, then I'm sure you can create loads of amazing insights for a company that's doing things differently, where you can start to say, you should be creating your, your material like this. Your website could be optimized by doing these things.

Adam:

You should create your data infrastructure, your your tech products like this. But I think we're very, I think that there's always gonna be this proprietary information, this, like, gulf between

Aaron:

Yeah. Everyone's protectionist, aren't they? Yeah. They're they're they're a long way off sharing completely openly every, you know, facet of information that they have. And, you know, that's either because of the competitive advantage or, you know, they're just worried.

Aaron:

You know? That's like, what what are they actually letting go out the door? So, yeah, I agree. You know, you're not gonna have the availability of information until something else changes. You know?

Aaron:

There's the the one thing that that happened recently that probably got me thinking the most is the the level of investment that went into OpenAI. There there must be people taking a bet that human level intelligence could be on the horizon or could be possible. I mean, the just the investment is so huge that it's it's it's the the vision then has to be enormous. You know, it can't be simply a a a profitable tool or something. It has to be people thinking this is gonna change humanity kind of level.

Aaron:

That that's how big the investment appears to be. And, you know, that that certainly gets me thinking, well, maybe. Maybe it's possible, but it it still seems, you know, when you're at the cold face of making, you know, data available, that seems quite unlikely, you know, where where we're sitting now.

Adam:

Yeah. I mean, I agree.

Aaron:

Adam, thanks for coming on the show. It's been an absolute pleasure. I've enjoyed every minute of our discussion that's covered a huge and wide range of topics. Hope to see you again very soon. Thanks, Aaron.

Aaron:

Thanks for having me. Brilliant. See you soon.

Adam:

See you soon.