Data Matas

Data Engineering as a Team Sport: Coaching, Accountability, and Pragmatism
In this episode, Sam Wrench, Lead at Reality Mine, joins us to explore what actually makes data teams perform at a high level.
We cover:
Why data engineering breaks down when teams optimise for tools instead of people
 How coaching principles translate directly from elite sport to data leadership
 Why dogfooding your own data creates faster feedback loops and better quality
 How to think about AI as a managed colleague, not an autopilot
 Why pragmatic, security-first tooling often beats chasing industry hype
This is a grounded conversation for anyone building data platforms that real teams and real decisions depend on.

What is Data Matas?

A show to explore all data matters.

From small to big, every company on the market, irrespective of their industry, is a data merchant. How they choose to keep, interrogate and understand their data is now mission critical. 30 years of spaghetti-tech, data tech debt, or rapid growth challenges are the reality in most companies.

Join Aaron Phethean, veteran intrapreneur-come-entrepreneur with hundreds of lived examples of wins and losses in the data space, as he embarques on a journey of discovering what matters most in data nowadays by speaking to other technologists and business leaders who tackle their own data challenges every day.

Learn from their mistakes and be inspired by their stories of how they've made their data make sense and work for them.

This podcast brought to you by Matatika - "Unlock the Insights in your Data"

Aaron Phethean (00:28)
Hello and welcome to today's show. We are joined by Sam Wrench from Reality Mine. Sam has a people-centric leadership style and it's not going to be any surprise that we discuss AI and how people in the end are responsible for that output. So really looking forward to the episode. Great to have you here. Without further ado, let's dive in.

Aaron Phethean (00:52)
Hello Sam and welcome to the show. A huge pleasure to have you on. We were chatting before the show about how to introduce you and it's fair to say it's quite an eclectic journey to here. Science background, GB coach of dodgeball and leading data teams now. So please tell us a little bit about you and how you got here in data.

Sam Wrench (01:14)
Yeah, I mean, as you say, though, it has been sort of feels like quite a meandering career so far. mean, sort of starting off on the science research side, you know, for me, I was like the main goal was to be a doctor, to get a PhD. that sort of fueled the early career. And obviously throughout that, you know, spent a lot of time in some really interesting and complicated data sets.

Got the opportunity whilst on my PhD to actually, I'd say work with the GB and England dodgeball teams. So I was high up coaching capacity there and saw them through some really exciting times with the team. There's some really great performances on the global stage. Definitely a whirlwind experience for me. I was quite new to coaching and...

kind of took the punts and yeah, absolutely great experience. But then, yeah, coming out at the end of the PhD, I wanted to find something that, you know, sort of utilized my sort of experience with data and programming. And that led me towards working in data teams and then naturally through sort of coaching that leadership and sort of, you know, well, desire to.

be a leader and coach and work with people and sort of grow the people around me came through and sort of landed me in leadership positions. yeah, that's long story short, I guess.

Aaron Phethean (02:50)
Yeah. It

definitely feels to me like, what do I hear about reality mine and the company in a moment, but yeah, the kind of coaching high performance teams in a strange way, it feels like absolutely amazing preparation for the leading data teams and leading teams. What's it like being that kind of high performance coach and does it relate to your day job?

Sam Wrench (03:17)
Yeah, massively. mean, there's so much crossover. mean, the one thing I was saying, know, sort of with high performing sports teams versus, you know, sort of a data team is that the egos are definitely, definitely less hard to work with in the data teams. But yeah, it's just, it's the experience of, you know, the true nature of a coach is, yes, it's to mentor, it's to help develop the strategy and tactics and to build that team out.

actually the real side of it that no one really talks about is you're just building an environment and a culture that people can perform at their best. So once you translate that into the business world, you just start to see teams that just work well together and just that natural sort of collaborative culture comes out. yeah, loads and loads of crossover. It's been great to see.

Aaron Phethean (04:13)
Yeah, I coach a kids rugby team and I find the whole coaching experience, know, really, I learn a lot about myself and learn a lot about, you know, how people interact and work. And yeah, I think that's cool. I mean, that's really, you know, probably, you know, it set you up well to lead the team. So then coming to Reality Mine, just tell us a little about the company and who they are first. That be a really useful background for everyone.

Sam Wrench (04:43)
Yeah, so I mean, we've been around for just over a decade now. Our sort of core, I guess the core business is data capture. So sort of online behavior. So capture and monitoring. Clients vary massively. A lot of it's obviously around sort of market research, competitive intelligence and these sort of things. But the business itself and the product itself is data. So

You know, we capture it directly off devices. I will say at this point, know, this is the people that are part of our data capture. They know that they're having their data captured and, you know, it's all sort of consent based and all the good stuff that we have to make sure we've got in place. So yeah, that's the core product is data and that varies massively from...

Aaron Phethean (05:25)
Yeah, yeah.

Sam Wrench (05:33)
a very raw data feed all the way to sort of more curated data sets. So like being the data team within that business, we're quite new. And sort of we've been brought in to sort of be an internal sort of stakeholder to that data. So we're in a really great position where we're actually experiencing what our clients experience firsthand. if we're trying to...

get a certain piece of insight from the data, we're hitting any of the challenges, but it then allows us just to feed straight back into the business and say, hey guys, like this is actually a little bit difficult to work with, or I think there's like, can add a refinement here that might make the sort of user journey seem a bit more fluid. So yeah, it's a really great place to be as a team. And the business itself is, we've got a global reach, we're working.

all with companies from all over the world. So yeah, it's a really, really exciting business to be with and a really interesting proposition to be part of.

Aaron Phethean (06:36)
That's cool, that's cool. I'm gonna stop there while I close it. Keep recording, don't change anything.

As we were, pretend that never happened. The sunshine was. So I'm picking up from the top team and using the data. So you're using the data yourselves. And I think there is something really interesting about using your own product and the feedback loop that arises from that. Internally, then, you've separated the data.

engineering collection part of the business. Tell us a little bit more about you as the user of that. What are you doing with it? Like you're trying to dig into the data and what you see. What is the type of data that you're dealing with? Paint a picture of that for us.

Sam Wrench (07:18)
Crikey. mean, our remit as a team is so far reaching. So, you know, it goes all the way from, you know, creating automated processes to try to identify whether we're getting, whether we think we're getting all the capture that we need. So we've got any data points that are an explicit capture. You know, we have to build and maintain like a rule set to do that.

And then through that pipeline then is all the ETL, all the sort of processing that we're doing. So we have to, part of what we're doing is building out visibility and sort of programmatic monitoring of that side of things. So we know when things might be sort of missing downstream and obviously we want to make sure the data that we're using is of the best possible quality.

But then the other side of it is, you know, really we're working, you know, we'll work directly one-to-one with clients to understand, you know, what it is that their use case is or to, you know, even if they don't have their explicit use case that they want to share with us, which obviously is often the case, you know, in this sort of business, everyone's, trying to get the edge. So we understand if they don't want to tell us explicitly, but, you know, we can get an idea of how we can use our data. It's where my background.

in retail really comes in because I can sit there go, well, the main questions I was asked when I was in a data team at retail are X, Y, Z. So we can then use that to try and pull that from our data. And we're using data from, you know, sort of app usage. So general, like someone used X app and then went onto Y app. We get to see all the web sessions and then in some certain circumstances, we get to see all the in-app data as well. So

we can really go to a great depth and understand, you know, maybe what's driving certain like product purchases or, you know, abandonment of basket. So we get to see all of that firsthand. It's honestly, it's a really, I love it because I'm very curious as a person. So, you know, I get to sort of sit there and, you know, kind of be like, it's really exciting. Cause I can now see what happens when someone leaves, like when someone abandons a basket, I can see where they go next and what they're, you know,

onward journey is. it's a really, it's great. It's like a kid in a playground. It's brilliant.

Aaron Phethean (09:36)
Yeah.

I'm sure that definitely helps, like, be really interested in the domain and the subject matter. We were chatting before about the demand from the client for raw data versus insight. And it's sort of become clear to me as you were describing that, if they, you know, finance, they talk about alpha, know, finding that kind of opportunity. If they're not really communicating with you, telling you what they want to see, because that's...

sensitive and that's where their edge will come from. That kind of makes sense that those types of customers want the raw data and really want to figure that out for themselves. And where you're building out the insights and having that conversation and that sort of not wanting to tell you that that could be quite quite strange actually and being able to dig into yourself and make your own assumptions. It feels quite similar to a data team inside a company where

that trying to deliver insights to the business but perhaps don't have or can't get certainly without conversation what the business is about. How do you go about digging into it? What's your kind of like mental model of working on an insight?

Sam Wrench (10:49)
I will say that that will vary massively. think the ⁓

The, I mean, the starting point, if I can get something, you know, I want to have face to face time with people. I'm very much a people person. I want to be in the room and having, having that conversation, but probably comes from the sort of the time as a coach, you know, the hands on time is, is where you make the biggest impact. So yeah, it's, getting that hands on time. If we can't do that, it's really starting to, you kind of have to cast the net quite wide. You have to think about what the general,

and sort of themes are within the industry. So for me, staying close, not just to the data industry, but the broader light, you know, going to like marketing meetups and staying close to those sort of more like sort of diverse teams outside of my business is paramount because I get to sort of see the pain points that they're hitting normally. And then we can bring that internally and go, OK, so if our data dropped within that person's team,

would they be able to use it to answer those questions? And that's when I sort of then use my team to explore some of these options. So yeah, it really has to be a sort of best guess and just really staying close to the industry, which is why I'm probably so active in sort of the networking communities. I need to kind of get that insight into what people are, what questions people are asking.

I think it would be impossible to finish this podcast without using the word AI. And that's one of the biggest disruptors at the moment across the industry. Everyone knows it. you know, we're looking at, okay, what impact can we see in our data from, you know, these emerging AI ⁓ apps? So, you know, that's kind of how we have to go about it is a lot of detective work and a lot of sort of just general curiosity and exploration really.

Aaron Phethean (12:41)
Yeah.

And that, I definitely see that as a, a way for someone to be a high performer in a data team is really grow the knowledge of the business and be curious about it on AI. we were talking about sort of the different ways AI has been used to pre show. know, there's, you know, coding agent type style AI. There is a,

Sam Wrench (12:42)
Yeah

Aaron Phethean (13:04)
another kind of AI that think you touched on there that you're doing as a person, the more exploratory what is in the data that's not immediately obvious. And I think you were, one of the things that you were sort of sharing with me is that it doesn't really work without a person, I think you gave it a name, a kind of assistant. us a little bit more about what you think the kind of opportunity with AI on data and data sets is.

Sam Wrench (13:33)
Yeah, so I mean, I do think there's really two sides to AI and it's probably a conversation I've had many sort of, we'll call them conflicts, in sort of general conversations, know, either at meetups, whatever. One side that I think is incredibly powerful about AI is we're now able to sort of embed them quite easily on our sort of like internal systems across our databases or even just like a knowledge base. So,

I'm finding it's a lot easier to be able to onboard people onto our data sets because they can sort of have a conversation with information in a language that they understand. you know, that's one side of it that I'm really, really, you know, as a business, we're massively leaning into, you know, using AI to sort of obviously increase, you know, efficiency and output, you know.

some of these, I mentioned it before about, I sort of like explicit capture within apps. That is something that an AI can do really, really well is figure out what it is that we want to capture and start to pull that out and start to create these rules for explicit capture. So that's one area that we're really sort of leaning into. And I suppose that sort of then leads into the other side of it and how we're kind of utilizing AI.

as a team ourselves is to use these models and sort of send them out across our quite diverse data set and quite complicated data set to start to point out, you know, trends that we might not see normally or correlations and start to kind of interrogate that data on its own accord. And think that's the side of it that we talked about previously that

I've seen emerging quite a lot in conversations and I think is a really great way of looking at it is the sort of like the ownership model around AI. It's so easy to get AI to write you an email or whatever and you send it off to a client and there it goes. You never read it. It's just bang, bang, gone. The sort of growing narrative is, you know, actually if you're using AI, that's still your work.

So I've seen a lot of people have like digital colleagues and they're the kind of line manager to that digital colleague and any mistake that that makes falls back to you. So that ownership and accountability, you're like, well, maybe I want to check that work before it goes across the line. So I love that side of it. And it sort of goes back to me for the, I love human in the loop and I love people being in that space. So yeah, it kind of changes up the narrative a bit to fully automated to.

Aaron Phethean (16:01)
Yeah.

Yeah.

Sam Wrench (16:18)
It's like an intern almost that's going a bit wild. You want to sort of like ring them in and check their work a little bit. So yeah.

Aaron Phethean (16:21)
Yeah.

We see that progression as well. actually, it reminds me of, I think it was the police recently had reported a game, football game that had never happened. And it was because they were, you know, used Google or something that gave an AI response. And it was fairly...

know, inconsequential in reality in that it was, you know, a match. You know, the issue, I think, was around, you know, quite a sensitive topic. But the actual, like, what AI had produced was quite inconsequential. But it still their work. And obviously, the news item was police uses AI and news. You know, that's quite scary for people to think about. If I think about my kids and what they're doing,

They're like, they are using AI, but they're already growing, you know, very skeptical of the result because they get burnt quite early on. I feel like that's the maturity everyone's going through quite quickly is that like AI is great, but it can be completely wrong. And that's still you. I find that what you're talking about there is very insightful. How do you deal with that?

Sam Wrench (17:40)
I think my biggest approach is really to sort of like try to drive home that accountability side of it. And like, think you're saying about your kids that it's a really good example, because if you imagine, so me in the workplace, if I was just like, okay, well, I wasn't brought up on AI, but I've got it available to me now. So, hey, know, chat, you'll be tea, let's write me this email and send it. Then maybe that goes to the exec and they look at me and go.

always this load of rubbish. And you can see that it's been written by AI. So they just kind of, you know, will either roll their eyes or they'll call you out on it. If a child is going through schooling is sending AI assignments in and they've got all the sort of hallucinations in there, the teachers are going to be fairly stern on that one. And I think like what we'll see is that those kids will grow up to be

probably using AI in a much more sophisticated way and a more like correctly skeptical way. So that's what I'm trying to sort of bring into it is, if people aren't wanting to use a copilot agent, like in agent mode, it's like, okay, well, use it in ask mode. Look at this new code, this code base you've never used before and ask it, what does it do? Or ask it, you've just written a code base, you've just written some changes, ask it what...

what you've done and see if it can tell you what you should have done and what you've actually done. So it's kind of, I find it's a bit of a tailored approach. know, some people are diving in, you know, all in with AI and that's great. They need sort of to be kind of led through the sort of, I guess, the governance and the sort of maybe just understand the risks. Whereas those that are really skeptical of it, it's like, okay, well.

Let's look at those small wins that we can start to like bring into the business and, you know, changing your workflow. it's kind of, mean, it sounds like I'm going back into the coaching thing now is it's each person has its own like way of working and you have to make sure that you're meeting them on that level and bringing them on the journey.

Aaron Phethean (19:48)
Yeah.

And I think everyone's on that. There is no really well proven way to use it yet because nobody's got the experience. There are some more experienced people sharing their ways of working. But that's kind of the most exciting thing. It's like everyone's playing an entirely new game.

And to a degree, the data team's in a position of coaching everyone on how to play this entirely new game. And it's like, well, actually, you're only just seeing the rules and constraints for the first time yourselves. So yeah, that's equally scary and challenging, I'm sure, and exciting for teams that trying to deal with that. If you look at, then coming back to Reality Mind and what you guys are doing and working on. ⁓

Give us an idea of how that tool set works and perhaps what the tools are as far as you can. Share with us what is your experience so far of the tools and what's good and what's working for you.

Sam Wrench (20:44)
Yeah, I can't talk too much about obviously the capturing of the process of data, that's our secret source. But yeah, I can also start talking about what we're doing with the data and what tooling we're going about it. We're in an interesting place really. So the sort of ⁓ controversial thing I'm gonna say is we're using QuickSight, AWS QuickSuite.

Aaron Phethean (20:52)
Are you guys enjoying that area?

Sam Wrench (21:09)
to do our BI and dashboards, which I was talking to a recruiter maybe a week or two ago, and they were like, do these new people have to use QuickSci? And I was like, it's like, yeah, I think there's 12 in the Northwest. And I was like, right, brilliant. So a very niche sort of version of the BI tooling, but the...

The reason we're having to do that is with our data being so vast and there's a lot of stuff in there that I'm sure people wouldn't want getting out there and that sort of stuff. information security for us is absolutely paramount. So we have it written in to contracts that we won't move sort of panelist level data outside of

the ecosystem that we capture in, which is AWS. So we're sort of bound to using this sort of AWS tooling. There are obviously nuances within that tooling and there's things that we have to work around and there's challenges, but I go back to our sort of stakeholders quite a lot, because in the tech team, we know that there's this restriction on what we can and can't do.

but the more removed sort of stakeholders in our business are saying, well, yeah, why can't we use this platform? Why can't we use this platform? And it's less so to, you know, I don't want to just say, oh, well, we can't because it's contractual. It's helping them understand that we can do what we need to do on this BI tool or another BI tool. They all work slightly differently. Some of them are a bit more drag and drop and, you know, there it is.

Some of them take a bit more like heavy lifting. And I think that the thing that we've had to really drive home to people is the BI tooling sits on top of the data, which I won't roughly understand. But the data in which it sits on is that's where the work has to be. We've spent, my team's only been in place for about 12 months now. And our, yeah.

journey to date has been building consumable datasets. It's been building like good quality, well aggregated, well understood datasets that have a good level of understanding around them. So we can use that to then stick a dashboard on top of it go, there it is. And that is starting to sink through into people's like sort of their...

when they're having a conversation about building a dashboard, they're going, okay, okay, so we'll never use this. What do we need to do underneath that to get that system to do what we need it to do? So, yeah, it's a niche, I mean, it's a niche skill set. I will say we've been incredibly lucky in the fact that we hired someone within the first four hires of the team who's been basically using QuickSight since it came out.

Aaron Phethean (23:58)
Mm-hmm.

Sam Wrench (24:14)
worked in one of the Amazon like logistics centers. Yeah, it was been an absolute godsend because they knew the tooling inside out and they've been able to work with the existing viz developers and grow that out. It's been, there's been a godsend. And I think like that's where I'd sit to is if you are in a niche technology, it's not necessarily rip all the technology out and go for something new because it's easier. It's if you can find someone that can really utilize that niche technology.

Aaron Phethean (24:16)
Go.

Sam Wrench (24:44)
If it's embedded in the business already and there's reasons why we have to use it, just get the right person working with that technology and then bringing that technology to life in a more functional way rather than, we're going to get rid of it all and go cloud first, disrupt the business and change everything and all these sort of fun stuff. Yeah, exactly.

Aaron Phethean (25:06)
without actually a different outcome. yeah.

It sounds like you have quite a unique challenge there with your users in that if they have good tech knowledge and they're actually suggesting to you the tools that they would like to use, maybe that's a little bit different to a lot of teams where they have their complaints about the report or the insight or how it.

maybe how it looks or the information you're sharing them. But I'm not sure they all have a strong opinion on what tool would be better. What are your users? Why do they have this strong opinion about the tools that they would prefer to use?

Sam Wrench (25:42)
You see, I think like maybe in this situation, because obviously it's, know, BI tooling, it seems a bit of a stronger, kind of a stronger narrative. But I think it's from my experience, it's actually fairly common that, you know, we'll get someone coming up to us and say, oh, we need this model. Why do need this model? Oh, because we had it at the last place I worked at. It's like, OK, were they doing the same thing as what we're doing now? Not fully, but like it was really good. And it's like...

Okay, so yeah, yeah. And like that's the sort of, I think the side of it that we miss out on is, you know, it's very easy as a tech team or data team to sort of slip into that sort of, slip into the lane of like the business says, and you will do like, you know, the business says we're moving to this technology. Okay, cool, we'll move to that technology. We'll do all the, you know, the necessary migrations.

Aaron Phethean (26:12)
they understood it, so clearly they felt comfortable in that space.

Sam Wrench (26:38)
But we don't go back and be like, why are we proposing this move? Why are we having this big disruptive transformation when actually what we have is fit for purpose, we're just maybe not utilizing it well enough or finding the specific challenge that that person's facing. We as a business don't do oodles of analysis at the moment on

our onboarding journey for people to download our app so they can start the data capture. We don't do loads of that and we're like, okay, well actually we need to and that's not come as a sort of thing that we're like, all of a sudden one day we like, we should care about our onboarding funnel. It was a new business stakeholder came in and they were like, can we have this tooling because this is really, really good for

checking the funnel metrics of like an onboarding or a recruitment funnel. And we were like, okay, cool. The problem isn't the tool. The problem is the fact that we actually don't currently, know, catch or process or prioritize that data. So it's now having that conversation of like, you know, we can probably do this in our own ecosystem as it lies today. We just need to catch a few additional data points, process a few more, and then start to bring that together. Yeah.

Aaron Phethean (27:58)
Yeah, yeah. Use what we've got. Yeah.

I feel like that is like such great advice in loads of different domains and circumstances. Like the thing that occurs to me is like quite often in development, you spend quite a lot of effort creating something, developing something. But if nobody knows about it, then what's the point in all that energy and that effort? It's the same with that.

that data set that's tucked away in a corner that's really compelling, does something really quite insightful. But if nobody knows about it or understands what's in it. And so I feel like the takeaway advice is people should spend more time telling people about the data they have. And then that education gives them the next village to ask the right questions and really do more with the data. That's cool. That's quite interesting.

I suppose then, like if you look at your users, some of them are customers and you're trying to understand that data, they tend to be quite data literate, I suppose. They tend to be already like know their way.

Sam Wrench (29:04)
So this, mean, this is a, it's a real, you'd assume so because traditionally we've been sort of almost a raw feed kind of business where we do, we do an amount of pre-processing. We make sure the data is clean and usable, but you know, there's definitely been a sort of, we've known that it's landed in a business into sort of an engineering team. They're going to do the processing that they need. And often the,

a lot of our previous clients were sort of the middlemen in the situation where we'd give them the raw or raw-ish data. They would then sort of do what they need to do with it and sort of do the enrichment necessary. And then they'd sort of sell it on to another vendor or to an end client. And we're now moving, we work a lot more now directly with those end clients. And the one thing that's sort of coming through that,

transition of moving to working directly with the end client is some people step in the door and be like, we have a massive data engineering team that is bigger than your entire business and we will take everything you can give us and we'll get the insight from it. And you go, okay, cool. That's fine. They effectively want to have the IP themselves. They want to develop the models themselves. They want to have everything on their side.

Aaron Phethean (30:24)
Yeah.

Sam Wrench (30:27)
So of course we're like, know, we'll do all the commercials and blah, And that's actually realistically, you know, quite an achievable thing for us. Cause it is our bread and butter of moving data from someone's device, someone's behavior into someone's, you know, data lake and off they go. But not all of our like clients are massive data engineering teams. Some of them are, you know,

maybe more in the of either the strategy part of the business or even the marketing part of the business or wherever they might sit.

Aaron Phethean (30:59)
Yeah, yeah. And they need something more refined,

don't they? Yeah, so it's like they definitely need somebody who's explained it and always not dumbed it down, but already come to some sort of conclusions to get the value out of it.

Sam Wrench (31:12)
Yeah, exactly. in that situation, it's, you know, maybe you've got like a small data science team that sits in a sort of like a remote part of the business, or maybe you've got like a few analysts. But like in that situation, we need to be a lot more mindful. And like, this is really where my team like, pays its dues within our businesses, we need to be a lot more mindful of like, how usable is that data from the day in which it leaves our, our ecosystem into that. And

Aaron Phethean (31:41)
Yeah.

Sam Wrench (31:42)
We have to do that sort of due diligence to know what's their data seem look like or what does their, the people that using this data primarily. Exactly. Yeah.

Aaron Phethean (31:49)
what's their level of competence.

I feel like it's quite relevant to any data team that some users may actually be asking for raw data and they leave it to me and...

I think perhaps the risk of that that I hear fairly often is that they might draw conclusions that aren't accurate. you've always given them too much power to make the mistakes and conclude things. And also in kind of legacy environments, I see a lot of that where there's actually a fairly well-established data set that's helping people make a decision that's wrong.

Maybe it was right at one point, it's no longer right. And that kind of process is quite hard for teams to go to and have to get back to, what do we want this data set to answer? I wonder if you have any advice for how to deal with that? do you go about explaining what they should and shouldn't be doing with a data set?

Sam Wrench (32:49)
⁓ well, think, mean, what you've explained there, like the way I sort of always explains my team is it, it becomes the sort of classic dual responsibility model. Like, although we might be delivering a raw feed, we need to make sure that we can monitor that raw feed best we can. So are we getting, you know, the right explicit capture and these sort of things. So it's, it's even though we're, passing most of the sort of

the processing work onto the other side, we need to make sure that what we're delivering is still the same. And I think like it's that mindset sometimes can be like, well, if it's going over there, they'll be doing all of that sort of due diligence and they'll be doing all the checks to it. Exactly. it sort of becomes the, you sort of lose that, you know, the client might sit there and go, okay, so we've had all the checks on this. And then, know, the sort of...

delivery is just raw. So they kind of loses that communication. it's, it's the way, well, the best way we sort of come at it is to try and get in as early as possible, ideally in the sort of pre-sales process, to get a real clear idea of what it is, what data is that they want and what the, if they're not, if they can't give us a question that they want to answer or a use case, it's to try and

explore what key data points within this journey they really need to see. And once we can start establishing that, it's then we can, you know, we can sort of build up our side of things to make sure that what we're delivering is correct. sort of onboarding it into a business.

For us, it can be very varied. We spend a lot of time working with them directly, all the nice things. We want to spend just time figuring it out and we'll come back and ask you a few questions here and there. Or we'll say, we've seen this trend. Is this a technical trend that something on your side or is it something within the wider world? So we'll sit there and say, oh no, a political event has driven this increase in this app and these sort of things.

Aaron Phethean (34:59)

So it strikes me that that is really quite difficult and requires quite a lot of understanding of what could go wrong. And I can now see why you might be trying to get the opportunity of pointing AI at that and helping make that conclusions. But how on earth do you deal with the investigation into whether AI has got it right any more than you deal with those conversations of the conclusions they might draw?

feels incredibly challenging. Maybe there isn't a Nazi yet.

Sam Wrench (35:32)
No, I think that's it. It's at the moment, know, the key thing at the moment for me is making sure that when we're, know, passing over even though it's a bit of analytics to say that, you we've checked this data and it looks good. You know, it's it's understanding that the person still has to be part of that process, you know, on our side, they need to be, I'm putting the rubber stamp on this. I'm making sure that this is right.

And if it comes back and what I've said is incorrect, they need to be that sort of responsibility model. With AI, was something, was a quote a while ago that sort of stuck with me, which was the, I've got a feeling it was someone with an open AI said that, you're developing a sort of something within AI today, but the particular thing that you want doesn't exist.

Aaron Phethean (36:04)
I get it.

Sam Wrench (36:23)
just develop the work stream regardless. And by the time you've delivered in six months, it will exist. So I was like, okay, that's confident. But it's true, technology's moving at such a pace now that we have to be careful with how we're using it. with RealityMine, we're very ambitious with using AI, but we're also incredibly...

mindful of the risks of AI. And I think that's where a lot of businesses might come undone is they're like, let's go AI first. And they don't really onboard the business into this is, this is what can go wrong when AI doesn't work. Like this is, this is the risk. So I think at the moment, the only answer at the moment to make sure that the AI isn't going wild and doing, doing stuff that we're not expecting is to kind of keep that person in the room.

with the AI and just make sure it's doing what it needs to do. And if it's, the AI tells us that it's spotted a trend and then I pass that onto a human analyst and say, can you just verify this trend? Like we can do that. And I'd rather do that than just pipe it over to clients and say, this has gone wrong. And then they go, okay, that's a big deal. And it's actually just a hallucination. So that's the only answer I've got at the moment is just keep the people close to it and you

Aaron Phethean (37:38)
Yeah, let's have your opinion.

Sam Wrench (37:47)
give them that accountability that, you know, if this goes across the line, it is still your work, even though you've used an AI.

Aaron Phethean (37:55)
That's, yeah, that I think perhaps is a really great point to wrap up on and really sound advice that, you you might have this super tool, but you still should be accountable and you should still be owning the output.

think that's really great. Perhaps give you the last word and anything that you would give as advice to how to go about that accountability or how to get people to still maintain the ownership of their work or is it simply just to make sure that they are.

Sam Wrench (38:29)
I it really boils down to, for me, it's how you build your team. I know it's probably something that we touched upon at the start and haven't gone back to really, it's how you put your team together will help you. Because at the moment, I can be one voice in the team saying, you need to use your AI, but you need to check the AI. And people are like, well, I don't have to do it myself. Exactly. So if we get the team right and get everyone

Aaron Phethean (38:36)
Mmm.

that needs to be accepted by everyone.

Sam Wrench (39:00)
supporting each other and working together on it. And, you know, I've had people within my team turn around and being like, this document reads a bit AI. Brilliant. Thank you for feeding that back to me. And like, that's exactly what I've focused on is building that team that works as a unit and make sure that we're all jointly accountable for what goes out the door. And building that into the team is what's made that a lot more achievable for us.

Aaron Phethean (39:29)
Yeah, it feels like a really interesting next chapter of everyone. Like, you know, what do we accept and what do we accept is AI okay? And what do we accept is not representing us the way we want to be. Well, thanks, Ann. I really think that is like, been quite a cool insight into how your world looks in the world of data and the other emerging technologies and how you deal with people and all that that brings. So yeah, thanks again for coming on the show.

Sam Wrench (39:56)
It's been great, it's been good talking to you. It's been a really interesting conversation. hopefully we'll have another one soon.

Aaron Phethean (40:01)
Yeah, we'll do. Thanks, Sam.