Future-proofing the humans behind the tech. Follow Phil Gamache and Darrell Alfonso on their mission to help future-proof the humans behind the tech and have successful careers in the constantly expanding universe of martech.
[00:00:00] Phil: you've said that most teams right now are thinking about this like race to build AI copilots. and a lot of folks are kind of not thinking about building the cockpit first.
[00:00:11] John: our biggest asset is, this single source of truth data layer behind the agency. here's when this client started with this, here's when they ended, like everything operational too. performance metrics around clients. So a mix of all of that becomes the cockpit That is ultimately more effective when I now then go and layer AI on top of it our OS, we call it nova. is building contracts for clients, using a library of all the services that we provide. There's so many different tools and platforms that they work on And with one agency os kind of tying it all together it just provides a central spot for everybody to work off of,
[00:00:48]
[00:01:15] In This Episode
---
[00:01:15] Phil: What's up everyone? Today we have the pleasure of sitting down with John Saunders, VP of Product at Nova Power Digital. Power Digital is a San Diego based growth marketing firm. Nova is their proprietary marketing technology. In this episode, we explore building an AI cockpit before AI co-pilots, how an agency operating system reduces silos.
[00:01:36] Why context driven analytics replaces dashboards chasing a single source of truth in marketing data, and how to tell if an AI tool is more than a wrapper. All that and a much more stuff after a super quick word from one of our awesome partners.
[00:01:49]
[00:02:43] Phil: John, thank you so much for your time today. Really excited to chat.
[00:02:47] John: Yeah, happy to be here and nerd out with you.
[00:02:49] Phil: So much of what we talk about in MarTech this day and age is focusing on what's visible, what people can actually see, like the dashboards, the reports, the flashy new AI [00:03:00] features. But a lot of the work that actually moves the needle is the things that our audience cares about. Often buried in this like operational layer.
[00:03:07] That's where a lot. Marketing ops, data ops folks spend the most of their time. More specifically, one of the areas we chatted about before recording today is that Europe's just about rethinking how agencies actually function day to day. You said that strategy without systems is like burning budget. So let's, let's unpack some of that here.
[00:03:26] Like
[00:03:26] 1. How an Agency Operating System Reduces Silos
---
[00:03:26] Phil: when you talk about building an actual operating system or an operating engine for agencies, what do you mean by that exactly, and, and what's in scope? What do you think most teams are kind of missing?
[00:03:37] John: Yeah, I think, uh, the biggest departure for us from what you would traditionally see within other agencies, and really the track that we've been on for years, is not just focused on data automation or routing data to the right spots, like the typical things you would think of with marketing ops, but. More having a customized layer [00:04:00] within our agency that's built specifically for us.
[00:04:03] And to expand on that, what I would say is ROS, we call it nova. Um, it's really focused on the specific functions that we have throughout the agency and helping to automate those, or at least give our teams a central place to conduct those operations. So what I mean by that is things like building contracts for clients, using a library of all the services that we provide. There's different layers to that. The library is like guidelines for people on what they can sell to clients and for how much. Um, but then there's also automation built into that so that they don't spend as much time doing that. So with an agency os um, we're really trying to fix this problem with agencies where. There's so many different tools and platforms that they work on that [00:05:00] inherently creates silos for most people. And with one agency os kind of tying it all together and being focused on the operations of the agency, um, it just provides a central spot for everybody to work off of, which creates efficiency and alignment.
[00:05:18] Phil: Yeah, so many tools that agencies log into and there's so many clients. Is that use so many different tools. Like when I was on the agency side, really early in my career, I felt like that half of my time was just remembering logins on all, like the client tools and stuff like that. But like the dashboards and all of those tools too, like it, it's easy to just kind of lose sight of that.
[00:05:39] And I, I dreamt of this world where all of that stuff could be in one place. We could have like a, a command center for it. I mentioned dashboard there. 'cause like
[00:05:47] 2. Why Context Driven Analytics Replaces Dashboards
---
[00:05:47] Phil: you've argued that dashboards are kind of dead weight, like people admire them briefly. Like you see the cool visualization and then, you know, maybe they throw a screenshot in a PowerPoint deck but then it kind of like dies somewhere.
[00:05:59] Instead [00:06:00] you have advocated for analytics that are embedded context to where And opinionated. Can you walk us through what that looks like in practice and, and what should kinda replace dashboards inside high performing teams?
[00:06:12] John: Yeah, I think, um, if you're an agency, you have to have an opinion. Obviously that's like what people pay you for is to who has the best opinion, who's gonna drive the, the highest performance, and. So you have to kind of come to what that opinion is. But then when you think about dashboards and analytics, it's one thing to just purely show the raw data in a dashboard.
[00:06:34] I think 90% of people out there, that's what they do for us. Um, anything that we show that is measurement or analy analytics related, uh, is gonna have our opinion tied to it in the context of how we would measure things. So everything that we build is part of our product. Nova. Um, has this first step of does this align with the methodology of how we should be looking at this data? [00:07:00] Um, and if not, then it's not as useful to the team as, as we want it to be. Right. The other piece of it, when it just comes to dashboards in general and analytics and how it's done today. Um, I think we're in a really interesting time where I, it it's table stakes to be able to have the data and be able to produce a visual around it. Um, but it's so much easier nowadays to have context aware data. Um, meaning being able to put your files, your context, your communications with clients, um, into an engine that then fuels how analytics are shown to them. If I want to convey performance to a client, um, that performance isn't just the data itself. It's everything around that. It's the context to it, it's product launches that they had. It's all those, uh, things that don't show up right in a database. Um, and that's really where our, our [00:08:00] product is focused is melding the context aspect with the data aspect and having opinions, right? Like we're a very opinionated agency and that needs to show up. Uh, in whatever we're showing the client. But not to go too far with it, I would just say, uh, dashboards I think will start to go away.
[00:08:20] I think that just having analytics dashboards, like, it's just, I think it's gonna be a thing of the past as we start to move into these more, um, natural conversations with data, um, and having more context to where engines buying them.
[00:08:38] Phil: Very cool. Um, I, I come from a, a long line of, uh, dashboard practitioners actually started my career at a bi, uh, shop in like local startup and. Our mandate at the time was like centralizing data in this one tool, uh, company. It was called Portfolio. They're, they're still kicking around. Um, but like data viz and this whole [00:09:00] idea of like connecting things by API and having it all in one place, there's still like today, 15, 20 years later, a ton of talk about this idea of like a source of truth and then having a dashboard on top of that source of truth.
[00:09:15] 3. Building a Single Source of Truth in Marketing Data
---
[00:09:15] Phil: With like how fragmented client campaigns are, like cross channel, cross team, cross tool. Do you think it's even realistic for agencies to build like a single source of truth on top of the data warehouse? Or is the better goal like building something that's like a flexible interface around multiple sources of truth?
[00:09:36] Like what are your thoughts there?
[00:09:38] John: I think our product strategy in the past has always been positioned towards that goal of being the single source of truth for Power digital, our agency. Um, and not to knock data visualizations too hard. I mean, that's a big piece of what Nova is today. And you know, I think that's where most people are. [00:10:00] Our whole goal has been to take all the various data sources that marketers have to work with every single day. I mean, they're countless, right? There's the big platforms, meta, Google, TikTok, but then there's, you know, all these measurement platforms. There's, uh, one-off affiliate tools. The list goes on and on and on. So I do think that it's possible to get to a source of truth where you can layer in all of those different sources and be flexible enough to allow any business to use it as a source of truth. But I think that there will always be arguments of what the source of truth should be or arguments with us on how we're looking at the data. Um, I think it's more about is the source of truth real for us as an organization? Is everybody aligned that this is the source of truth? And so I actually see, see it as more of a people versus a product problem of if your people aren't [00:11:00] aligned, that this is the source of truth being the platform, um, then you don't have, but if they are aligned, then it is a source of truth for your opinionated approach.
[00:11:13] Phil: I like how you say that because like when we chat about source of truth on the show, a lot of people will say. We have sources of truth in our company, and the source is different based on the team that you ask. And so like the sales team, my source of truth is Salesforce. I live in the CRM, not in the data warehouse.
[00:11:34] Like I'm not in the marketing tools. My source of truth, the shit that I care about is in Salesforce. But like the end user's perspective of the source of truth is one thing. And I think like. The marketing operations, the revenue operations perspective of the source of truth behind the scenes. The, the operational layer is the data warehouse for a lot of companies.
[00:11:55] And a lot of your, um, your philosophy around that is, is also a. [00:12:00] The data warehouse. I know you guys are, are big, uh, snowflake partners there, but, um, yeah, I, I'm curious like to dig a, a bit further into your take on like why you think de despite, you know, a lot of the products around the agency right now, our, our dashboards and analytics.
[00:12:16] Why you think that's kind of going away? And I think you teased that out a little bit. Like the, there's already a lot of cool tools with, uh, natural language processing that allow you to have this kind of like chat, like interaction with data. So instead of like looking at a dynamic or a static dashboard and being overloaded with a bunch of signals and information, like when you start your day with the dashboard.
[00:12:37] You're typically trying to figure out like, all right, what happened yesterday? Is there something that I need to be doing today based on what this dashboard is telling me? And there's always this like cognitive load of the human digesting the information, trying to figure out is there like a red flag somewhere that I should dig into?
[00:12:54] Versus this like a bit more, not really like far out futuristic, like a lot of these tools are [00:13:00] doing this today, but you get a ping when you start the day, like a stock bot or something tells you like, Hey, uh, traffic grew 30% yesterday. Not a big deal. We know you had a campaign launch. So that like context, uh, awareness that you kind of mentioned there.
[00:13:14] But yeah, I wanna like, uh, let you unpack that a bit further, I think is super interesting.
[00:13:18] John: Yeah, we, um. I think where our platform has been is where a lot of people have been, where you serve up all the data and you say, we did it. We are flexible enough to where, to your point, the sales team uses Salesforce, but we have that data here too. And then client services has their data around what they're managing with clients.
[00:13:40] But we have that here too. And we did a really good job of consolidating all those sources into one spot so that everybody's working outta the same place. The thing that we overlooked is people don't wanna spend their time digging through all of the data.
[00:13:55] They don't wanna wake up in the morning and look at a chart and say, [00:14:00] okay, now let me spend the next 20 minutes analyze, start coming up with what it means,
[00:14:05] Phil: Yeah, and sometimes you don't have the choice 'cause your CEO is already at your door and they're saying like, why are free trials down
[00:14:11] from last week? And you're like, all right, there goes my morning, I'm
[00:14:14] investigating data.
[00:14:16] John: Right. And it's, it's such a wonderful spot to be in right now because there's so many tools that at least can take you maybe 80% of the way of, here's the data for the day. Paint the picture for me. And that's what we're trying to do with Nova is really just instead of. Giving everybody the data and expecting them to spend all their time to go through it and take action on it and make that really impactful for us, it's important to give them those insights, like hand feed it, make it so easy to where they don't have to spend any time other than reading through the high level insights and then they're off to the races. Um, but even your example of. Uh, [00:15:00] notifications and alerts on like anomalies. Like we have anomaly detection within Nova, and if you're managing a client, you're assigned to that client in Nova, it's gonna track performance changes for that client. And it could be good and it could be bad. And that's where the context engine comes into play. It could say traffic dropped by 50% yesterday. Um, but if we can also have this objective context of, well, they had some maintenance going on yesterday with their site or whatever, and it was planned for, and that's why, um, then we can start to adjust and hand feed that to the, the client services team and say, well this is why.
[00:15:42] Don't worry about it. We'll keep, you know, tracking what's going on here and detect changes and let you know people. You know, they don't wanna spend their time being the person that needs to go and do that. They just want that front and center. What's going on.
[00:15:58] Phil: Super cool. [00:16:00] Um,
[00:16:00] 4. Building an AI Cockpit Before AI Copilots
---
[00:16:00] Phil: you've said that most teams right now in terms of building AI products are, are thinking about this like race to build AI copilots. A lot of folks are, you know, messaging them like that. We are the co-pilot or the marketer, or the sales team, and a lot of folks are kind of not thinking about building the cockpit first.
[00:16:21] Like what's in that cockpit analogy for you? Like what foundational systems or data models or feedback loops do you think need to be in place before AI can actually like meaningfully help marketers?
[00:16:36] John: Yeah, I think, um, AI is only as good as what you put into it, right? Garbage in, garbage out. Um, especially I, I think models are getting better and that, you know, becomes less of a problem. But one of the investments that we've made as an agency, and really I think what our biggest asset is, is how much time and effort we've put into, [00:17:00] again, this single source of truth data layer behind the agency. And when I say that, most people will kind of restore. We'll immediately go to like a data layer of just performance metrics around clients. I'm talking about a data layer of here's when this client started with this, here's when they ended, like everything operational too. So a mix of all of that becomes the cockpit That is ultimately more effective when I now then go and layer AI on top of it because I can feed countless, countless pieces of data. Into this engine that then has more information to work off of and then effectively become a better co-pilot, um, because of that structure around it. So I think the difference is a lot of people or a lot of tools out there that may market themselves as the co-pilot, that co-pilot is based off of that company and what they have and the model that they put together. [00:18:00] It. It can run, it can work, but it's only gonna be as good as what people feed it. And if I can start from this, if I can start from the page of I have billions of data points to feed into that system, that's gonna be way more effective in the long term. Or even with instant gratification. The first time I use it, it's gonna be way more effective.
[00:18:23] Phil: You're almost like, I'm curious to get your take on, like
[00:18:26] 5. Why Data Accuracy and Transparency Build AI Trust
---
[00:18:26] Phil: what do you think makes this like agency operating system worth trusting for the marketer? Like we're at this like inflection point where. A lot of folks are excited about some of the new advancements and they're playing around with it, but I think a lot of folks still have this like trust issue when, uh, like we understand the capabilities to that AI has to like analyze data and like surface insights.
[00:18:51] The next step is like recommendations on what to do based on that stuff. And that's where like some of the trust factor is because. Like, sometimes [00:19:00] we don't give it all the context. Like sometimes it's not like the AI's fault. It's like we, we didn't like add that layer of, of, of context there. Like you're basically with this like west system that you're painting, like we're asking teams to rely on an internal system for guidance, not just like analyzing something and telling us what happened.
[00:19:19] How does the AI or, or the, the product that you're building, like earn that trust from the users? Like what signals tell someone? I actually don't need to check that this AI is telling me that I should be doing this thing. I can just go ahead and act on it. What are your thoughts there?
[00:19:36] John: Yeah, I think um, as soon as you have something that people second guessing the data or the insight that you're providing 'em, then you've established that trust. I think you earn that trust through one data accuracy is. A big piece of it. I mean, as a marketer, I don't know how many times I've logged into new tools that we've signed up for, and I look at [00:20:00] something and if I see one number that's slightly off, I immediately go, I need to talk to the rep.
[00:20:08] I need to, I have a ton of questions. How are you looking? You know, so that's, I think, a big trigger for people in terms of trusting these different platforms that they use is first and foremost, data accuracy. But I think the second layer of trust comes from allowing people to move quickly. And if they can trust that the data is there, but then they can also trust that, hey, this is gonna save me time, and this isn't just an added step in my process day to day. They're gonna trust that as kind of like the backbone that they're working off of. Um, and it goes for clients too. I mean, our, our platform is available for clients and for them to trust the platform. They need to see that there's activity from our team, and they need to see that, hey, working with this company, I know a lot of my time is [00:21:00] saved in this relationship because of this platform in front of me.
[00:21:02] So the trust, I think you, you can't just build products that add a step for people's process. Like all they want is to cut down on the number of steps. And as soon as you start to do that. Then they're like, I am the number one advocate. I will market this for you. Like they will become, uh, a trusted user or they'll trust you.
[00:21:25] Right.
[00:21:25] Phil: Yeah, so true. I, I think the other thing for me too is transparency. Like so much of AI recommendations are. Complete black box and it's like, trust us. Like just go do this thing. We have like an algorithm that is like basing this decision off of, and, but there's a lot of AI products that are building transparency and activity logs behind the scenes.
[00:21:48] And when you get this recommendation and you still don't have a hundred percent of that trust, you can click through and. How did I come up with this recommendation to do this? And like some of the earlier forms of this [00:22:00] were in, um, machine learning, lead scoring tools, uh, like Amplitude, Mixpanel were some of the first ones to do this.
[00:22:07] I forget which one it was, but it was like, we put all of your leads into A, B, C, or DA are gonna convert without a human touch. B, need a bit of love, C and d, like probably aren't gonna convert. And it was like, all right, cool.
[00:22:21] John: Right.
[00:22:21] Phil: the hell did you come up with that? And like there was nowhere to dig into that and it was like a black box and they were still kind of working on it.
[00:22:28] But a lot of tools today would allow you to say like, all right, how did I come up with that? You click on it, activity log shows. Alright, this person fits the ICP because they're at a 500 person plus company. They're a senior director, they have budget for this, they're a series C. Like, you see all of the qualifiers that led to that decision.
[00:22:44] What are your thoughts on like building this as like. Anti black box or like, I don't know what the term is, like white boxes as much as possible.
[00:22:52] John: Yeah, I think, um, transparency is key and I think it has always been like that with analytics, especially [00:23:00] building products that are supposed to be analytics, uh, solutions, right? Or, or at least analytics platforms. Like if I look at a chart, first thing I wanna know is how does this work? Like, what are the calculations that you're using to get to that metric?
[00:23:15] So for all the products that we've built, um, for clients and for, for internal employees, like the first question that people always have is what is the model behind that? Like, how does that work? And so we've had to build those layers into it. But I think the really interesting, uh, thing that's going on right now with ai, there's this big battle obviously with the underlying models that you know, who's. Who's the best at this point in time, right? And like they're constantly measuring against each other. But what it's created is within people who are using those models to build other products. There's been this one flexibility of being able to switch between models. [00:24:00] If I wake up tomorrow and Gemini is outperforming open ai, now I have a little dropdown where I can select Gemini, right?
[00:24:10] And instantly change what. A ton of things might be functioning. So there's that. But I think what has also been really effective, and kind of props to the product people behind this, but is designing, especially for these chat systems, designing the thinking behind, uh, what these different solutions are doing.
[00:24:31] Like whether it's agents or um, even like rag systems. Rag systems will show you like the SQL query that I used to pull. The, the answer out. Um, but then with agents, right? It's literally telling you like, what I'm doing right now is I'm gonna go and look up this data from this source, and then I'm gonna analyze it this way and here's the logs behind that. Like I think that transparency has been so key because of how much of a black box, um, [00:25:00] these things have been and how fast it's moving.
[00:25:03] Phil: Yeah, for like non-technical users, you can get a taste of this using deep research on, on open ai. Like you see the whole. Step-by-step tasks that the, the, the tool is using there. And I was using it the other day for research, for, for an episode and, uh, plugged a question in, and one of the first steps was like, uh, reading neil Patel blog.com.
[00:25:24] And I was like, Nope. All right. Not trusting this answer, like, stop this research right now.
[00:25:29] You can kinda like, get a taste for what they're doing before.
[00:25:32]
[00:27:37] Phil: I, I feel like your perspective on this is really cool because you're not just building an agency, you're building internal products to support that agency, but also selling those internal products externally to, to other companies and, and other clients.
[00:27:51] And I think a lot of agencies and product companies like have this debate in, in this inflection point throughout their journeys. Um. A [00:28:00] startup I worked at was called, uh, is still kicking around. They're called, uh, close CRM, and they actually started out as an agency. They were like a sales agency. They were doing outbound stuff for SaaS clients, and eventually they started creating internal tooling because none of their clients wanted to use Salesforce.
[00:28:17] So they built an internal hack together, CRM, and then they pivoted to just selling that CRM because a lot of, you know. Is easier than like selling services. And, um,
[00:28:28] 6. Building Internal Data Products for Agencies
---
[00:28:28] Phil: I'm curious to get your take on like your, your philosophy for building internal data products for agencies. Uh, agencies specifically are interesting 'cause like they build a lot of internal toolings and they're like stitching together off the shelf platforms.
[00:28:42] Um, you've actually taken the route of building a product like Nova and I'm curious to get your take on like. When does it make sense for an agency to become a product company internally? Have you had these, these philosophical, uh, debates a lot? What are your thoughts there?
[00:28:58] John: We, we've constantly had debates [00:29:00] around, um, Nova and whether or not it's, it was born out of being an efficiency tool, and so it was always for us, always for power digital. Um, but then it hit a point, right, where we're doing some really complex analysis and what we can do with, uh, the client's data and unifying all of that.
[00:29:22] Like it then started to become, well other people. Could really take advantage of this and would be interested in that. So there's definitely been healthy debates on should this also have kind of a SaaS arm to it, or at least give more accessibility to more businesses. And so that's something that we're, we're still considering, but I think on the side of like, when does it make sense to build internal tools, I think that there's. Two different factors to it. One is your operations and every company is different. Um, and it's [00:30:00] my belief that if you have a system that can grow and adapt with you, um, then you're going to, you're gonna get the most out of, uh, that tool versus tools that are off the shelf. Um, and why I think that is, I've just seen it happen, right where. We've bought tools before. It's not like Nova is the only tool that we have as an agency. Um, but there hits a, there comes a certain point where we change the structure of teams or we change processes that we have within the company. And all of a sudden this tool that we're paying, you know, a lot of money for, uh, can no longer adjust to them.
[00:30:39] Right? But what has always been constant since we started building Nova. Um, is it can grow and adapt with where the company's going. And we've definitely made those changes as those things change. And so it just gives you that customization, right, to be for your business and no one else. Um, but I [00:31:00] think also on the external side of things, especially within the marketing world, um. What we've really shifted our focus to is our client's first party data with just the lack of signal that we get from platforms. Also, just the lack of alignment across these platforms. And, um, we talk a lot about contribution versus attribution and painting a picture of here's how each, uh, different channel and platform contributes to your overall business success.
[00:31:31] Like, that's, that's our methodology or opinion, right? And. Uh, that's very hard to do outside of, uh, your own personal tool or like your own thought up method of doing so. And uh, with Nova we can ingest all of that first party data from our clients as well as all that ad platform data. And then have our own model that really paints that picture the way that, you [00:32:00] know, it should be painted versus using different platforms and stitching it together and saying, we think this is what's going on. Um, so it's really helpful on that side of things of, you know, we could get other measurement platforms. There's other solutions out there that work the same way, but for us, it allows us to be flexible, customize, um, but then shift with the market and. Have our own point of view on, on how we should be looking at this stuff.
[00:32:26] Phil: What do you mean by contribution? Versus attribution, is that like incremental contribution to revenue versus multi-touch attribution fighting for who should get credit for this or that?
[00:32:38] John: Yeah, exactly. Um, one of the things that we've always said is all we care about at the end of the day is that we're driving business performance for our clients. We're platform agnostic, channel agnostic, um, but we obviously need to know which channels are working and which are not. Um, but it's a mistake to [00:33:00] throw all of your eggs in one basket or eggs in one platform, I guess.
[00:33:05] Um, and that's really, you know, with multi-touch attribution. How it's been done historically. You know, you tend to over index on one area, right? And you put your money into an area where it's either saturated or it's not actually, uh, performing to the way that it's reporting back to you. And so a lot of our, a lot of our initiatives within Nova have been to paint the picture of incrementality and showing, uh, what's working, what's not, and what the realistic picture of. X platform. Like what is it really doing for your business? Um, and that's been very, very effective with clients. And oftentimes it'll shift their budgets on day one, where they'll get the picture of, oh, this is what it's actually contributing to business versus what they're telling me. Um, so that's always [00:34:00] been kind of our opinion, is you gotta look at it in that, that view versus. Uh, you know what? They're, honestly what those platforms are telling you.
[00:34:09] Phil:
[00:34:09] 7. Reducing Complexity in Martech Product Development
---
[00:34:09] Phil: I feel like we could have a whole separate episode on
[00:34:11] on attribution there, but the, like, the space is so complex that it's never been harder for agencies and marketing teams today to measure the impact of, of marketing.
[00:34:23] You're building a tool around this space here, and I wanted to ask you about. This idea of like shifting versus removing complexity as you're building a product, you've actually admitted that earlier in your career. You build features that you thought were really powerful on paper. People loved it, but then when it was rolled out to end users, it wasn't used to the level you thought and the feedback you got was because.
[00:34:46] They added more complexity instead of actually removing that complexity. How do you think about that now when you're shipping new products in this analytic space or even internal tools for power digital?
[00:34:59] John: Yeah, I, I [00:35:00] think you see it not just in the marketing space, but for product in general over the past few years with just the AI boom and different capabilities coming about. Honestly, I think it dates back further than that, but especially in the past year or so, you saw a lot of people come out with, here's this big, shiny new toy that is a new capability that you've never had before. A lot of those, a lot of those will land and people will take those and they'll use them, and now they do have a good, like a new capability. But I think mistakes that people make are thinking that just because it's a new idea, uh, that people are gonna gravitate towards it and that they're gonna use it. Let's all be honest here. Most people have their full-time job and they've established a process for that job, and they've established their schedule and their routine. Adding a new thing into it is maybe the most difficult thing that you can do as a product person. [00:36:00] And I'm not discounting where people are successful with that.
[00:36:03] For sure. It happens all the time. Um, but at least for us in our product strategy, I mean, how we used to do things was, let's sit down and think of the next idea. Let's come up with something that does not exist. Um, and a lot of the time that leads to very large products. Um, it obviously creates gaps in terms of. What people are doing now versus what they would need to do with this new tool. Um, and so we failed a few times for sure in terms of giving some, giving something to users that's new and saying, this is the next best, greatest thing that you could use. And then they look at it and it's got like 15 inputs that they need to fill out, and they don't understand the data and they don't understand the model behind it, and they don't know where it's coming from. And so within a few seconds they're like, eh, I am not gonna spend my time on it. So I think that's where the problem comes from, or the mistake that you can make. But where we've [00:37:00] shifted to as far as our, our product strategy and how we ship products, um, it's important for us to be iterating. Like I think we've done the work as far as building the, the, the structure and the backbone of what we need. Um, but now it's literally just about working with the actual employees within the agency to understand their day to day. And then iterating on the product in a way that anything that we ship out there isn't large in scale, so people can understand it, um, but also that it's backed by the user who informed it and that we're actually saving them time. So, to sum that up, it's, instead of thinking big and shipping big, it's not necessarily thinking small, but shipping small. Like we want to put things out there that are smaller, easier to understand. And over time influence process with those small changes.
[00:37:57] Phil: Love it. It's like grounding product [00:38:00] decisions based on user empathy and understanding that day to day, it's almost like building. Um, instead of building products in search of a solution because you think they're really cool building products for existing problems and understanding those problems deeply by chatting with actual people and, and understanding their day to day right.
[00:38:20] John: Right, exactly. And we don't want to, I, I mean, I think you saw that in 2024. Uh, new capability came about or came about, right? Everybody could start to plug in these different models to their products. And you saw a lot of products that were basically chat GPT with a different skin over the top of it, right? And that's where I think people have failed a little bit, is they think big and they say, here's this. Everybody's obsessed with ai. Here's this new product that's this new branch of ai and it does this one thing. The most successful has been, you know, the underlying chat [00:39:00] gts. The Geminis, because they're just improving in adding little iterations that allow people to do different things within their product versus saying, I'm this big new thing over here that you can add into your process, you know?
[00:39:13] Phil: Yeah. Yeah. So true. What's, like,
[00:39:16] 8. How To Tell If An AI Tool Is More Than A Wrapper
---
[00:39:16] Phil: you're building a product, you obviously super technical and you, you understand like the, the, the background for maybe spotting, like what is a g. PT wrapper versus something that's a bit more innovative, like what's, what's your litmus test for knowing whether a new AI feature, a new product or integration is genuinely useful versus just hyper or another wrapper?
[00:39:38] Like how do you cut through the noise there?
[00:39:41] John: I, I think it goes back to the. And visibility aspect behind how these products think. If, if they're not giving me that, it almost is kind of step one of, uh, I don't know if this is as effective as it could be, because I wanna know how they're coming up with the answers, right? It's [00:40:00] magical that these tools can do this and spit out the answers for you so quickly. Um, but I need to know where it's coming from. And so I would say that's the first thing. Um, but there's a lot of tools out there where you just, you plug in your question or it's positioned at a, a specific thing and then what you get back is the same answer that you'd get from chat GBT, right? So it it, it also goes back to that point of opinions and sources. Are these tools actually rooted in a methodology? How much is feeding that context behind the answers that it's giving you? How structured is it? Those things really are easy to spot with some of these tools, whether they have 'em or not. And, uh, I think the most effective ones, uh, for us right now are the ones that do have an opinion and can show us the transparency behind how they work. I'd go a step further too to say, I think 2024 [00:41:00] was a lot of that vaporware. Uh, let me just throw this new skin on top of one of these models. Um, but I think this year it's been really fun to watch the, the agent architecture and like people who are doing that really well. Um, because the biggest thing for me personally that I look for right now is, can I plug my data architecture into your tool and make use of that? So it's exciting that that's where we are now versus just kind of the surface layer chat, GPT, like tools.
[00:41:30] Phil: Yeah, you, you've actually used terms like nudge and removing. Operational drag to describe some of the vision of, of the, the product capabilities that, that you're building. Um, maybe on this like topic of figuring out what is kind of hype and, and what is a bit more interesting and, and serious. Like what does that actually look like?
[00:41:51] Like systems that nudge and think and can you maybe give us an example of a, a workflow or system that intelligently guides a team [00:42:00] through execution without slowing them down? I'm curious your thoughts there.
[00:42:04] John: Yeah. One of the applications that, uh, we built a couple years ago, it's called Creative Affinity, and it's for our paid social, paid media and creative teams. And the differentiator, 'cause there's obviously a ton of different creative reporting tools out there, but the differentiator for us is that first party data aspect. So what we built in the backend of our tool is an algorithm that's able to match an actual customer from their customer bank to the creative that that customer engaged with purchase through all that. And then what we show is the value of those customers tied back to the creative. That was the first step in kind of giving all of that data to everybody, and it was again, that mistake of just give them all the data and hope that something comes from it. Um, they looked at it and they said, there's so much to dig through here. I think we had like a [00:43:00] million creative assets in this application. And they're like, I have no direction on where to go.
[00:43:05] Phil: Right.
[00:43:07] John: This would take me days to look through. And so very quickly it turned into how do we nudge people along into the right direction, or how do we give them the right focus when there's this much data out there? And so we ended up building a feature called playbooks, and these playbooks are using the same data source or using that data from the application. But condensing it down into what's most important to see or what are the top creatives or what are the insights that you should really grab from, from all of this? Um, and all of those playbooks that we built are basically the brains of the top strategists within our creative paid social and paid media teams. So it's, it's a matter of scale. Because now we have basically their brains backed into [00:44:00] how they would look at the data and identify insights. But now I can give that to 200 other employees and say, here are the insights and here's the way to think about this data.
[00:44:09] So that's how, um, that's just one example of how we're kind of nudging people in the right direction of the way to think about things and even draw inspiration from, from that.
[00:44:20] Phil: Super coy reminds me of something else that you said, like the the best tools treat users as decision makers, not operators. How, how do you think that's like changed your approach to product design for, for Nova and some of the other tools that you, you rolled out internally? Can you maybe share like a couple of tactical ways where, um, the tools reduced the number of steps, but increased the number of clarity for like users, especially like on the agency side, that, like you said, are overloaded with a bunch of insights and not really sure what to do with them.
[00:44:53] John: Yeah, yeah. We've built a backbone of uh, what we would call like a knowledge graph where [00:45:00] there's a little bit more context and description behind the metrics, uh, that are. Available within the platform. That's where like that opinion layer comes from and just gives a little bit more information behind what me, what, what some of this data means. Um, and where we took that is really being able to serve up those insights to people, but not necessarily say this is exactly what to do next. And so. have different options, right? If somebody gets an alert through the platform that's like, Hey, we just noticed that this customer segment from our customer insights application, we noticed this customer segment, uh, could drive 20% increase in your lifetime value for customers who have bought three times.
[00:45:47] Like it's very specific, but we don't necessarily say like. Send that to the client or take this action right now, we give them the options. We say draft email or draft slack [00:46:00] or, uh, put this into the next weekly update for a client, or try again, like, no, this doesn't, this doesn't make sense. And when they hit try again, then it's add more context.
[00:46:13] Why is this not work? What is the, and so we start to build that system even more. But I would say from like the design side of things and how the, uh, the application kind of comes together for people is we want it to be, uh, we want it to give people direction. We don't want it to be rigid in terms of this is exactly what you need to do. And that's really important and is highlighted in the design of the application itself of, uh. The flexibility and the capabilities that they have to kind of make it personalized to them.
[00:46:48] Phil: Super cool. Um,
[00:46:49] 9. How to Build Client Portals That Clients Actually Use
---
[00:46:49] Phil: I wanted to ask you, I, I'd be remiss like not to, to pick the brains of, um, someone who's doing a ton of stuff supporting agencies, but also like being on the agency side. Like, one thing for my agency days that was a big pain for me was the client portals and, and having specific areas. To manage like almost like glorified share drives for, for every single client.
[00:47:13] Um, what would it take to kind of like turn them into real products? And I know you're kind of like innovating in this space there, but like I'm envisioning like client portals that actually give clients leverage and bit more of like that self-serve angle versus always having a strategist who sometimes answers very.
[00:47:32] Basic questions that they could have like self-serve themselves and like how would that change the client agency relationship? Just curious your thoughts there on, on client portals.
[00:47:42] John: Yeah. I think again, table stakes for what you're giving to clients. If you're within an agency or you're a SaaS platform, whatever it is, it always seems to be analytics and dashboards, right? So I think that's table stakes. I don't think that differentiates really anybody and. [00:48:00] Where I think you start to create some stickiness, especially on the client side of things, uh, is exactly what he said, the self-service aspect. Um, but what's also interesting is the conversations that we've had with clients to just get product feedback and understand what really brings the most value. Uh, a lot of them actually do want something like a shared drive, which is
[00:48:23] Phil: Hmm.
[00:48:23] John: where yeah, there's, there's Google Drive, there's all these different tools for that. Um, but it's almost like they want a combination of here's all the most pertinent files that, you know, mean something to our relationship. Uh, but then on top of that, here's your weekly update. And then on top of that, here's again who's on your team and here's how to contact them. So there is a little bit of like the management aspect on top of analytics. Where I think we've taken it the furthest is we have a product, it's called Insights ai. The client can go in and they can ask questions like they would [00:49:00] ask to a, a data analyst or a strategist, uh, and that will answer those questions for them. And right now I would say it's really focused on performance data and saying like, how many discount codes uh, do I have and what was the top one used last year? All that. So they'll, they can get that information, but we're expanding to say, Hey, what was the thing that was most important in our last monthly report with you all? Then look up, call transcripts, look up the report that we shared with them, do all of that, and be able to answer that question. So then it starts to go into that, that world of actual client management and self service of that too.
[00:49:43] Phil: Super cool. John. It's been a super fun conversation out, looking at time and like we're, we're flying through things here. Um,
[00:49:50] 10. Finding Happiness in Building and Experimentation
---
[00:49:50] Phil: what question we ask everyone on this show is this like happiness question. Um, you're obviously a VP of product, you're a team leader, an agency specialist, but you're also a soon to be dad.
[00:50:02] You're already a dog dad of two. Big golfer, big beach guy in San Diego. Uh, one question we ask everyone on the show is like, how do you remain happy and successful in your career, and how do you find balance between all the things you're working on while staying happy?
[00:50:15] John: Yeah, I mean I think it's all of those personal things and passions that I have. Um, but I think at the root of it, what really, and you can probably tell from this interview, and going a little bit technical in some areas, I'm a builder at heart and I love to experiment with things and. I think what keeps me happy is all the innovation that, that we see.
[00:50:38] I mean, I, I think it's like perfect for me right now because I just love to test new things out and, um, whether that's applying it to things that are personal, uh, or applying it back to what I can do within my job, as long as I'm building, experimenting and moving forward, that's really what, what, keeps me happy.
[00:50:55] But yeah, golf, my wife, my dogs, it contributes [00:51:00] to that a lot too.
[00:51:01] Phil: Golf might be on pause, uh, a little bit
[00:51:04] John: Yeah, it might not be as that here.
[00:51:07] Phil: Uh, yeah. John, uh, plug, plug Nova a little bit in, in power Digital. Um, I'm sure some folks are, are curious, like not every agency is building products. Not every SaaS company is also servicing on, on the agency side. Uh, yeah, plug the company.
[00:51:21] John: Yeah. Yeah. Power Digital. Um, we're a full service performance marketing agency, I would say, uh, in the past few years, really focused on measurement and. Uh, being able to paint a very accurate picture of what's going on, not just within marketing, but your business. Uh, and then Nova is at the core of it. If you're a client of Power Digital, you get access to Nova and all the, the great features that we built there. Um, but you can go to power digital.com and, uh, all of our services and information about Nova lives there.
[00:51:50] Phil: You've talked. Out like a lot of the Nova features, uh, throughout the episode today already. But like, if you were to say what is the one Nova feature recent or in the past that [00:52:00] you think like really blows people, uh, people's minds when they first use it, what's the one that comes to mind?
[00:52:07] John: Hmm.
[00:52:09] Phil: There's so many that comes to mind.
[00:52:12] John: Man, there's too many. No, I I would say, uh, we have, it's more in a testing phase still at this point, but, uh, we have this tool, three personas, um, where people can actually simulate their marketing ideas against AI personas
[00:52:29] Phil: Hmm.
[00:52:30] John: understand, uh, what the potential success of a campaign or an audience that you're targeting, like you can test creative against it. Uh, so kind of like out of the box personas, they give you a better idea. Uh, how your campaign's gonna perform before you even put a dollar into it.
[00:52:48] Phil: Very cool, like a simulated focus group, but the people in the focus group are your ICP.
[00:52:54] John: Yeah, exactly.
[00:52:55] Phil: Very cool, John. This is awesome. We'll, uh, we'll share out links to Nova and [00:53:00] Paradi for, for folks to check out. But yeah, really excited to to keep tabs on how you guys are innovating in this space. Uh, awesome to have you join us today.
[00:53:08] Thank you so much for your time.
[00:53:09] John: Yeah. Thanks for having me. I really, really had a good time.