Making Sense of Martech

"We see the future of digital experiences as living — personalized and self-improving 24/7." - Josh Payne

In this episode, Jacqueline Freedman puts Josh Payne in The Hot Seat to unpack "living interfaces," AI-driven experimentation, and what agentic marketing really looks like in practice.

Josh shares lessons from collaborating with OpenAI, why most CRO tests fail, and where generative engine optimization (GEO) is headed next.

We delve into enterprise realities, including privacy and governance, avoiding spaghetti code, and what a "living interface" entails.

Highlights

  • Discover how "living interfaces" adapt websites in real time to lift conversion and usability.
  • Learn a practical framework for automating ~90% of the CRO workflow with agents.
  • Understand when to start with global experiments vs. personalization to reduce data-privacy friction.
  • Explore the rise of generative engine optimization (GEO) and why it will rival SEO in influence.
  • Adopt guardrails: define measurable evals, keep specs specific, and stay model-agnostic for compliance.
  • Hear what enterprises actually need from vendors: context, integration, and disciplined measurement.

Timestamps

03:50 - What "living interfaces" are and how they create real-time, self-improving digital experiences

04:57 - Partnering with OpenAI: visual grounding, UI generation, and early access to models

11:25 - Why most CRO tests fail—and how AI agents flip the economics of experimentation

14:50 - What "agent marketing" means: automating workflows and scaling high-leverage loops

17:45 - Common AI pitfalls and why clear goals, specs, and evals matter

24:20 - Navigating privacy concerns and model choices in enterprise AI

27:30 - How Josh filters the AI noise and stays sharp

Subscribe to Making Sense of Martech wherever you get your podcasts. Leave a review, share with your team, and send us your questions. We may feature you in an upcoming episode.

Creators and Guests

Host
Jacqueline Freedman

What is Making Sense of Martech?

Unfiltered takes on the biggest shifts in marketing technology. We spotlight what matters, who's leading (or lagging), and what's next. In Martech, clarity is power — and we're here to deliver it.

00;00;03;12 - 00;00;08;01

Speaker 1

But lucky for making sense of time to.

00;00;08;12 - 00;00;26;12

Speaker 2

Where we interview leaders and put them in the hot seat. I'm Jaclyn Friedman, founder of Monarch and the global head of advisory for MarTech Weekly. Let's dove in and meet Josh Payne. A little bit about him first. Josh is a serial entrepreneur and technologist with a track record of building innovative companies at the intersection of AI and user experience.

00;00;26;15 - 00;00;50;27

Speaker 2

He's currently the CEO and co-founder of Keyframe, a platform pioneering living interfaces that adapt websites in real time to boost conversion and usability. He previously co-founded two other ventures, demonstrating his deep focus on automation user centric design. He's also an angel investor and frequent speaker on AI and marketing and product. And today, he's joining the hot seat to talk about where our tech machine learning and experience design are headed next.

00;00;51;03 - 00;00;58;22

Speaker 2

Welcome. And we like to start off with a few quick questions to warm things up. Just like any good IP warm up. But in the interim. Thanks for being here.

00;00;58;23 - 00;01;00;03

Speaker 3

Thanks for having me. Good to see you.

00;01;00;08 - 00;01;11;25

Speaker 2

Good to see you as well. All right. A rapid fire. So you're also a fellow Texan. And so I got to ask. True or false? California versus Texas. Which one is the best Mexican food? California. Texas. Which one? Oh.

00;01;12;21 - 00;01;29;12

Speaker 3

I might anger some people here. A lot of my new friends. But I've got to say, it can't be Texas when it comes to Mexican food. Tex-Mex is called Tex-Mex for a good reason. California has some good spots, especially in the Mission District, which is where our office is based. So I'm grateful for that. But Texas is where it's at.

00;01;29;13 - 00;01;33;01

Speaker 2

It's true. I'm glad we're on the same page. Otherwise we're going have to scrap the entire interview.

00;01;35;08 - 00;01;40;11

Speaker 2

All right. I have a feeling I know your answer on this, but I generated content and marketing. Yay or nay?

00;01;40;15 - 00;01;57;07

Speaker 3

I don't even know what it's a question. Yeah. So the serious answer here is that you have to approach it the right way. I think that everyone's in alignment, that it's transforming marketing and pretty much every other field out there and it's going to increasingly do that. But it's not just one of those things that you can slap on chip and be happy with.

00;01;57;08 - 00;02;04;13

Speaker 3

Right. It needs to be done the right way with the right team, the right, you know, thinking strategy, etc., but can be incredibly powerful often. Right.

00;02;04;14 - 00;02;12;07

Speaker 2

We're in full agreement there. Okay. So I kind of knew what your answer is going to be there, but what's the popular opinion you hold about the future of MarTech?

00;02;12;09 - 00;02;13;10

Speaker 3

An unpopular opinion.

00;02;13;16 - 00;02;14;16

Speaker 2

Yeah. Yeah.

00;02;14;18 - 00;02;18;25

Speaker 3

The future of MarTech to question. Yeah. I've got to qualify it with, like, an unpopular opinion, right?

00;02;18;25 - 00;02;19;14

Speaker 2

I mean, it's up to you.

00;02;21;15 - 00;02;33;26

Speaker 3

Yeah. It's like my popular opinion is that MarTech is super fragmented and broken and kind of like a pain to work with. And there needs to be a neocortex, so to speak, built up from the ground up the right way. But that's not an unpopular opinion, I think. I don't know.

00;02;34;03 - 00;02;39;25

Speaker 2

Okay. So. Okay. Yeah. All right. If you could automate one aspect of your job entirely, what would it be?

00;02;39;28 - 00;02;56;10

Speaker 3

Oh, man, it would be the sales process. This is something that, you know, it just takes up a ton of time. It's great to get to know people and people's needs and so on and so forth. But all of the stuff around it is what gets annoying. You know, all the follow ups of the the administrative year of the contract paperwork.

00;02;56;14 - 00;03;02;01

Speaker 3

It's quite a bit of time right now for me personally, you know, who knows? Maybe I'll do a weekend hackathon and solve that, too.

00;03;02;18 - 00;03;09;25

Speaker 2

Might be worth it. All right. And last question of the Rapid Fire is what do you was curious about? What are you exploring in your own time?

00;03;10;02 - 00;03;30;12

Speaker 3

I mean, it's not super relevant to MarTech, per se. I'm just very, very interested in how the universe works. So and this has been it's not a recent development for me necessarily, but I spend a lot of my free time going in rabbit holes on, you know, various aspects of physics, especially astrophysics, and trying to understand the workings of the universe from a more like mathematical level.

00;03;30;19 - 00;03;38;01

Speaker 3

It's really, really fascinating to me. I was able to take a couple of classes in college and quantum physics and math, so kind of just continuing that thread.

00;03;38;05 - 00;03;57;24

Speaker 2

That's amazing. Can't relate from a physics standpoint, but from philosophical. Yes, certainly. All right. Let's dove in a bit more to learn about what you and the girlfriend team are up to. So program aims to create living interfaces that adapt in real time. Can you elaborate on this concept and its significance in today's digital landscape?

00;03;57;26 - 00;04;20;11

Speaker 3

Absolutely. So we see the future of digital experiences as being what we like to call living. What we mean by that is today's digital experiences are largely static. It takes a lot of effort to, first of all, get them off the ground and then to actually personalized them and improve them over time. It takes an enormous lift from multiple functions of your organization, right?

00;04;20;12 - 00;04;39;21

Speaker 3

It's engineering to actually build the assets. It's design, command and design. It's marketing, having the strategy of what to do. And there's, you know, the list goes on and on. What we see is a future that's coming to our doorstep is this idea that these interfaces and experiences that customers can have, businesses start to become personalized and self-improvement.

00;04;39;21 - 00;04;47;07

Speaker 3

So instead of having a team that has to come in and constantly update, do the manual work here, this is going to be something that happens 24 seven in the background.

00;04;47;18 - 00;04;58;23

Speaker 2

Yes. That's great to hear. So speaking of things working in the background, I'd love to hear a bit more about your partnership with Openai and how that came to fruition, but also the highlights and lowlights of that collaboration.

00;04;58;29 - 00;05;15;02

Speaker 3

Absolutely. Yeah, that was a really great collaboration that we had initially with them. So the way that that partnership started was I have a friend in the company who is an angel investor in us and I have a couple other friends there as well. And we talked from time to time about the various ins and outs of building models and so on.

00;05;15;02 - 00;05;35;29

Speaker 3

And we were talking about some of the work that we've been doing, building our own violence, vision, language models. And the reason that's interesting to us is because as we're helping to create variations and do experiments on websites, we need to know not only how does this thing functionally work, but also how does it look. That's a really important part of the experimentation process is the design way and so on.

00;05;35;29 - 00;05;56;02

Speaker 3

So we had built a model that was at the time state of the art for visual informed coding. So you have a baseline of brand identity that you want to adhere to. We will be able to produce code that is aligned with that and as many companies have experienced whenever open the AI in a certain line of business.

00;05;56;03 - 00;06;10;12

Speaker 3

Probably not a good time for whatever startup is is in that line of sight if they're not working with them. So one of my friends, the company, he said, you know, he has obviously a vested interest in us today. Like we're going to be rolling out some pretty interesting functionality with Vision soon. You guys have done a lot of work here.

00;06;10;19 - 00;06;30;20

Speaker 3

Would you be interested in collaborating on this with us? And I said, Of course we're going to get steamrolled if we don't. Luckily, though, we were able to not, of course, get steamrolled and actually take full advantage of the incredible team that Openai has had. And we did this initial collaboration focused on specifically creating model that is very good at generating UI code given visual grounding.

00;06;30;23 - 00;06;43;17

Speaker 3

And so that was kind of the first instance of collaboration with them. And since then we've been working together and we have early looks at some of their functionality and keep in touch with the team regularly on that and are able to incorporate some of that into what we do.

00;06;43;22 - 00;06;57;27

Speaker 2

That's fantastic. And mainly because I want to know more with that collaboration. Does that mean your customers specifically at CCO Frame are getting kind of a front, a front row seat to some of the upcoming work? Or is it just I would love to learn a little bit more about that.

00;06;57;28 - 00;07;19;17

Speaker 3

Yeah, we really pride ourselves as staying on top of whatever's happening in the industry. And we, you know, new model comes out. We're able to get customers on that pretty much immediately, sometimes before it's even officially launched. And that's something that I think is a bit unique to us because we are such nerds about AI and have built strong relationships with the large model providers.

00;07;19;17 - 00;07;27;10

Speaker 3

We're able to provide that early access to people and have them take full advantage of that to gain that competitive advantage in a market that's fantastic.

00;07;27;10 - 00;07;50;01

Speaker 2

Openai, of course, is large language model that focuses on as many inputs as possible to be able to better train what they're working with versus deep sea is less breadth and more depth within specific topics. And so I'm curious if your collaboration with Open AI is almost the deep sea model on top of what already existed in terms of that kind of collaboration?

00;07;50;02 - 00;08;11;01

Speaker 3

Interesting. I certainly wouldn't call it a deep sea model with an open I mean, it has their own, you know, reasoning models, which are, of course, quite powerful and we take full advantage of as well with our platform. What we brought to this collaboration was specifically a great dataset for which to train a really good model on and like real world use cases that provide tangible, hard business value.

00;08;11;02 - 00;08;33;10

Speaker 3

So the combination of those two things and also of course talent, you know, we have people on the team that have done very impressive work, training models, even on bare metal. So, you know, the collaboration with them was, I think, useful as well for their alpha testing and developing their developer experience and so on. But I wouldn't call us necessarily like a deep seek or a functionality within Open the AI, while this was a collaboration that we certainly benefited a lot from.

00;08;33;10 - 00;08;44;11

Speaker 3

And I think that they also, you know, on the developer experience side benefited from it's certainly like kind of there's a Chinese wall in terms of what's offered. So it's not like opening or offering our model to other customers.

00;08;44;18 - 00;08;51;19

Speaker 2

Understood what is happening at the intersection of AI and marketing that people should be paying attention to that they aren't currently.

00;08;51;25 - 00;09;27;27

Speaker 3

Yeah, there's a lot of interesting things happening with generative engine optimization. This is squarely in the space that we operate in necessarily, but I'm actually very, very excited for the future of how people find brands and and build trust that's increasingly happening with these generative engines, you know, Chhatrapati and Tropic Squad and so on. And what I'm starting to see is a couple of companies out there that are starting to build observability tooling and then I think eventually will be able to build, you know, tooling that can help actively improve the I don't want to call it rankings, but performance in these general engines, there's other companies that are out there like profound that are

00;09;27;27 - 00;09;38;22

Speaker 3

doing this or daydream. There's others out there that focus on this. And I think that's an area of marketing that some are aware of, certainly, but it's going to be a lot bigger than most people realize for sure.

00;09;38;22 - 00;09;54;22

Speaker 2

I think the folks who have been so SEO forward are in that niche, are probably the only ones at the forefront of understanding what this next generation evolution is. And so that's great call for the later question of if we want to bring it back who you'd intro one of those folks would be interesting, for what it's worth.

00;09;54;29 - 00;09;55;10

Speaker 3

Yes.

00;09;55;19 - 00;09;56;14

Speaker 2

In case that makes sense.

00;09;56;22 - 00;09;57;28

Speaker 3

Sure. Yeah. I'm familiar with.

00;09;58;14 - 00;10;12;26

Speaker 2

Oh, perfect. Out of the box performance on your website is a very bold claim, especially just in the general landscape of MarTech. What delivers immediate value without a heavy implementation left on the frame, product tooling and how efforts are using it?

00;10;12;28 - 00;10;28;05

Speaker 3

That's a great question. Yeah, the beautiful part about Keyframe is that it's so easy to get started with and there's usually offered two methods to customers. One is we have a very simple pixel. You just add insight. It's just like adding Google Analytics, and if that's too high of a left, then it's probably not the right customer for us.

00;10;28;05 - 00;10;51;04

Speaker 3

It usually only takes, you know, a couple of hours of engineers time, maybe a couple of weeks for two way or whatever, but very, very little left compared to other types of implementation. And then the other method is even easier, which is you already have an experimentation platform like Adobe Target or Optimizer or what have you, and we can just run experiments through there, which is where most of the effort and value frankly comes from, is the actual content of the experience that you're running.

00;10;51;08 - 00;10;59;29

Speaker 2

Understood. And out of curiosity, if a company has their own homegrown testing and experimentation tool, this is something you also look up with or.

00;11;00;03 - 00;11;26;25

Speaker 3

Yeah, not totally. I mean, this morning had a conversation with a big Fortune 100 company in technology space. And one of the things that they came away super excited about was they have fully homegrown experimentation, analytics suite, everything up until like email, deliverability and sending is fully homegrown. And we were able to reference another case study with a Fortune 50 retailer that has also like homegrown system for the most part, with the exception of them using Salesforce for part of it.

00;11;26;25 - 00;11;31;00

Speaker 3

And we're able to come in within weeks and drive significant, you know, millions in incremental revenue.

00;11;31;04 - 00;11;42;05

Speaker 2

Well, that's a bold claim. I like the sound of it. So how do you see conference approach complementing or replacing traditional conversion rate or zero optimization and personalization flows?

00;11;42;07 - 00;12;02;27

Speaker 3

Well, today, conversion optimization is a very new task. It's involves multiple functions. It's engineering, like I mentioned, design marketing, data analysis, conversion strategy. It's a bunch of different things that have to come together to make it happen. And it's work that people today, for the most part, I think, don't really like to do. Some people just like to do it.

00;12;02;27 - 00;12;22;06

Speaker 3

But the problem is that most of the tests that you run fail don't provide significant lift. And so most of the work that people put in gets thrown away. And no one likes to have that work thrown away. What machines are very good at is just pumping out lots and lots of work, as we know, and they're happy as can be whether or not the work is thrown away at the end of day or not.

00;12;22;10 - 00;12;41;22

Speaker 3

And they're even happier with data, right. So data is what drives drives these engines and it's also what drives effective experimentation. So that combination of things makes this a very potent analogy to take on the future of of conversion optimization, ultimately helping to significantly accelerate teams and their own workflows.

00;12;41;25 - 00;13;02;04

Speaker 2

That make sense. And if I were to kind of play that back in short form, it's almost as if the depersonalization of the output means you can both have better personalization and also optimize quicker, sooner and better. Since the machine itself doesn't have a preference or doesn't have a longing for a project it's worked on in the same capacity.

00;13;02;08 - 00;13;05;06

Speaker 3

Yeah, there's no emotional side of it. It just does the work and it does it well.

00;13;05;07 - 00;13;21;29

Speaker 2

There you go. All right. Something I love finding when I'm in-house or I've worked with clients is that aha moment for companies. And I'm curious, what have you uncovered as a specific user behavior that has changed how clients are approaching it entirely?

00;13;22;03 - 00;13;49;18

Speaker 3

Interesting. So there's one case study that I'll mention here, a large travel company I think we're allowed to talk about this one actually tweeted the biggest travel company in Europe. So Tui has a variety of business lines and many, many different teams across the different destinations that they cover and one of the key problems with how experimentation is run it really big companies like this is that you have knowledge that you build up in different silos that are difficult to to translate to other parts of work.

00;13;49;20 - 00;14;07;08

Speaker 3

Those learnings, we've been working with them for the last several months and coming in. One of the first things that we did was tests specifically for different parts of the user journey, things that would indicate is this user up funnel in the journey or are they kind of just exploring things? Are they higher intent down funnel in the journey?

00;14;07;08 - 00;14;34;25

Speaker 3

And so what we discovered for instance, was that in certain destinations that they had, the users were actually further down funnel and the experiments that we ran that were more supporting things like comparison shopping and seeing options did much better. And likewise we found other we were deployed on other pages that where the users were more upfront on the journey and experiments that just got them further down the funnel did better without being committal at that point and just aiming to help them explore and learn more.

00;14;34;26 - 00;14;52;06

Speaker 3

This was an insight that I think is going to help to inform, you know, how the different parts of the funnel. So through this series of experiments, we were able to learn about the user behavior in different parts of the funnel that I think are going to inform how the teams roll out their own testing internally through throughout the different destinations that they have.

00;14;52;06 - 00;15;04;00

Speaker 3

And one of our value propositions is kind of being able to take learnings and apply them across different places. Exactly what happened with the first set of destinations that we tested out on and now expanding to other destinations is going to be one of the first things that we do.

00;15;04;05 - 00;15;23;25

Speaker 2

Awesome. That's exciting. So recently at the End User Conference you discussed agent marketing. Obviously, agents are all the rage, everyone is talking about them, but a lot of people don't know what they're actually talking about. So because you're in this space, I'm curious, how do you see agents transforming marketing strategies and what does this mean for marketers?

00;15;24;07 - 00;15;41;18

Speaker 3

That's a great question. Yeah. So agent marketing is obviously a term for buzzword, but I think it does have some meaning. In order to dig into what that meeting is, we have to break it down. So agents are of course, systems that can act and have access to tools that can do things on our behalf. They perform jobs, they have agency, so to speak.

00;15;41;26 - 00;16;14;17

Speaker 3

So what do we mean when we say agents of marketing? My definition for this really is defining workflows and then having a I take those on and on them. The reason I brought that up at the user conference is because from our perspective, what we're delivering is an internal workflow specifically of conversion optimization. The way that we deliver that, though, is, I think what can be useful to other marketers in their own jobs, which is we took the entire workflow, we identified the areas in which we could get the highest leverage to optimize and to automate for us in this, that was things like generating code and that's what we worked with with Open Edge

00;16;14;17 - 00;16;31;29

Speaker 3

create this this model for that specifically other things like creating interesting ideas based on data that we've seen in the past. Rate machines are great at looking at lots of data. Copy of course is is helpful doing doing analysis, market analysis, pulling out a bunch of data. Deep research is definitely reached, product market fit. I use it frequently.

00;16;32;09 - 00;16;55;21

Speaker 3

We have kind of a similar concept going and lots of data from lots of sources to do that marketing analysis, whether those otherwise take weeks. And so by piecing apart these different parts of the workflow of delivering conversion optimization, we were able to create a system that automates 90% of that entire workflow, which has been beneficial for clients because of the increased velocity and experimentation.

00;16;55;21 - 00;17;25;29

Speaker 3

And the reason I mentioned it at that conference is because I think you can take that approach to basically any type of knowledge work, but in particular into marketing. So when we think agent and marketing, what we're trying to do is identify what are those specific workflows or routines. I mentioned loops because I think most workflows are loops, at least in marketing, you have kind of like the content operations to get something out and then you have the learnings that you bring back and that informs content operations and blah blah and the kind of that that continues ad infinitum.

00;17;26;00 - 00;17;34;04

Speaker 3

This is applicable to not just web conversion optimization, but to paid media into lifecycle marketing and to basically every part of the marketing stack.

00;17;34;14 - 00;17;49;09

Speaker 2

Awesome. I'm on the same page personally, so I think I see the value of agent I particularly for marketing is it frees you up more time to do what you're actually good at. And I see it as a net positive, but the consequences are coming whether we like it or not.

00;17;50;10 - 00;17;51;11

Speaker 3

That's right. Exactly.

00;17;51;21 - 00;18;09;09

Speaker 2

So speaking of just air and how folks are including it a part of their own landscape and their own ecosystem, what are you seeing as some common pitfalls that folks and companies are facing while integrating A.I. into their existing tech? What can they do to mitigate risks here?

00;18;09;13 - 00;18;29;01

Speaker 3

So the first thing that I think is important to to define are your goals with respect to incorporating. And I don't mean that broadly because everyone just takes that and it's like, okay, we have a broad goal to save time or be more successful with it. Right. Does to you need to define precise, measurable goals that can be turned ideally into what are called evils.

00;18;29;02 - 00;19;00;19

Speaker 3

Evils are what we measure air against. If we have an evil for something, we can build a system that's going to do it with a certain amount of accuracy and then know with some certainty how well this is going to perform in our life. So once you have those goals defined as everything like this has been true since the beginning and I think it's actually it's been abstracted a little bit more used to be that there's this to this concept of prompt engineering, which was really, really important to getting out the right outputs from a model that's become less important now because models have become smarter, especially with reasoning models, but specificity and and creating

00;19;00;19 - 00;19;19;24

Speaker 3

great specs has remained very, very important. So I think now the it's just moved higher up the ladder in terms of what the important aspects of. So what's really important to do is first like have your stock have to find that, oh, this is going to be delivered and then of course have like, you know, the implementation done properly.

00;19;20;03 - 00;19;36;17

Speaker 3

You've got to have all the people on board who are going to be consuming if it's like a corporate type of relationship, if it's more autopilot and Agent X and ensuring that we have the proper measurements and controls in place, it's really, really important. So but yeah, the most important thing, specificity and having proper goals set up.

00;19;36;25 - 00;19;44;08

Speaker 2

That's perfectly summed up within the C-suite and to say implement AI. And that's the only aspect of the strategy.

00;19;44;08 - 00;19;44;15

Speaker 3

Yeah.

00;19;45;21 - 00;19;48;27

Speaker 2

Because I keep seeing and hearing that constantly as opposed to.

00;19;49;09 - 00;20;04;26

Speaker 3

Is a trope now it's an unfortunate trope because it's weakening the actual value, right. Like the true value of this. I mean, yeah, it's so insanely powerful. Most people just don't realize it because they're not close enough to it. I just had a conversation this morning with someone who is like leading the air strategy for a $400 billion conglomerate.

00;20;04;26 - 00;20;21;00

Speaker 3

So she sees lots and lots of different businesses. Is a conglomerate that we work with as well, specifically on zero. But she was talking to me about other initiatives that they're trying to pursue. And the disconnect here, like everyone is, no one wants to get disrupted. Like everyone identifies that this is a has potential to be highly disruptive.

00;20;21;03 - 00;20;25;25

Speaker 2

I mean, you don't want to be worried about your moat that's going away quickly.

00;20;26;19 - 00;20;43;20

Speaker 3

Crazy to think, right? Who would have guessed? It's not just something that you can snap your fingers in or you can throw money at and it's fixed, right? Like what it involves is bringing in people who are able to fully understand the power of AI when used properly and implemented the right way. Only then can you have. It's like the combination, right?

00;20;43;28 - 00;21;07;02

Speaker 3

One by itself is totally useless when combining real like especially enterprise use cases. Because I think that that's where a lot of the pain is right now and also the opportunity for AI combining the use cases with the knowledge of how to deploy properly. It's not rocket science, but it does require like I, I don't want to call it like being a AI native necessarily, but it requires a knowledge of what these systems are capable of.

00;21;07;02 - 00;21;24;09

Speaker 3

And like a firsthand experience, there is a bit of an intuition aspect to it. It's rarer than I think a lot of people would believe, which is to me it's a little bit crazy because there's so many, at least in the circles. And here in this app, there's so many smart people who know a ton about A.I., but maybe that's just because the world is small in itself, and there are actually not that many.

00;21;24;21 - 00;21;40;09

Speaker 2

It can be. But I've definitely had that realization of the folks I speak to in the industry. We're talking about this constantly, but outside of work it's barely discussed, except for maybe teaching my dad how to use opening AI for things you would like to Google that are more complex. For example.

00;21;40;09 - 00;21;55;02

Speaker 3

Yeah, that's I mean, it's true. Like it's pretty shocking to me. I don't know the stats exactly, but many people haven't even touched AI, so it's so easy to try out. Like, I don't want to say that there's no excuse because sometimes there's excuses, but it's just kind of shocking to me and a little bit sad. But we'll get there.

00;21;55;07 - 00;21;56;05

Speaker 3

It's just like the Internet, right?

00;21;56;06 - 00;22;03;01

Speaker 2

Exactly. I take the philosophy of Scott Galloway where AI is not going to take your job. A person who knows how to use A.I. is going to take your job.

00;22;03;01 - 00;22;19;02

Speaker 3

Yeah, I agree with that. I think I eventually will take jobs. But the term I think also, yeah, people who are using it, I mean, I use it like so much now and I am way more efficient and fast with how we do things because of it. It takes a little discipline for sure. There's a wrong way to do this as well, right?

00;22;19;02 - 00;22;35;20

Speaker 3

Like you can. I think there's this new term vibe coding, right? Like where people are just throwing problems out there and letting the machines handle everything. And what you historically ended up with is in the coding world, it's what we call spaghetti code stuff that like it's duct tape together and you don't really know how it works and you can't maintain it and so on.

00;22;36;01 - 00;22;52;08

Speaker 3

That's true with other things as well. If you don't actually understand what's happening through the systems that you're controlling, it becomes less useful because, you know, maybe that's the outcome for you delivered and you're good to go and put a bow on it. But for the most part, work is continuous and kind of has to fit into a bigger, bigger picture that you should be aware of and cognizant of.

00;22;52;21 - 00;22;58;17

Speaker 3

But there is some discipline that's required with using AI properly, but most people just aren't even putting the effort in.

00;22;58;22 - 00;23;28;24

Speaker 2

I agree entirely. It seems like a really poignant parallel. I come originally from the email marketing world before I kind of expanded more, and it's the equivalent of using your email service providers, pre coded templates that are really bloated verses and coding it yourself. Pros and cons to each one is more scalable. You can, you know, folks who are not technical can use pre coded templates, but if you're hand dicing the code, you are the expert of what needs to be there, doesn't need to be there.

00;23;28;24 - 00;23;33;19

Speaker 2

And so it's cleaner, it's quicker. So it's it's not spaghetti code I guess is the best way to phrase it.

00;23;34;09 - 00;23;35;24

Speaker 3

Yeah. I like spirit emails.

00;23;35;25 - 00;23;40;18

Speaker 2

Yeah. Frozen spaghetti on the one stick. So I get the metaphor.

00;23;40;18 - 00;23;59;05

Speaker 3

It's very true. It's very true. Strictly speaking, not to say that we produce spaghetti ourselves, but experimentation is kind of throwing stuff at the wall and seeing what sticks, and that's kind of what machines are great at. So as long as you have, like I said, the discipline to have to implement things and understand why and understand the right implementation of things from there on out.

00;23;59;05 - 00;24;04;29

Speaker 3

But sometimes you just got throw some stuff to the wall, see what sticks, and then you shouldn't endlessly debate on something and you know, deliberate and so on.

00;24;05;01 - 00;24;26;26

Speaker 2

Agreed. So we've talked about some of the really exciting and awesome guardrail based topics on how to use deeper crypto into the water. See how companies can do that. The first thing that I think of as it relates to risk mitigation and just risks in general is privacy and data ethics. And I'm really curious about your thoughts here because I know I personally have a lot of concerns.

00;24;26;26 - 00;24;31;05

Speaker 2

And also, of course, every enterprise out there is going to have those concerns as well.

00;24;31;11 - 00;24;47;27

Speaker 3

Yeah, it's a great point. There's kind of a couple of ways to look at this. Yeah, there's a lot of facets here, to be honest with how we work with enterprises, we typically don't touch any data that would be otherwise thought of as like PII or sensitive at least to start. So in the conversion optimization space, there are a couple different techniques.

00;24;47;27 - 00;25;09;07

Speaker 3

There's like kind of just normal global zero, which is, which is where you're just running experiments and trying to increase conversion rate or click through order. Then there's personalization where you're taking in user attributes to figure out how do we match the right experience, the right user. Well, we typically start for these engagements on our site is the first part where we're we're just listening to what these are doing on the page, not who they are.

00;25;09;09 - 00;25;23;22

Speaker 3

And then once the relationship develops, we're able to dig in and figure out, all right, what are the best ways to personalize? What are the opportunities here? What are the levers that we can pull? What's the the data that we can consume to do the personalization on? And that's when it becomes a little bit more interesting from a data privacy perspective.

00;25;23;23 - 00;25;45;14

Speaker 3

Now, there's a bunch of things to be aware of when it comes to data privacy, especially with AI in the loop. Right. Many companies are okay with opening AI, you know, getting getting access to some of the data opening AI Anthropic, Google, etc. Some companies, especially ones that are in highly regulated environments, are not okay with that. And they have their own hosted versions of Llama and Mistral and so on.

00;25;45;15 - 00;26;06;19

Speaker 3

Sometimes, but rarely. And that typically ends up being a path that if you're a highly rated company, you are exploring with all your air vendors. So not just the core mile providers, but also the ones building on top. So for instance, when we talked to highly regulated industry players, we come in with right out of the gate composable.

00;26;06;22 - 00;26;19;10

Speaker 3

You can plug into your systems, your models. We can be single tenants, right? We've explored on Prime, although we don't really advertise that there are ways to solve for it. But it kind of depends on the level of what you're getting in and what data you're saying.

00;26;19;17 - 00;26;39;29

Speaker 2

Makes total sense. All right. Shifting gears from the perspective of being on the vendor side. So what is something you wish other vendors understood more about working with marketers, technical and non-technical? And there's different personas. If you haven't been a marketer, sometimes you don't understand how to speak to marketers. I'm curious not just for yourself, but the greater industry.

00;26;40;06 - 00;26;42;04

Speaker 2

What do you wish you understood?

00;26;42;08 - 00;27;06;06

Speaker 3

So I'll caveat. This was saying that I've never been a marketer that has been like, I wish this vendor understood me better. But something that I've heard marketers talk about when they talk about vendors is context, like understanding their business, understanding their actual problems. Because most businesses are very different from each other. Vendors will come in with a solution that for the most part, it's one size fits all.

00;27;06;08 - 00;27;24;07

Speaker 3

We, to an extent are like that as well. Although we have kind of a forward deployed motion, we're able to go in and deeply understand and the context is really important for success. You know, people talk about our way. The way to actually achieve our way is to seamlessly integrate with the business process and try to drive toward the actual goal that they care about.

00;27;24;07 - 00;27;33;29

Speaker 3

And so that's something that I think some vendors do really well, but most are pretty focused on their lane. And I've heard marketers, you know, have complaints around that.

00;27;34;04 - 00;27;36;26

Speaker 2

Yeah, I think it boils down to bad discovery.

00;27;37;03 - 00;27;37;12

Speaker 3

Yeah.

00;27;37;19 - 00;27;42;21

Speaker 2

In not listening to the customer, it's true for every industry person.

00;27;42;24 - 00;27;43;03

Speaker 3

Yeah.

00;27;43;06 - 00;27;55;17

Speaker 2

All right. How do you plan on staying ahead in this rapidly evolving landscape? Of course, you're in you're in among the groups talking about this constantly. But where else do you look for resources to stay ahead?

00;27;55;21 - 00;28;19;25

Speaker 3

Ooh, that's great. So I think actually the best way to stay ahead is the blogosphere or the Twittersphere, to be honest. I mean, S.F. is a fantastic place to connect with people. There's there's the engineer summit that's happening right, like even right now or upcoming and things like that. Ted I happens here a bunch of fantastic communities around here, but people are always posting their advancements on on Twitter or in the blogs and stuff like that.

00;28;19;25 - 00;28;39;03

Speaker 3

So I don't think people are necessarily missing out in terms of getting on, staying on top of things. But it's really about who you're following is what's going to make that successful or not. And then again, like I mentioned before, having the discipline to continue to like, you know, find the high quality sources of truth and information and and not just be, you know, not just get distracted.

00;28;39;05 - 00;28;45;02

Speaker 3

It's really easy to get distracted. There's a lot shiny objects out there trying to follow where the puck is headed with that is really the key.

00;28;45;06 - 00;28;51;21

Speaker 2

So who are some of the maybe shining stars that you would recommend? We we follow and pay attention to or subscribe to?

00;28;51;27 - 00;29;13;10

Speaker 3

I think that switch is one that's really good. He's got a great he's one who runs Engineer Summit. Harrison Chase from Lake Gene is great. Jerry Liu from Lehman Index is he's a great voice. There are tons of people who are really smart and are kind of at the edge, I would say. And then a strategy for finding out what other things are good out there is just to see like who are these guys also following, right?

00;29;13;15 - 00;29;14;26

Speaker 3

And you just kind of photograph.

00;29;15;04 - 00;29;20;15

Speaker 2

Yeah. I got to ask though, where the ladies at? Any folks on that side of the spectrum do paying attention to?

00;29;20;20 - 00;29;51;00

Speaker 3

Yeah, there's a friend that I have, Natalie Mira, who's at the company called Sierra. Ooh, she's one of the leading people, I think, in the current like for deployed company but like my agent solutions for like kind of focus that I've come across. She joined really early and she was I think behind the blog to put out for the tower bench benchmark and earliest contribute to it and had you know has an amazing thoughts about how one can incorporate AI into an agents into real like enterprise use cases for in their case like customer support.

00;29;51;18 - 00;29;54;10

Speaker 3

So definitely someone I recommend giving a follow for sure.

00;29;54;14 - 00;30;08;23

Speaker 2

All right. Last question for you. Who is someone we should have on the podcast? I feel like you've mentioned a couple of different options between some of your customers. You might have some great leadership folks. You want to talk about the space or some of the other companies you work with who you recommend?

00;30;09;01 - 00;30;32;00

Speaker 3

Yeah. So couple folks come to mind. There's person chase is just great. Overall I you know he's not specifically martech but think he's really, really, really good. You know he's a very he's certainly a thought leader in the space. Dylan Babbs is a great guy as well. He's the founder of Profound, which I think is going to be a successful company in the like generative engine optimization space.

00;30;32;00 - 00;30;40;16

Speaker 3

I think not only would be a great person to host on the customer success side, working with a lot of really large enterprises, which is I think adjacent to MarTech, I would say.

00;30;40;21 - 00;30;44;26

Speaker 2

Oh, for sure, I got him. Yeah. Well, I'll take you up on those introductions then.

00;30;46;11 - 00;30;47;01

Speaker 3

All sounds good.

00;30;47;27 - 00;30;52;00

Speaker 2

Well, thank you so much for being here. Where can folks find you and learn more about Kufrin.

00;30;52;06 - 00;31;08;19

Speaker 3

So you can head to Kogan.com? We have a nice way to test out some of our functionality by putting your website into our platform and then we can jump on a call and show you how it how the platform works on your own site, which is kind of cool part of our process. For me personally, I'm not super active on social.

00;31;08;19 - 00;31;12;20

Speaker 3

I'm on Twitter. You can search for Josh Paine and you'll find it probably understood.

00;31;12;22 - 00;31;13;28

Speaker 2

Well, thank you so much.

00;31;14;00 - 00;31;15;10

Speaker 3

Yeah, thank you for having me.