Demand-Geniuses is the podcast for revenue-focused B2B Marketers. We bring you the latest insights and expert tips, interviewing geniuses of the B2B Marketing world to bring you actionable advice that you can implement to accelerate growth and progress you career. The role of Marketing in B2B go-to-market strategy has changed drastically. It's more important to revenue generation than ever as buyer engagement becomes more digital. We equip you with the information you need to thrive in this new, revenue-critical role.
Tom Rudnai (00:21)
Hello everyone, welcome to what is now, I believe, episode 21 of Demand Geniuses. It's kind of crazy that we've gone that far that quick actually, but to celebrate the occasion, I've got with me Oren Greenberg. So, well, first of all, Oren, hello.
Oren (00:34)
Hello.
Tom Rudnai (00:34)
Thank you for joining us. guess rather than one thing I've learned over the course of 21 episodes or soon to be 21 episodes is that better than me butchering the introduction of whoever I've got joining me, it's better to just let you do it yourself. So do want to give us all a little bit of an introduction into who you are and your background?
Oren (00:51)
Sure. So I'm Oren. I've been a marketer for 20 plus years now, working on a lot of different types of businesses. I worked with some very large corporate, Investec Bank, Canon, Lenovo, and also a lot of startups and scale-ups. And I now mostly focus on helping companies with AI, specifically B2B marketers, mostly training, workshops, building custom projects. And yeah, that's pretty much it.
Tom Rudnai (01:14)
One thing that stood out in your background, you spent, I think, was it about six years at Wonga back in the last decade?
Oren (01:21)
Yeah, I think it was about a year and a half as an employee. And then on my agency, there were clients for a good stretch. And then I worked with a few subsidiaries as well and working in multiple countries, not just the UK.
Tom Rudnai (01:37)
Okay, well there must have been a formative part of your career then and obviously quite different to where you've wound up now, like a very high volume B2C search environment and then I know you kind of transitioned onto more of a B2B world. Was that like a deliberate transition? And yeah, Tommy's about that journey, but.
Oren (01:55)
Yeah, sure. So, Wongo is an interesting one, a controversial company, and I've definitely had my fair share of Slack and Flac for that. My fair share of Flac for that. I'd say Wongo had a lot of really smart people and it was a real pleasure working with a really, you know, company that's trying to disrupt the very traditional space and working with very smart individuals. What happened with my
agency was, I essentially was very focused on generating pipeline and trying to get new clients in. So I focused on LinkedIn and I had a lot of success in LinkedIn. as my notoriety grew on LinkedIn, which is primarily a B2B channel, I ended up following that route and I
It was a conscious decision to pivot from B2C to B2B, but I didn't expect that I was going to do that pivot. So I only pivoted once I saw, okay, this is where the growth is going to be for me, and I'm going to pursue that. And before Wonga, I did do B2B, I actually managed 12,500 SME B2B accounts on Google Ads at the time, which I think at the time, probably one of the largest spenders on Google Ads in the UK.
And that was quite intricate.
Tom Rudnai (03:07)
I guess what got me curious was seeing the job title at Wonga which was obviously head of search, right? And I'd imagine search was a very, very big component of what they were doing there given the nature of the business, right?
Oren (03:16)
It was initially, but then they started investing in TV ads and above the line and like, you know, buses. And once that advertising shift spend it, once advertising spend shifted to buses and TV ads, then search became progressively smaller as a chunk. So I think if I was to quantify it as a percentage.
I'd say search became, search and display combined became about seven to 8 % of total media spend. At one point we had an ad in the UK on average every eight seconds. So it was pretty prolific. a very, it was a sizable marketing budget in terms of UK ad spend. And so for startups for sure.
Tom Rudnai (04:02)
does it become a little bit more of paid search strategy then to make sure that you're getting all of that traffic and capitalising on the kind of above the line investment?
Oren (04:12)
So actually a few interesting things happened there. What happened is a brand search started to out eclipse non-brand. And in that category, like payday loans, there was substantial search volume. So to watch it eclipse due to brand building activities was fascinating because it really, there's like binary conversation in the marketing community between like, should we do brand or should we do performance marketing? And digital marketing and performance marketing is the future and like brand is dying.
And then you actually see it happen and you're like, wow, like brand our Eclipse performance marketing, you know, a hundred to one. so that was quite fascinating. What was also really quite interesting was the quality of the customer was, was materially different. So customers that came from the branded efforts had four to five times a lifetime value and they were just better, like a better quality customer. It was a credit rating and, you know, the ability to, to.
Holds to the credit rating as well, like the default rate was lower. So overall, what I think is really interesting as marketers, we think about diversification strategy and we think about how to acquire customers for different channels. really what I'd encourage marketers to think about is how to attract the right customer and almost be channel agnostic and think about who is the customer I want to attract rather than go into your comfort zone.
everything, you see everything through the lens of SEO. And if you're paid search, you paid search and social social. And yes, there is a propensity for different quality of customers through different channels. And sometimes you've got to let go of what you're comfortable with and go with the right channel. But that's still a secondary consideration to like understanding who is the right customer, you doing your market research, getting those quantitative insights, getting the qualitative insights, and then really thinking about like, where can I go and find them and go, you know,
hunt where they congregate rather than just stick to your comfort zone. So I've seen over the years, marketers kind of make that mistake over and over again.
Tom Rudnai (06:10)
Yeah, well, it's really interesting. It makes me think back to we had a guy called Brendan Hufford on the podcast a couple of weeks ago. And he was making a very, it's quite a good segue actually into talking a little bit more about AI because he was making the point that historically we have all organized ourselves as marketers around channels. Like I'm the head of SEO, I'm the head of paid. What AI is increasingly doing is making like the redistribution of content across all of those channels or the repackaging of whatever campaign or whatever, whatever message you're trying to put out there.
much easier to distribute across the channel. So actually, the way that we organise ourselves needs to evolve a little bit. We need to maybe own a stage of awareness. So I think there's loads of different ways you can approach it.
Oren (06:48)
I think there's merit when you're talking about the workflow for content. I don't think there's merit to say the channels have shifted because of AI. But the way marketers are working with those channels due to AI. Yeah. I mean, if that hasn't shifted, then there's a problem.
So let's say, you know, if we're talking about B2B perspective, LinkedIn, has LinkedIn changed because of AI? It has in terms of like, there's more AI slop, AI content slop on LinkedIn. So there's an increase in noise level, but have people, are people using LinkedIn differently because of AI? Not really. say mobile app, people open up every day, still following different people, engaging with different content.
Tom Rudnai (07:15)
It was.
Oren (07:28)
The topic of AI on LinkedIn has obviously increased, but it hasn't changed people's behavior on LinkedIn per se, other than AI slop. And that's applicable to lots of other channels as well. Like has Instagram changed because of AI and the way people engage with it? Not really. The only channel that I think is material to talk about is search in the sense that we are seeing this two conflicting pieces of research. One saying that search is declining because people are moving.
to AI like ChachiBT to do the searches, or Gemini, Copilot, et cetera. And the flip side is research showing that there's actually people now using Search more because of AI, and they're just using both, but for different purposes, where about 30 % of the AI usage is generative. So people are creating stuff in AI, which they didn't do with Search, but Search still is part of their everyday. So I think it's very interesting to how there's like two conflicting
pieces of research that would do binary dichotomous points of evidence. The one that is showing that there's decline is obviously like market share and the shift to Bing from Google. And I think that's undisputed. And then the second one is from Rand Fischke and Sparke Turo, who did the research that showing that actually search is increasing and there's just this new behavior around using AI. So I think that's how behaviors have shifted.
And the most practical impact, I think there's a lot of other things that are happening with marketers like vibe coding, automation, Zapier, NA 10, the utilization of foundational models in the day to day. When I say financial models, mean Gemini, Claude, TragiPT, et cetera. do think definitely content workflows. think some marketers, their roles have really been impacted.
I think everyone to do with like translation has really been impacted. I it was 11 Labs who said they let go of their translation agency. They now just use AI to do all the translation.
Content marketers have definitely been impacted significantly. think that, I think, you know, the lower end, I think justifiably, I think there's a lot of really good quality talent who are deeply frustrated and justifiably so, because the people are using AI to generate content and they don't understand the quality of the contents dropped so much because they're, you know, the AI doesn't have original thinking per se.
I did do this interesting thing the other day. I created this criteria for creativity because there was a, there was a debate on one of my posts on LinkedIn and I was saying AI is not creative. saying it was, and then I spent like two hours researching and thinking about it. And essentially I came up with like a framework of like eight criteria for creativity and AI does fit five of, of the, of the eight, but there's three really critical ones that it doesn't meet. So I'd say like, yes, AI is capable of some creativity, but the three areas that it misses out on like.
human emotion and like the process and methodology, you only have input and output, it kind of skips that. And in a sense, I think it's quite lacking as a result of not being as fully creative as human content writers are. That's kind of top of my head, AI's impact. I think there's a few other areas I haven't mentioned like data analysis, but I'd there's like more obvious, blatant ones.
Tom Rudnai (10:42)
the image I have in my head is of an iceberg. Right. And I think what you're describing is above the waterline of the iceberg, we've not seen too much impact, or at least actually, I would argue a lot of that impact happened a long time ago in terms of how algorithms start ruling our lives and things like that, with the exception being search.
what we are now seeing is below the iceberg, the stuff that you maybe don't see on LinkedIn in terms of the workflows of how it's all created, is where there is a lot of disruption and everyone's scrambling to understand how to react to that. That's just the kind of image that I got in my head as you were talking. I guess one thing that I'm like, you're on the front lines a lot more in working with businesses and I think larger businesses who are trying to manage their way through this disruption.
How many business out there do you see that are leveraging AI in a way that is genuinely like transformational?
Oren (11:29)
very, very few. I'd say the most obvious ones are the AI native platforms themselves, and they're doing really interesting stuff. I mean, it's kind of, well, obviously they're building AI platforms, but the way that they run the business with AI is quite fascinating. Like their first port of call is, we use AI to solve this? And they're kind of, everything for them is through this lens of AI can do it, where normal traditional businesses, they don't default to that mindset.
And they're not confident, comfortable, or familiar with the capabilities. So naturally there's a lot more resistance as well. What they view it through is through the frame. There's two conflicting frames that executives have at the moment. The first frame is, this is really impacting me. It's impacting clients in like the &A process.
It's impacting them in terms of like the job descriptions for marketers that they're hiring. It's impacting them in terms of like their peers groups and what they're saying and what they're doing. So they're kind of getting that pressure, but then they have a conflicting mental frame, which is I use ChachiBT and gosh, it hallucinates and these hallucinations are catastrophic. And if that happened on a business level, that would be horrifying. And I think that's the tension people have at the moment. It's kind of like...
this increasing reliance on it, the inability to often assess if it's accurate or not, because the amount of effort cognitively to assess the accuracy kind of contradicts the whole point of using it in the first place, which is to increase efficiency and being more productive. So if you're going to end up having to spend all your time to figure out if it gave you a correct answer, that's not very good. And you know, someone left that comment on one of my posts the other day where they're like, you know, sometimes I have to unpick AI.
and the unpicking took me longer than if I had just done it myself from scratch. And I think everyone has this experience and this confusion around how to utilize it and how accurate is it and how reliable is it. So I think businesses, have this, a lot of conflict. to someone the other day, asked them a question about AI, they're like, don't talk to me about AI, it's demonic. And I was like, huh, what a fact, and it's a content marketer. And I thought it's such an interesting view how some people have gone like complete binary.
Like AI is evil, we need to completely avoid it. And other people are completely immersed in it. I I started it back in pre 2020 when OpenAI, there was no chat jbt, OpenAI had models like DaVinci and Babbage. And I was using, wasn't called AI, it was called natural language processing. AI was a subset of machine learning. This was NLP, which is part of machine learning. And I was using it to classify skills and job titles for marketers for some of my projects. And it was a completely different, I never predicted the explosion. So.
My view, I guess, is I've seen the evolution and the transformation and I am very early to have used these tools before the current popular form. So I have quite a biased view on their utility and I don't have that same emotional relationship a lot of people seem to have with AII where they get very emotional about it. For me, I'm very aware of its limitations. But what's most fascinating to me is these companies are spending
money on buying all these licenses and they haven't upskilled or trained any of the marketers in their team. And then people are using AI, paying for it, disappointed. And then they're like, this AI is not great. And I see it with marketers. I'll do a session with them and they'll try and do something. And they're like, yeah, the output's not good. I'm like, no, no, your prompt was really poor. And they're like, what do you mean? I'm like, this is how you properly prompt. And then you teach them how to prompt. They're like, wow, this results amazing. And then you realize that this capability.
gap and skill gaps. The problem is there's so much hype and so much FOMO that it's really fueling people. there's so much fog, it's really hard to see clearly who's full of it, who's actually got the capability and competence, who's got high integrity, what tools are great, what tools are just a wrapper, what tools are a wrapper on top of a wrapper. I think it's kind of like crypto, it's like the wild west of AI at the moment.
how to make heads of tales of it. That's what I'm saying.
Tom Rudnai (15:35)
No, I think there's so much there that resonates with the impression that I get of how a lot of people on the ground feel. I'm worried that we might have lost connection for a second there, so I'll kind of summarize just for the audience in case But what you kind of describe with is this tension that your ordinary marketer feels. I guess you'd say between two forces, right? Pressure and risk. So there's a lot of pressure from the top down.
on output, right? You're expected to do more with less, that old phrase that every marketer probably dreads. And interesting, you have this risk aversion coming from the bottom up and that is kind of the opposite of how it should be within an organization, right? You should have the executives who are very focused on minimizing risk and everyone else focused on flourishing and being creative and doing cool things. But I think it leaves you feeling trapped, Because you're...
There's so much noise about what you should be doing from internal and external sources. And as you said, you're not enabled on how to overcome that. It was something I had a thought about a couple of weeks ago, which is that AI still broadly lives within the realm of a copilot, right? Yet all of the money currently is being invested towards the copilot and none towards the pilot, which is just fundamentally flawed, right? If you wanna get more value from the copilot, teach the fucking pilot how to use it.
Which I guess is the same thing you've spotted,
Oren (16:54)
Well, there's the copilot, which I think is the most common use case for AI. But then there's vertical specialized AI tooling. There's agentic AI, which is a different category of how large language models are used. Then you've got your ripe coding component of AI, which is exploding, as we can see from the annual recurring revenue of companies like Cursor and Lovable, et cetera. And then you've got
workflows and processes that being built in the company leveraging AI. And that could be agentic or it could not be agentic, but it's not necessarily an external tool as much as an internalized model that the engineers or data scientists are building in-house. So there's a lot of different use cases for AI from what I'm seeing. The copilot is just the most familiar and common and probably where like 80, 90 % of where people are using AI. What I really mean is
someone's paying for Chad's EBT and they use it in their day to day and some of that is for work. And that's probably the most common. But there's so much more power. AI is like a car, but everyone's like driving it like it's a bicycle. And they don't really understand the power and the capability because a pilot hasn't been up skilled. And because of that, even when they're using it like a car, they keep bumping into things because they don't even know how to drive the car, let alone like treating the car like a bicycle.
So, and the reason is because it's exploded so quickly, companies haven't devised an effective strategy. And part of it is like, they don't know who to listen to. And it's so new. And the biggest problem is it's changing at such a rapid rate. So no one can keep on top of it. So I just did an analysis of the entire AI marketing, AI landscape, and I'm building a subset of AI marketing within the AI landscape. And there's about 28,000 AI companies now.
That's a staggering number. Like you can't get your head around who's who and what's what, what's good and what's bad, let alone that number's constantly growing. So it's not like a static number where like you'd hope one day to be able to get through it. And like, you know, there's such a proliferation and explosion that people are overwhelmed.
Tom Rudnai (19:03)
rate of change, you cannot keep up. And that's something I hear all the time from folks at networking events and things like that, is like, how am I meant to keep up with What's your take in terms of who should own that process within a company? Do you think it's a...
to have like a head of AI who owns at a company level staying on top of these things or do you think it should be something that sits with department leaders?
Oren (19:25)
Yeah, it's a great question. I remember this thing about the chief digital officer, the CDO. And I don't know if you know, but the job title for CDO went like this, and then you can guess what happened. It went like this. And the reason is, what was the chief digital transformation officer meant to do? He was meant to help facilitate digital transformation. But like, what exactly, what was digital transformation? Like learning to use email or migrating services to the cloud?
or what upscaling people on how to use Outlook on the cloud rather than desktop Outlook. Like what was that person meant to really be doing with digital transformation? And like when you think about how companies are built, they're built of departments. You've got your product team, you've got your tech team, your engineering team, you've got your design team, your marketing team, sales team, et cetera. The digital transformation happens in two areas. Each department's tools, kit, tech stack, and the combined software between those departments, the CRM.
Right. Et cetera. So what's that person meant to like be the influencing coordinating, or is it better to infuse the digital transformation in those departments upscale the people in those individual departments in their respective domains? And the same thing with AI. What's this head of AI going to do? Like what, what, what really needs to happen? There needs to be an exploration on specific pieces of kit that a company's using that's cross departmental and
exploration of processes, systems and tech stack for each department and the upscaling of the pilots in all of their respective domains. How is that one person sitting on top going to orchestrate and materialize that in a material? What is that person really going to be doing? So I don't see why you need a head of AI. Unless you're saying our core business has some sort of product and we need to infuse AI into it.
And really then that means I need a data scientist, you know, that's what an AI expert is. And I need to integrate them into my engineering and or product team somehow. But like, should that person sit above those teams and orchestrate and influence all the other teams? I don't, I don't see why. I think as long as there's a cultural mandate to become AI native and AI first, there's a mandate to upskill and train and invest and bring people along on that journey.
And there's a focus on what are the solutions that we co-create to solve the cross departmental friction and challenges on the CRM and the other tooling. Then that to me is sensible. That's a practical, viable route, which encourages collaboration and transformation from the bottom up rather than the top down approach. That's just my perspective as an AI advisor. And funny enough with clients, I specialize in
Tom Rudnai (22:13)
Mm.
Oren (22:20)
in B2B marketing and AI marketing in a very, specific domain. But across a lot of my clients, I now train lots of other departments. And the reason is they want to have everyone up skilled to be able to use the AI tools that they've purchased, like Copilot or Gemini, et cetera. So I think what's really interesting is that the way I see it is how do you just up skill the teams?
and then let them find and understand the best solutions for getting their jobs to be done. And I think in a jobs to be done framework. I think there's like 85 different types of CXO job titles now. And I think there's an excessive proliferation of thinking that another senior C level exec hire is going to solve.
these problems that require much more finesse and nuance.
Tom Rudnai (23:11)
I think it's gonna depend a little bit on the type and size of organization. Like it's something that I think is a viable route for us, right? So we're a small startup, but I don't view it as a particularly technical role. I view it actually as what I would look to do where instead of a COO, right? I think that's the skill set. It's a change maker and someone who puts systems in place as you grow. But that isn't really about like a technical, it's a very different skill set to the CDO type.
And I think if you are going to look at it, that's how you need to view it, I would say, in order to make it successful. But I think that also makes sense for us because we will be scaling within an AI world. So my focus is finding someone who can stop me hiring for a role that's going to be redundant six months, one year later, who can make sure that we're making smart decisions to stay as lean as possible. Because to me, that's our entire edge that we have as a business is at the moment our leanness.
Oren (24:03)
I think this is all sensible. I just don't think they require full-time hire. I don't think there's depth or complexity to those problems or challenges. They need a full-time AI person to do that. Like you just need the right consultancy, advisory, guidance, because these are directional rather than like granular. Unless once again, like the AI is like deeply integrated into your business in a meaningful way and you need someone who owns that process.
And they, for whatever reason, can't sit in product, can't sit in engineering. You know, like one of the businesses I support, they split out the AI experimental features into a separate unit. And it's almost like, you know, how you have marketing teams and you have growth teams and, you know, it's like multiple definitions for growth is that growth is like a repackage of performance marketing. And there's like the more authentic original definition of growth, which is these people are running experiments to try and...
10x or 100x specific parts of the business. And I think if you said, can see like, well, I'm not going to have a chief AI officer, but I'm going to have kind of almost like another horizontal element or vertical element that is like cutting across the organization to run experiments and help adoption. Then I think in the right organization, I can see that working. But I wouldn't say that you need a chief AI officer. I wouldn't pull that on a C level.
reason just because this AI is so powerful, but it's not any more powerful than other forms of technology that we have. It's very disruptive. It's new, but we're still operating on a lot of other software and infrastructure and platforms that working really well. So I think it's just like another tool rather than one tool to change all tools. there's a really interesting piece of research about weather forecasting, and they use the AI for weather forecasting, and they had an old model.
And the old model has a higher prediction capability than AI modeling weather forecasting. And then you're like, you realize AI is not the right tool for every problem, but people seem to think that it is. And the reason is you don't understand what AI really is or what its limitations are or how it works, which is why they view it as like a hammer and everything is like, everything's a nail and that's not the right approach or mentality, which is why I don't believe in this top down approach.
Tom Rudnai (26:19)
No, that's really interesting. And to me, I think it's all a little bit linked to one of the other questions that I wanted to ask you actually, which was around like build versus buy when it comes to implementing AI change, right? If you're taking a buy approach, then actually it's not something that needs to live in-house because you want breadth of knowledge and going external is the best place to find that always. Well, not always, but often. If you're taking more of build approach, then it becomes more of a full-time job and it's about, as you say, understanding the strengths and weaknesses of different models for different use cases.
and what problems we should even point AI at in the first place, right? Finding the right nails to nail with your shiny new hammer. How do you approach that? And I'm sure it's probably not a catch-all answer for everything, but if you're an organisation, how do you approach build versus buy in terms of all the shiny new tools out there or just getting your hand, rolling up your sleeves with ChatGPT?
Big question.
Oren (27:10)
I think
it's like a sensible question, it's question with no answer. It's like, I have to conduct an audit to understand the problem. And depending on what the problem is, I'd recommend what the solution is. if you're like, it's almost like, what's that remedy for all diseases? Like there's over 8,000 diseases, or 10,000 diseases, 8,000 of them are rare, 2,000 are common. If it's one of those common problems, there's a cure.
and that QMEN Core happens to be the equivalent of a charger BT, roll out charger BT. If you're like, do it to every user. But if it's like, well, actually we have like this very specific back office problem, but we need a bespoke build or custom build, then yeah, that first solution really wasn't suitable. it's not possible. There is no one answer cure to that question. I wish I had one. I think it's very, everything in my head is always like custom. It's always like.
Because every organization is unique. know companies don't think this way and executives don't like to think of themselves this way because they like to think there's things that they can learn from other organizations that gives them a competitive advantage. And sometimes there are, but it's very hard to find these generalist principles that are really broadly applicable, that are really going to drive massive transformation in an organization. It's usually rapid incremental changes that drive velocity and big bets and risks that tend to really move the needle.
And those big bets and those risks and even those incremental changes, depending on like, you know, who is your team? Who's your competitor? What's your positioning? What's your brand? How differentiated are you? What's your market share? What's your share of voice? You know, what's your strategy? How you, you know, like, you know, how, how, what's the size of your competitors? There's so many components of variables to winning this, this, this game, this basketball game. And unlike basketball, when you're running a business, the pace of change.
in so many different conditions is changing all the time that it's so dynamic that it's very, very hard to come up with static answers that are material. And they do exist. There's very few of them, kind of like gravity or magnetism, like in physical law. There's like a few of these principles and they broadly apply. But when you start looking in Newtonian physics and lots of different parts, and you try and translate Newtonian physics to quantum mechanics, things start to break down. Then you realize, we want one theory of everything. We want to unify.
And like, still is no unifying theory. there's a reason for that. It's because that we don't, we don't have the full picture of all the variables. So we have to deal with it in a contained manner of the variables that we can constrain and can manage rather than try like these very broad things that are, they sound smart. They sound impressive, but very often they fail. And I see this like these large company wide initiatives and they either take forever. You know, I've seen companies that like take them eight years to change their CRM.
I just talked to a company the other day, it's a 240 mil ARR business. They've been trying to change their website for the last four years. Just the website for four years because there's so many departments, they can't get them to agree. And it's not the first company I've seen who's had this challenge. That's just the website. Imagine what happens if you're trying to change the CRM, you know? So I'm just saying like, I don't believe in like broad, big elephant like solutions. I think like nimble, fast, efficient, think more like sniper than I do scatter gun.
Tom Rudnai (30:27)
I guess we come back then to, I'm an ordinary marketer, I know that it's my job to make sure that we're using AI as well as we can in order to achieve my goals, but we come back to that issue of how do I stay on top of everything that I should and could be doing, prioritise it alongside my day job. So I think that's probably a good place to come back to that. What's your advice to a person in that position?
about how to start and how you would approach that problem that they find themselves with.
Oren (30:54)
to prioritize how to get on top of AI.
Tom Rudnai (30:58)
To make sense of all the noise, I guess, what you're saying is that this isn't something that necessarily needs to be tackled with large scale change programmes. It's something that actually should be put on the plates of your department heads. How is a department head who already had a full time job as well as keeping up with this fast evolving trend, how do I go about that?
Oren (31:21)
Yeah, I hear you. One of the ways I went about solving this problem is I've created a program called the AI Marketing Lab to help marketers with that specific problem. So my approach is training and upskilling marketers to learn everything from the foundations of prompt engineering to automation to agent-like AI so they feel like, I'm confident and comfortable with this. So I think that's a solution. It doesn't have to be my training program. Go find a training program that you want to do.
You're upskilling and training, but you know, that's like, if you want a simpler, easier way, if you're self-deductive, then there's a huge amount of content on YouTube, TikTok, that's widely available. I mean, like if you look at NA 10, which is growing in popularity, they have some really fantastic videos on YouTube that show you how to do stuff. But like, it's like, it's just a time sync and energy sync. Like where's that in your priority list? I think the problem is there's so many software now.
with the fragmentation of different areas of marketing that they cater to, that you can't do all of it. So you just go to focus and you're going to think, well, what is the problem I'm trying to solve? Is that a specific tool or software? How do I go down that rabbit hole? Or is it another problem that I don't have? And how custom is that problem? And then you can either like, you know, reach out to someone who knows that's, someone who is positioned as a, you know, either a data scientist or an AI consultant.
And you can ask them, how complex or hard is this problem I'm trying to solve? So for example, the other day I wanted to write a book on prompt engineering for B2B marketers. So now I have a crew of six agents and I've set it up in cursor and I'm using crew AI as a framework. And I'm building that myself to scratch the itch because there was no off the shelf solution that I was looking for that could help me do that in a more efficient way. sometimes you will end up having to build custom solutions.
And if you can't, then you need to outsource that. And that comes with its own, like, know, can of worms, or you veer in with a safe, well-defined route. I mean, there's a lot of tools out there. don't want to name any of them because I don't want to bad map. There's a lot of very popular AI marketing tools that are just so ludicrously expensive that I can do what they do with a Google Colab Python script, which takes me 10 minutes to whip up.
and it's a hundredth of the cost. And the problem is the reason marketers view towards these tools is because a UX and UI is so beautiful and easy. But that's not really what the issue is. The reason, the problem is the cognitive load and the time sync and the juggling of multiple responsibilities. So marketers are time poor and energy poor and they need quick solutions and they're happy to pair premium for something that's easy to use and quick rather than try and learn how to do it, even if it only took
10 or 15 minutes of fiddling around with a Python script in Google core lab to try and get that result. So I the problem a lot of marketers have is they're going to split into two groups. So it's been someone the other day, a hundred marketers in the organization, 30 marketers are AI native, 70 of them aren't. And apparently all the product marketers are the AI native ones, which is interesting. I never, I wouldn't have thought that product marketing specifically would be the first sub-specialization marketing that would pick up and become AI native. But there you go.
Tom Rudnai (34:15)
Hmm.
Oren (34:41)
And that's not necessarily representative, it's just that organization. But what was really interesting is the ones who are more AI native tend to be a bit more technical, more performance marketing oriented. And it's the marketers and the bulk of marketers who are more brand-centric or content-centric who aren't as technical or aren't as comfortable. And I think for them, it's just a lot easier to go and find those tools off the shelf and try and use those rather than try and build something that's more efficient or cheaper.
but obviously it requires a lot more mental strain.
Tom Rudnai (35:12)
Yeah, and the challenge I think is you often end up creating downstream impacts in terms of the workload that you have. Like one thing that always strikes me about these tools is the product fundamentals within them in terms of integration and things like that. It's really poor. So the amount of time I spend copying and pasting and downloading this as a PDF, feeding it into this AI, it's really bad.
But I think one thing I was thinking as you were talking was also like the way that we set goals as organizations probably isn't that helpful because all of our goals from our marketer at least revolve around output, right? So you're judged on what you put out and the impact that it has. You're not judged by efficiency. So it doesn't really encourage you to go and play around and say, hey, we've got a way more efficient way to do the same, which is still very valuable in terms of.
what actually creates enterprise value, right? So I think it's something, if there's one thing to take away as leadership that you can do to help enable that kind of departmental ownership, that's probably one thing is create goals around it.
Oren (36:10)
Yeah, think all executives I work with, they only care about outcomes and outcomes are always translated commercial KPIs and marketers, regardless of seniority, have the challenge of translation of how activity input translates to a output that translates into an outcome. And executives really only care about the outcome and they don't care about the way you initiate that process. Right? So.
The more you, you need to, when you're communicating internally, you need to say, I'm doing this because I'm trying to get this output, which will lead to this outcome. And then people go, okay, I understand what you're trying to do here. But like, hey, I'm using AI to be more efficient so I can undertake more activity, which will shift the KPI on marketing qualified leads or, you know, sales and e-commerce. And that will result in increased revenue.
Now the executive can understand why and what you're saying. Marketers tend to talk in jargon. They tend to talk in like click-through rate and impression share and share a voice. And a lot of executives, especially like COOs, CTOs, CFOs, CPOs, they don't understand what this means. Some CEOs do because they either come from sales or come from marketing and they kind of get it. But a lot of other CXOs don't get it. And the marketer keeps falling into that trap of
talking about time and efficiency and activity and the task list and to-do list and how busy they are. And executives don't really care. If you could do it half the time, but still get double the result, most executives would say, great, do that.
Tom Rudnai (37:41)
Yeah, now we know you're talking our language of demand genius. What was the impact that it had? And that's what you need to be reporting upwards. I wanna get into some quick fires before I let you go, Owen, but I have one more question actually before, which I think is probably the most valuable thing that any marketer can take away from this, which is, can you maybe just give us a little bit of an overview of like, what makes a great prompt? Cause I think that's the skill that comes back that you said earlier, a lot of people are lacking.
Oren (38:03)
I'd say, lots of issues, but if I went through the biggest problems, they are prompt stuffing. So giving the prompt far too much context that is irrelevant for your request. It's not structuring the prompt in the right way, like who's the audience, background context, what you're trying to do, give the AI some sort of specific role, get it a very concrete,
output that you're expecting. Ideally, give it an example of an output. Another really common mistake that I see would be not going through this process of understanding that it's iterative. Like the AI is a generalized model, most large language models. And as a result, it's giving you the response that's average. So the person thinks that AI is default, but what actually is typically happening
the user hasn't given enough specificity to get the right parameters narrow enough to get the desired output. So what you need to do is you really need to...
Bear with the AI as it gives you wrong answer and give it an opportunity to understand what it is that you want. And sometimes I can take three attempts, sometimes I can take 15 attempts. But what you'll always find is it always came back to you not thinking about some edge case or variable or way to view the lens on the problem.
that the AI has taken into consideration because it has so many lenses they can view at the same time. So like I see people always approaching with a very specific lens. They believe that's truth. And the AI has 15 lenses and they're just viewing the answer through the wrong lens. I still need to work on my analogy or metaphor here. Hopefully that's come across clearly. There's lots of other problems I can talk about. not sure where do, I just produced a video that talks about the order of importance. So if you put the wrong order in your prompt,
it can impact the quality by 14%. So if you put the most important piece of information at the very bottom of your list in your prompt request, you're not going to get as good as a result as putting the most important thing on the top of the list. And that's just because of the way the transformers work with the order of words and the tokenization of those words and just the way large language models operate to understand meaning within patterns of language. it's linear and it doesn't
Tom Rudnai (40:02)
No,
Oren (40:16)
read backwards in the same way. So it doesn't take everything and evaluate over equal weighting. It gives more precedence because most of the content on the internet tends to focus on what's most important first. And that's been the training data. That's the hypothesis as to the reason this is happening.
Tom Rudnai (40:30)
the order of importance points is very, very interesting. I've not heard that before. I have a pretty basic framework, which is just task context output. So what do you want it to do? What's the context it needs? And then what do you want back from it? And that seems to produce pretty good outcomes for me. But I think the point you made about being willing to iterate and do a bit of trial and error is also really important.
Cool, I'm gonna get into a couple of quick fires now and we've got about four minutes left so we'll see how many we can get through. I guess first one, which is a very good one to ask you, is there a single AI use case or AI tool that has just blown your mind that you think people should go and check out?
Oren (41:04)
good question. The first one that popped in mind when you said it was Claude from Anthropic, which I find is better than ChadGBT in pretty much almost everything. The only caveat to that is I use ChadGBT, especially Mini, for lot of API utilization. And ChadGBT has persistent memory where Claude, you have to ask it to go through. I did post or I did.
post today, I think. Yeah, today I posted on LinkedIn about how you can create a local memory bank for Claude on your desktop. So you can create your own repository, which acts like a project or like in GeminiGems or in JBT projects as well. And that's persistent and you can call it into any conversation where Claude doesn't have access to content inside projects. But I'd say across like classification, data labeling,
creative writing, problem analysis, I'd say articulation, like understanding what Claude is saying, like just clarity. It outperforms TragiBT in almost all tasks. The biggest problem with the API is very expensive. So yeah, I'd say Claude by far is remarkable.
Tom Rudnai (42:12)
That's great, I imagine there's a lot of people that haven't played around with that yet. Another one, for you personally in your career, what skill or trait would you say has been the biggest needle mover for you?
Oren (42:21)
I'm going to go kind of really sideways here. I spend a lot of time, money, and energy on my own projects. And I believe that that allows me to...
like pioneer, that's why I was in AI before anyone heard about charging BT. It's because of that route to experimentation and exploration. And I think a lot of marketers, they go to the nine to five, they leave and they go watch Netflix. And I think it's good to invest and have the entrepreneurial spark where you're trying. I know a few marketers and they're doing some incredible stuff with vibe coding. So I'd encourage those initiatives to, to not be afraid to take those risks.
that served me well. I think it's infused into my personality types. It's very easy for me to say, go do that. But if someone's listening and they're like risk-averse or conservative, then don't do that. That's not right for you. So it worked for me. But you have to remember people are different, different places in life and very different priorities with where they're at with family obligations or what's important to them. So it's very hard to make a broad statement, but that's just what worked for me.
Tom Rudnai (43:25)
No, I mean, that's the question, right, is what helps you? And it might be different things for different people. And then one recommendation before I give you a chance to plug anything that you're doing, one recommendation you'd have for people to check out, whether it's a book, a thought leader, a podcast that you really enjoy that someone else puts out there.
Oren (43:41)
ooo
Who is this guy? Ethan Molyk is probably one of the... I like that it's... I guess I see him as similar where the content I put out on my socials is to help others. And it's less about selling or focus on like...
It's like the benefit to me commercially comes as a symptom of building meaningful relationships of the content I put out. And when I consume his content, it feels like this person has a curious mind and he's genuinely sharing what he thinks is interesting and the journey into learning and understanding AI. So he's the first that popped to mind. There's a few, but he's definitely one of them. That's really great.
Tom Rudnai (44:26)
Awesome, I have not heard from him, so I will go and check him out, Ethan Molyke. And then finally, just for you, anything that you're doing that you think is worth a plug.
Oren (44:35)
feel like I plugged myself a fair bit more than I had anticipated to in our conversation so far, so I'm going to refrain. I will say, feel free to connect on LinkedIn, feel free to check out the gap analysis. If you want the email prompt engineering course, then I'm going to send that, just go through the gap analysis process. That's plenty.
Tom Rudnai (44:38)
You
There we go, well I'll do it for you since you're being modest or in greeberg.com. Go and check that out if you want some more help on getting up to date on AI. Look, it's been fantastic to have you on. Really appreciate it. You've put me to shame in terms of the quality of your podcast background, so I need to up my game. I will be better. And thank you to everyone who's listening at