The NeuralPod, by NeuralRec.ai
The NeuralPod is all about deep diving into technical machine learning topics and showing its real-world impact, bridging the gap between AI-first companies and adoption.
We chat with subject matter experts across reinforcement learning, recommender systems, deep learning and generative AI. As well as chatting with business leaders, VCs, operations and strategy experts.
Who are NeuralRec.ai? NeuralRec is a recruitment staffing agency. We build niche ML teams and represent some of the globe's best ML talent.
Chris: Eric Schwartz, welcome to you.
Uh, what AI means for us now is,
uh, great to have you on today.
Thank you for joining us.
Erik Schwartz: Good morning, Chris.
Thanks so much for having me on.
It's really a pleasure to be here.
Chris: Yes.
I'm excited to, uh, chat to you.
I know we've had kind of, uh, casual
conversations in the past, haven't we?
And I've always found, um,
yeah, your approach interesting.
And I think just to let people know
what kind of drew us here today, we were
chatting around deep seek I thought you
had a really, uh, logical approach to, uh,
what's going on, but that, that kind of,
um, almost seems like old news now with,
um, the, the AI sort of in Paris recently.
Yeah.
Erik Schwartz: all the hype last week.
And now we're on something completely new.
It's the pace, the pace
that we're moving at.
I was just, I just got off a call a few
minutes ago before this conversation.
And I was just saying to somebody else,
it was an older gentleman who was like,
you know, Eric, you know, I'm really
having trouble trying to keep up.
And, you know, for, for us in the older
generation, I'm really looking for
some resources and trying to find ways.
To stay in touch.
I said, you know, it's not you, right?
I'm not even referring to myself.
The younger generation,
like our kids, right?
The new entrance into the workforce.
They're having trouble keeping up as well,
because we're moving at a pace right now.
That's completely unprecedented in
Chris: Yeah.
Erik Schwartz: we've seen in the last.
That I've ever seen in
my career or lifetime.
Chris: Yeah.
It's interesting because, uh, you know,
as a business owner, it's like, I've only
got so much time in the day and what.
AI to do a spend to actually learn
because it's not for the one of
not wanting to learn these tools.
It's just there's so, so many out there.
And I think, you know, the podcast
originally started for me as just a
little bit of fun, but actually now it's
turning into, I want to help amplify
the right voices in the community
because it's people are kind of asking
me who they should turn to and follow.
So, um, yeah, thanks for coming
on and sharing, um, your thoughts.
So, um, yeah, I think just
let everyone know at home.
We'll cut.
general conversation, but I think you've
got a great background in kind of Rexis
as well, which we'll kind of touch on.
And, um, it'd be good to get your future
predictions and where you, you see the
market going, because I think, again,
when we originally started, it was what
do you see in the next couple of years?
But I think I wasn't really setting
people up for success with that question.
It's kind of, you know, what, what do
you see in the next six to 12 months,
but it'd be great just to, um, If
you could talk people for your career
history, how you got to today, um, and
into your role and, you know, kind of
lessons learned a lot along the way.
Kind of two questions there.
Erik Schwartz: Yeah, of course, Chris.
I'd be happy to share that with you
and the group and everyone listening.
So as you mentioned, my, my background
is an information retrieval.
So I came out of, I came out of
university by, I studied electrical
engineering in school, I was always, I
was always really drawn to the applied.
Use of technology and electronics.
And I think the thing that
resonated the most with me was
really about computer engineering.
Like I did courses on learning how to
code and learning, you know, learning
things like Fortran and Pascal.
The people remember those
languages back, back, back then.
But I really loved like the microcomputer
code, like, you know, really putting
bits into a microprocessor and storing
logic in there and getting it to compute.
And so I really resonated.
With the compute, but I love that this
whole assembly of a system of things
coming together to actually get something
done, up, putting a little led screen on
a little microcomputer and getting the
numbers to count up from one to eight
was like, like, this is groundbreaking,
you know, and, uh, but, you know, it
seems so overly simple, but, um, But
I started my career after university.
This is like the mid nineties
when I came out of school.
Um, and, um, we had the first web
browsers were just starting to come out.
Um, this is like even before, even
before Netscape, which has sort of
come and gone and done its life.
Uh, there was a browser that
came out, um, called Mosaic.
Um, it was the first internet web
browser that was ever out there.
Came from the supercomputer
center outta Champaign, Illinois.
And, um, I didn't know.
I didn't know what it was.
I didn't know what the impact of it
was, but it was it was so simple and
so easy that it just resonated with
me from the start that I could sit
there and write something with HTML
and disseminate it broadly across,
you know, across the Internet.
Now, obviously, these are
all common things right now.
But like that just seems so wildly
powerful as I don't know what
this is, but I want to do that.
So I took a job right after school.
Um, and I went down to Washington D.
C.
and got a job at the at the
Naval Research Laboratory.
If you've ever flown into Washington D.
C.
We fly into the Reagan Airport,
which is the airport right in D.
C.
Which actually.
a week ago because it was a horrible
accident a couple of weeks ago with a
helicopter and an airplane colliding.
But was right at that airport.
But we were across the
river in a building.
That had a huge radar dish on top of it.
boss at the time used to call it the
biggest birdbath in Washington, DC.
Because it was an old defunk radar dish
and this is where where they had mentioned
radar there are sending sound signals
across the Potomac and every time a boat
would come across they would have this big
blip that would block their radio signals.
And that's in radar at this lab.
And so, Um, we started building digital
libraries at that lab, and so we started
taking, um, but the library physically
was out of space, and they had research
journals and reports that they were
creating on the laboratory, and they had
no space to store this paper anymore.
So we, we were digitizing, we were
putting it online so the researchers
on campus could get access to it.
And so that started my passion and
love for, Information retrieval.
So basically we took this content,
we put it into a search engine.
We made it available for these
guys to find, uh, reports.
Um, we started exploring, how do
we get access to the full text?
So they can look for
words inside that context.
Um, we were doing OCR, we
are recognizing characters.
Uh, but we quickly ran
into some problems where.
Um, the words that they would
use wouldn't always match the
words that were in the paper.
If they had a thought or a concept
in their head about a problem
area that they would solve.
But those words weren't
necessarily in the paper.
And so we started building these very
sort of simplistic ways of sort of
finding ways to interpret what they meant.
What was their intent to
connect them with content?
And so I like to say that I've been in
the same problem space for 30 years.
And now the technology has come
around to actually solve those
things with large language models.
And you have things like, uh, chat,
GPT, and you have Claude and all
these great tools that are amazing
at understanding what we're looking
for, helping us find that information
and pulling it back out of that.
So just a bit about my background, right.
And how I got there.
I went and I, and I spent my
time building digital libraries.
We were building it from the government
and for libraries back in the United
States, back in the 90s and 2000s.
I started working for search
engine vendors in the 2000s.
Um, a company called Convera, which
had a product called Retrievalware.
Um, they were acquired by a company
called Fast, that was based out of Norway,
who was then acquired by Microsoft.
And so in 2008, um, we
all joined Microsoft.
So as a Microsoft employee for a couple of
years in the SharePoint team, uh, helping
manage customers into that acquisition.
Um, and, uh, then in 2010, um, I
was in Washington DC and Microsoft
wanted, really wanted everyone in
Seattle, in Redmond at the time that
was building products and making.
So I started building, working
for large corporates where we
ran and built large search engine
platforms or knowledge discovery
platforms that powered the business.
And so I worked for Comcast, um, for
a number of years where we built a
large data, uh, and access platform
that was based on search, that
power their entire video experience.
Every time you sat down and watch
TV, whether it was mobile web or set
top box, you engage with the search
engine to find the content or find
the channel that you wanted to watch.
Okay.
And then in 2018, I moved over
here to the United Kingdom.
I came to London to build a knowledge
discovery for platform for Elsevier.
And while we were at Elsevier, we
built one of the first, um, uh,
interfaces of a large language model.
We built a tool called, uh, Scopus
AI, which took chat GPT, a large
language model and put it on this
trusted database of over 90 million
abstracts of academic research.
So we proved that we could use large
language models on top of a trusted
data source to deliver trusted results.
For academics, which
brings me up to today.
And now I'm the chief AI officer
at a company called Tricon
Chris: Nice.
Yes.
Thank you for sharing that.
Some, some interesting points.
I think, um, To your just bring it
back a second when you mentioned
around um, you're going through a
boom And we're obviously going through
a bit of an ai boom now Is there
any kind of similarities you see?
You know because we're going into
uncharted territory here of you know,
maybe what went on Uh during the dotcom
boom or anything like that or uh things
that you've seen in the past that you're
starting to see Similar signs now of
how the market is kind of shaping up
Erik Schwartz: Infotech.
The pattern of the mid to late nineties
is absolutely repeating itself.
And I think the one, the one, um,
voice of caution that I'll give, right.
You know, we're sitting here,
it's been, you know, since.
Since November of 2020, uh, 2022, right?
When chat GPT came on, on the field
in a big way, when chat GPT was first
launched, it was November, 2022.
Um, everyone's head was blown when they,
when they realized what could happen.
And it's really only been about a little
over two years since that launched.
Um, and now we're at a place
where, um, AI applications are
starting to come really mainstream.
We're starting to really build them
into our businesses and we're, we're
seeing AI and lots of different
affordances and products, um, and So
it was, it was, it was very similar
to what was happening with the web 1.
0 wave in the early nineties,
but we have to keep in mind that
that took a good five years from
about 93 to about 96, 97, right.
Before we started seeing real
products and, and, and, and
services on the internet.
And so we're seeing a similar cycle
where there's a ton of startups,
tons of lots of little ideas, lots
of interest and lots of valuation.
Uh, and you know, that's, that's going
to, that's going to get aggregated.
It's going to get pulled up, right?
For example.
Um, you know, when, when, when open
AI releases deep research, right?
Suddenly there's a whole bunch of
little applications or they release,
you know, computer use or they're a
little tool called operator or, um,
or, uh, Claude releases canvas, which
is a, you know, a way to kind of
visualize, um, real, real, real code on
screen or visualizations or documents
on screen, suddenly dozens of little
businesses sort of disappear because.
They are subsumed by this feature.
That's part of a larger language model.
And it's an exact pattern that we've
seen before where people will go out,
try different ideas, experiment with
them, maybe even get some funding to
get some programs, but then they get,
they get, they get absorbed by, uh,
by these larger, larger platforms.
Chris: Yeah, and it seems like, um, you
know, that last year there's big folk
focus on foundational model training, etc
This year kind of with what's happened
with deep seek, etc Everyone's kind of
pushing towards the application side.
It seems obviously your background
in uh products um You know what?
What are your thoughts on where?
um These applications should sit and
you know How potentially you would
position in the the current market
and where the needs are I guess You
Erik Schwartz: Yeah,
no, it's a big question.
I think, you know, at this
point, there's a very much a
crystal ball question, right?
It's, I think it's everyone's guess in
terms of what's exactly going to happen.
Um, for the foreseeable future, we're
still gonna have a big dependence
on these large language models.
They, they, they, they're really
powerful and they're really big enablers.
Um, you know, there's, I think
over the last couple of years,
there's been this sort of criticism.
Uh, what people call wrapper apps
where someone builds some code, builds
a use case, builds a scenario on
top of these large language models.
Right.
And you know, it creates this risk that
we spoke about a moment ago, right?
Where these large language models
can add a new feature and suddenly
those applications become meaningless.
Right.
But at the same time, right, what's
really, what was, what was happening a lot
for the last two years is Google Ridley
would release something or meta would
really Google release a Gemini model.
Meta would release a Llama model,
Anthropic would release a version
of Clawed, Um, even, uh, X, Twitter
would release a version of Grok.
And they were all kind of the same,
They were all sort of, they were all
maybe trying to catch up with OpenAI.
Maybe they'd leap over slightly
in terms of the benchmarks,
in terms of the capabilities.
But they were all more or less the
same in terms of what they could do.
You know, people, some people might
like Clawed, for example, for coding.
Um, purposes, or they might like, um, uh,
chat GPT for question and answer services,
or they might like to, you know, draw
pictures or something inside of X, right.
So rock, but it's, so people started
having sort of things that they'd
like to do better with them, but
there wasn't a huge amount of
specialization that was happening.
Um, and, and, um, and, and, and, and, you
know, so the tide kind of turned from.
You know, is it, is it
just a wrapper app, right?
Or actually am I building, um, a
moat around my own, my own value,
my own experience, I'm managing
customers, I'm creating a business
around these customers and I have
the flexibility to point to different
models as they, as they evolve, right?
So, so I think, I think we're going to see
this, like, just like the pendulum swings.
Back and forth.
Like we're considering this the
dynamic here for a little bit longer.
Um, but I think, you know, I think
the thing that's clearly going to
happen is that people will start to
embed AI capabilities inside their
products and services and workflows.
Um, and, and these will become, these
will become a lot smarter and this
is where we get into agents, right?
And we get into this whole notion of.
What is an agent and what's it going to
do and how's it going to change for me?
Chris: Yeah, yeah, it's interesting.
I haven't had a chat recently
with a friend trying to
define, um, what an agent is.
Uh, I think that's a big thing.
I don't know if you've got a, uh,
would like to go on record and define
what you think an agent is right now.
Erik Schwartz: Yeah, yeah, exactly.
I think there's a lot of confusion
with, you know, intelligent workflow
and bots and agents and assistants.
Um, but I have a very simple
definition that I use for agents.
So.
So today, if you think about it, like chat
GPT or any of these tools we just talked
about a moment or two ago, um, those are,
those are really, they're really, they're
really chatbots in the simplest way.
I asked a question that
gives me an answer back.
Right.
Um, and, and an agent.
And so we're starting to see these,
these chatbots can do things a
little bit more autonomously, right?
For example, we talked about deep
search, deep research a minute ago.
Um, deep research, you can give it a
task and it'll go out on the internet.
It'll go on the internet.
It'll find some things it'll, it'll do.
So it'll do some reasonings, um, uh,
and it'll come back and bring some
content back and, and, um, and generate
a report and what's really interesting
about it, and this is where we're
starting to get into what an agent
is, I'll get to a definition in just a
second, is that it didn't just take the
single question and then start to go
answer the question and fill the result
response, it actually started to refine.
plan, right?
Has it gathered information?
So went out and asked a question
and said, Oh, okay, I'm finding some
good resources or finding things
that maybe aren't that relevant.
Maybe I need to ask a better question.
This is agentic behavior, right?
So it's the ability, right?
For a system to say,
let me give you a goal.
Right.
I'll give you a task.
Right.
Um, and I need you to go do something
that task could be something like
set up a meeting with Chris, right.
Or find some time to do a podcast.
That goal could be something like,
Hey, plan me a vacation somewhere
warm because I haven't seen the
sunshine months in the United Kingdom.
So let's, you know, and here's my budget.
Go do something like that.
But it's a goal.
Right?
Um, and then the agent that needs to
be able to take that goal and build a
plan, How am I going to solve this goal?
How am I going to figure out how to get
the information I need to solve this poll?
And typically, the way it does that is
it leverages different tools, right?
The tool could be the Internet.
The tool could be a database.
The tool could be your web browser.
The tool could be any number of
services that are provided inside
of our enterprise or out publicly
available to help accomplish that goal.
So it's planning, right?
It's got tools, right?
And it's got the ability to
refine its plan as it gathers
information from the tools.
The other bit that an agent
also has is it's got memory.
Right.
So it's got what we call long
term memory and short term memory.
The long term memory is to say, Hey, I
remember Eric, Eric loves to go to the
beach, his wife loves to go to the beach.
So when he says he wants to go to the
holiday, let's not look for a ski trip.
Let's look for a beach trip.
Cause I know that's what he's looking for.
I know that's going to
make his wife super happy.
Um, so that's kind of this
notion of long term memory.
It's like this notion of
personalization, right.
Um, or always remember before
I send the credit card in to
make sure I get permission.
That's long term memory.
Short term memory is like,
what am I working on right now?
And how do I help refine my plan?
Right.
I've, I've gathered
some flight information.
I've gathered some hotel information.
And what else do I need?
Oh, I need to get a transport right
between, I need to connect the docks.
Right.
So how do I make sure that there's
a smooth agenda going through?
So going back, right.
So as a system that can, can, can a goal.
Right.
It can build a plan.
It can refine its plan, right?
It can use tools and then take
action on those tools to go
accomplish that, uh, that, that goal.
That's how I define an agent.
So it's really intelligent workflow
with the ability to think as it moves.
And this is where we're getting
into with deep research.
This is exactly what we're getting into.
We're getting into.
This agentic behavior, I give it a task
and it can go off and take 5, 20, 30
minutes to go off, find the right sets
of information that are relevant to the
goal ahead to bring back a report that's
cohesive and write it out and then write
it out to me in a level that's that's
appropriate for my audience, which
is typically, you know, a PhD level
research report that I can ask for.
Chris: Yeah, that's, um, really
interestingly put and, uh, very well said.
So, uh, yeah, thanks for sharing that.
Um, I think that a lot of companies
now are looking at adoption,
and obviously it was last year.
Um, you know, everybody, it feels
like maybe people are looking to adopt
agents before they know what to do.
Problem, the potential going to
solve because a little bit of fomo
in the market, I guess right now.
Um, I think you've got the unique,
you've got the unique perspective.
You come from a technical engineering
background, then products.
What should companies be looking for in
terms of when adopting this technology?
How would you actually evaluate
a product for your needs?
And, um, yeah, how do you see it?
You know, you've come from a
Rexis personalization background.
How would you see it affecting
personalization or the retail
markets, for example, or e commerce?
Erik Schwartz: Yeah, no, I think, I
think it's, uh, we're, we're really
just beginning, uh, to beat the answer.
And I think if we look, if we come
back and have this conversation in
six months, I think we'll have a very
different conversation and not a year
from now, but six months from now, we'll
have a very different conversation.
The reason why is that, um, Agents, agents
is still very much an emerging field.
and so I think we have a fairly
well agreed upon pattern in terms
of what the agent means to do, uh,
in terms of like the goals and the
sort of things that I laid out.
Um, however, um, there are some challenges
with the way the large language models
work to make these things really
repeatable and make sure that every time
I ask for a task, it actually solves
it at the level of quality that I need.
And how do we put the
right frameworks around it?
How do we make sure?
That is asking the right questions
that it's keeping the human in
the loop, that it's not making
assumptions where it should.
Like, so there's, there's a number of
things around this, these patterns.
I'm, you know, when I say to you,
Chris, go get me, you know, go book
me a holiday for a human, that's
pretty straightforward to do.
I know the parameters of wanting to
work with where I need to get checks
and where I need to get validation.
The agents, we still need to learn that we
still need to put those controls in place.
And so that's evolving.
But to answer your question specifically,
what we're seeing in the marketplace
right now is that there are a number of
businesses and a lot of these are smaller
and medium sized businesses that are
saying, God, This AI thing is coming.
Uh, I need to put agents in my
business and actually what they really
need to do is they need to actually
start doing some simple automation.
Um, there are, there are, you know,
there are still people that are taking
orders via email, reading the orders
and typing them into a spreadsheet.
Right.
And so like these things are inherently
solvable and they don't require an AI
agent to automate that, but it's been
someone's job for the last 10 years
to kind of read that email and put
it into the order system so So that
an order can be shipped to somebody.
All right.
So these manual processes that we're still
doing in our businesses are still there.
And those can be replaced by just
simple automation platforms, like
something like make or Zapier, right?
And there's lots of guys that are
out there that are really pro at
putting these things together and they
don't require a whole lot of, a lot
of oversight to get that automation
going, which means that you can free
up capacity and talent within your
organization to take on harder problems.
So that being said, right, I think
we're gonna see over the next year,
like, like, you know, getting an Airbnb,
it's like, you're not gonna have to go
and, you know, and think about all the
little, like, if I want to go and I
want to have a nice night out, right?
I want to plan a date.
Let me, it'll be that simple where I'll
be able to go and say, I just want to
plan a date and there'll be applications
that start to emerge and say, I'm
going to be a date planning service.
It's going to take care of
everything for you, right?
It's going to take care of making
sure your date, first of all, your
date's available and then they have
freedom, freedom, on their, in their
calendar, uh, going to find a nice
restaurant, a place that they're
going to like, they're going to take
care of the car reservation, they're
going to make sure there's a box of.
But a box of chocolates or a bouquet
of flowers waiting in the car for you.
When you get there, uh, they're also
going to check, you know, just like
we kind of do with delivery or any
of these online services or Uber
eats, it's going to sort you or city
map or one of my favorites, right?
Um, as you kind of go through your
journey, it's going to keep track
of where you are in your journey.
It's like.
Is everything going?
All right.
Do I need to adjust my plan?
Oh, the uber didn't show up.
Or there's, there's a lot of traffic,
you know, instead of taking uber,
you know what, just take the bus.
Instead, you can have a romantic
top time on top of that.
Everyone loves taking
the bus around London.
Go, you know, it's nighttime.
You've got extra time in your plan.
We'll move the, we'll move
the booking back a little bit.
Go hop in the bus, take the sightseeing
tour around London, take in the sights.
Um, and then, and, and, and, but
the ability to sort of adjust on
the fly is what's going to come.
And we're going to see this, uh, in, in,
in products and services and commerce,
um, in, in, in, in the coming months and
weeks, not weeks, maybe months and years.
Right.
Chris: Yeah, no, I think you touched
on an interesting point there.
The market's changing it every six months.
I think, you know, personally, from like a
business perspective, I found when you're
trying to road map technology and AI,
you're constantly pivoting and shifting
to kind of the way the market is changing.
I think you've, yeah, you're obviously
come from, as I say, a technical
background, now you're more strategic.
How do you actually strategize
and, you know, plan a roadmap
for such a quick moving market?
Fair,
Erik Schwartz: And I think, you know,
if I told you that when I give away all
my value, that's just, I'm just kidding.
Um, but, but, you know, think, um,
a couple of, a couple of things,
the most important thing is we start
to introduce any technology, uh,
like this into our organizations
and into our business, right.
As we have to keep it really
focused on where there's a
return on that investment, right.
Um, there's still a lot of,
there's still a lot of, you know,
Concerns and a lot of risk people
are having a lot of fear around A.
I.
They don't don't understand it.
So there's still opportunities
for us to go and introduce A.
I.
Literacy programs where we
teach executives, right?
How A.
I.
Works.
What is it capable of doing today?
And what's it not capable of doing?
We teach the staff, right?
How to use A.
I.
To help them with the
tasks that take a lot of.
A lot of, a lot of work for them.
We help organizations think about where
there is lots of friction or manual
tasks, like this example of email coming
in and putting into an order form.
Um, identify these really manual tasks.
There's a lot of businesses, Chris,
out there today, big businesses, really
serious, big businesses that run their
business entirely on spreadsheets.
I'm talking manufacturing.
I'm talking construction.
I'm talking events, businesses, right?
Like these really large
businesses, they're distributed.
They're not repeatable.
They haven't even built basic
automation into their flows yet.
Um, and they're really running on a lot
of manual product and it works right.
But they can carve a lot more
value out of those businesses and
do it a lot more efficiently with
a lot fewer staff by looking at
how they can do some automation.
And so.
So it's, you know, it's,
it's about literacy.
It's about understanding the risks.
It's about exposing the tools.
It's about focusing on business basics,
which is how do I apply the right
technology for the right problem?
Don't just throw AI in there because
it's cool and sexy, but let's make sure
we're solving the right problems and
there's a real return on investment.
And then we do it in an intelligent way
that we're not throwing a really expensive
model into something that's got a scale.
because it doesn't, you know, if, if
I've got to call chat GPT or this new,
you know, reasoning models that are
getting quite pricey, if I got to call
them on every, every workflow, then
that's going to be expensive, right?
I'm not gonna be able to build this
cool dating app experience, right?
Where, um, where, uh, if I've got to go
and call it a 20 model every time, right?
That, you know, it must be 20 bucks
to get to go to go get that updated.
So, so we have to be intelligent
how we use these things
and how we use these tools.
Um, and then, and then what's also going
to happen to is these models are going to
get smaller and smaller so that will have
purpose built models that people can use.
For their businesses, uh, inside their
computers and inside even their phones.
And that's where we'll go in the next
year or two is that we'll have our
own personal AI's that are really
shrink down and really, really sort
of understand us and understand, uh,
a lot about us on a personal basis.
Chris: well put again, and I think
that segues nicely into Tricon.
First of all, I think, yeah, chief
AI officer, um, less than a year ago,
no one was calling themselves that.
So I'd love to just understand, um, you
know, what you think a good chief AI
officer does, roles and responsibilities
and how you would define it.
And then, you know, I think that
will lead nicely into what you're
actually working on as well.
Erik Schwartz: Yeah.
So, so I, I think when I, when I
took this role a year or so ago,
about, about 12 months, nine months
ago, back in April of last year.
I was fully cognizant that in a year
or two, this role may or may not exist.
But the reason for this role today is
that it creates a focal point around
the organization to help us, um, help
us understand exactly, um, uh, where
AI fits and help us manage the controls
around the pace of change with AI.
Um, and, um, it's, uh, it, my, my strong
belief is that in a few years, things
will settle down and things will develop
over time that, um, these tools, these
capabilities will roll back into sort
of the more traditional leadership
roles that we have in the organization.
Where, at the product officer of
Chief Nick, technology Officer, right?
These roles will fall back into, as
we understand the technology and we've
deployed the change right in, in the short
term, but in the short term, it's about.
How do we, um, how do we help
organizations and how do we
help our own organization build
the right policies around AI?
Where do we use it?
Where do we not use it?
Right.
Well, how do we stay on
top of the technology?
How do we integrate it
safely into our business?
Um, how do we, how do we put
the right ethics around it?
How do we start thinking about how
we can train both our employees
and our customers, how to use it.
Um, and then how do we use it
as a conversation starter to
really drive new business within
our, within our organizations?
Um, and today it's, you know, a lot
of customers will come to us and
say, Um, I've got, you know, I want
to put AI in my business, right?
Going back to this whole automation thing.
I want to put AI into my business.
And, um, uh, the, the short answer
is, is that actually what they really
need to do is they really need to
get their data state in order, right?
They need to figure out
where their data is.
They need to stop moving it.
From place to place to place, they
need to sort of settle it down.
They need to figure out
exactly what the data is.
They need to clean their data.
They need to organize their data.
And then they need to bring it into
applications that help drive value.
And then we can come back and say, once
we've got that data, because AI is a
data, really data hungry application.
It loves lots of data, but it's, it's,
it's a garbage in garbage out problem.
If I throw messy data into the AI.
We're going to have messy
experiences that come on top of that.
And so organize your data, get it
clean, get your state in order.
And then we can layer AI
capabilities on top of it to
really build good experiences.
Chris: So what, what, what, you
know, these companies that you're
helping, what would you say their
biggest challenges in adoption?
Is it the education piece?
Is it the data piece?
Um, is it ethics?
Imagine more data or education?
Erik Schwartz: Of the companies that I'm
working with right now today, uh, lots
of companies are in lots of different
places, uh, in their AI journey.
Some have been investing
in AI for quite some time.
Um, others have been doing proof of
concepts, um, where they are, um, taking
that proof of concept, they're doing
some experimentation, they're testing it
out, and then getting it to work, it's
pretty straightforward actually to get
the AI to work, to build an application
to work, um, but they really haven't
tied it back to their business yet.
Others have put in policies in
place where, um, they're, they've
said basically don't use it.
Don't use AI at all or
others are embracing AI.
And so let's put it into our
products as quick as possible.
Um, uh, other, other organizations
are, um, um, are really focusing
on literacy and ethics and making
sure they're compliant, uh, with
the regulations that are out.
And so, so, so everyone's sort of
like lots of different companies
are in lots of different journeys.
Um, one of the biggest barriers that
I see today as I talk to clients is
this whole notion of AI literacy.
There's a lot of fear, both at the board
level and the executive level, as well
as, as the front line that the staff, the
teams that are working on it, um, they,
they're afraid that their, um, their
jobs are going to get taken, the staff,
uh, they're afraid the jobs are going to
be replaced by AI, and the, the C suite
and the, and the executive leaders are
saying, listen, There's a ton of pressure
that we're getting downward pressure
from the board and from external factors
to incorporate AI into our products.
We've got to get it into our
products as soon as possible.
Excuse me.
There's an unnatural
rush to put it in there.
So it's really about understanding where
it fits, what it can do, what can't do.
And creating that, that literacy
program, giving people exposure to AI.
Um, how do, how do I,
how do I connect with AI?
How do I use AI?
How do I, um, how do I have a tie?
And what problems could solve for
me and what others now need to work
on their business, their policies
and put those policies in place.
And then other organizations really
are, are, are, are, are understand
where they want to fit it, understand
how to use it, but still have
to get the data state in line.
Um, and so one of the things that we
do is when we go and engage with a,
with a customer is we really figure
out where they are in that journey.
Whether they need to focus on the AI
literacy piece, they need to focus on the
sort of staff training component of it.
Um, and, um, uh, where they need to get
their data state in line, where they
need to figure out what strategy, whether
it's policy, whether it's regulation and
compliance, we can figure out there's
a number of dimensions that we can
look at when we look at the health of a
given organization as they adopt AI, and
then we take those things and we move
them on a strategic journey from there.
So at Tricon, we, because we,
we, we provide a full set, we'll
start Typically with the business
strategy, what is the business?
What is the what's the needle thing
is really gonna move the needle.
How do we measure that
for the organization?
And let's start with that
proof of concept to get to know
each other and build business.
A lot of failed projects inside of
organizations fail because they don't
high technology back to the business,
and that becomes a big problem.
So we, we start there and then we grow
it out over time, finding those, that low
hanging fruit, finding those opportunities
for automation, helping them put policies
and putting, um, uh, regulations and
compliance and good governance in place,
making sure that everyone's at the table
because everyone, you know, everyone from
legal to marketing, to sales, everyone in
the organization wants to be involved in
these conversations and they should be.
Um, AI is a great level or in a lot of
ways, because, because I can speak to it.
With language because I can speak
to it with English or whatever
your native speaking languages.
It doesn't require any special skills
necessarily, um, to learn how to use AI.
It's just like the way you learn
how to talk to somebody else.
What?
You know what?
You, um, you could, you can, you can, you
can speak to it just the way you would.
Talk to you like a, like a, uh, a
new graduate student or train it
like a really smart graduate student
that's just come out of university.
Um, that's really eager.
It's got lots of information, but
maybe it doesn't have the sort of
the smarts to figure out how to
navigate the corporate environment.
Uh, but you can learn that, right?
It's really sort of straightforward
to learn how to have a conversation
with somebody to get them to optimize.
How they perform and that's how we start
thinking about AI inside the organization
Chris: Yeah, and I think you touched
on earlier a lot of fear around A.
I.
And I think there's an interesting
stat posted a while back that 80
percent of generative AI projects fail.
You've obviously took a lot of
things from prototype to production.
And I appreciate you may not want
to share all the secret sauce here.
But, um, What are things that you've
done or can do when you're taking
these things free to production to
ensure increase the chances of project
success and and a successful launch.
Erik Schwartz: Yeah, no, it's a
great question and that you're right.
That's going to depend on the use
case One example that I'll share with
you is a large company for more of a
traditional company that we work we
work with and They you know, they were
they were Quite uncertain as to where AI
was going to fit inside their business.
So one of the things that we created
for them is we created a safe space
for them to learn how AI works.
So they weren't worried about taking their
AI, taking their content and sending it
out to like a chat GPT or, DeepSeek or
any bots and know what they're going to
do with it and not know rather what's
going to happen with their content.
We created a safe environment, a sandbox
that they can put their content and they
put their information, their data into
the AI and start to see how it behaves.
And they knew that it was a safe place and
they knew they could try different things.
And what this enabled them is it allowed
their teams to start to say, Ah, now I
see how I didn't know we could do that.
I didn't know that I could actually
do these things and now I can, um, I
can experiment with different ideas
that we can bring into our products.
So that was one thing.
Creating this enabling place, the
safe place for experimentation,
trial and learning is one thing.
The second part as well is that, um,
it's super critical, especially as we
start to learn, introduce AI into our
systems that we keep human in the loops.
Uh, and basically what this means is
that we don't take a product or an
application and let customers talk
to AI directly, a really safe best
practice around implementing AI is
to use AI to create things offline.
Right?
Generate a bunch of things offline, right?
Look at them, have a
human approve them, right?
And then put them out online.
So one example of something that
we're this is not a customer, but
something we're we're working towards
in the compliance sector, right?
Is there is a lot of effort right now that
goes into human evaluation that something
is legally or regulatory compliant, right?
Those rules are fairly well established.
Right.
And so what we can do is we can take a
piece of content or a drawing or a piece
of marketing literature, and we can run it
through those set of rules and have the I
kind of go through it right and identify.
Yep.
Here's all the things that pass muster.
Here's some areas of concern, and
we can flag those for a reviewer.
So instead of spending.
Three weeks having to read the content,
understand it, get deep into the
literature, compare it to what's there.
We can codify that into a set of
rules, put that in front of an
analyst, and in an hour, then go
through, yeah, accept, accept, deny,
accept, change this to this, boom.
The contents done and reviewed.
And so we're not now we're
preventing sort of this massive
scale for end users to come in.
It automatically generate content,
which could cost millions of dollars
for large websites were also reducing
a massive amount of cost and removing a
ton of friction out of the business by
enabling 80 90 percent of this workload
through this intelligent automation.
Yeah,
Chris: and it's interesting you we spoke
earlier before about you know somebody's
potentially sat in the business um using
an excel spreadsheet and you can automate
that or use agentic AI whatever it may be.
At what point is the kind of benefits
realization or how do you actually
measure ROI because it's great that
you might have swapped the process for
AI but if it's the Exact same thing
when, you know, how do you, yeah.
How do you measure that it's been
a successful project, et cetera.
And, um, show the value
that you've delivered.
Erik Schwartz: that's gonna be that's
gonna be where the rubber hits the
road over the next couple of years.
It's really putting that
measurement on the ROI.
So first and foremost, like, what
are the measures of success, right?
A good product person.
How do we successfully measure
something where we love to start?
Because it's quicker, right?
Is to look at top line impact, right?
Is this generating revenue?
Yeah.
New, new revenue, right?
Is this generating new
customer engagement?
Is this generating right.
And then, you know, you can honestly look
at, yes, we've, we implemented X and now
we're seeing new customers come on board.
We're seeing new customer acquisition.
These are things that are, uh, there's
are, there's are leading indicators
that we can put on top of a business
project, um, that, that we can, we can
measure and we can understand quickly.
It's having impact, right?
When we start looking at sort
of operational efficiencies.
Right.
If we look at time saved or we
look at labor costs, those are
all trailing indicators, right?
They can sometimes take six months
or a year to really understand.
Uh, sometimes it's even
qualitative, right?
But, but put that measurement in place and
say, listen, let's put a hypothesis there
about, we think there'll be time saved.
Um, I was working with one business.
That's, uh, again, an advanced business.
Um, that's looking at how they
set up their new events and how
they acquire customers, right
through their new events business.
And the, the process was effectively, once
a week they would go to a spreadsheet,
download the spreadsheet, copy it into
another spreadsheet, and then look for,
um, you know, look for new, new exhibitors
that really they wanted to, uh, wanted
to, to, to join, to join their new event.
Um, and so, you know, that, that team
that had to spend that weekly effort
to sort of move these files around
the organization, copy it from before
front and back of the firewall, right?
They're going to know, right?
Hey, suddenly I don't, I
don't have to do that anymore.
Right.
I have a list of approved.
People with invites and I've already
signed up and paid right for my event
and I didn't have to do all that work.
So like those, those qualitative
measurements are going to start to surface
in the business and they're going to have
more time to sort of think like, okay, you
know, instead of automatically accepting
all the people that apply for my event.
Let's be a little bit more selective.
Let's change the criteria a little bit
more so we get better quality experiences.
Let's focus on like what that experience
inside the events going to look.
So, so now that mind shift is going to
go away from all this sort of like, you
know, this, this, this mind numbing work
of sort of getting everyone into the
funnel into let's actually make sure
that we focus on where we can add value.
So again, to summarize, right,
there'll be, there'll be,
there'll be leading indicators.
Right.
That we all identify and we can
put those in front of a project.
Those are typically going to be around,
uh, other type of leading indicators
you would put on any sort of product or
product investment, any sort of change
management investment, and then there'll
be the trailing indicators, right,
that'll show the overall operational
efficiencies that you'll get, um,
and those will be both quantitative
and qualitative that we can put
Chris: Yeah, that that's super super
interesting and um, well put and um,
just a final bit on I guess kind of
product this is I guess unrelated to
Uh tricone, but obviously new startups
pop it up Uh daily literally at the
moment in terms of product market fit,
you know I think the thing with machine
learning is there's a lot of founders
from technical engineering backgrounds
who you know, potentially haven't
You know Got on into product light.
So what?
What?
What do you look at?
Or what?
You know, if you had a startup, how
would you look at product market
fit at the moment when becoming
tighter and tighter every day
Erik Schwartz: product market fits.
So it was, um, it was, it was always a,
it's a, it's always a tricky question.
And I think a lot of companies,
regardless of sort of the
tech backend that they have.
Or the staffing they use or how they
go implement or how they go to market
are always stretching for PNF, right?
They're always trying to find that that
niche, uh, for their own businesses.
Um, I think, you know, what
does this allow you to do?
What does the technology allow you to do?
Right?
Um, if you talk to any founder,
right, any startup founder, how many
businesses, how many startups fail?
What's that percentage
failure rate, right?
And the failure rates are good, you
know, typically in from a product
mindset Those are learnings, right?
And what those learnings are,
those are iterations, right?
Those are, I tried
something, it didn't work.
I tried something, it didn't work.
I tried something again, it didn't work.
I tried something.
This is resonating right now, right?
And so so that that's the tried
and true methodology, right?
That we can use to apply and find
product market fit for any new
business that we want to create.
So what does I enable us to do right?
I enable us to a we can do
those iterations a lot quicker.
Now we can do them a lot cheaper.
Right.
It doesn't mean that they go away.
It doesn't mean that we'll
get it the right time.
But then as the AI gets smarter
and smarter, it's going to
say, Oh, like, here's 10 ideas.
Instead of choosing one through
10, let's start at number seven,
because we've tried that before.
And that works, right?
So that's going to be the
evolution that we're going to see.
Um, and, and instead of, you
know, hiring a team, having to
get expensive engineering, um, you
know, which is, you know, again,
the whole measure twice cut once.
You know, engineering is one of the,
you know, and your staff and your labor
is one of the most expensive parts
of building any new digital product.
If I can go into a tool
and just use words.
Prompts to define what the
tool is to tweak it and manage
what that thing is going to be.
I can build that with a much
smaller team and I can scale it out
and build on top of these tools.
Like, you know, replicates come out with
a whole set of tools right now where I can
use prompts on my mobile device, right?
To talk to it and build
any app that I want.
Now, again, you know, they're not
full scale, but they allow you
to prove that idea out, right?
There is product market fit.
You can build that MVP,
that minimal viable product.
You can scale it up to a certain
level, validate that it's not going
to work at a much cheaper cost.
And then once you get that, once you get
that validation, then you can go make
that right investment and you know that
that's going to, that's going to scale.
I
Chris: thanks, Sharon.
Super interesting.
And, um, yeah, what?
It won't be a way I needs for his
podcast without asking a couple of
recommender systems are search questions.
How do you see, um, a gentic AI
affects in, um, those specific
markets and technologies?
Erik Schwartz: think, uh, I think
the SEO guys are, uh, are, are a
little bit nervous at this point.
Right.
I think, um, uh, there, there's,
there's, there certainly is.
Um, although, although Google's
not reporting any, um, any lack
of, um, uh, traffic or any, or, or
decrease in ad revenue, but I think
we're going to see that much shift.
Right.
Um, I can go into, um, A lot of people
that you talk to now, instead of going
to Google to look for something, um, they
go into chat GPT and they say, Hey, let
me, let me give me an answer to this.
And so suddenly those ad impressions
aren't happening anymore.
Right.
Um, and, um, and that,
that organic traffic's not
happening through, through SEO.
Now it's happening.
The traffic still happening.
Something's visiting that
website on your behalf, right?
But it's coming through an agent so that,
you know, the information is out there and
it's being exchanged in a different way.
So it raises a really
interesting question.
Do we need to be building websites for
human consumption now, or do we need to
be building websites for AI consumption?
Right?
So, so how's that?
How's that going to shift?
Do I need to make it
agent accessible or not?
Right?
Um, Now, you know, I, um,
I spent a lot of my time.
We relocated out to the south,
uh, southeast coast of England.
So we're out, um, in, in Kent and
I still see, you know, I still see
a lot of opportunity where there's
sort of a very light footprint, uh,
for, for digital digital, uh, or very
little digital literacy in that space.
So people don't really have websites
for their businesses or, uh,
they don't really put their hours
online or their menus online or.
Uh, or, you know, like, or the,
they'll, they'll, they've gone
as far as putting together a
Facebook page and that's about it.
And so I, I think, you know, there,
there's still real, real opportunities for
these, these systems to come online and,
and really, uh, uh, uh, uh, propagate.
But it's, it's gonna evolve.
It's gonna change, um, the
early days on this question.
Um, but I think it's
gonna, it's gonna happen.
So that's one sort of thing.
This is sort of on the discovery,
customer discovery, finding
people, finding services,
finding goods in my neighborhood.
But, but once, once, once, you
know, these local shop owners and,
and local providers start figuring
out how to use these tools, right.
Uh, and realize that they can go into
their phone and say, Hey, put up a website
that just talks about my business, right.
That's it.
And it'll be online.
It'll be out there.
It'll be up to date with all their
products, all their services.
Help me tweak my offers and adjust
my pricing to be more competitive.
I could say that to my app.
Then I'll make it more discoverable.
And whether it's discoverable through
a search engine or through social
media, like all those things will
sort of be taken care of automatically
for me by an intelligent agent.
And so that's that's a real opportunity.
So that's one side of the spectrum.
We think about search rexus, right?
The other side of the problem that
I'm super excited to go kind of solve.
And you asked me a little bit before,
like, what are, you know, we were
talking before this, we started
the recording about my predictions
for, um, for the rest of the year.
Um, We finally have a real path
forward to solve the enterprise
search right inside the organization.
This has been a problem for decades.
Um, and especially I talked a little
bit about my history working for
search engine vendors, working
at Microsoft, working with the
SharePoint team for a couple of years.
This is a really unsolved problem.
And what What the problem is an
enterprise search is large corporates
that generate a ton of information
and they put it in documents and power
points and wikis and slack channels
and teams messages like an email.
So it's all this information
and no one can find anything and
no one can find any projects.
And there's all this sort of waste
that happens inside the organization.
How do I find the policies?
What are the, you know, a vendor,
you know, uh, gave me a, gave me
a, uh, offered me a free dinner.
Can I accept this?
Right?
Or I got tickets to a football
game, to a football match.
Can I, is this legal within
our policy to accept it?
Or, um, hey, I need to take a holiday.
Um, how do I, how do I request like
sort of all these things about how
to, how does business work, right?
These were all sort of the problems.
Or how do I find an expert that
you can help me solve a problem
inside of my larger organization?
These are all problems that we've been
trying to solve with Enterprise Search.
For decades.
And they've been solved really poorly.
Now it still comes back to you.
You got to know somebody, you
talk to somebody, they refer you.
Like we, we tap into our wetware
right inside the organization.
You find that one person inside the
team who knows who's a great connector.
You talk to them and they know all
the people that you can talk to.
And that's how we get stuff done
inside of these organizations.
Um, that's going to get replaced.
We're going to have an enterprise
AI where it's going to have all the
information, all the experts, all the
data flowing in, uh, you know, uh, Our
customer data will be, will be protected.
They'll understand the rules.
It'll understand the financial data.
It'll understand who
can get access to that.
And you can't get access to it.
We can allow executives, right.
To start to have
conversations with the data.
Help me really understand, right.
Why we missed a forecast in Q3
for the Southeast region, right.
Instead of having to spin up an analyst
to have them go on for, you know,
uh, a month preparing for a meeting.
Down the middle of the meeting.
So let's just ask, let's
just find out right now.
We're in the most call what, what
happened and why we missed the data.
I'm like, Oh, there it is.
Like there's a, you know, uh, uh, an
email didn't get crossed or someone
went on holiday for two weeks.
We, we missed the end of the quarter
and that's why they had pushed, you
know, some explanation will come out.
You're like, Oh, okay,
well that makes sense.
But I never would have known that
never would have been able to
serve this information with all
these legacy tools that we have.
Chris: Yeah, and that's pretty
transformative, especially for
companies who have been around for a
long time and a lot of legacy systems.
If they can bring all that data
together, how far do you actually think
we are off solving and having, um, a
potential working solution to, um, that
Erik Schwartz: So, so.
AI is being deployed inside
the enterprise right now.
I think it's, you know, it's still,
still very much in sandbox phase.
Um, I think there are certain companies
are starting to embrace it and starting
to make discovery a bit more accessible,
but there's still massive data, uh, and
there's still massive policy problems.
There's still massive, um, uh, uh, volume
problems that we still have to tackle.
Um, but the tools are there.
We can start to see a roadmap now.
Right.
Where the challenge we had before
was the data problems were always
there, but the tools just didn't
allow us to ask the right questions.
They didn't allow us to integrate
these systems in a timely manner.
So we can see that.
I think, um, my crystal ball is that,
you know, in the next couple of years,
we'll start to see some real impact
from enterprise applications of these
capabilities, but it's going to move.
What's going to be slower.
Is the change management process
inside the organization, right?
Not the technology.
It's not, it's not going
to be the barrier anymore.
Chris: fair.
And so if you touch on that, you know,
we'll potentially have a gentry carry
kind of gentry, Chris, um, working
for us in the not so distant future.
Yeah.
Companies like Meta have built.
Businesses on ad revenue, you know, if
chris is now surfing the internet on
my behalf how do you think that will
affect you know a business like meta
who have built a business on ad revenue?
Erik Schwartz: Yeah.
You know, um, the, uh, we, we've all
been, you know, everyone, we watched,
watch television, um, uh, you watch,
you know, they watch TV shows, you
watch movies, um, and we all hate.
Interruptions with of ads.
Um, we see ads place.
We've got ad blindness on websites.
Um, in lots of places and it's not,
it's not a pretty thing, right?
Sort of distracts us from
the content we're getting to.
Um, but the ad models been pretty
resilient to a number of technology
advances over the last couple of decades.
It's been hard.
You know, everyone's been trying to come
up with a better, better way to monetize
some of those content or a better way
to generate revenue from this content.
And it's been tricky.
It's been, it's been super hard.
And they haven't come up with
a better solve than ads yet.
So that being said, right, I
think, I think there's a number
of enabling technologies that
are starting to come into play.
And Again, going back into Eric's
crystal ball here and looking at the
future, you know, it's going to take
a lot and there's a lot of inertia
around ads just because it's a, it's a,
it's a tried and true business model.
People are making a lot of money.
Google, you know, Google's
an ad business, right?
It's 90, 95%, 90 plus 90 plus
percent of their revenue.
I don't have the numbers on hand, right,
comes from advertising, so it's going
to take a lot of inertia to move away
from that, especially for businesses
generating billions of dollars in revenue.
However, um, there is, you know,
there is a confluence of things
that are starting to come together.
We think about artificial
intelligence, and we think about
distributed technologies, and I'm
not talking about necessarily crypto,
but the underlying blockchain.
Technology, right?
The strip notion of a
distributed ledger, right?
Where I can attribute
anything to anyone over time.
Um, you know, take take the
example of buying a house.
Right.
If you buy a house in the UK, right,
what is the biggest, what is the biggest
bottleneck right in, in, in buying that
house, the, you know, it's, it's all the
time that the barristers have to do that,
that the lawyers have to go through to
validate that you've got all the pieces
of information that chain up right into.
You know, did they put the right windows
in, have the right certifications,
have the right compliance?
Is there, you know, is there a, uh, a,
uh, you know, a historical compliance
that, you know, I've got to preserve in
terms of the style of the house, like
all these things need to be checked.
Now imagine, right, imagine that over time
you have a house and all these things can
be put on the blockchain, then go into the
blockchain and they'll pull out all the
records for this house and in that records
of the house are all the certificates,
all the compliance certificates, all the.
You know, all the additions, all
the new windows that are put in,
all the new boilers that are put
in, the heating certification, the
roofing, the, um, the, the, the
temperature, all that stuff was there.
And all I had to do is go pull it back.
And it was just in a record for me.
And I could see exactly who did it.
I could go talk to them because
there was this sort of distributed.
Ledger, right?
This distributor record a big distributed
database of all this information.
So anyway, long point.
I'm a belabored point.
But what I'm getting to, right, is
that I can now think about applying
this modality of distributed ledger
to a number of different businesses.
Think about health for a
second for a minute, right?
What if I can tie, you know, every
doctor's appointment I have to.
So, and again, it's something that I own.
I can give rights to somebody else
to access this, but now I have got
something that's incredibly valuable.
Maybe I've got a DNA scan, right?
Or I've got a particular genome.
That could be incredibly valuable to
a drug company who's trying to build,
uh, a, um, a, a, a, a, a, a targeted
cancer cure, or a, um, a, or, or
a, a, a cure, or, or a medicine for
a really particular rare disease.
If I can now capture all that
medical information and share it
in a way that becomes an incredibly
valuable thing that I can share and
create a value exchange to, to an
organization that's looking for it.
And so we start thinking about how we
can leverage distributed technologies.
There's block chain technologies on
top of AI deliver hyper personalized
and incredibly personalized solutions.
There creates a new marketplace
for value exchange, right?
That allows us to then move off ads.
Now, this is.
10, 20 years in the future.
This is, this is going
to take some time, right?
Just because these systems are
large, um, and, and, uh, and, and
have a lot of inertia, um, but,
but you can see now a path forward.
Chris: Yeah, I think blockchain
Probably, uh, another podcast altogether.
Erik Schwartz: Yeah.
We're going to leave the
listeners hang in for that one.
And we can come back and we
get that one another time.
And I think we should bring a web, a
web three expert in to talk about it.
Cause I really just skimmed
the surface on that area.
But, but I can, I can see
these things come together.
You know, we haven't even layered quantum
on top of that, which analysis allows
us to, you know, to, to, um, to compute
these things in, in instantaneously.
Chris: um, yeah, we're
moving at such a rapid pace.
We originally got speaking about DT last
week, which is, I said, at the start of
the podcast, why we came together today.
It feels like old news, but I think.
Let's um, discuss it.
Um, you know, how, how have you, I think
you've raised some interesting points
when we originally spoke about, you know,
democratizing the market and it's changing
things, but potentially in a positive way.
What, what are your thoughts on the
short and long term effects of, um,
you know, that coming to market?
Erik Schwartz: Yeah.
So, so, um, we, we spoke at the beginning
of this, uh, this, this call, uh,
about, um, the different models that
were coming out there and how sort
of every model was sort of catching
up with chat GPT in a lot of ways.
Nothing was sort of somewhere, maybe
catching up and maybe surpassing it
maybe, but then chat GPT would put out
a new version and there wasn't really
sort of anything new that was coming out.
And in terms of these benchmarks,
right, we were sort of.
Incrementally improving the
benchmarks a bit by bit, month
over month, quarter over quarter.
Um, and then, um, OpenAI
released, uh, these, these things
called reasoning models, right?
They're, they're, um, they've been labeled
O1 or O3 are these reasoning models.
And this gets into the sort of first
step in this agentic journey where
a model, basically what it does,
it, you know, takes a lesson out
of what our parents told us, right?
Chris, you know, Eric, don't just
answer the first thing that pops in
your head when I asked you a complicated
question was we learn this in school.
Stop and think about
it for a minute, right?
Think about what the answer
to the question is, right?
This is what these models
are doing now, right?
They've been now told, right?
To instead of just giving the first
answer back that comes to you, stop
and think about it for a minute.
Have a look around and reason.
What are different possible solutions or
different answers or different different
rationales for answer for a question?
So that's one thing that
that's starting to happen.
And so deep seek is a reasoning model Now
what was innovative about deep seek right
is that instead of training on all the
raw data that like chat GPT or anthropic?
Or Gemini or meta are these guys
are building on instead of building,
you know and that the raw data is
You know, like, here's a website.
Here's product.
Here's, you know, this is an apple.
Here's a picture of an apple, right?
So things, right?
What they started doing is they
started training themselves
on on solutions, right?
On, on, on, on, on, on plants.
So they would take a plan.
They would go.
And I think there was some
evidence that they were actually
taking these completed plans.
Either from their own internal model, a
version called V3 that they were using
themselves or taking them from chat GPT.
Um, and they were looking at
these plans and saying, let me
train on this entire plan, right?
So a series of steps, a workflow of
from start to finish, here's the goal.
Here's the things that I did.
Here was the outcome.
And then I use that.
As a plan, we use that as a,
as a, as a training thing.
And so, uh, as a, as a block of training.
And so what it did is then they created a
mathematical system to reward the system
for creating plans that were as efficient
as the original plan they were trained
on and then throughout the other plans.
So, so instead of getting it to
think in, Words and sentences.
It kind of to think in full plans.
And that was, that was, that was,
that was, that's a big deal, right?
Um, and that's what we're starting to
see in these reasoning models is this
notion of I can, so it's building,
it's building on the standing on
the shoulders of giants, right?
And that in order to come up with
a plan, I've got to train on these
little elemental things that I
can build a plan and they can
train on top of the plan, right?
Um, it's, you know, it's, it's,
uh, it's like the Lego block.
You don't sit there and
pour your own plastic.
You have these blocks that are a
certain size and you can sort of stack
them up and make houses and cars.
Um, so, so that was one big thing
that I think was really revolutionary.
And, and then they, the other thing
that they did, um, which was, which sort
of got everyone, um, head scratching
a bit, and this is where it got a bit
competitive a little bit is, um, and no,
we weren't a hundred percent sure on kind
of what chips they were using, but they,
but due to export controls that were
set up in place by the U S they didn't
have access to the latest and greatest.
GPUs from like a media, right?
That, um, the American labs have, I
mean, you know, these, these GPUs for,
you know, the cost 40, 000 a piece.
And they're, you know, they didn't have
a lot of, let's not, let's be clear.
They had a lot of chips, but there
were these memory constrained chips.
There were these chips that were
slightly less, had slightly access to
less memory, uh, slightly slower, slower
bandwidth, uh, than the other ships.
So they did some really incredible
math to take advantage of these
constrained environments, and
they were able to train the model.
At a fraction of the cost.
So the run, instead of costing hundreds
of millions or, uh, you know, or tens
of millions or hundreds of millions,
they're able to do a training run that
costs about 5 million, um, in, in compute
power, which, which is like an order of
magnitude smaller than the big labs were.
So what, you know, one, they were
able to train, uh, on, on plans.
You know, build on top of logic
that built on top of plans and
then reward the system for plans.
They were able to work within
a constrained environment
to do a lot cheaper.
And then three, they made
it completely open source.
So people could repeat it and put it out
there again and replicate the results.
So, you know, People came at
it from all kinds of angles.
Is this a psyop or, you know, it was
super convenient that this got launched
on the same day that Trump had announced
his, his tariff plans and was his
plan, was this done by the government?
And, you know, that's,
that's 40 chest in my world.
I think, you know, there, there,
there's some evidence that this
was sort of a natural progression.
If you look at.
Over the last year, year and a half,
maybe the timing was a bit off and
you know, but, but it's, they had
been working on this thing for, for a
good year or two and you can see the
progress that they were making and
they were going in this direction.
So, so it was kind of,
um, it was in my mind.
It was, you know, what would I say is
invention, you know, um, uh, necessity is
the motherhood of invention kind of thing.
Like you, when we give ourselves
constraints, we find ways
around those constraints.
And that's what happened.
And I think the other thing that was there
is that, you know, it came from China.
And so suddenly we're saying, Hey,
we've been paying all this attention
to what's going on in America.
Right.
And we know like tech talk is happening.
We don't buy dancers doing some cool
stuff over there, but suddenly this
thing came out of these labs in China.
Uh, and we now have level really
level the playing field in terms
of who's contributing the most, uh,
really valuable frontier models.
Chris: Yes, it's super interesting.
Going to be interesting to see.
Um, like I say, not, not the next year,
but the next six months, uh, plays out.
I think that's as far as anyone dare
look into the future at the moment.
And, um, just closing a
question, Eric, what's, um, yeah.
What, what are you most excited to see
as we kind of progress into this year?
It feels like a super
exciting start to the year.
What, what's exciting you the most?
Erik Schwartz: There's so much.
I think, um, I, you know, I'm,
I'm, uh, I'm super excited.
Um, I think there's a number of
things that I'm super excited about.
Um, one, yeah, I, I'm really enjoying sort
of this, this frenetic pace and change.
It's been a lot of fun just to watch
it, um, to see the tools evolve.
You know, we're seeing we haven't
really cracked into audio creation,
image creation, video creation, right?
So there's the whole notion
of multimodality, right?
We haven't really spoken about it all.
That's coming at an incredible pace.
We're seeing a ton of stuff
in the robotics field.
Robotics was always these quirky arms
that you kind of moved up and down,
and you know, they were doing some
cool things in manufacturing and cars.
But now we're seeing this whole
notion of it's not just your Roomba
anymore, but a real humanoid robot.
That's coming out, uh,
coming to the market quickly.
That's learning to navigate
around, you know, a very complex
environment based on these things.
So, so cool.
So we have multi modalities, we have
multiple form factors coming out.
Um, and, um, we're going to see, um, I
think what I'm most excited about, this
is something we didn't hit on yet is
literally came out yesterday from open AI.
Um, that they're realize, you know, they
realize internally that we're releasing
all these things sort of piecewise, right?
So I've got a reasoning model here,
and I've got planning model here,
and I've got something that can
go control my web browser there.
And I got something can
go do some deep research.
Well, this is all going to come together
right in a really cohesive package.
And so.
As a user, I'm not gonna have to go off
and say, for this particular request,
I would like you to create a task for
me and make sure you run it every day.
For this particular request, I
want you to go use a web browser
to go figure it out for me.
And for this particular task, I
want you to use a reasoning model.
I'm gonna be able to sit down
with the AI and say, here's the
problem I want to solve, right?
Here's the goal that I had in my mind.
Again, thinking agentically, It's going
to be able to figure out how to use
its internal models and its strengths
and weaknesses to figure out which bits
to use for what to assemble that thing
for you and then go solve that problem.
So that problem of having to
figure out and navigate all these
spaces that's going to go away.
And so those are those.
I think that's the thing
that's the most exciting.
And then this is going to
show up in my phone, right?
And I'm gonna I'm gonna have
access to this anytime I want to.
I'll be able to create
Applications I'll be able to solve.
Probably it's the it's the matrix.
I need to learn how to
fly a helicopter, right?
Tell me how to ride a motorcycle.
It'll it'll effectively be there, right?
Chris: Nice.
Yeah, it's similar for me as a, um, solo
founder, Gentic workflows are a blessing.
And even when the market changed so
much since I set up the business last
year, but using AI in the business and,
um, assistants are, um, what I'm really
looking forward to, and we're also, um,
using replay as well, a customer had a
particular problem and we've used, um,
a no code app on replay to help them
solve the problem, which is, you know,
I can't code and, um, Having things like
that at your fingertips is is amazing.
So,
Erik Schwartz: No code schools.
Lovable is out there.
Replicates out there.
Um, you know, those of us, you
know, we're not gonna need these
heavy skills anymore to go build
these to solve some simple problems.
So it's exciting.
I agree.
Chris: certainly well look
Thank you so much for coming on
to for sharing your knowledge.
I've actually learned a lot from you today
So i'd like to say thank you and yeah
Appreciate your time and if anyone's got
a I guess a question about ai adoption in
in their organization Are there are you
happy for them to to reach out to you?
Erik Schwartz: Absolutely.
Yeah, please do reach out.
Keith connect.
Let's connect on socials.
Let's have that conversation.
We do run what's exciting.
We do run a monthly dinner in London.
Um, and we'd love to have
people come and share their
stories at this monthly dinner.
So I'll post that online as well.
It's a great place to learn.
Join the community.
We're all learning from each other, right?
Um, that's the one of the biggest things
I say when I go into an organization.
Someone says, Eric, you've
got this great expertise.
I said, you know what?
You've got the business understanding.
You know how your, your business
works and together we learn together.
How do we solve these things?
And that's, I think the most
fulfilling thing right now.
When you work with new clients, when
you meet new friends, when you meet
new colleagues, when you join the
community and we share how we solve.
Chris: Nice.
Yeah, what I'll do is I'll put the link
to the dinner in the description as well.
And anyone's got any questions,
they can they can reach out.
But thank you so much.
It's been great to chat.
Erik Schwartz: Chris, it's
great to chat to you as well.
Thanks so much for.
Thanks for the amazing questions.
Really good.
I can see I love where you're
going with all these things.
And I think these are things that are on
top lot of top of a lot of people's minds.
So it's great to be able to put them
out there and have a conversation about.
Chris: Thank you so much.