WEBVTT

NOTE
This file was generated by Descript 

00:00:01.170 --> 00:00:05.560
Victoria: Welcome to The Chemical Show,
the podcast where chemical means business.

00:00:05.930 --> 00:00:09.710
I'm your host, Victoria Meyer,
bringing you stories and insights

00:00:09.710 --> 00:00:13.620
from leaders, driving innovation and
growth across the chemical industry.

00:00:14.529 --> 00:00:18.830
Each week, we explore key trends,
real world challenges, and the

00:00:18.830 --> 00:00:20.580
strategies that make an impact.

00:00:20.879 --> 00:00:21.959
Let's get started.

00:00:23.243 --> 00:00:26.873
Today we've got a great
conversation focusing on the use

00:00:26.873 --> 00:00:29.023
of technology and AI in business.

00:00:29.373 --> 00:00:33.863
I've got Alan Spanos, who is the
director of data solutions with ICIS

00:00:34.203 --> 00:00:39.073
and Chad Alan, who is the director
of technology strategy at ICIS.

00:00:39.443 --> 00:00:42.223
We're going to be talking
about technology, the role of

00:00:42.263 --> 00:00:44.013
AI in business and analytics.

00:00:44.423 --> 00:00:46.393
Risks and solutions and more.

00:00:47.073 --> 00:00:48.543
Welcome to the chemical show guys.

00:00:49.273 --> 00:00:49.413
Alan: Thanks.

00:00:49.413 --> 00:00:50.463
We appreciate it.

00:00:50.493 --> 00:00:50.963
Thanks for having us.

00:00:51.643 --> 00:00:54.333
Victoria: Absolutely Alan, let's
start with you Can you just give a

00:00:54.433 --> 00:00:58.743
brief intro to who you are and how
you got here in your role at ICIS?

00:00:59.324 --> 00:00:59.813
Alan: Sure thing.

00:00:59.813 --> 00:01:02.063
So I didn't start my career in chemicals.

00:01:02.113 --> 00:01:07.253
Um, I started in a management consultancy
company After doing aerospace engineering

00:01:07.263 --> 00:01:13.843
at university And then I spent about 14
years In aviation in the uk in a variety

00:01:13.843 --> 00:01:18.708
starting with Moving on to data and
data engineering roles after that and

00:01:18.708 --> 00:01:23.648
leading a internal practice, looking at
that,  within the UK's largest airline.

00:01:24.108 --> 00:01:27.478
And then at the end of the pandemic,
I moved on to work for  ICIS,

00:01:27.878 --> 00:01:30.958
um, performing the same kind
of role, but for our customers.

00:01:30.958 --> 00:01:33.938
So making sure they have what
they need to do their job.

00:01:34.738 --> 00:01:35.078
Victoria: Awesome.

00:01:35.918 --> 00:01:36.658
Chad, how about you?

00:01:37.558 --> 00:01:40.658
Chad: So I'm, I'm from a technology
background, so I've been working,

00:01:40.698 --> 00:01:44.188
uh, in the parent company
for ICIS for about 18 years.

00:01:44.608 --> 00:01:47.018
Um, I've been with ICIS
for about eight years.

00:01:47.278 --> 00:01:50.198
Uh, so I've come from a computer
science background and,  just really

00:01:50.198 --> 00:01:53.948
into technology, uh, but,  got into
ICIS because of the interesting

00:01:53.948 --> 00:01:56.848
data and all of the very interesting
problems you can solve here.

00:01:57.848 --> 00:02:01.218
Victoria: So your role today is
director of data technology strategy.

00:02:01.218 --> 00:02:01.558
What is.

00:02:02.008 --> 00:02:04.258
What does that mean in the
grand scheme of things?

00:02:05.338 --> 00:02:05.998
Chad: Great question.

00:02:06.108 --> 00:02:10.068
Um, so it, it means,  I do a lot
looking at sort of the longterm view.

00:02:10.428 --> 00:02:13.478
Um, on,  the business and how
technology,  helps support the

00:02:13.478 --> 00:02:16.018
transformation of the business where
it needs to go looking at sort of

00:02:16.018 --> 00:02:19.028
emerging trends,  and, you know, things
that are going to make the difference

00:02:19.028 --> 00:02:20.708
both for us and for our customers.

00:02:21.148 --> 00:02:21.488
Victoria: Awesome.

00:02:21.578 --> 00:02:23.708
Well, that'll be perfect then
for today's conversation.

00:02:23.998 --> 00:02:26.868
Before we jump into a little
bit more about technology.

00:02:27.348 --> 00:02:32.098
Can you guys just explain a bit
about who ICIS is and who they serve?

00:02:32.998 --> 00:02:34.588
Alan: Yeah, I can jump in on that one.

00:02:34.588 --> 00:02:40.148
So, um, ICIS is,  effectively a
information and analytics provider for

00:02:40.178 --> 00:02:41.928
the chemical and energy industries.

00:02:42.408 --> 00:02:47.008
We have a variety of different services
and products that we offer from price

00:02:47.008 --> 00:02:50.948
assessments for different raw materials
and commodities, but also analytical

00:02:50.948 --> 00:02:55.408
services for things like supply
and demand, analytics,  and  price

00:02:55.428 --> 00:02:59.008
forecasts and margin analytics in the
industries that we serve, um, also

00:02:59.008 --> 00:03:04.328
part of Red X, which is a global data
analytics information provider, uh,

00:03:04.368 --> 00:03:10.768
working across many industries from
health care, science, legal and data

00:03:10.778 --> 00:03:15.028
services, which is what we work in,
uh, covering areas like chemicals, but

00:03:15.028 --> 00:03:17.308
also aviation and other industries.

00:03:17.953 --> 00:03:18.273
Victoria: Awesome.

00:03:18.273 --> 00:03:18.963
Thank you for that.

00:03:18.973 --> 00:03:22.703
In fact, uh, I had not appreciated
just how big the parent company

00:03:22.703 --> 00:03:25.933
was and all the pieces it
touches, which is really exciting.

00:03:26.233 --> 00:03:29.653
And then,  listeners of the chemical
show will know that I have talked

00:03:29.653 --> 00:03:34.333
with one of your colleagues, John
Richardson,  from ICIS many times.

00:03:34.343 --> 00:03:38.903
So we'll, in fact, link to some of John's
episodes on, uh, on our show notes.

00:03:38.903 --> 00:03:42.833
And as we promote this episode,
uh, because John, of course, always

00:03:42.853 --> 00:03:46.568
brings a wealth of Insights into
what's going on in the world of

00:03:46.588 --> 00:03:48.218
chemicals and polymers and more.

00:03:48.298 --> 00:03:53.248
So let's just get into this in
terms of what do you see as the role

00:03:53.318 --> 00:03:56.098
of technology in chemicals today?

00:03:57.248 --> 00:04:01.378
Alan: From a business perspective,
the industry is very competitive.

00:04:01.448 --> 00:04:04.398
So I think,  what you see when we
talk to our customers and the industry

00:04:04.398 --> 00:04:07.538
in general is that everybody's
looking for competitive advantage.

00:04:07.538 --> 00:04:11.228
And one way of delivering that
is trying to be as efficient and

00:04:11.228 --> 00:04:15.338
creative as possible with new
technology that you're able to deploy.

00:04:15.793 --> 00:04:21.173
Um, so we see that customers talking about
fantastic innovation in their operational

00:04:21.173 --> 00:04:26.313
processes, making sure they can really
get the most out of what they're producing

00:04:26.333 --> 00:04:27.773
and who they're producing it for.

00:04:28.133 --> 00:04:34.613
And also more recently, and in the space
that we operate in,  making the most

00:04:34.613 --> 00:04:36.133
of understanding the market correctly.

00:04:36.133 --> 00:04:38.483
So, um, how much should you produce?

00:04:38.883 --> 00:04:40.223
Who should you sell it to?

00:04:40.603 --> 00:04:41.743
How you should use it?

00:04:42.033 --> 00:04:45.223
Those are all types of questions
that we can help with,  and we're

00:04:45.303 --> 00:04:49.383
seeing customers digitize those
processes,  and try to apply AI to

00:04:49.383 --> 00:04:53.763
them as much as possible to make better
decisions and to really do things

00:04:53.763 --> 00:04:55.643
faster in such a changing environment.

00:04:55.897 --> 00:04:57.007
Victoria: Yeah, that's interesting.

00:04:57.472 --> 00:04:58.872
You know, you talk about technology.

00:04:58.872 --> 00:05:01.512
In fact, what comes to mind is I
think about is  the supply guys

00:05:01.512 --> 00:05:05.032
especially would run things like,
an LP model, linear programming.

00:05:05.262 --> 00:05:08.092
And yet, I don't know if
we talk about that anymore.

00:05:08.122 --> 00:05:10.812
Is that a predecessor to AI?

00:05:10.812 --> 00:05:12.512
Is it part of AI?

00:05:12.892 --> 00:05:14.552
Where does that, something
like that fit in?

00:05:14.897 --> 00:05:18.857
Alan: Yeah, so I think, um, in my whole
career, probably everything I've done up

00:05:18.857 --> 00:05:24.437
until now, today we would call AI, and
it's such an umbrella term for, kind of,

00:05:24.707 --> 00:05:26.597
lots of, in my mind, it's mathematics.

00:05:27.017 --> 00:05:32.977
So, you start from incredibly simple
formulas, linear programs, or just

00:05:33.167 --> 00:05:37.807
basic, Mathematics you'd learn at primary
school in effect in the UK, all the

00:05:37.807 --> 00:05:41.307
way up to the modern version of what
you'd call generative AI these days,

00:05:41.317 --> 00:05:47.107
which would be much more complex and,
I suppose difficult mathematics, right?

00:05:47.107 --> 00:05:50.597
And I think the difference these
days is we have so much more data

00:05:50.597 --> 00:05:53.217
available to pump into those models.

00:05:53.577 --> 00:05:55.877
And also the compute
power is a lot cheaper.

00:05:55.877 --> 00:06:00.963
So often people will give you the kind of
adage that when I was running an Atari.

00:06:01.363 --> 00:06:04.980
For my games console back in the
day, my phone has got 10 times

00:06:04.980 --> 00:06:08.010
more processing power than that,
or a thousand times, right?

00:06:08.010 --> 00:06:09.260
And I just put it in my pocket.

00:06:09.290 --> 00:06:13.350
So you just have a lot more compute power
and a lot more data and better mathematics

00:06:13.350 --> 00:06:14.610
to solve those problems these days.

00:06:14.970 --> 00:06:15.470
Yeah, and it's,

00:06:15.480 --> 00:06:17.170
Chad: it's all, it all builds, right?

00:06:17.240 --> 00:06:21.090
It's not like, it's not like AI emerged
overnight and it was, it was everything

00:06:21.090 --> 00:06:24.550
was a whole brand new, you know, so
all those techniques, we would have

00:06:24.550 --> 00:06:29.100
looked at any kind of data analytics,
historically, all that mathematics stuff.

00:06:29.100 --> 00:06:29.550
That is the.

00:06:29.730 --> 00:06:30.850
Core of what fed into AI.

00:06:30.850 --> 00:06:34.340
A lot of the core things around AI
and how it works are actually very

00:06:34.340 --> 00:06:37.950
old ideas, but the technology in
terms of the compute power and all

00:06:37.950 --> 00:06:39.340
of that just needs to catch up.

00:06:39.450 --> 00:06:43.428
And once it became powerful enough
in the cloud computing revolution

00:06:43.438 --> 00:06:48.178
happened,  all of a sudden we could
start just throwing tons and tons of

00:06:48.178 --> 00:06:50.358
processing power at solving problems.

00:06:50.718 --> 00:06:53.538
And that's over the years become
more and more powerful to the

00:06:53.538 --> 00:06:57.298
level you're now seeing with like
generative AI and really getting into

00:06:57.308 --> 00:07:00.168
what people have always thought of
as being artificial intelligence.

00:07:01.473 --> 00:07:04.053
Victoria: Yeah, I mean, and in fact,
I think when I first heard the term

00:07:04.073 --> 00:07:09.103
artificial intelligence, which was
probably 20 or more years ago, I was like,

00:07:09.143 --> 00:07:12.033
what, holy heck, are you talking about?

00:07:12.423 --> 00:07:14.843
Um, and I think people still are.

00:07:14.853 --> 00:07:19.963
What the holy heck are you talking about
in our machines taking over the world?

00:07:20.193 --> 00:07:23.543
So I think we'll get, we'll get into
that cause I think that falls into

00:07:23.543 --> 00:07:27.153
some of the risks maybe later, but
when you think about,  what's going on

00:07:27.153 --> 00:07:29.293
with technology, what are your clients.

00:07:30.508 --> 00:07:31.638
And how are they engaging?

00:07:31.648 --> 00:07:34.798
Because I think of ICIS as a data company.

00:07:34.958 --> 00:07:38.608
Um, at least that's  my lens on it
is that there's always this wealth

00:07:38.608 --> 00:07:41.488
of data that, uh, clients go to you.

00:07:41.488 --> 00:07:46.338
I go to you, other people go to
you for data and information.

00:07:46.658 --> 00:07:49.388
So what's different
today in terms of what.

00:07:49.718 --> 00:07:50.888
Your clients are asking.

00:07:51.968 --> 00:07:54.478
Alan: Yeah, I think if I start
with,  under that umbrella of

00:07:54.488 --> 00:07:59.158
AI, kind of what you'd term as
more traditional AI use cases.

00:07:59.298 --> 00:08:03.758
I actually run the strategy and work
with customers every day looking

00:08:03.758 --> 00:08:06.118
at data solutions as my job title.

00:08:06.118 --> 00:08:07.368
And what does that mean?

00:08:07.368 --> 00:08:08.188
That kind of means.

00:08:08.543 --> 00:08:11.963
Customers that they, they're not
comfortable or it doesn't meet their needs

00:08:11.963 --> 00:08:13.813
just to read content from our website.

00:08:14.253 --> 00:08:16.863
They want it in bulk and they want
to feed it into systems and they

00:08:16.863 --> 00:08:19.313
want to run their own AI models
and things like that from it.

00:08:19.323 --> 00:08:23.053
What we see these days is that's
becoming more common that customers

00:08:23.213 --> 00:08:24.473
in the industry want to do that.

00:08:25.083 --> 00:08:29.903
And probably the main driving
reasons are we're needing to make

00:08:29.933 --> 00:08:33.253
decisions more quickly because the
environment around us is changing.

00:08:33.843 --> 00:08:37.793
Us is changing and to do that manually
these days is just not sufficient.

00:08:37.803 --> 00:08:39.833
You just can't run that
in a spreadsheet anymore.

00:08:40.653 --> 00:08:42.083
People want to be more efficient.

00:08:42.143 --> 00:08:44.843
As I say with the
competitors in the industry.

00:08:44.853 --> 00:08:49.733
Everybody's looking for marginal gains in
terms of shaving time off of processing

00:08:49.733 --> 00:08:54.603
and running their operational processes
and with the data available, they

00:08:54.603 --> 00:08:56.633
can now make smart decisions as well.

00:08:56.643 --> 00:09:02.423
So they want to run smarter algorithms
and against our data and their own.

00:09:02.813 --> 00:09:06.413
And that's really what we're seeing that,
customers are pushing forward and those

00:09:06.443 --> 00:09:08.103
solutions are becoming more popular.

00:09:08.873 --> 00:09:13.173
I think Chad can give you a good
answer as well about kind of the more.

00:09:13.503 --> 00:09:16.443
Generate I kind of use cases and
what we're doing there as well.

00:09:16.587 --> 00:09:18.090
Victoria: Yeah, I

00:09:18.187 --> 00:09:20.223
Chad: think that's
that's a different space.

00:09:20.233 --> 00:09:23.773
So I think the perception
of us as a data provider.

00:09:23.813 --> 00:09:24.183
That's true.

00:09:24.183 --> 00:09:26.993
We certainly are a data
provider and we do create that.

00:09:26.993 --> 00:09:29.823
But we do see ourselves very
much in an analytic space, right?

00:09:30.083 --> 00:09:31.433
We do some pretty cool stuff there.

00:09:31.598 --> 00:09:33.398
The gen ai I think is a
slightly different beast.

00:09:33.454 --> 00:09:37.664
I think the explosion of chat GBT and
what that actually means because,  it's

00:09:37.664 --> 00:09:41.454
really going into the whole thing of
looking at individual productivity, right?

00:09:41.784 --> 00:09:44.694
And  in some ways it's quite
difficult 'cause it's sort of

00:09:44.694 --> 00:09:47.274
like it can do so much, but, you
know, where do you get started?

00:09:47.794 --> 00:09:51.214
So we're seeing like different levels
with our customers in terms of like.

00:09:51.439 --> 00:09:55.539
We have some who are quite savvy with
it and get it very quickly in that.

00:09:55.579 --> 00:09:59.089
And you can see them immediately
saying, I use chat GPT to use this.

00:09:59.089 --> 00:10:03.229
I'm using Ask ICIS to
do all of these tasks.

00:10:03.419 --> 00:10:06.049
And what they're getting out of it
is a loss of productivity, right?

00:10:06.059 --> 00:10:10.609
It means they spend less time doing
the jobs than they're doing before.

00:10:10.989 --> 00:10:13.379
And in other areas, people are still
trying to get their heads around it.

00:10:13.899 --> 00:10:18.899
The difference between  using a
chatbot based solution versus using

00:10:18.899 --> 00:10:21.809
a traditional search where they're
still just trying to type in a keyword

00:10:22.189 --> 00:10:24.449
and just trying to get results.

00:10:24.449 --> 00:10:28.409
But,  it is that evolving to that
space of kind of asking questions and

00:10:28.409 --> 00:10:31.929
trying to have a conversation with,
an artificial person, if you want.

00:10:31.989 --> 00:10:33.199
But, those who can really get it.

00:10:33.579 --> 00:10:36.539
They can really get advantage and
there's a lot there about the skill set

00:10:36.539 --> 00:10:40.539
of actually asking good questions is,
is, is something that everyone really

00:10:40.539 --> 00:10:43.839
has to start learning in the world of
the AI revolution, as I like to say,

00:10:44.139 --> 00:10:44.779
Victoria: absolutely.

00:10:44.839 --> 00:10:47.769
In fact, that whole aspect
of prompt engineering, right?

00:10:47.789 --> 00:10:51.239
Figuring out what questions to
ask, and then to keep asking.

00:10:51.909 --> 00:10:55.519
Um, I know myself just from my
use of chat GPT over the past

00:10:55.519 --> 00:10:57.679
2 years,  and how it's evolved.

00:10:57.729 --> 00:11:02.559
I can get to the types of answers I'm
looking for rather quickly because I

00:11:02.559 --> 00:11:07.879
know exactly how to start asking the
questions, how to amend those questions,

00:11:07.889 --> 00:11:11.409
how to keep driving it to a solution,
but it's,  you know, the first question

00:11:11.409 --> 00:11:16.249
you ask is usually, just like in real
life, frankly, the first question you

00:11:16.249 --> 00:11:18.059
ask is probably not the right question.

00:11:18.249 --> 00:11:20.029
You have to keep asking questions.

00:11:20.494 --> 00:11:21.914
Chad: Yeah, absolutely.

00:11:22.024 --> 00:11:25.184
That's very much the case
of, advanced users with, with

00:11:25.184 --> 00:11:26.684
looking at using AI technology.

00:11:26.684 --> 00:11:29.604
They'll ask a question and if it doesn't
give you the answer to the one, you just

00:11:29.604 --> 00:11:33.584
go and reframe your question a bit, you
know, knowing is that it's smart and it

00:11:33.604 --> 00:11:37.464
has this amazing amount of knowledge, but
it's not always that smart and maybe it

00:11:37.484 --> 00:11:40.894
just needs a little more context and help
from you to, to get the answer you want.

00:11:41.724 --> 00:11:44.064
Victoria: So let's talk a
little bit about Ask ICIS.

00:11:44.064 --> 00:11:48.974
So I know that you've recently rolled
out this,  Ask  iCIS, which I guess is

00:11:48.974 --> 00:11:54.754
an AI based tool,  to really, I guess,
interrogate the data and get answers.

00:11:54.754 --> 00:11:55.054
Right?

00:11:55.054 --> 00:11:58.464
So can can talk about that
and how you guys are using it.

00:11:59.364 --> 00:12:03.284
Chad: Yeah,  Ask ICIS is our
generative AI assistant, as

00:12:03.284 --> 00:12:04.524
we call it, that is sort of a.

00:12:05.234 --> 00:12:08.414
specializes in these sort of the
chemicals in the energy sector.

00:12:08.974 --> 00:12:12.734
And it's there to, you know, like a chat
GPT, you can ask it questions and  it

00:12:12.734 --> 00:12:14.014
will answer those questions for you.

00:12:14.194 --> 00:12:16.754
Unlike chat GPT, it's not
just a general purpose tool.

00:12:16.904 --> 00:12:19.094
You know, it's not there for
answering every question.

00:12:19.104 --> 00:12:23.564
It's focused specifically in the markets
and it's built on the data and content

00:12:23.564 --> 00:12:25.764
that we're taking and create with an ICIS.

00:12:25.774 --> 00:12:29.804
So when you ask a question about,
like what's happening in the styrene

00:12:29.804 --> 00:12:31.654
market or what's what's going on there.

00:12:31.874 --> 00:12:35.754
It understands more context about
what styrene is and what you've used

00:12:35.754 --> 00:12:37.514
it for and it's pulling content from

00:12:38.184 --> 00:12:41.384
our price assessments or
forecasts or from from our news

00:12:41.634 --> 00:12:43.004
to help answer those questions.

00:12:43.054 --> 00:12:45.124
And it gives you a much more
powerful answer as a result.

00:12:46.024 --> 00:12:46.734
Victoria: So what is it?

00:12:46.744 --> 00:12:47.764
So what are the use cases?

00:12:47.774 --> 00:12:51.174
How are you seeing your clients
using it and really getting value?

00:12:52.374 --> 00:12:55.739
Chad: So I think the big thing
there is people are looking at

00:12:55.739 --> 00:12:57.269
a really key time saver, right?

00:12:57.269 --> 00:13:01.639
It's that they can get the job done
a lot quicker  than, um, just going

00:13:01.639 --> 00:13:02.809
and doing the work themselves.

00:13:02.869 --> 00:13:03.169
You know?

00:13:03.289 --> 00:13:05.899
So we have some people who are
saying that they, they go and use

00:13:05.899 --> 00:13:07.399
it and they can save hours of work.

00:13:07.769 --> 00:13:11.099
They normally take them hours to go look
at our content, look at other content

00:13:11.199 --> 00:13:14.149
elsewhere, and ask that, and now they
can just have a single session with Ask

00:13:14.149 --> 00:13:19.054
ICIS, ask some of those questions and
get what they need, you know,  and it

00:13:19.054 --> 00:13:23.044
accesses all the content we have around
market dynamics and companies, topics,

00:13:23.044 --> 00:13:25.284
events,  supply and demand fundamentals.

00:13:25.284 --> 00:13:25.864
And so

00:13:26.084 --> 00:13:28.694
Victoria: I'm assuming is it, I'm
assuming it's available only to

00:13:28.694 --> 00:13:32.554
people that are already subscribing
to your services and data.

00:13:32.554 --> 00:13:33.074
Is that right?

00:13:33.194 --> 00:13:33.924
Yeah, that's correct.

00:13:34.358 --> 00:13:36.978
Alan: I think that the important
thing that we do with Ssk

00:13:36.978 --> 00:13:37.418
ICIS,

00:13:37.438 --> 00:13:41.728
Alan: that, Some of the like, Bard, I
think does it now from Google and some

00:13:41.738 --> 00:13:45.648
of the other Gen AI tools is originally
they didn't tell you what information

00:13:45.648 --> 00:13:47.118
they were using to give you the answer.

00:13:47.658 --> 00:13:50.508
So, in effect, they would just
say, hey, here's the truth.

00:13:50.548 --> 00:13:51.908
And a lot of the time it wasn't right.

00:13:51.928 --> 00:13:53.108
It was a lie, right?

00:13:53.118 --> 00:13:55.178
It was the AI getting it wrong.

00:13:55.178 --> 00:13:58.748
And what we've done from the beginning
and some of the learnings we've taken

00:13:58.758 --> 00:14:02.708
from our sort of sister companies
across Bellex was ground the answers

00:14:02.708 --> 00:14:04.028
in the articles that we already have.

00:14:04.028 --> 00:14:04.088
So.

00:14:04.658 --> 00:14:08.878
We'll give you an answer and say, Hey,
you've asked about styrene in this market.

00:14:08.938 --> 00:14:10.198
Here's what we think is going on.

00:14:10.538 --> 00:14:13.298
And here's the relevant articles that
we've used to create this response.

00:14:13.298 --> 00:14:16.538
And then you can click through if
you subscribe to those and see.

00:14:16.883 --> 00:14:19.533
And if you don't, you can talk
to us about if you want to get a

00:14:19.533 --> 00:14:20.773
subscription to those articles.

00:14:20.783 --> 00:14:25.073
So trust for us is a really big
thing in using AI, and I know

00:14:25.073 --> 00:14:26.253
we'll go on  to talk about that.

00:14:26.253 --> 00:14:29.073
But I think I would never
trust the answers if I didn't

00:14:29.073 --> 00:14:29.943
know where they came from.

00:14:29.943 --> 00:14:31.190
And that's really

00:14:31.191 --> 00:14:31.607
Chad: important.

00:14:31.607 --> 00:14:34.903
Yeah, that was when we
were developing Ask ICIS.

00:14:34.913 --> 00:14:38.343
That was really the heart of where
we started, which is, you know, our

00:14:38.373 --> 00:14:39.983
customers really value the content we do.

00:14:39.983 --> 00:14:41.183
They know they can trust us.

00:14:41.413 --> 00:14:45.133
They know we do our research, we
do know what we're doing, and the

00:14:45.133 --> 00:14:46.863
tool must do the same in that sense.

00:14:46.873 --> 00:14:50.693
So, it's very much built into the
actual design, to make sure that we

00:14:50.693 --> 00:14:56.233
know that if you see it, it's something
that somebody in ICI has said.

00:14:56.233 --> 00:15:00.453
Because that's, trust is the absolute
most important thing, and when you

00:15:00.463 --> 00:15:02.963
just use a chat GPT or looking at that,

00:15:03.463 --> 00:15:08.633
they talk very confidently, right,
when you use these services, and that

00:15:08.793 --> 00:15:11.913
always gives you the suggestion that
is, it's always right, but it's not.

00:15:11.913 --> 00:15:15.973
And it's very hard to tell
when it's right or not.

00:15:15.973 --> 00:15:20.183
If you can't go and say, can I
just look where that content came

00:15:20.183 --> 00:15:22.313
from to see if that's correct.

00:15:22.823 --> 00:15:25.673
Victoria: Well, and  I really
appreciate that as well, because

00:15:25.913 --> 00:15:29.153
especially the referencing back to
the specific articles, because as

00:15:29.153 --> 00:15:32.203
we know, things change over time.

00:15:32.628 --> 00:15:32.968
Right?

00:15:33.128 --> 00:15:36.728
And so when you get an answer
and say, Oh, the answer is X.

00:15:36.758 --> 00:15:37.208
Well, okay.

00:15:37.208 --> 00:15:40.758
The answer X was correct back in January.

00:15:41.218 --> 00:15:44.598
But these other three big things
happened and now the answer is Y.

00:15:44.608 --> 00:15:47.888
So understanding and being able
to kind of trace some of that I'm

00:15:47.888 --> 00:15:51.158
sure is super important  to your
clients and really to everyone.

00:15:51.158 --> 00:15:51.348
Right?

00:15:51.348 --> 00:15:55.458
Because we, when we ask a question,
and somebody is coming in to Ask

00:15:55.618 --> 00:15:59.598
ICIS and be like, Oh, I'm asking this
question because I have to go talk to

00:15:59.598 --> 00:16:04.678
my president, the board, the boss, a
customer, whatever,  you want to go

00:16:04.678 --> 00:16:06.808
back to that individual with confidence.

00:16:06.861 --> 00:16:08.158
And I think that's a, that's

00:16:08.158 --> 00:16:13.738
Alan: a common,  requirement, I think,
for any analytics or AI back from

00:16:13.738 --> 00:16:16.638
before people talked about generative
AI, you know, we  got some great.

00:16:17.053 --> 00:16:20.633
Data science teams working on price
forecasts as well as what we do with

00:16:21.103 --> 00:16:25.493
price assessments and one of the key
things that we work on with customers

00:16:25.493 --> 00:16:29.253
is explainability of those analytics
because As you say if you're using

00:16:29.253 --> 00:16:34.903
a third party to create them If your
director or cfo or cmo asks you a

00:16:34.903 --> 00:16:36.748
question about well, how did that work?

00:16:37.088 --> 00:16:37.648
Get to that.

00:16:37.648 --> 00:16:38.588
That's a strange number.

00:16:38.588 --> 00:16:39.498
I don't understand it.

00:16:39.978 --> 00:16:40.038
Chad: You

00:16:40.038 --> 00:16:41.218
Alan: need to be able
to explain that to them.

00:16:41.248 --> 00:16:43.408
If you can't, you lose trust immediately.

00:16:43.408 --> 00:16:48.758
And I think no AI tool is really
going to be used by customers or any

00:16:48.768 --> 00:16:50.548
users if it can't explain itself.

00:16:50.548 --> 00:16:51.338
It's very important.

00:16:51.478 --> 00:16:52.258
Victoria: Yeah, that's great.

00:16:52.268 --> 00:16:53.558
And that's actually a great segue.

00:16:53.558 --> 00:16:59.248
Let's talk a little bit about maybe some
of the concerns and the risks that we

00:16:59.408 --> 00:17:03.898
think about as we think about technology
and AI and how those are being mitigated.

00:17:04.923 --> 00:17:05.523
Chad: Yeah, sure.

00:17:05.783 --> 00:17:09.613
I mean, the big, the big thing that people
worry about is hallucinations, right?

00:17:09.673 --> 00:17:11.833
It's the phrase everyone's kind of using.

00:17:11.883 --> 00:17:14.443
It's a bit of a disservice in
some, some cases  to the machine.

00:17:14.453 --> 00:17:18.413
It says it's hallucinating and
that it's just hallucinating.

00:17:19.393 --> 00:17:22.563
They're actually making very,
very, very educated guesses, right?

00:17:23.033 --> 00:17:23.963
That's how they actually work.

00:17:23.963 --> 00:17:27.763
But the one thing that people worry about
a lot is if it goes and tells you so.

00:17:28.063 --> 00:17:28.243
Right?

00:17:28.243 --> 00:17:29.953
Can you actually handle that?

00:17:30.013 --> 00:17:31.783
Can it actually manage that hallucination?

00:17:31.843 --> 00:17:31.903
Yeah.

00:17:32.323 --> 00:17:36.213
And that's something  as far as how we,
we took and built it  in a pattern,  it's

00:17:36.603 --> 00:17:42.803
fairly common in the market to make sure
that,  when we train, um, Ask ICIS, we

00:17:42.893 --> 00:17:45.863
tell it  that it prioritizes our view.

00:17:46.588 --> 00:17:49.608
In our content right over any other view.

00:17:49.648 --> 00:17:53.458
So it's it's always pulling from our
content first, which we make accessible

00:17:53.468 --> 00:17:58.138
to it and effectively does a search
and pulls back the information and then

00:17:58.138 --> 00:18:01.728
it tries to answer the question that
customer has based on that information.

00:18:02.048 --> 00:18:06.478
It looks at and it's it's not
drawing on as much the the LLM.

00:18:07.293 --> 00:18:12.823
To answer that question, it's using how
its ability to read and its ability to

00:18:12.823 --> 00:18:14.853
reason to help answer that question.

00:18:14.853 --> 00:18:17.703
But the data it's pulling
from is ICIS content.

00:18:18.073 --> 00:18:18.413
Victoria: Got it.

00:18:18.443 --> 00:18:21.353
So it's, so it's a little bit
less about generative, right?

00:18:21.353 --> 00:18:26.443
So I think this aspect of generative
being creative and so creating

00:18:26.503 --> 00:18:29.573
answers that don't exist, but rather

00:18:29.573 --> 00:18:34.373
generating and then testing back to
what is already known to be true.

00:18:35.043 --> 00:18:38.643
Chad: So it's, it's generative in the
sense and it's creative in the sense of

00:18:39.043 --> 00:18:42.553
it makes content that's more specific
to what you're trying to find out.

00:18:42.603 --> 00:18:42.933
Right.

00:18:43.123 --> 00:18:45.393
But  it could be made out of
six pieces of our content.

00:18:45.723 --> 00:18:46.033
Right.

00:18:46.043 --> 00:18:48.993
So it could have pulled it from
three news articles, a price report

00:18:49.013 --> 00:18:53.003
and forecast because you are asking
about something that hit all of those

00:18:53.003 --> 00:18:53.483
areas.

00:18:53.843 --> 00:18:57.983
And so it, it then it creates a unique
piece of content for that particular

00:18:57.983 --> 00:19:00.323
answer, but it's all based on our content.

00:19:00.563 --> 00:19:01.343
Victoria: Yeah, that's helpful.

00:19:01.463 --> 00:19:05.723
What are the other risks that you see or
that people concerns that people raise?

00:19:06.423 --> 00:19:09.253
Chad: I guess one of the other
ones that comes across a lot is

00:19:09.253 --> 00:19:10.703
the, is AI going to take my job?

00:19:11.093 --> 00:19:13.913
That's the, the big one
that, that most people say.

00:19:14.373 --> 00:19:18.033
Victoria: Yeah, do we still need the
ICIS consultants  if we have this?

00:19:18.563 --> 00:19:19.973
Chad: Certainly it would be my answer.

00:19:20.033 --> 00:19:23.363
And I think, I think you need to
look at AI and the tools there.

00:19:24.048 --> 00:19:27.648
They make it easier to do your job
or to do somebody's job with that,

00:19:27.648 --> 00:19:31.544
but it's not just a like for like
replacement, like really, our content

00:19:31.544 --> 00:19:34.934
creators in our business, they, they
answer very hard questions, right?

00:19:35.144 --> 00:19:38.204
They're looking at very
complicated situations and

00:19:38.264 --> 00:19:39.944
apply subject matter expertise.

00:19:40.294 --> 00:19:42.554
Most of that's not in the question
that's being asked, right?

00:19:42.714 --> 00:19:46.174
So when we actually look at the
content, we're looking at these things,

00:19:46.175 --> 00:19:47.505
There's a lot more there to add that.

00:19:47.505 --> 00:19:52.215
So what element, you know, um, it's fine
to say that as a piece of information,

00:19:52.505 --> 00:19:58.135
a tornado wiped out interstate 95 when
Hurricane Milton,  was hitting Florida.

00:19:58.425 --> 00:19:59.125
That's great.

00:19:59.225 --> 00:20:01.875
It's more important to know that
that's a major, uh, Transport

00:20:01.875 --> 00:20:03.935
hub to the port of Tampa, right?

00:20:03.995 --> 00:20:06.775
Which actually could potentially is
going to affect some customers there.

00:20:07.165 --> 00:20:09.475
So that's hard to figure out.

00:20:10.485 --> 00:20:13.914
Alan: I think as well with
any AI, not just the sort of

00:20:13.965 --> 00:20:15.405
more modern generative AI.

00:20:15.405 --> 00:20:18.904
It's really a lot of the cases only as
good as what goes into it in the first

00:20:18.905 --> 00:20:20.685
place from an information perspective.

00:20:20.685 --> 00:20:22.755
So we're in a privileged place.

00:20:22.795 --> 00:20:26.565
I think in ICIS that we have a lot of
good information that we can feed into

00:20:26.565 --> 00:20:29.965
these things and actually,  Make them more
intelligent to help you make a decision.

00:20:30.465 --> 00:20:34.985
I think as well, like with, not just Ask
ICIS, but the, the tools that we create,

00:20:35.025 --> 00:20:41.045
but also internally, how we use a I, um,
a lot of it is around in the industry.

00:20:41.665 --> 00:20:45.805
For sure, and every industry
has this challenge is how do

00:20:45.805 --> 00:20:46.995
you scale up your workforce?

00:20:47.015 --> 00:20:49.885
And how do you make them more
literate to use these kind of tools?

00:20:50.045 --> 00:20:50.305
Victoria: Right?

00:20:50.365 --> 00:20:53.445
Alan: And both from a benefit
perspective, as Chad was talking

00:20:53.445 --> 00:20:56.735
about from making you more efficient,
faster doing your job and so on.

00:20:56.735 --> 00:21:02.085
But also the risks and pitfalls very,
very early on, like, you know, because

00:21:02.085 --> 00:21:07.335
we're At heart data analysis company in a
group in general, we had guidance around

00:21:07.335 --> 00:21:09.425
employees using generative AI tools.

00:21:10.195 --> 00:21:12.735
Don't put confidential
information in chat GPT.

00:21:12.745 --> 00:21:15.735
It's just going to harvest it and
start spitting out to other people and

00:21:16.025 --> 00:21:16.255
Chad: I'm

00:21:16.255 --> 00:21:18.345
Alan: sure a lot of companies
their employees are doing

00:21:18.345 --> 00:21:19.435
that and they don't realize.

00:21:19.445 --> 00:21:20.495
So, um, yeah.

00:21:20.980 --> 00:21:25.140
Making sure that you get good training
out to your employees, that you teach

00:21:25.140 --> 00:21:29.790
them what these tools can do, why they're
good, why they're bad,  or what are the

00:21:29.790 --> 00:21:31.540
risks around them is very important.

00:21:31.880 --> 00:21:35.200
And we have to always keep up to
date with the latest and greatest new

00:21:35.200 --> 00:21:36.780
tools that are available to do that.

00:21:37.330 --> 00:21:40.815
Next year there'll be another Chat GPT
like thing that will come out that we've

00:21:40.815 --> 00:21:45.755
got to make sure that we use it in the
right way Like I like to think of AI

00:21:45.755 --> 00:21:47.765
analytics tools like a builders toolkit.

00:21:48.225 --> 00:21:52.330
Someone gives you like a brand new
drill You don't just Try and do

00:21:52.330 --> 00:21:53.370
something with it straight away.

00:21:53.370 --> 00:21:54.560
You read the manual, right?

00:21:54.560 --> 00:21:58.000
Like, so, try to make
it, you should do that.

00:22:00.650 --> 00:22:02.727
Or you drill a few holes in
the walls and see what happens.

00:22:02.862 --> 00:22:06.630
I mean, hey, maybe, maybe I'm
that guy who reads the manual.

00:22:08.460 --> 00:22:11.070
But I think it's important
that,  they're fascinating.

00:22:11.070 --> 00:22:11.690
They're brilliant.

00:22:11.720 --> 00:22:13.680
They're, they're important
tools and they are.

00:22:14.370 --> 00:22:20.000
Chad and I will often geek out on, it's
a very exciting time to be in business

00:22:20.040 --> 00:22:24.200
to use these tools in your personal life
as well, but you should make sure that

00:22:24.200 --> 00:22:28.040
you do a little bit of due diligence to
understand which tool to use at what,

00:22:28.340 --> 00:22:32.680
uh, in what way,  and also, I mean,
not just risks, like there's a big risk

00:22:32.730 --> 00:22:34.530
that you run up with lots of costs.

00:22:34.955 --> 00:22:36.675
Some of these tools
can be quite expensive.

00:22:36.675 --> 00:22:41.025
So if you can solve, um, my mantra has
always been in the AI space, right?

00:22:41.025 --> 00:22:45.365
If you can write a linear program
to solve a problem to a level of

00:22:45.365 --> 00:22:48.675
accuracy, that's acceptable, why
would you deploy a neural network?

00:22:49.055 --> 00:22:51.055
It's like 20 times more expensive, right?

00:22:51.055 --> 00:22:55.325
So having that cost benefit analysis, when
you're deploying these tools is important

00:22:55.325 --> 00:22:56.405
from a business point of view as well.

00:22:56.445 --> 00:22:56.755
Victoria: Yeah.

00:22:56.815 --> 00:22:58.585
Well, especially I think
when you think about the.

00:22:59.140 --> 00:23:05.600
Maybe the broader social context of it is
the power Usage that goes along with AI

00:23:05.600 --> 00:23:09.200
as you say the these sophisticated neural
networks I don't need to ask it how to

00:23:09.200 --> 00:23:14.320
make a peanut butter and jelly sandwich
Because I can get the answer anywhere

00:23:14.320 --> 00:23:20.190
else But asking it things that I can't
find the answers to becomes critical so

00:23:20.190 --> 00:23:23.510
that I'm using the resources wisely Yeah.

00:23:23.740 --> 00:23:24.300
A hundred percent.

00:23:24.990 --> 00:23:27.280
Do you, folks, you talk a
little bit about training.

00:23:27.580 --> 00:23:32.550
Do you guys provide training to your
customers when they start using Ask ICIS?

00:23:33.100 --> 00:23:37.390
Because obviously you want them to use
it effectively and get the benefit.

00:23:38.120 --> 00:23:38.970
Alan: Yeah, absolutely.

00:23:38.970 --> 00:23:42.780
So, so we have a customer service
team in effect or customer success

00:23:42.780 --> 00:23:46.650
team, which  not just Ask ICIS, but
they're available to our customers

00:23:46.650 --> 00:23:47.900
for any product that we ship.

00:23:48.410 --> 00:23:52.620
As part of onboarding and general queries,
if anybody has any questions, uh, we

00:23:52.620 --> 00:23:56.170
can do that as interestingly, a use case
for those types of algorithms that you

00:23:56.170 --> 00:23:57.850
can ask them how to use them as well.

00:23:57.880 --> 00:23:58.230
So it's

00:23:59.700 --> 00:24:00.750
Victoria: actually really good.

00:24:00.750 --> 00:24:01.760
I hadn't thought about

00:24:01.760 --> 00:24:01.940
Alan: that.

00:24:01.940 --> 00:24:06.870
Yeah, Chad, just don't Ask ICIS
specifically anything that he helps.

00:24:07.180 --> 00:24:11.280
Chad: Yeah, I think it's, it's something
I know that as we develop it, it's

00:24:11.280 --> 00:24:12.880
always about making it easier and easier.

00:24:12.880 --> 00:24:15.030
I mean, I think a lot with
these chatbots is trying to

00:24:15.030 --> 00:24:16.290
make them as simple as possible.

00:24:16.300 --> 00:24:19.955
Like  it is just a box, you
type in a question and you try

00:24:19.955 --> 00:24:21.035
to have a conversation with it.

00:24:21.655 --> 00:24:22.795
But I think, there's always elements.

00:24:22.795 --> 00:24:24.245
How can we make it easier?

00:24:24.245 --> 00:24:25.915
How do we help people to onboard?

00:24:25.915 --> 00:24:27.375
And that's something
we're always looking at.

00:24:27.515 --> 00:24:31.125
And we can evolve things,  the tool and
how  we approach things, of course, too.

00:24:31.975 --> 00:24:32.325
Victoria: Awesome.

00:24:32.795 --> 00:24:33.075
Love it.

00:24:33.295 --> 00:24:34.495
So what's next guys?

00:24:34.495 --> 00:24:40.040
What do you see as So Ask ICIS is the big
thing for 2024 and obviously continuing to

00:24:40.040 --> 00:24:44.490
roll it out into 2025, which is when this
episode is going to be getting published.

00:24:44.860 --> 00:24:45.280
What else?

00:24:45.300 --> 00:24:45.870
What's next?

00:24:45.870 --> 00:24:49.570
What should we be looking for
from, from your team, from ICIS?

00:24:50.270 --> 00:24:54.850
Alan: I think more innovation, I think,
is what we, we want to kind of promise

00:24:54.850 --> 00:24:58.350
to our customer base and ourselves
that we're going to keep trying to push

00:24:58.350 --> 00:25:03.110
the boundaries of what's possible and
what's sensible in our space as well.

00:25:04.115 --> 00:25:07.295
And I think whilst my head of product
would kill me if I said anything

00:25:07.295 --> 00:25:10.375
specific,  from my perspective,
that's more investment in some of

00:25:10.375 --> 00:25:15.395
these tools, like Ask ICIS, growing
its capability, but also in my space.

00:25:16.370 --> 00:25:20.060
new analytics for customers to use to
help them make decisions across our

00:25:20.060 --> 00:25:24.280
industries, but also keeping up to
speed with developments in the software

00:25:24.280 --> 00:25:27.710
and analytics market if they want to
take data directly from us as well.

00:25:27.710 --> 00:25:30.120
We've done some investments
on that recently as well.

00:25:30.430 --> 00:25:33.930
Um, and we need to make sure those
tools are up to date when our

00:25:34.050 --> 00:25:35.900
customers come asking us to help them.

00:25:36.370 --> 00:25:37.650
So yeah,  keep watching us.

00:25:37.650 --> 00:25:40.300
We'll try and post as much as we
can on socials and things like

00:25:40.300 --> 00:25:41.760
that personally and as a group.

00:25:41.760 --> 00:25:43.850
And,  Yeah, I can't be more specific.

00:25:43.850 --> 00:25:49.160
I'm afraid that if you want to be
braver than me, but I think it's,

00:25:49.170 --> 00:25:53.480
Chad: you know, as always, it's
with with all generative AI tools

00:25:53.520 --> 00:25:54.930
that people use today, right?

00:25:55.300 --> 00:25:58.360
It's not always going to be about
specific features or things you're

00:25:58.360 --> 00:26:02.430
looking at, but it's just, it's a just
how do you make the experience better?

00:26:02.600 --> 00:26:05.040
It's a lot of a lot of fine tuning.

00:26:05.415 --> 00:26:09.755
Of how things work or behave and
respond and looking at particular use

00:26:09.755 --> 00:26:13.535
cases on how does it answer particular
kinds of questions or things like that.

00:26:13.535 --> 00:26:13.705
Right.

00:26:13.705 --> 00:26:16.965
So it's it's it's hard to say
exactly what's there because we're

00:26:16.965 --> 00:26:19.675
looking at,  feedback from customers
and seeing what's important.

00:26:19.715 --> 00:26:20.924
And,  that's what we're looking at.

00:26:21.105 --> 00:26:22.565
Yeah, that's what drives.

00:26:22.735 --> 00:26:26.165
Alan: I think what I would say is, well,
if you want an advanced look at any of

00:26:26.195 --> 00:26:30.225
the stuff we're developing at the moment,
we have like a custom advisory panel.

00:26:30.345 --> 00:26:34.955
You can apply to and we'll be testing
like tool prototypes and things like

00:26:34.955 --> 00:26:36.515
that with with  those customers.

00:26:36.525 --> 00:26:38.565
So, um, how does somebody

00:26:38.565 --> 00:26:39.445
Victoria: get on that panel?

00:26:39.445 --> 00:26:42.715
Is there a website or an email
or can I just how do we direct

00:26:42.755 --> 00:26:43.275
Alan: people to you?

00:26:43.745 --> 00:26:46.035
Yeah, we can, we can supply
that if I don't know if they're

00:26:46.035 --> 00:26:48.585
still recruiting people, but I
can give you that information.

00:26:48.595 --> 00:26:49.085
Well, and,

00:26:49.135 --> 00:26:51.905
Victoria: and for sure, we're going
to include a link to Ask ICIS.

00:26:51.905 --> 00:26:54.475
And I know if people are already
ICIS customers, they can probably

00:26:54.475 --> 00:26:57.475
talk to their salesperson, the
relationship focal point to learn more.

00:26:57.475 --> 00:26:57.845
Awesome.

00:26:58.025 --> 00:27:00.725
And I think, uh, you know, if
nothing else, as we head into

00:27:00.725 --> 00:27:06.865
2025,  learning how to use AI
effectively, including tools like Ask

00:27:06.865 --> 00:27:11.405
ICIS becomes a real differentiator
for companies and for individuals.

00:27:11.945 --> 00:27:12.925
Chad: Yeah, absolutely.

00:27:13.345 --> 00:27:14.135
There's, there's a lot.

00:27:14.225 --> 00:27:17.485
I think customers,  people working
in this industry to think about about

00:27:17.495 --> 00:27:21.698
how, how can we use this technology
to help improve our productivity?

00:27:21.848 --> 00:27:24.638
How can we help our individual
people  get more productive?

00:27:24.808 --> 00:27:28.288
Time to do the things that add
value to your business and less time

00:27:28.298 --> 00:27:30.648
doing, you know, some mundane tasks.

00:27:30.728 --> 00:27:34.558
That's where these tools are very powerful
to get people to focus on the stuff

00:27:34.558 --> 00:27:36.208
they tend to like doing more as well.

00:27:36.668 --> 00:27:40.768
Alan: So, we honestly, we feel like
these types of tools and us  as

00:27:40.768 --> 00:27:43.798
a partner with our customers,
we want to augment what they do.

00:27:44.098 --> 00:27:45.308
We want to make them better.

00:27:45.718 --> 00:27:50.968
Ask ICIS as a name, even think of us as
basically, you're the superhero in our

00:27:50.968 --> 00:27:52.808
story and we want to be a sidekick, right?

00:27:52.808 --> 00:27:57.258
We want to help you, we want to help
you get done what you need to do, and we

00:27:57.258 --> 00:27:59.668
want to make you look great to your boss.

00:27:59.898 --> 00:28:01.256
That's effectively what we would like.

00:28:01.256 --> 00:28:02.290
And that is a

00:28:02.290 --> 00:28:03.448
Victoria: great way to end.

00:28:03.448 --> 00:28:06.708
So, Alan and Chad, thank you
so much for your time today.

00:28:06.708 --> 00:28:07.308
This has been great.

00:28:08.148 --> 00:28:08.548
Alan: It's a pleasure.

00:28:08.728 --> 00:28:09.538
It's a pleasure.

00:28:09.538 --> 00:28:09.858
very

00:28:09.858 --> 00:28:10.178
Victoria: much.

00:28:10.178 --> 00:28:10.538
Absolutely.

00:28:10.538 --> 00:28:11.798
And thank you everyone for listening.

00:28:11.798 --> 00:28:14.298
Keep listening, keep following,
keep sharing, and we will

00:28:14.298 --> 00:28:15.808
talk with you again soon.

00:28:17.908 --> 00:28:20.068
Thanks for joining us
today on The Chemical Show.

00:28:20.408 --> 00:28:24.868
If you enjoyed this episode, be
sure to subscribe, leave a review,

00:28:25.138 --> 00:28:28.668
and most importantly, share it
with your friends and colleagues.

00:28:29.528 --> 00:28:32.418
For more insights, visit TheChemicalShow.

00:28:32.428 --> 00:28:34.948
com and connect with us on LinkedIn.

00:28:35.598 --> 00:28:39.548
You can find me at Victoria King
Meyer on LinkedIn, and you can also

00:28:39.548 --> 00:28:41.498
find us at The Chemical Show Podcast.

00:28:41.908 --> 00:28:45.138
Join us next time for more
conversations and strategies

00:28:45.398 --> 00:28:47.188
shaping the future of the industry.

00:28:47.558 --> 00:28:48.268
We'll see you soon.