WEBVTT

NOTE
This file was generated by Descript 

00:00:04.454 --> 00:00:05.894
Welcome to DevOps and Docker talk.

00:00:05.924 --> 00:00:07.334
And I am Bret.

00:00:07.534 --> 00:00:14.404
If you've been listening to this podcast for a while, this is
probably the seventh year or so that I've been doing this podcast.

00:00:14.604 --> 00:00:18.054
It hasn't changed a terrible amount in that time.

00:00:18.384 --> 00:00:36.018
We've mostly had guests on the show from various cloud native and DevOps related product
companies, talking about tools and solutions, and you often will hear My co-host Al Al Metha
on the show, but every so often I just monologue, and this is one of those specifically.

00:00:36.018 --> 00:00:45.298
This is about what I'm seeing and what I'm doing right now, and then
for the rest of the year and so this is gonna happen in three parts.

00:00:45.298 --> 00:00:46.528
First, I'm just talking about.

00:00:46.828 --> 00:00:58.398
What's about to happen for me for the next three weeks in going to
London for KubeCon, and then what I'm planning to change in this
podcast, as well as my other content on YouTube for the rest of the year.

00:00:58.598 --> 00:01:06.178
And last, I'm gonna talk about some industry trends that I'm seeing
that will force me, I think, to change the format of this show.

00:01:06.378 --> 00:01:07.338
All right, let's get into it.

00:01:08.984 --> 00:01:13.754
Today is March 22nd, 2025, and I am days away from leaving for London.

00:01:14.099 --> 00:01:21.149
Where I will be spending time first with my wife and mother and
attending three conferences over nine days as a part of that.

00:01:21.629 --> 00:01:23.909
So it's a little bit of fun and a lot of work.

00:01:24.179 --> 00:01:34.036
First, when we land, the day we land, I'm actually gonna go to Red Monks Mardi Gras, which
I've never been to before, but I'm a fan of Red Monk and I know that their events can be.

00:01:34.236 --> 00:01:39.396
Small and intimate and full of powerful people doing cool things in tech.

00:01:39.596 --> 00:01:49.456
And no surprise, this year's version in London is a packed agenda of
ai, everything, which is the reason I think I'm really wanting to go.

00:01:49.656 --> 00:01:54.476
As I will hint to later in this podcast and why I think that is, so I start with that.

00:01:54.476 --> 00:01:55.436
That's a two day thing.

00:01:55.436 --> 00:01:58.106
Then I do some fun with the wife and mother.

00:01:58.166 --> 00:02:02.322
My mom has never actually been to England so it's pretty exciting to take her for the first time.

00:02:02.522 --> 00:02:07.172
And if you didn't know some behind the scenes, my wife
and I actually worked together in this business full time.

00:02:07.258 --> 00:02:15.418
And have so for many years, and it's not our first business, I think it's something
around our sixth, uh, I think it's certainly the most successful long running one.

00:02:15.898 --> 00:02:24.468
And then we go to work at Rejects, which is, if you've not been to KubeCon
Rejects is the conference of Rejected conference talks from KubeCon.

00:02:24.828 --> 00:02:28.458
But I've already started to really like it and I've only been to one.

00:02:28.488 --> 00:02:31.038
I like that it's at least much smaller.

00:02:31.238 --> 00:02:36.488
KubeCon, so you can actually see everyone in a matter of hours at the conference.

00:02:36.488 --> 00:02:45.009
And it tends to be a little bit of the who's, who of people in the community there, that
you maybe don't get access to at the big conference because you can never find them.

00:02:45.009 --> 00:02:52.389
Or there's just, you know, there's 10 plus thousand people at KubeCon,
so when there's only hundreds or possibly up to a thousand at rejects.

00:02:52.589 --> 00:02:58.049
It's a much more accessible conference and you can sort of
slow down, have longer conversations, and it's pretty nice.

00:02:58.249 --> 00:03:06.939
Um, but unfortunately with rejects, which this year at London sold out in
days, unfortunately I missed the blue sky skeet on the announcement that the.

00:03:07.139 --> 00:03:10.499
Tickets were open and within days I went back and it was gone already.

00:03:10.499 --> 00:03:16.019
So the, I'm hoping I can get in the door, maybe I can
find someone I know and sneak in the back or something.

00:03:16.019 --> 00:03:16.829
Hopefully that'll happen.

00:03:17.099 --> 00:03:29.546
And then there's the KubeCon Day Zero, which is sort of mini conferences within
the big conference and platform engineering recently has been the biggest, and
there's a growing number, but platform engineering is definitely a trend there.

00:03:29.746 --> 00:03:40.786
And I like to visit the Argo CD Argo Con Day because I'm a fan of deployments and
automation, and Argo CD seems to be the clear winner right now in the ecosystem.

00:03:40.986 --> 00:03:52.931
And then that same day, I'm gonna go off to Portainer's workshop,
which is gonna take place on a Yacht where we're gonna work hands-on
with Sidero's, Talos Linux, and their Kubernetes manager, Omni.

00:03:53.131 --> 00:04:03.631
So I'm excited to learn some of that stuff 'cause I haven't really been able
to put my hands on Talos Linux yet, which is the self-proclaimed Kubernetes
Linux and the tool from Sidero, I think that's how you pronounce it.

00:04:03.881 --> 00:04:05.621
They're omni tool for managing all of that.

00:04:05.821 --> 00:04:27.281
And then three days of KubeCon, where for the fourth year in a row we're probably
gonna have keynotes largely about ai, while the average person walking around the hall
isn't even running their own inference clusters and doesn't really do anything with
AI except use open AI or LLM models to answer questions and write text and to an IDE.

00:04:27.618 --> 00:04:35.384
So it's a bit of a weird disconnect, but I'm going to talk about why I
think that's about to change for US infrastructure and DevOps people.

00:04:35.828 --> 00:04:36.548
But first.

00:04:36.748 --> 00:04:39.868
Let me talk about what we're gonna be doing on this show this year.

00:04:39.868 --> 00:04:52.004
And you probably, if you've been paying attention, if you're an avid
listener of various tech podcasts, you might have noticed that I haven't been
shipping as many episodes this year so far, and there's an intention to that.

00:04:52.004 --> 00:04:53.414
It doesn't mean we're slowing down.

00:04:53.444 --> 00:04:58.104
If anything, I'm hoping we're gonna speed up and do more, and I needed to take.

00:04:58.304 --> 00:05:03.494
My small little team and spend some serious time focusing on a few things,

00:05:03.744 --> 00:05:13.524
We've spent the last two and a half months focusing on improving our content workflow
so that we can ship things faster and smaller and more dynamic in the moment.

00:05:13.524 --> 00:05:16.164
As well as improving our sponsorship offerings.

00:05:16.707 --> 00:05:18.567
Hey, do you work at a company that wants to sponsor us?

00:05:18.597 --> 00:05:18.987
Lemme know.

00:05:19.694 --> 00:05:33.784
Um, there's an increasing number of companies that want to sponsor us in certain ways
and we're trying to figure out how to make that work while also ensuring my journalistic
integrity, which I'm not a journalist, but whatever that integrity thing, right?

00:05:33.784 --> 00:05:34.654
Like I don't wanna be a corporate.

00:05:34.654 --> 00:05:41.284
She, so we've been trying to figure out that stuff, spending a lot of time
honing in on what we feel comfortable with and what we think is genuine.

00:05:41.704 --> 00:05:44.824
But not to be a complete sellout to anyone who wants to give us a buck.

00:05:45.024 --> 00:05:58.861
And something else fun we've been doing is improving the studio, which is in one of our spare
bedrooms, where we've basically taken over the last few years, the entire room and expanded a
little bit of the studio so we can have different camera setups and just produce better content.

00:05:59.297 --> 00:06:12.547
And recently, specifically this month in March, I've spent some time
leaning into AI and more than just, Hey, I'm using chat GPT now more than
I'm using Google or Stack Overflow, which it's what's happening everywhere.

00:06:12.877 --> 00:06:17.317
Uh, if you didn't know, stack Overflow is down 60% in traffic year over year.

00:06:17.557 --> 00:06:30.457
Google is down in traffic as well, but I'm not sure that there Talking publicly
about how much that is, but we are definitely seeing a shift where AI is answering
questions better and faster than we can find through traditional search methods.

00:06:30.657 --> 00:06:42.087
That's a little bit scary because that's kind of what the first wave has been, there are
more waves of AI coming, and I think it's coming for a lot of the jobs in a non scary.

00:06:42.287 --> 00:06:42.827
Oh no.

00:06:42.827 --> 00:06:47.747
What's gonna happen to our job roles and will we be able to find a job if the AI takes it?

00:06:47.987 --> 00:06:49.277
I'm not one of those people.

00:06:49.277 --> 00:06:55.817
I am more in the middle where I think there will be some churn and
some people that have to shift their job roles to something adjacent.

00:06:56.017 --> 00:07:00.839
But I think we're about to see AI come for infrastructure.

00:07:01.229 --> 00:07:16.429
If you haven't been hearing about the term AI agents, which is basically
something that was invented in 2024, as far as I can tell, it's giving
AI tooling, allowing it to control things, not just say or write things.

00:07:16.629 --> 00:07:24.939
And that's the moment where I feel like it becomes way more
interesting for DevOps and platform engineering and cloud ops.

00:07:25.189 --> 00:07:26.899
I think we're really, really early.

00:07:27.099 --> 00:07:33.817
And if you look at some of the things like Google Trends, you will notice that
even just the term AI Agents basically wasn't being searched for six months ago.

00:07:34.057 --> 00:07:48.737
But we're starting to see exponential growth , in the interest for AI Agents . I expect
that to continue as the industry starts to realize that allowing the AI to do things on
our behalf rather than just tell us how to do it, is where the real value's gonna happen.

00:07:48.937 --> 00:07:52.017
But you know, we're DevOps, we're platformers.

00:07:52.347 --> 00:07:57.357
We need deterministic output, and AI is nowhere near doing that.

00:07:57.687 --> 00:07:58.317
So.

00:07:58.517 --> 00:08:00.917
I am gonna be talking more and making some videos.

00:08:00.917 --> 00:08:12.087
You'll probably hear me make a few dedicated podcasts about this, but I believe
that we're possibly on the precipice of AI coming for our infrastructure.

00:08:12.434 --> 00:08:20.954
But that's not gonna happen if we can't control it, if we
can't rely on it to be correct a hundred percent of the time.

00:08:21.154 --> 00:08:37.214
And I think one of the things that's gonna happen there is we're gonna have
to learn how to give the AI a locked down playground to figure out what it
needs to do before it actually does it in the real, and I don't know how
this is gonna look, and I could be totally off on how this actually happens.

00:08:37.514 --> 00:08:38.684
But something happened this month.

00:08:39.227 --> 00:08:40.358
The short version here is.

00:08:40.723 --> 00:08:52.238
I have been ignoring AI largely in terms of doing anything in my job other
than helping me  write some code or Terraform or, well, it doesn't need
to help me write Docker files 'cause it's not as good as I am at that yet.

00:08:52.633 --> 00:08:56.533
But I've tried and you know, things like it even knows bake files, right?

00:08:56.533 --> 00:08:59.473
It knows a lot of the things that are even new because.

00:08:59.473 --> 00:09:02.833
The models are getting better at having more current
knowledge, and they can also now search the web.

00:09:03.033 --> 00:09:05.943
But it was never terribly interesting for me to talk about.

00:09:06.143 --> 00:09:12.323
But I think what's about to happen is it's going to do things, not just talk about 'em.

00:09:12.533 --> 00:09:18.863
So what happened is on March 1st, Solomon Hikes the founder of Dagger, a programmatic CI/CD company.

00:09:19.178 --> 00:09:21.788
Who previously founded Docker, you may have heard of him.

00:09:22.208 --> 00:09:27.578
Uh, he reached out to me saying something that had happened
in their community and he wanted me to be aware of it.

00:09:27.948 --> 00:09:29.148
it was totally out of the blue.

00:09:29.478 --> 00:09:32.088
We had actually talked at KubeCon last year for.

00:09:32.288 --> 00:09:36.218
An hour or so about what they were doing in the CI space.

00:09:36.218 --> 00:09:56.058
I've been watching them for years kind of wanting to see where they went and not spending a
terrible amount of my own time learning it because I was sort of waiting for that inflection
moment where they, are starting to become a popular tool and they've got it fleshed out enough
that it makes sense to programmatically write your CI in your favorite language rather than

00:09:56.058 --> 00:10:13.781
using YAML and what he's telling me is now something that they have been rapidly iterating on
day and night for the last month or so, and speaking at meetups all over about, because it's
caught the attention of the AI fan people and the short version of that for this podcast.

00:10:13.981 --> 00:10:31.051
Is that if AI is coming for us, and it's still a crazy chaos monkey in and of itself because
I've witnessed this on my own machine just last week where I was using Cursor and gave it what's
called YOLO mode, which means that the Cursor ai, I can basically give it a set of instructions.

00:10:31.306 --> 00:10:36.616
And it will iterate over and over and over again,
including command line tooling and command line builds.

00:10:36.816 --> 00:10:39.276
And it does that over and over until it gets a successful result.

00:10:39.696 --> 00:10:46.226
And it was trying to build for me an iOS app, which it was
doing very poorly and it would constantly fail at the build.

00:10:46.676 --> 00:10:53.156
And it tried this over 20 times in a row to build this
thing, edit some code, build this thing, always failing.

00:10:53.546 --> 00:10:53.756
It.

00:10:53.756 --> 00:10:57.116
Started to believe that my computer was broke so it.

00:10:57.316 --> 00:11:06.746
Stopped just short of executing a sudo RM dash RF of a specific
system directory where X code binaries were installed before.

00:11:06.746 --> 00:11:07.766
I said, ah, ah, ah.

00:11:07.766 --> 00:11:09.596
That is definitely not something I want you to do.

00:11:09.836 --> 00:11:29.160
Luckily, in Cursor there is a setting under the yolo mode, Which prevents it from doing any
RM commands, but there was a clear indication of this thing being so confused, doesn't know
when to quit, which my friend Nirmal Mehta actually pointed out to me that that's one of
the problems with our models today is they don't know when to quit and it wasn't quitting.

00:11:29.490 --> 00:11:37.950
And this exact thing could easily happen when we would ever let AI
control any sort of infrastructure or run in CI or do anything like that.

00:11:37.950 --> 00:11:38.250
Right.

00:11:38.580 --> 00:11:39.060
So.

00:11:39.345 --> 00:11:45.305
The fundamental problem today is that we have these things
that are kind of smart, but they don't know when they're lying.

00:11:45.455 --> 00:11:51.725
They don't know when to quit, and we need to give
them a safe place to play until they figure it out.

00:11:52.175 --> 00:12:05.935
Well, containers happen to be the perfect place to do that, and since
we're all already used to containers, it turns out that daggers open
source may have accidentally stumbled on to a method or a workflow.

00:12:06.310 --> 00:12:19.170
That allows you to easily use LLMs inside a pipeline of tasking that you give it, but
that it also has access to a giant set of tools that Dagger calls the dagger verse.

00:12:19.170 --> 00:12:20.040
It's essentially a

00:12:20.040 --> 00:12:25.320
Docker Hub esque place for all the different tools and functions that people build.

00:12:25.520 --> 00:12:26.930
That you can use in Dagger.

00:12:27.050 --> 00:12:30.020
It would be kind of like the dagger version of GitHub Actions.

00:12:30.220 --> 00:12:45.970
And they have lots of them and they all happen to have a common API that Dagger
has learned how to give the L LMS access to, which means the LLM can instantly
understand the purpose of that tool and be able to use it to solve the problems.

00:12:46.170 --> 00:12:47.910
And it does all this in containers.

00:12:48.110 --> 00:13:07.270
So it is extremely early days for this, but the big minds over at Dagger are thinking
heavily about how they can use this to give AI work, let it iterate in a safe place, and
then return the final result while also having access to all the tools we want to give it.

00:13:07.470 --> 00:13:23.785
And this led to some thinking on my part And I'm probably gonna make a different podcast about
this because it comes down to some of the research I've been doing on AI agents and how they're
coming at us from the industry of AI to those that are making and deploying the software.

00:13:23.815 --> 00:13:24.145
Right.

00:13:24.625 --> 00:13:26.035
And there was a great talk.

00:13:26.235 --> 00:13:38.695
That, I'll worry about the details in another podcast, but essentially they were
talking about the idea that now that engineers are getting comfortable with AI
writing the code, they're able to iterate faster and produce more work output.

00:13:38.895 --> 00:13:45.755
But because operators and DevOps engineers have largely been standing on the sidelines and have.

00:13:45.955 --> 00:13:49.615
generally had this attitude of the AI will never touch my infrastructure.

00:13:49.815 --> 00:13:56.585
This has meant that one part of the pipeline of the
software lifecycle has sped up, but the others haven't.

00:13:56.615 --> 00:13:59.345
And that's gonna create the natural tension that we often have.

00:13:59.345 --> 00:14:08.685
Before we had DevOps, where you had devs creating software, and then they would have
to wait on the operators to get around to building the servers, and deploying the code.

00:14:08.885 --> 00:14:16.185
And we possibly could be in a moment where that's going to happen again due to mismatched velocity.

00:14:16.385 --> 00:14:22.715
and I haven't witnessed that myself because quite honestly,
I'm not working with any AI prompting teams in production.

00:14:22.885 --> 00:14:27.415
so I haven't witnessed this, but I can imagine that it's a real thing for some companies.

00:14:27.790 --> 00:14:31.270
Where they're aggressively taking advantage of AI in the dev groups.

00:14:31.373 --> 00:14:35.393
so anyway, that whole story has happened over the last three weeks.

00:14:35.573 --> 00:14:47.353
I'm very excited to be involved with this and I'm appreciative to Solomon
for the heads up because it caused me to spend weeks deep diving into
the state of these agents, the different players in the industry, and.

00:14:47.353 --> 00:14:50.743
Even how AWS is coming hard at this stuff.

00:14:50.943 --> 00:14:56.213
So I think it's real and I think it's gonna change a
lot of the content I create this year and going forward.

00:14:56.413 --> 00:14:59.763
But not to fear I am still strictly a.

00:14:59.763 --> 00:15:05.313
focused podcast on everything, infrastructure, DevOps, containers, and the like.

00:15:05.513 --> 00:15:09.413
It's just gonna have the assistance of robots at this point.

00:15:09.613 --> 00:15:10.873
So stay tuned on that front.

00:15:11.073 --> 00:15:22.113
and the last thing I'll say is that you probably have noticed that we
haven't had guests on the show for a few months, and that's due to us
coming out of KubeCon last year and taking a break during the holidays.

00:15:22.313 --> 00:15:31.493
Which led into this whole sort of rebooting of our content machine and me spending
some time focusing on business improvements as well as the AI rabbit hole.

00:15:31.493 --> 00:15:51.903
So what we're doing now is we're going throughout all the CNCF projects, all the cloud native
ambassadors, all the Docker captains, and we're trying to find two main focuses, new, exciting
CNCF projects that are becoming popular or are graduating and that we bring one of the experts
from that project on the show to talk about where they're at and what the product does.

00:15:52.103 --> 00:15:59.333
At the same time, we're definitely gonna be leaning into this AI
agents wave and how that's going to affect all of us specifically.

00:15:59.533 --> 00:16:01.213
And I personally can't wait.

00:16:01.493 --> 00:16:12.498
I have a feeling in my gut that's telling me this could be the next big wave for
infrastructure tech like we saw over a decade ago with containers and distributed computing.

00:16:12.698 --> 00:16:17.838
But I'm not smart enough to make any grand declarations
of exactly how this is all gonna come together.

00:16:18.048 --> 00:16:20.748
I just want to be there to witness it and to report it.

00:16:20.885 --> 00:16:22.655
So thanks for sticking with me on this one.

00:16:22.855 --> 00:16:37.025
Expect some more content coming out of KubeCon over the next few weeks as I try to read
the tea leaves at the conference about not just what they're saying at the keynotes, but
also what people are doing on the ground day to day in the world of cloud native DevOps.

00:16:37.945 --> 00:16:38.815
See you in the next one.