1
00:00:00,450 --> 00:00:04,530
Yeah, I talked to the CTO of a big biopharm.

2
00:00:04,680 --> 00:00:09,420
It's about 24,000 employees, and he said copilot.

3
00:00:09,600 --> 00:00:13,380
No, just they're used, I mean, they're all in, in 365.

4
00:00:13,890 --> 00:00:14,670
SharePoint.

5
00:00:15,465 --> 00:00:18,705
Of course email the whole shebang.

6
00:00:19,185 --> 00:00:24,255
And, uh, he said, oh, 65, uh, copilot just a hard no.

7
00:00:24,255 --> 00:00:27,285
What, what are his end users going to do with it?

8
00:00:27,765 --> 00:00:30,585
And Microsoft isn't discounted like it, it is.

9
00:00:31,370 --> 00:00:38,570
A full whatever it was, 35, 25 bucks per user, per month times 24,000 users.

10
00:00:39,020 --> 00:00:43,040
And I think I, uh, one of my more popular AI posts on LinkedIn

11
00:00:43,040 --> 00:00:47,480
was sharing how my son got thrust or pulling him copilot.

12
00:00:48,050 --> 00:00:51,230
And he said he rather just got the $25 a

13
00:00:51,230 --> 00:00:53,690
month directly from his company as a stipend.

14
00:00:53,690 --> 00:00:57,560
He, he just, it's, it's a tool he does not use.

15
00:01:03,420 --> 00:01:04,950
Welcome to Screaming in the Cloud.

16
00:01:05,310 --> 00:01:06,510
I'm Cory Quinn.

17
00:01:06,630 --> 00:01:09,540
It's been a month of Sunday since I had him on the show,

18
00:01:09,540 --> 00:01:13,590
but Keith Townsend, founder at the CTO Advisor is back.

19
00:01:13,920 --> 00:01:14,500
Keith, how have you been?

20
00:01:15,540 --> 00:01:16,140
Been good.

21
00:01:16,200 --> 00:01:17,220
I've been doing good.

22
00:01:17,550 --> 00:01:21,810
Corey, having screamed in the clouds in a long time, so I'll, I'll, I'll do the.

23
00:01:23,429 --> 00:01:25,020
I like that Crying Out.

24
00:01:25,020 --> 00:01:26,759
Cloud is one of the few cloud security

25
00:01:26,759 --> 00:01:29,190
podcasts that's actually fun to listen to.

26
00:01:29,280 --> 00:01:32,190
Smart Conversations, great guests, and Zero Fluff.

27
00:01:32,250 --> 00:01:34,080
If you haven't heard of it, it's a cloud and

28
00:01:34,080 --> 00:01:37,590
AI security podcast from Wiz Run by Clouds.

29
00:01:37,590 --> 00:01:39,570
Sec Pros, four Clouds SEC Pros.

30
00:01:39,810 --> 00:01:42,000
I was actually one of the first guests on the

31
00:01:42,000 --> 00:01:43,920
show, and it's been amazing to watch it grow.

32
00:01:43,979 --> 00:01:48,960
Make sure to check them out at wiz.io/crying-out-cloud.

33
00:01:49,110 --> 00:01:51,600
So since we've spoken, you got acquired by the

34
00:01:51,600 --> 00:01:54,149
Futurum Group and then have left the Futurum group.

35
00:01:54,149 --> 00:01:56,550
It, it's almost like, uh, the acquisition door

36
00:01:56,550 --> 00:01:58,560
is one of those things to which you're a cat.

37
00:01:58,560 --> 00:02:01,229
Like I'm gonna always be perpetually on the wrong side of it.

38
00:02:01,619 --> 00:02:02,640
What are you out in these days?

39
00:02:02,640 --> 00:02:03,539
What's, what's exciting?

40
00:02:03,539 --> 00:02:03,960
What's fun?

41
00:02:03,960 --> 00:02:04,500
What are you seeing?

42
00:02:05,009 --> 00:02:08,820
The exciting thing is, uh, beyond ai.

43
00:02:09,180 --> 00:02:13,109
Is that the enterprise has kind of grown up,

44
00:02:13,109 --> 00:02:14,940
and I think we'll talk about this a little bit.

45
00:02:15,060 --> 00:02:18,300
They've realized that they're not going to convert.

46
00:02:18,870 --> 00:02:21,989
To any one platform and they just have to deal with

47
00:02:21,989 --> 00:02:24,120
the complexity, at least most enterprise itself.

48
00:02:24,209 --> 00:02:26,579
And that is, you know, kind of the, the, the

49
00:02:26,579 --> 00:02:28,590
people who have followed me know the brand.

50
00:02:28,590 --> 00:02:29,519
That's what I deal with.

51
00:02:29,519 --> 00:02:33,239
The mucky middle, not necessarily the sexy technologies.

52
00:02:33,660 --> 00:02:37,799
The, the, the grinding through and just figuring out the

53
00:02:37,799 --> 00:02:41,820
old and crusty, which is interfacing with the new is sexy.

54
00:02:42,945 --> 00:02:46,395
I stand corrected on a position I took a long time ago

55
00:02:46,395 --> 00:02:49,065
that, uh, multi-cloud was sort of a worst practice.

56
00:02:49,070 --> 00:02:51,405
It, it may be, but it's what everyone's doing.

57
00:02:51,465 --> 00:02:54,735
Uh, even anything beyond the trivial scale, it's, and I've

58
00:02:54,735 --> 00:02:58,215
seen companies fighting this since I came up from tech.

59
00:02:58,275 --> 00:03:01,305
When I got my first CCNA cert from Cisco back in the

60
00:03:01,305 --> 00:03:03,945
day, the entire certification assumed it was a giant

61
00:03:03,945 --> 00:03:06,255
Cisco universe that never had to interoperate with.

62
00:03:06,315 --> 00:03:07,155
Anything else.

63
00:03:07,395 --> 00:03:11,325
AWS was the same way for a long time, but they just got beaten up in

64
00:03:11,325 --> 00:03:14,925
the this year's Gartner report for not playing well with multi-cloud.

65
00:03:15,075 --> 00:03:17,805
So I'm sure they're going to finally pay attention to it because

66
00:03:17,805 --> 00:03:20,984
if you want AWS to care about something, make Gartner talk

67
00:03:20,984 --> 00:03:24,555
about it, and suddenly you can see their roadmap a common it's.

68
00:03:24,625 --> 00:03:27,295
Every company of scale uses a little bit of everything.

69
00:03:27,475 --> 00:03:32,305
There's definitely going to be strong, uh, biases and significant outliers.

70
00:03:32,455 --> 00:03:34,165
They're not gonna look at the three hyperscalers

71
00:03:34,165 --> 00:03:36,475
and split their workloads into thirds.

72
00:03:36,895 --> 00:03:40,555
There's gonna be something that is inherently dominant there, but every

73
00:03:40,555 --> 00:03:43,375
company needs to be able to walk and chew gum at the same time in this era.

74
00:03:44,035 --> 00:03:47,065
Yeah, we talked about it many moons ago.

75
00:03:47,065 --> 00:03:49,270
It was, you need to be an expert in something.

76
00:03:50,295 --> 00:03:54,165
And then you need to be fairly decent in everything else.

77
00:03:54,165 --> 00:03:57,435
And I think early cloud, that was really tough to be

78
00:03:57,435 --> 00:04:00,825
an expert in one thing and decent in other things.

79
00:04:01,125 --> 00:04:02,804
Now it's just, it's table stakes and we

80
00:04:02,804 --> 00:04:04,605
can do it a little bit rather now with ai.

81
00:04:04,605 --> 00:04:06,135
But it, it is table stakes.

82
00:04:06,280 --> 00:04:11,565
It, it has gotten easier, I will say, in that it's possible now for a single

83
00:04:11,565 --> 00:04:15,135
person to wrap their head around a lot of different things, just because

84
00:04:15,135 --> 00:04:19,060
you don't need to be a deep wizard just to use the platforms anymore.

85
00:04:20,010 --> 00:04:25,110
Yeah, I've, I've been waist deep in GCP the past

86
00:04:25,680 --> 00:04:30,510
48 hours, and, uh, it's a horrid experience.

87
00:04:30,659 --> 00:04:36,630
Like, not that it's a bad platform, I'm just more familiar with AWS and

88
00:04:36,659 --> 00:04:42,659
even working with ai, I am more familiar with calling open ai, uh, APIs.

89
00:04:43,500 --> 00:04:48,330
Uh, I've gone super fast with Curs and it's allowed me to do like.

90
00:04:49,155 --> 00:04:51,405
All the Google command line tools, I'm not even

91
00:04:51,405 --> 00:04:55,875
paying any attention to any of the GCloud, whatever.

92
00:04:56,880 --> 00:05:01,050
Cursor takes care of it for me, and I'm just orchestrating what I know

93
00:05:01,050 --> 00:05:03,990
to be when it gets in this troubleshooting loop, I know it's time for

94
00:05:03,990 --> 00:05:09,210
me to step in and offer some wisdom, and it, it is an amazing word.

95
00:05:09,210 --> 00:05:11,520
It allows me to build apps that I, that I've,

96
00:05:11,550 --> 00:05:13,860
quite frankly, I'm not qualified to build.

97
00:05:13,860 --> 00:05:16,830
And if, uh, if you're depending on one of these

98
00:05:16,830 --> 00:05:19,440
apps for production, you should probably be fired.

99
00:05:20,355 --> 00:05:21,405
Oh, absolutely.

100
00:05:21,705 --> 00:05:26,145
People love to talk smack about the AI code that it spits out, but it's also,

101
00:05:26,355 --> 00:05:29,565
look, I'm also not spending three hours anymore trying to figure out the right

102
00:05:29,565 --> 00:05:33,285
parameters to an undocumented library 'cause someone couldn't be bothered to

103
00:05:33,285 --> 00:05:36,945
give an example of how to use the thing, like I just breeze past that stuff.

104
00:05:37,035 --> 00:05:39,915
Now I have other problems, but I, I'm not a front end guy.

105
00:05:40,185 --> 00:05:44,350
However, I can build something now and then have AI slap a front end.

106
00:05:44,789 --> 00:05:47,190
In front of it that actually looks halfway

107
00:05:47,190 --> 00:05:50,550
decent and it, it, it's been a big unlock for me.

108
00:05:50,550 --> 00:05:52,289
I have a whole bunch of silly utilities I

109
00:05:52,289 --> 00:05:54,240
use just as a part of my daily workflow.

110
00:05:54,450 --> 00:05:58,500
I was using retool for years to write my newsletter, and now I have something

111
00:05:58,500 --> 00:06:03,330
that's custom featured on the same API just, it works on my phone now.

112
00:06:03,330 --> 00:06:05,909
I don't need to wait till I'm at a desktop to write the thing.

113
00:06:06,405 --> 00:06:09,330
I, I have a thing that fools, I, I, I built

114
00:06:09,360 --> 00:06:12,690
basically a, a AI powered, uh, RSS reader.

115
00:06:13,170 --> 00:06:16,920
So, you know, being building the Rss s reader is super,

116
00:06:17,160 --> 00:06:21,090
relatively trivial, but building one that runs the.

117
00:06:22,784 --> 00:06:26,534
The feeds through open ai that that then, you know,

118
00:06:26,534 --> 00:06:30,255
filters what I should be paying attention to and gives

119
00:06:30,255 --> 00:06:33,195
me summaries and AI generated summaries and all of that.

120
00:06:33,465 --> 00:06:36,075
That was a pretty trivial tool of the building

121
00:06:36,075 --> 00:06:38,684
that's been super helpful in my workflow.

122
00:06:40,230 --> 00:06:42,300
Yeah, I should really look into doing something like that.

123
00:06:42,300 --> 00:06:44,610
I have about a hundred and change RSS feeds

124
00:06:44,610 --> 00:06:46,590
these days that I wind up just consuming.

125
00:06:46,590 --> 00:06:49,410
And a lot of 'em don't publish often, but I still wind up getting

126
00:06:49,410 --> 00:06:51,630
a few hundred items a day, and I just have it hanging out in

127
00:06:51,630 --> 00:06:54,660
the corner, and I just glance at it periodically here and there.

128
00:06:54,660 --> 00:06:56,110
I'm running fresh RSS and like.

129
00:06:56,335 --> 00:06:59,995
Kubernetes test cluster at home, and it just sits there aggregating all

130
00:06:59,995 --> 00:07:04,885
the stuff I care about, but 90 to 95% of it can just, okay, good to know.

131
00:07:04,885 --> 00:07:07,465
I don't need to read this, especially in the era of AI

132
00:07:07,465 --> 00:07:11,635
generated slop, but the things that are good are really good.

133
00:07:12,115 --> 00:07:16,525
So having something that would act as a filter on that is not a terrible idea.

134
00:07:17,065 --> 00:07:20,095
Well, that that's the other thing that's I've

135
00:07:20,095 --> 00:07:22,855
found to be incredibly enabling, not just about.

136
00:07:24,450 --> 00:07:29,730
AI on the development side, I've developed AI processes, like

137
00:07:29,760 --> 00:07:36,630
I've tuned Chat GTP, and Gemini to the nth degree, and I can

138
00:07:36,630 --> 00:07:41,490
churn out a really high quality blog post in about an hour.

139
00:07:42,540 --> 00:07:44,055
This is something that used to take me.

140
00:07:46,140 --> 00:07:48,990
Hours, if not days, depending on the topic.

141
00:07:49,469 --> 00:07:54,450
And I can, and this is the thing I have to be careful on, I can speak on almost

142
00:07:54,450 --> 00:07:59,669
any topic now because I know enough about enough things to be dangerous, and the

143
00:07:59,669 --> 00:08:06,750
ai, uh, extends my ability to hallucinate to another area that I, you know, wow.

144
00:08:06,750 --> 00:08:09,900
The CTO advisor said it, so it must be true.

145
00:08:09,900 --> 00:08:13,650
And it's, it's, it's been, uh, it's been something that.

146
00:08:14,220 --> 00:08:17,910
Trying to stay within my lane and just go deeper in my, my

147
00:08:17,910 --> 00:08:22,110
expertise has been what I've attempted to discipline myself in.

148
00:08:23,534 --> 00:08:23,655
Yeah.

149
00:08:23,655 --> 00:08:27,674
What I've been doing that I find works pretty well is I'll

150
00:08:27,765 --> 00:08:30,885
write a draft of something and then have AI turn it from my

151
00:08:30,885 --> 00:08:33,554
stream of consciousness into something a lot more structured.

152
00:08:33,855 --> 00:08:36,645
And it, it's taken a while for me to get the prompts right

153
00:08:36,885 --> 00:08:40,814
and Lord knows after that I have some editing to do because

154
00:08:40,814 --> 00:08:44,054
if I just drop that out as written, it doesn't work.

155
00:08:44,385 --> 00:08:46,845
But there's a, but it becomes a good collaborator.

156
00:08:47,170 --> 00:08:49,750
And it definitely cuts the cycle time on this.

157
00:08:50,079 --> 00:08:54,130
And I've also found it to be very helpful for writing the initial

158
00:08:54,130 --> 00:08:56,829
outline of conference talks, which is something I've been terrible at.

159
00:08:57,010 --> 00:09:00,340
I generally know what I wanna say in a conference talk, but how

160
00:09:00,340 --> 00:09:03,010
to get there and how to tie it into the broader theme of whatever

161
00:09:03,010 --> 00:09:08,140
past me pitched on the CFP as always been a bit of a challenge and

162
00:09:08,560 --> 00:09:12,520
getting that structure just really serves as a massive unlock for me.

163
00:09:13,060 --> 00:09:13,300
Yeah.

164
00:09:13,300 --> 00:09:14,985
If you think about it, the, the.

165
00:09:16,770 --> 00:09:21,330
This generation of generative AI is a derivative

166
00:09:21,330 --> 00:09:23,730
of Transformers, language transformers.

167
00:09:23,730 --> 00:09:28,860
So the thing that I tell, uh, anyone who is considering AI

168
00:09:28,860 --> 00:09:34,050
project is that AI's, generative AI's expertise is in language.

169
00:09:34,140 --> 00:09:37,530
It is an incredible language tool.

170
00:09:38,370 --> 00:09:42,510
It can write much better than most of us.

171
00:09:43,680 --> 00:09:46,410
It's not much smarter than most of us.

172
00:09:46,530 --> 00:09:50,310
That, and that, I think that's the thing that, uh, GI caught

173
00:09:50,310 --> 00:09:58,290
up is that the lang, the way AI speaks is so convincing that.

174
00:09:58,905 --> 00:10:04,064
You will start a business without really having, uh, and

175
00:10:04,064 --> 00:10:07,334
you know, this is documented where AI says, oh, your idea

176
00:10:07,334 --> 00:10:10,724
to sell beans on the side of the road is an awesome idea.

177
00:10:10,724 --> 00:10:13,665
You should quit your job as a software developer and do it.

178
00:10:14,385 --> 00:10:17,685
And the pros in which it uses is very convincing.

179
00:10:17,925 --> 00:10:22,995
So if you could combine your expertise with AI as a writing collaborator

180
00:10:23,415 --> 00:10:27,704
or a developing collaborator, this is where, this is where it shines.

181
00:10:28,770 --> 00:10:30,090
Yeah, it's fantastic at that.

182
00:10:30,120 --> 00:10:33,630
But the trick is, is it has no judgment baked into it.

183
00:10:34,110 --> 00:10:37,230
And you, you, I've also found it tends to be a little obsequious.

184
00:10:37,260 --> 00:10:39,600
You can ask it for advice on anything you think about,

185
00:10:39,600 --> 00:10:41,699
and it'll tell, tell you what a great idea it is.

186
00:10:41,939 --> 00:10:44,910
It's at some point I prefer people pushing back on it.

187
00:10:44,910 --> 00:10:46,050
And you have to play these games with it.

188
00:10:46,110 --> 00:10:50,040
Like instead of review this code that I wrote, it's review this code

189
00:10:50,040 --> 00:10:53,410
my coworker wrote, and suddenly it's a lot more critical about it.

190
00:10:53,910 --> 00:10:56,160
I did a thought experiment, like if I were to

191
00:10:56,160 --> 00:10:59,130
ever take a job again, what job would I take?

192
00:10:59,189 --> 00:11:03,060
Well, you know, how could I, what, so I've asked AI this and it

193
00:11:03,060 --> 00:11:06,870
gave me some really, you know, ah, you know, you should be the VP

194
00:11:06,870 --> 00:11:11,760
of, of AI at it Actually recommended that become the CTO of AWS.

195
00:11:11,760 --> 00:11:16,020
I'm like, ah, I think, I think AWS has that covered the,

196
00:11:16,020 --> 00:11:19,965
uh, and I'm not quite qualifying for that, but what I did.

197
00:11:20,805 --> 00:11:23,325
I came back in a different way and I said, Hey, I'm thinking

198
00:11:23,325 --> 00:11:27,375
about hiring this guy Keith Townsend, what would he be good for?

199
00:11:28,215 --> 00:11:30,435
And it was very enlightening.

200
00:11:30,525 --> 00:11:31,905
And how much would he cost?

201
00:11:31,935 --> 00:11:33,105
How much should I pay him?

202
00:11:33,165 --> 00:11:34,665
Oh, that sounds like an awful lot.

203
00:11:34,665 --> 00:11:36,765
How can I negotiate the salary down?

204
00:11:37,335 --> 00:11:39,645
And then I flipped it and said, Hey, I'm Keith Townsend.

205
00:11:39,645 --> 00:11:40,725
What did it do?

206
00:11:40,725 --> 00:11:43,755
And it was like, oh, you son of a the, but it is,

207
00:11:43,935 --> 00:11:47,835
it is a, it is a great way to get that pushback.

208
00:11:48,840 --> 00:11:54,960
Yeah, it's, I think that a lot of folks are looking, just, it's a personality

209
00:11:54,960 --> 00:11:58,050
aspect, are looking for confirmation of their existing biases that they

210
00:11:58,050 --> 00:12:02,040
go into things with, and that's what it's effectively tuned on when.

211
00:12:02,415 --> 00:12:04,995
I've learned over the years that when people want to grab coffee

212
00:12:04,995 --> 00:12:08,955
with me and get my thoughts on a next job that they're thinking of

213
00:12:08,985 --> 00:12:12,615
or a product they wanna build, I've learned that I have to lead with.

214
00:12:12,825 --> 00:12:15,945
Are you looking for advice or are you looking for validation?

215
00:12:16,275 --> 00:12:19,845
Because if I think they're looking for advice and they're looking for

216
00:12:19,845 --> 00:12:23,550
me to agree with them, suddenly we are not as friendly as we used to be.

217
00:12:24,005 --> 00:12:26,885
And it's not even an intentional thing that people do.

218
00:12:26,915 --> 00:12:28,625
It's an unconscious thing.

219
00:12:28,985 --> 00:12:31,355
'cause if you ask people objectively, do you

220
00:12:31,355 --> 00:12:33,185
want advice or just me to agree with you?

221
00:12:33,425 --> 00:12:35,735
No one's gonna say, oh, I just want you to agree with me.

222
00:12:36,005 --> 00:12:38,105
But by even asking that, it suddenly makes people

223
00:12:38,105 --> 00:12:40,655
realize, oh wait, what do I actually want here?

224
00:12:40,685 --> 00:12:40,925
Okay.

225
00:12:40,925 --> 00:12:43,145
I do want critical feedback and it makes the

226
00:12:43,145 --> 00:12:44,765
rest of the conversation a lot more smooth.

227
00:12:44,855 --> 00:12:47,555
And I don't leave as many damaged friendships in my wake.

228
00:12:48,965 --> 00:12:49,355
Yeah.

229
00:12:49,355 --> 00:12:51,695
The, uh, it's the Kobe Bryant.

230
00:12:52,260 --> 00:12:55,500
Uh, approach, like you have this thing in your teeth.

231
00:12:55,530 --> 00:12:57,930
Like, do you want me to tell you, you have the thing in the teeth

232
00:12:57,930 --> 00:13:01,140
or you just want to keep talking as you have the thing in the teeth?

233
00:13:01,320 --> 00:13:02,970
If you want to, do you want to?

234
00:13:02,970 --> 00:13:07,170
Uh, I, I, I tell people this all the time.

235
00:13:07,230 --> 00:13:11,100
I'm way more interested in winning than I am and not losing.

236
00:13:12,495 --> 00:13:16,515
And, uh, it's the very nuanced step to good.

237
00:13:16,515 --> 00:13:20,145
And, uh, anyone who's an entrepreneur kind of understands

238
00:13:20,145 --> 00:13:23,595
this by nature, like, why don't you go work for someone?

239
00:13:23,625 --> 00:13:26,105
And the idea, well then I'm kind of, you know.

240
00:13:26,790 --> 00:13:29,190
Not necessarily selling, but I'm trying to get into

241
00:13:29,190 --> 00:13:33,570
protective mode where, uh, it's winning is someone else's

242
00:13:33,720 --> 00:13:37,350
problem versus a shared problem versus being entrepreneur.

243
00:13:37,710 --> 00:13:42,150
Winning is, uh, the problem is a hundred percent your problem.

244
00:13:42,150 --> 00:13:46,235
You may have a team that helps you get there, but the problem is your problem.

245
00:13:47,535 --> 00:13:49,665
Yes, very much so.

246
00:13:50,115 --> 00:13:54,825
What is your area versus what is something that you're willing to delegate?

247
00:13:54,855 --> 00:13:57,045
It's, I have found that.

248
00:13:58,605 --> 00:14:00,915
In a lot of these cases as well, enterprise have taken,

249
00:14:00,915 --> 00:14:03,855
enterprise have taken a more reasoned approach, but they also are

250
00:14:03,855 --> 00:14:07,395
lagging in some ways in AI compared to a bunch of the upstarts.

251
00:14:07,545 --> 00:14:09,555
And it makes perfect sense, and I've been saying

252
00:14:09,555 --> 00:14:13,065
this for a while, that take Cursor as an example.

253
00:14:13,065 --> 00:14:16,365
AWS has a couple of options there now where they have their Q

254
00:14:16,365 --> 00:14:18,705
developer and they have their kiro thing that apparently they

255
00:14:18,705 --> 00:14:21,825
can't price well according to a bunch of articles this week, but.

256
00:14:22,560 --> 00:14:27,120
If Cursor does something absurd and starts saying problematic things after

257
00:14:27,120 --> 00:14:30,930
a release, it becomes a bit of a he, he, you know, the how AI works and

258
00:14:31,110 --> 00:14:33,720
they wind up fixing it and if anything, it becomes a PR boost for them.

259
00:14:33,750 --> 00:14:35,790
'cause people have heard about them as it makes the news cycle.

260
00:14:36,209 --> 00:14:40,050
But these big companies, if their products start doing this, they

261
00:14:40,050 --> 00:14:43,170
take reputational damage and they view this as an existential threat.

262
00:14:43,350 --> 00:14:46,650
So they're first and foremost looking at this through the lens of guardrails.

263
00:14:46,650 --> 00:14:48,720
How do we make sure it never goes off script?

264
00:14:49,170 --> 00:14:51,540
And that's not really where innovation tends to come from.

265
00:14:52,650 --> 00:14:57,540
Yeah, I, I, the, I published from the chat GPT store.

266
00:14:58,230 --> 00:15:01,440
I published this, uh, rag I had created, I had

267
00:15:01,740 --> 00:15:05,250
taken all of my blog posts over the past 10 years.

268
00:15:06,360 --> 00:15:09,780
Formatted into A-J-J-S-O-N-L file.

269
00:15:10,020 --> 00:15:12,540
Uh, did a bunch of schema work.

270
00:15:12,900 --> 00:15:15,930
Like I've spent a lot of time mass collecting

271
00:15:15,930 --> 00:15:19,200
and massaging the data, categorizing the data.

272
00:15:19,650 --> 00:15:23,130
I fed it to, uh, custom GPT.

273
00:15:23,760 --> 00:15:26,460
It should have been very simple.

274
00:15:26,460 --> 00:15:34,350
Basically ask virtual Keith a question worked in the builder.

275
00:15:34,949 --> 00:15:35,670
Published it.

276
00:15:35,670 --> 00:15:40,410
Someone said, Hey, wouldn't you be embarrassed if it said something that

277
00:15:40,410 --> 00:15:46,380
you, that puts you in a position that, uh, quite frankly is embarrassing?

278
00:15:46,620 --> 00:15:47,640
I'm like, no, not really.

279
00:15:47,790 --> 00:15:49,469
I'm just, you know, it's just chief.

280
00:15:49,500 --> 00:15:53,310
Like, if you took this free advice from this free tool

281
00:15:53,475 --> 00:15:56,189
and it told you to do something bad and you did it

282
00:15:56,189 --> 00:15:58,890
anyway, man, well, you know, you're not really a grownup.

283
00:15:59,445 --> 00:16:02,325
I, I'd be, I'd be absolutely embarrassed if I were passing

284
00:16:02,325 --> 00:16:06,885
it off as me and not disclosing that it was AI powered.

285
00:16:06,975 --> 00:16:08,625
There's a universe of difference there.

286
00:16:08,685 --> 00:16:10,305
There's a universe of a difference.

287
00:16:10,305 --> 00:16:11,205
So two points.

288
00:16:11,205 --> 00:16:13,815
One, it did do exactly that.

289
00:16:13,815 --> 00:16:15,225
It said embarrassing stuff.

290
00:16:15,945 --> 00:16:18,885
And then two, I learned how exactly, how fragile it was.

291
00:16:18,885 --> 00:16:27,495
Like it, here's the corpus of data only, uh, perform prompts.

292
00:16:28,110 --> 00:16:31,439
Responses off of this corpus and nothing else.

293
00:16:32,595 --> 00:16:36,795
The links, the URLs are in the in in js, ON, and it

294
00:16:36,795 --> 00:16:40,995
would just generate quote after quote after quote.

295
00:16:41,175 --> 00:16:44,805
That was nothing that looked, nothing like what I would say.

296
00:16:44,835 --> 00:16:48,195
Nothing I've said, and it would still hallucinate.

297
00:16:48,540 --> 00:16:51,720
Uh, references, even though the references were there.

298
00:16:51,720 --> 00:16:59,460
So that is modern day ai, at least AI big that enterprises are trying to avoid.

299
00:16:59,490 --> 00:17:03,480
So if you're a Fortune 500 shop and this is what you have to

300
00:17:03,480 --> 00:17:07,349
work with and you're told, don't do anything embarrassing.

301
00:17:07,950 --> 00:17:08,220
Yeah.

302
00:17:08,580 --> 00:17:11,520
Would you give an intern with very little judgment access

303
00:17:11,520 --> 00:17:14,340
to speak on your behalf without someone editing it?

304
00:17:14,430 --> 00:17:15,660
Probably not.

305
00:17:15,660 --> 00:17:19,440
I mean, this is some, this ties back to something I've been saying for a very

306
00:17:19,440 --> 00:17:22,650
long time, and it seems that corporate comms departments haven't gotten it.

307
00:17:22,950 --> 00:17:27,060
You can outsource the work, but you cannot outsource the responsibility.

308
00:17:27,420 --> 00:17:31,400
There's a. Well, one of our contractors got breached.

309
00:17:31,400 --> 00:17:32,000
Like, wow.

310
00:17:32,090 --> 00:17:32,360
Sure.

311
00:17:32,360 --> 00:17:35,780
Do wonder what company hired those contractors who got breached?

312
00:17:35,780 --> 00:17:37,190
It's terrific.

313
00:17:37,310 --> 00:17:39,080
I trusted you with the information.

314
00:17:39,080 --> 00:17:42,560
You are the one that did a bad job of vendor selection and they got popped.

315
00:17:42,770 --> 00:17:42,980
Yeah.

316
00:17:42,980 --> 00:17:46,040
That, that sounds like an internal problem for me, for, uh, from my

317
00:17:46,040 --> 00:17:49,705
perspective, but you seem to think it's this get outta jail free pass.

318
00:17:49,895 --> 00:17:50,345
It's not,

319
00:17:51,175 --> 00:17:55,160
yeah, I, I, I remember the, uh, early days of, uh, e-commerce.

320
00:17:55,490 --> 00:17:56,300
I'm that old.

321
00:17:57,555 --> 00:18:03,345
That, uh, I ordered something online, it didn't come, and I

322
00:18:03,345 --> 00:18:07,005
called the vendor, I don't even remember who it was, and they

323
00:18:07,005 --> 00:18:11,745
said, oh, well we gave it to UPS and they didn't deliver it.

324
00:18:12,315 --> 00:18:14,355
And I said, well, how is that my prop?

325
00:18:15,495 --> 00:18:16,455
Did you not?

326
00:18:16,485 --> 00:18:17,805
Did I hire UPS?

327
00:18:17,805 --> 00:18:19,065
Did I select UPS?

328
00:18:19,305 --> 00:18:20,265
It's your vendor.

329
00:18:20,265 --> 00:18:21,254
It's you.

330
00:18:21,675 --> 00:18:25,605
And I think, uh, in this world of abstraction, on top of abstraction,

331
00:18:25,605 --> 00:18:29,565
on top of abstraction, from a architecture perspective, we're

332
00:18:29,565 --> 00:18:33,285
getting that like US architects, we're starting to, not just

333
00:18:33,285 --> 00:18:38,385
starting, we realize that we're responsible for the system.

334
00:18:38,730 --> 00:18:42,300
Throughout the whole lifecycle of the system, whether or not we're

335
00:18:42,300 --> 00:18:47,310
outsourcing infrastructure to AWS, whether or not we're abstracting

336
00:18:47,310 --> 00:18:52,470
data and data management to Google with Big Query or whatever

337
00:18:52,470 --> 00:18:57,630
the platform, ultimately we're responsible for the outputs of

338
00:18:57,630 --> 00:19:01,740
the system, and AI has just immensely complicated that problem.

339
00:19:02,640 --> 00:19:03,750
Yes, massively.

340
00:19:05,100 --> 00:19:05,340
It's.

341
00:19:06,104 --> 00:19:08,909
It's hard to say what the future's gonna hold around this, but.

342
00:19:09,705 --> 00:19:13,514
I, I do think that some things are going to be more or less permanent, where

343
00:19:13,965 --> 00:19:17,264
enterprises are still going to feel the need to keep up with the Joneses.

344
00:19:17,264 --> 00:19:21,314
I think everyone has still lost their minds and has stuffed the AI hype into

345
00:19:21,314 --> 00:19:25,215
every product left and right, to the point where I'm scared to update apps now,

346
00:19:25,425 --> 00:19:29,415
just on the basis of what are they going to shove down my throat, I mean, zoom.

347
00:19:29,705 --> 00:19:30,935
Great example of this.

348
00:19:31,024 --> 00:19:34,835
I can't join a meeting anymore on Zoom without it popping up, talking

349
00:19:34,835 --> 00:19:39,304
about Zoom docs and its email collaboration suite, and its chat.

350
00:19:40,085 --> 00:19:43,205
As a consultant, I get to talk to an awful lot of companies out there,

351
00:19:43,205 --> 00:19:47,915
and I have yet to discover a single company using it for those things.

352
00:19:48,034 --> 00:19:49,264
Who's buying this?

353
00:19:49,264 --> 00:19:52,024
And I'm starting to have the creeping sensation that maybe it's nobody.

354
00:19:52,905 --> 00:19:59,325
Well, the, uh, you know, the, the big slow companies are my, are my jam.

355
00:19:59,355 --> 00:20:03,945
There's, this is where I operate in, and it used to

356
00:20:03,945 --> 00:20:08,025
be just, it was it that would hold back a company.

357
00:20:08,655 --> 00:20:13,665
But customer after customer I've talked to, it is not just it anymore.

358
00:20:13,665 --> 00:20:18,555
Like when it comes to ai, the folks who

359
00:20:18,615 --> 00:20:21,585
bought cloud before it would bless cloud.

360
00:20:22,514 --> 00:20:23,115
Ai.

361
00:20:23,145 --> 00:20:29,024
They don't want to touch it with a 10 foot pole until it's blessed by it.

362
00:20:29,325 --> 00:20:31,034
They don't want to be the next person in.

363
00:20:31,034 --> 00:20:34,405
The NE news, uh, is, is I find it strange whether it's.

364
00:20:35,024 --> 00:20:39,705
Business U users not really understanding AI or business users understanding

365
00:20:39,735 --> 00:20:45,705
AI enough to know that it's too dangerous for even them to use at this point.

366
00:20:45,705 --> 00:20:49,514
So it is, it is it, it has been a interesting journey.

367
00:20:49,514 --> 00:20:53,955
I'm, I'm surprised at how resistant to adoption, even

368
00:20:53,955 --> 00:20:58,305
when they find AI tools being forced upon them in

369
00:20:58,305 --> 00:21:03,165
Salesforce or SAP or something, they don't use these tools.

370
00:21:04,274 --> 00:21:06,764
Skip the stressful war rooms and go straight

371
00:21:06,764 --> 00:21:09,615
to the answers with Sumo Logic Dojo ai.

372
00:21:09,855 --> 00:21:12,585
When you add Dojo AI agents to your team, they

373
00:21:12,585 --> 00:21:14,985
get context from your live production data.

374
00:21:15,165 --> 00:21:17,955
They deliver potential threats and actionable

375
00:21:17,955 --> 00:21:20,475
insights guiding your investigation to the root.

376
00:21:20,510 --> 00:21:24,980
Cause use natural language to ask, moot your most pressing questions as

377
00:21:24,980 --> 00:21:28,520
he brings in the right agent for that point in the investigation workflow.

378
00:21:28,760 --> 00:21:35,090
Learn more at sumo logic.com/solutions/dojo-ai.

379
00:21:35,360 --> 00:21:37,250
You ask a bunch of people they want to be when they grow up.

380
00:21:37,280 --> 00:21:39,440
No one's gonna say a cautionary tale.

381
00:21:39,740 --> 00:21:43,370
I I think that it's pretty clear that a lot of this value is

382
00:21:43,370 --> 00:21:46,940
not there in that these companies have to mandate the use of it.

383
00:21:47,000 --> 00:21:48,590
It's a push, not a pull.

384
00:21:48,920 --> 00:21:49,770
I, I think that.

385
00:21:50,340 --> 00:21:52,950
There's a lot of value in AI that is going to be realized

386
00:21:52,950 --> 00:21:55,950
on a personal level, maybe not the enterprise level.

387
00:21:55,950 --> 00:22:00,330
I was talking with Ed Zitron on this show a couple months back, and part of

388
00:22:00,629 --> 00:22:04,680
his opinion on this, and I think he's onto something, is there is value there,

389
00:22:04,680 --> 00:22:08,460
but what if that value is like a $50 billion a year of market that doesn't

390
00:22:08,460 --> 00:22:12,810
justify all of the investment and hype and nonsense that's gone into this.

391
00:22:12,810 --> 00:22:14,190
So we're we're headed for a reckoning.

392
00:22:14,740 --> 00:22:18,280
Yeah, I talked to the CTO of a big biopharm.

393
00:22:18,940 --> 00:22:25,510
It's about 24,000 employees and he said Microsoft oh 365.

394
00:22:25,570 --> 00:22:27,520
The a, uh, copilot?

395
00:22:27,700 --> 00:22:29,320
No, just they're used.

396
00:22:29,350 --> 00:22:31,480
I mean, they're all in, in 365.

397
00:22:31,930 --> 00:22:32,770
SharePoint.

398
00:22:33,630 --> 00:22:36,780
Of course, email the whole shebang.

399
00:22:36,960 --> 00:22:42,360
And, uh, he said, oh, 65, uh, co-pilot, just a hard no.

400
00:22:42,360 --> 00:22:45,360
What, what are his end users going to do with it?

401
00:22:45,840 --> 00:22:48,660
And Microsoft isn't discounted like it, it is.

402
00:22:49,450 --> 00:22:56,680
A full whatever it was, 35, 25 bucks per user, per month times 24,000 users.

403
00:22:57,100 --> 00:23:01,150
And I think I, uh, one of my more popular AI posts on LinkedIn

404
00:23:01,150 --> 00:23:05,560
was sharing how my son got thrust or pulling him copilot.

405
00:23:06,130 --> 00:23:09,340
And he said he rather just, just got the $25 a

406
00:23:09,340 --> 00:23:11,770
month directly from his company as a stipend.

407
00:23:11,770 --> 00:23:15,640
He, he just, it's, it's a tool he does not use.

408
00:23:16,260 --> 00:23:18,120
What I also find weird is that all of these

409
00:23:18,120 --> 00:23:19,710
tools are converging on the same things.

410
00:23:19,710 --> 00:23:21,600
It's how, how many chat bots do I really need

411
00:23:21,600 --> 00:23:23,550
to interface with in the course of a day?

412
00:23:23,910 --> 00:23:27,660
And it really bugs me when it starts insisting upon itself.

413
00:23:27,840 --> 00:23:30,420
Where if I open a Google Doc, 'cause that's what we use for some of

414
00:23:30,420 --> 00:23:33,660
our collaboration internally, and it's like, Hey, write with Gemini.

415
00:23:33,960 --> 00:23:37,140
It, it feels like the in intrinsic message that

416
00:23:37,140 --> 00:23:39,540
it's saying is that you can't write this yourself.

417
00:23:39,780 --> 00:23:42,360
So much of the marketing that I'm seeing implies that

418
00:23:42,360 --> 00:23:45,540
the user is lazy, unethical, or a combination of the two.

419
00:23:46,200 --> 00:23:48,450
And that does not sit well.

420
00:23:48,450 --> 00:23:49,710
It's you're bad at your job.

421
00:23:49,710 --> 00:23:52,680
Let the computer do it has never been compelling to me.

422
00:23:52,890 --> 00:23:55,830
You know, Microsoft was disappointed that people weren't

423
00:23:55,830 --> 00:24:01,830
using LinkedIn's AI writing, and I've tried it a few times.

424
00:24:02,070 --> 00:24:02,970
It's really bad.

425
00:24:03,840 --> 00:24:07,170
Well, no, it's not that it's, it's not just that it's bad.

426
00:24:07,170 --> 00:24:10,080
It's bad because it's not interactive.

427
00:24:10,830 --> 00:24:13,290
The, it'll suggest a post.

428
00:24:13,290 --> 00:24:16,290
And if you've used AI to write, this is not how

429
00:24:16,290 --> 00:24:20,010
you use AI to write, you use your, it's iterative.

430
00:24:20,010 --> 00:24:22,770
You're like, oh, uh, I don't like this.

431
00:24:22,800 --> 00:24:24,480
Uh, you're missing a theme.

432
00:24:24,480 --> 00:24:30,095
You've taken out my voice, or whatever you're trying to get it to, uh, get to.

433
00:24:30,690 --> 00:24:33,090
It can help me in my social posts.

434
00:24:33,390 --> 00:24:38,130
Matter of fact, I use chat GTP for a good majority of my social posts,

435
00:24:38,550 --> 00:24:42,810
not because it's integrated into the platform, but because it has

436
00:24:42,810 --> 00:24:47,700
learned my voice and I can push back and I can, I can critique it

437
00:24:47,700 --> 00:24:49,890
and I can say, ah, you don't know what the hell you're talking about.

438
00:24:49,890 --> 00:24:52,680
I'm just gonna copy and paste what I originally started with.

439
00:24:52,725 --> 00:24:55,215
Or it can help me tease out ideas.

440
00:24:55,395 --> 00:24:57,165
That's not what the LinkedIn experience is,

441
00:24:57,165 --> 00:24:59,475
that's not what the experience is in SAP.

442
00:24:59,715 --> 00:25:02,084
That's not what the experience is in Salesforce.

443
00:25:02,115 --> 00:25:04,754
It is not iterative, it is not collaborative.

444
00:25:05,084 --> 00:25:10,245
It is kind of like, oh, you're, you don't know your job as a

445
00:25:10,245 --> 00:25:13,814
logistics professional, let me tell you how to do your job.

446
00:25:15,300 --> 00:25:15,780
Right.

447
00:25:15,870 --> 00:25:17,760
And that is wildly frustrating to me.

448
00:25:17,774 --> 00:25:20,879
I, I will use it for social posts, but it's in

449
00:25:20,879 --> 00:25:23,520
the context of here's a thing that I wanna write.

450
00:25:24,405 --> 00:25:27,465
Make this punchier, make it fit, wordsmith this.

451
00:25:27,495 --> 00:25:28,905
I have 300 characters.

452
00:25:28,905 --> 00:25:29,955
This is 320.

453
00:25:30,135 --> 00:25:30,525
Get it.

454
00:25:30,585 --> 00:25:32,085
Uh, come up with some turns of phrase.

455
00:25:32,175 --> 00:25:34,305
And I've also found it's terrific if you not, if you'd

456
00:25:34,305 --> 00:25:37,365
ask it to re, to do that, but give me 10 options and then

457
00:25:37,365 --> 00:25:40,065
you can pick phrases from the various ways it frames it.

458
00:25:40,095 --> 00:25:41,925
And that works super well.

459
00:25:42,045 --> 00:25:43,575
But it's iterative, it's collaborative.

460
00:25:43,605 --> 00:25:45,585
It is not something that I can automate.

461
00:25:45,915 --> 00:25:49,935
Uh, yeah, just watch this RSS feed and comment on stuff that's germane.

462
00:25:50,175 --> 00:25:51,975
While I sail around the world without a,

463
00:25:52,155 --> 00:25:53,655
without a computer for the next two years,

464
00:25:53,805 --> 00:25:57,585
I have some 82,000 uh, tweets.

465
00:25:58,544 --> 00:26:03,225
I pulled them, I normalized the data, I fed it back into chat,

466
00:26:03,225 --> 00:26:08,774
GTP, that so that it, it can capture my voice and I was still never

467
00:26:08,774 --> 00:26:12,615
trust it to automate and put out tweets in, in my voice because it.

468
00:26:13,245 --> 00:26:16,095
I will make myself, not just unhirable, but unen,

469
00:26:16,784 --> 00:26:20,264
engageable, if it would be bad results would incur.

470
00:26:21,254 --> 00:26:21,645
Yeah.

471
00:26:21,735 --> 00:26:23,175
Oh, it knows who I am.

472
00:26:23,264 --> 00:26:25,544
And when I ask it to comment on various bits

473
00:26:25,544 --> 00:26:28,665
of AWS news, it doesn't get it right at all.

474
00:26:29,054 --> 00:26:32,385
Oh, it knows who it, it, it absolutely knows who you are because

475
00:26:32,385 --> 00:26:36,105
when I say, Hey, how does this compare to my contemporaries?

476
00:26:36,105 --> 00:26:38,324
They'll say, oh, well, Corey Quinn would

477
00:26:38,324 --> 00:26:39,855
be a little bit more snarkier than that.

478
00:26:39,885 --> 00:26:40,105
I'm like, ah.

479
00:26:41,460 --> 00:26:43,680
Yes, that that applies to any sentence almost

480
00:26:43,740 --> 00:26:45,630
ever uttered in a professional contact.

481
00:26:45,630 --> 00:26:46,320
Yes, that is.

482
00:26:46,380 --> 00:26:47,430
That is my shtick.

483
00:26:47,520 --> 00:26:49,740
I am aware of this, but it also doesn't seem

484
00:26:49,740 --> 00:26:51,030
to understand there's a time and a place.

485
00:26:51,540 --> 00:26:53,175
No, it does not.

486
00:26:54,180 --> 00:26:58,260
It also, I, I keep running into guardrails as well, where like, anytime,

487
00:26:58,260 --> 00:27:01,770
like, so, like the most recent scandals, for example with, uh, meta,

488
00:27:02,070 --> 00:27:05,310
uh, they're saying it's okay to be romantic or sensual with teenagers.

489
00:27:05,310 --> 00:27:06,360
Like, great, terrific.

490
00:27:06,360 --> 00:27:09,480
That is, it's, it's been blowing up in a couple of corners of the internet in

491
00:27:09,480 --> 00:27:13,590
the last few days and none of the AI things will touch it understandably so,

492
00:27:13,590 --> 00:27:16,950
because it views, oh, there's, there's some stuff we don't make jokes about.

493
00:27:17,100 --> 00:27:18,120
Yes, I get that.

494
00:27:18,270 --> 00:27:19,740
I am not asking you to make those jokes.

495
00:27:19,740 --> 00:27:23,850
I'm asking you to basically skewer meta for its complete lack of ethics.

496
00:27:23,850 --> 00:27:28,110
There's a difference here, but the guardrails keep, uh, keep cropping up.

497
00:27:28,110 --> 00:27:30,150
And I get why companies have them in there.

498
00:27:30,150 --> 00:27:31,800
I'm not suggesting that they shouldn't.

499
00:27:32,310 --> 00:27:34,200
I wish it weren't as easy to get around it sometimes.

500
00:27:34,770 --> 00:27:34,890
Yeah.

501
00:27:34,890 --> 00:27:41,755
The, it's, it's, it's to the point where, um, I am way more interested.

502
00:27:42,915 --> 00:27:50,415
Private AI that I have been in the past, like I am very much a platform

503
00:27:50,475 --> 00:27:55,545
first for most modern technologies, if there's a cloud service for it,

504
00:27:56,295 --> 00:28:01,155
unless it's just a always on service where there's no economic value in it.

505
00:28:01,425 --> 00:28:04,245
Architecturally, I'm like, you know, let it be somebody else's problem.

506
00:28:04,430 --> 00:28:09,465
I, I don't want to manage AI drivers not.

507
00:28:10,125 --> 00:28:13,545
No desire to do it whatsoever until I run into

508
00:28:13,545 --> 00:28:16,275
these, uh, I was telling my wife about my check.

509
00:28:17,399 --> 00:28:21,480
My custom GPT problem and how I'm forced to have to build a

510
00:28:21,480 --> 00:28:26,310
more elegant solution for a platform that can handle millions

511
00:28:26,310 --> 00:28:33,000
of users A minute I have to build a custom application for.

512
00:28:33,725 --> 00:28:41,135
A half a dozen users in a week, but I want this tailored experience

513
00:28:41,135 --> 00:28:45,514
that when someone comes to, you know, critique or, uh, either

514
00:28:45,514 --> 00:28:50,254
critique my content or critique someone else's content, or ask kind

515
00:28:50,254 --> 00:28:53,915
of what would Keith think conceptually before I even engaged him.

516
00:28:54,720 --> 00:28:56,490
I want it to be a reliable service.

517
00:28:56,520 --> 00:28:59,820
I want it to be a experience that will meet

518
00:28:59,820 --> 00:29:02,820
the criteria fi finish of the CT advisor.

519
00:29:03,330 --> 00:29:09,540
And that requires this, you know, this, this stitching together of this

520
00:29:09,540 --> 00:29:17,490
low level AI so that you know, so that the experience, the desire UI, is.

521
00:29:17,940 --> 00:29:18,750
Is achieved.

522
00:29:19,170 --> 00:29:20,820
And I think this is the problem that a lot

523
00:29:20,820 --> 00:29:22,620
of enterprises are looking running into.

524
00:29:22,620 --> 00:29:24,960
They're looking at these big AI platforms.

525
00:29:26,010 --> 00:29:28,650
It's not meeting their immediate business needs.

526
00:29:28,710 --> 00:29:31,830
They could probably get away with a 7 billion parameter model,

527
00:29:32,220 --> 00:29:34,920
but they just need the expertise to either run it in the

528
00:29:34,920 --> 00:29:37,710
public cloud, which all the cloud providers are getting there.

529
00:29:37,710 --> 00:29:40,650
They're, they're providing the tools or they need to,

530
00:29:40,650 --> 00:29:43,260
uh, they need to hand stitch this stuff themselves.

531
00:29:44,639 --> 00:29:46,590
I, I'm interested in the local inference piece just

532
00:29:46,590 --> 00:29:49,860
because I want it to still be usable once the good times

533
00:29:49,860 --> 00:29:52,409
dry up and they stop throwing money into these things.

534
00:29:52,770 --> 00:29:55,320
Uh, because again, some of these AI tools are great

535
00:29:55,320 --> 00:29:57,689
for 20, 30 bucks a month, but not two or 3000.

536
00:29:57,960 --> 00:30:02,220
So I, I want a good enough version that I can run on my own hardware

537
00:30:02,429 --> 00:30:08,040
because I, now that I've had the benefit of having AI write the front

538
00:30:08,040 --> 00:30:10,980
end code, I don't know how to write, for example, I don't wanna go back,

539
00:30:11,790 --> 00:30:12,179
right.

540
00:30:13,410 --> 00:30:17,700
Yeah, I don't, I could, I'm watching it like in the background now.

541
00:30:18,630 --> 00:30:21,390
Uh, cursor's, troubleshooting the app that I'm building.

542
00:30:21,390 --> 00:30:26,040
I the, you know, it will tell you, oh, now go and try it.

543
00:30:26,610 --> 00:30:27,630
Like, wait, wait a minute, minute.

544
00:30:27,630 --> 00:30:30,030
I'm, I'm the grown up here.

545
00:30:30,090 --> 00:30:31,020
I'm in charge.

546
00:30:31,320 --> 00:30:32,490
You try it.

547
00:30:32,910 --> 00:30:36,870
The, uh, why You just write a write a test script and

548
00:30:36,870 --> 00:30:39,510
keep working through the problem until you've solved it.

549
00:30:39,960 --> 00:30:42,360
I don't wanna be engaged in your troubleshooting.

550
00:30:43,560 --> 00:30:46,440
I don't wanna go back to the days where I have to troubleshoot.

551
00:30:46,440 --> 00:30:49,260
I don't wanna have to use my connect to call

552
00:30:49,650 --> 00:30:52,980
someone at Google Cloud to get support for free.

553
00:30:53,430 --> 00:30:55,350
I just want AI to do that for me.

554
00:30:56,460 --> 00:30:59,910
Yeah, I, I want it to have some boundaries.

555
00:30:59,910 --> 00:31:01,800
Like I could never let it run loose on my laptop.

556
00:31:01,800 --> 00:31:03,660
I give it its own VM somewhere to run in

557
00:31:03,930 --> 00:31:05,550
because, you know, there, there's client data.

558
00:31:05,550 --> 00:31:07,560
I don't wanna smack it into, but there's a,

559
00:31:07,590 --> 00:31:09,660
but aside from that, just go iterate on this.

560
00:31:09,660 --> 00:31:10,200
Have fun.

561
00:31:10,200 --> 00:31:12,570
And if it doesn't work, oh well, but I don't wanna

562
00:31:12,570 --> 00:31:14,595
come back to a $10,000 monthly inference bill either.

563
00:31:15,615 --> 00:31:24,135
Well, this has been one of my fears with the AI services, uh, that somehow

564
00:31:24,165 --> 00:31:31,514
that I put a AI chat bot or something out there and someone figures out

565
00:31:31,545 --> 00:31:36,980
how to jailbreak it, and now they're just running their app through my.

566
00:31:38,580 --> 00:31:43,770
Through my high quality, you know, before where I would've used, uh, before

567
00:31:43,770 --> 00:31:52,050
where someone would've used flashlight or some, uh, for many type of of model,

568
00:31:52,889 --> 00:32:00,000
they now have access to my Gemini Pro or my five Pro, and they run ragged in it.

569
00:32:00,000 --> 00:32:06,570
And I get some $10,000 bill because of my poor security, because I've used a.

570
00:32:06,960 --> 00:32:08,010
AI to develop.

571
00:32:08,250 --> 00:32:15,000
So what I've been learning is how to do be a better software

572
00:32:15,000 --> 00:32:19,110
manager, a better product manager, as I've been building

573
00:32:19,110 --> 00:32:22,770
with AI assistance since it's been an a fascinating journey.

574
00:32:23,790 --> 00:32:24,540
Oh, absolutely.

575
00:32:24,690 --> 00:32:31,710
And I think that it's helpful to, to keep some form of, I guess, uh.

576
00:32:33,305 --> 00:32:34,260
Uh, distance in there.

577
00:32:34,260 --> 00:32:36,450
Like right now with inference being as easy to find as it is, most

578
00:32:36,450 --> 00:32:38,909
places, I feel like a lot of folks have not started really looking for

579
00:32:38,909 --> 00:32:42,149
how to get free inference, uh, the scammers and whatnot, but it'll come.

580
00:32:42,360 --> 00:32:43,169
It always does.

581
00:32:43,320 --> 00:32:44,520
And, and I'm the same way.

582
00:32:44,520 --> 00:32:47,340
Whenever I build something, I'm using the best top of line

583
00:32:47,340 --> 00:32:51,600
model because I'm not scaling this out to millions of users.

584
00:32:51,600 --> 00:32:53,190
I wanna make sure it works.

585
00:32:53,430 --> 00:32:56,040
I don't want to do it for the least amount of money possible because

586
00:32:56,040 --> 00:32:59,700
we're talking, well, I could do this for 15 cents a month instead of 70.

587
00:32:59,940 --> 00:33:00,540
And.

588
00:33:00,605 --> 00:33:03,335
I don't care about less than the cost of

589
00:33:03,335 --> 00:33:05,315
a candy bar when it comes to these things.

590
00:33:05,315 --> 00:33:07,415
I want it to work, and I want it to work well.

591
00:33:07,715 --> 00:33:12,215
Yeah, I think that, you know, we, we started down into kinda like the

592
00:33:12,305 --> 00:33:20,225
economic return of AI for entrepreneurs, solopreneurs, and small companies.

593
00:33:20,915 --> 00:33:23,015
There's been no doubt that.

594
00:33:23,835 --> 00:33:27,735
So right now it's just me and my wife still, but we both are avid

595
00:33:27,735 --> 00:33:33,255
users of AI and we just haven't found the need to hire yet because

596
00:33:33,555 --> 00:33:37,790
when you're dealing with an area that you have expertise in.

597
00:33:39,825 --> 00:33:44,415
You're, uh, dealing with ai, you 10 x yourself, and this is one

598
00:33:44,415 --> 00:33:48,195
of the things that I've talked to other really smart people about

599
00:33:48,195 --> 00:33:52,155
and that haven't really figured out, is how do you codify that?

600
00:33:52,275 --> 00:33:56,685
Like how do you, the way that Corey is using ai, how

601
00:33:56,685 --> 00:34:00,165
do I scale that to a hundred people in my organization?

602
00:34:00,795 --> 00:34:01,575
And that is.

603
00:34:02,010 --> 00:34:04,050
That has been the tricky part, right?

604
00:34:04,050 --> 00:34:07,680
Is you, you're able to do it because you have x

605
00:34:07,680 --> 00:34:11,370
number of years of systems, architecture experience.

606
00:34:11,610 --> 00:34:13,230
You understand systems.

607
00:34:13,620 --> 00:34:16,230
AI is just, oh, you, oh, I get this.

608
00:34:16,230 --> 00:34:19,500
This is like having a bunch of junior developers that

609
00:34:19,530 --> 00:34:23,639
have, you know, really, really massive amounts of, of.

610
00:34:24,659 --> 00:34:27,540
Photographic memory and I can put them to work.

611
00:34:27,540 --> 00:34:31,049
I have to put guard rails around them, but I can build these incredible systems.

612
00:34:31,679 --> 00:34:32,069
Yeah.

613
00:34:32,130 --> 00:34:34,650
Intelligence, stat of 18, wisdom stat of three.

614
00:34:35,279 --> 00:34:38,009
The, you know, you, you've done it with your career with

615
00:34:38,009 --> 00:34:41,880
AWS, you understand that, you know what, yeah, sure.

616
00:34:41,880 --> 00:34:45,029
One EC2 instances, and it's reliable as one VM

617
00:34:45,029 --> 00:34:49,679
in the data center, but three EC2 instances is.

618
00:34:50,295 --> 00:34:53,925
Yeah, I, I will say my use of AI is a whole bunch of onesie, twosies.

619
00:34:53,925 --> 00:34:57,825
I don't have any sustained processes that use it the same way every time.

620
00:34:58,005 --> 00:35:00,315
So I look at this and effectively say, oh,

621
00:35:00,315 --> 00:35:02,325
what, what job could I replace with this?

622
00:35:02,325 --> 00:35:04,905
It's, these aren't things I would hire people to do.

623
00:35:04,905 --> 00:35:07,665
I'm not gonna hire an artist, for example, to come up with a

624
00:35:07,665 --> 00:35:11,415
picture of a giraffe and a data center aisle, uh, for a dumb

625
00:35:11,415 --> 00:35:14,085
post on social media that gets three likes and one retweet.

626
00:35:14,115 --> 00:35:16,095
'cause no one else has the same sense of humor that I do.

627
00:35:16,425 --> 00:35:18,585
Uh, but I absolutely will make a robot do it.

628
00:35:20,024 --> 00:35:22,334
Yeah, I'm, I'm in the same space.

629
00:35:22,334 --> 00:35:22,845
I think

630
00:35:25,245 --> 00:35:28,365
the stuff that AI is really not good at is, like, if I needed to

631
00:35:28,365 --> 00:35:32,475
do a deck and I wanted that deck to look really good, I'm, I'm,

632
00:35:32,580 --> 00:35:35,685
I'm giving, you know, someone's paying me to deliver their keynote.

633
00:35:36,615 --> 00:35:37,455
I'm not gonna run it through ai.

634
00:35:37,605 --> 00:35:40,455
I'll, I'll get the outline from ai 'cause AI is good at

635
00:35:40,455 --> 00:35:44,625
language, but I'm gonna just send the, I'm going to send

636
00:35:44,625 --> 00:35:48,705
the deck off to a deck deck jockey to make it look good.

637
00:35:49,455 --> 00:35:53,774
The, uh, so, but it does enable me to do stuff,

638
00:35:53,834 --> 00:35:55,634
to your point that I would not have done.

639
00:35:55,935 --> 00:35:59,384
So I'm doing this, uh, daily series, the, uh, which is actually what

640
00:35:59,384 --> 00:36:03,410
sparked this, which is the cloud every day, whereas I'm looking at a, at.

641
00:36:03,915 --> 00:36:09,105
Platform engineering from a different angle, slightly different angle every day.

642
00:36:10,250 --> 00:36:14,930
I just, I asked AI for the, for a 10 day, uh, editorial calendar.

643
00:36:16,634 --> 00:36:21,285
And then, you know, I can now go through and say, Hey, gimme a rough draft,

644
00:36:21,464 --> 00:36:25,665
work with it for about 30 minutes a day, and I can churn out quality posts.

645
00:36:25,785 --> 00:36:28,185
I don't have to hire a ghost writer to do that now.

646
00:36:28,515 --> 00:36:28,785
Yeah.

647
00:36:28,964 --> 00:36:30,435
And you're not having, and you're not sitting

648
00:36:30,435 --> 00:36:32,055
there just copy and pasting it either,

649
00:36:32,505 --> 00:36:34,605
you know, a hundred so people see it a day.

650
00:36:34,935 --> 00:36:37,485
Not worth, not worth paying somebody else

651
00:36:37,485 --> 00:36:39,915
to do, but worth 15 minutes of my time.

652
00:36:41,190 --> 00:36:42,030
Exactly.

653
00:36:42,060 --> 00:36:43,350
'cause you never know what's gonna hit.

654
00:36:43,470 --> 00:36:46,590
And I, the idea of just doing a copy paste from

655
00:36:46,590 --> 00:36:49,110
AI and calling it a day is horrifying to me.

656
00:36:49,530 --> 00:36:52,110
But yeah, collaborating, helping it do a yes.

657
00:36:52,110 --> 00:36:55,830
And for me, the hardest thing that I have to overcome is the empty

658
00:36:55,830 --> 00:36:58,380
text editor in front of me when it's time to write something.

659
00:36:58,620 --> 00:36:58,950
So.

660
00:36:59,015 --> 00:37:02,404
Have it do a draft of something and even if, especially if it's wrong,

661
00:37:02,615 --> 00:37:05,615
but now, 'cause now I can correct the robot and I can mansplain to it or

662
00:37:05,615 --> 00:37:10,475
human mansplain to it, and suddenly I have a much stronger post for it.

663
00:37:11,535 --> 00:37:11,925
Yeah.

664
00:37:11,925 --> 00:37:14,775
And the great thing is even if you're not comfortable

665
00:37:14,775 --> 00:37:16,695
with that, you can just say, Hey, gimme an outline.

666
00:37:17,145 --> 00:37:19,665
The, when I did the a hundred days of ai, I said, you

667
00:37:19,665 --> 00:37:22,965
know what, give me a, give me a hundred days of AI of

668
00:37:22,965 --> 00:37:25,515
topics to cover every day for the next a hundred days.

669
00:37:25,965 --> 00:37:28,695
And I look up like, oh, here's today's topic.

670
00:37:29,205 --> 00:37:30,255
Ah, I don't like that.

671
00:37:30,825 --> 00:37:32,650
Now to another topic and then.

672
00:37:33,735 --> 00:37:36,825
Inevitably, three days later, that topic comes up and I have to figure

673
00:37:36,825 --> 00:37:42,225
out a new topic, but I digress that, that it is, I, I'm a fan with limits.

674
00:37:42,765 --> 00:37:42,885
Yeah.

675
00:37:42,885 --> 00:37:45,615
To bring this black full circle though, these are personal

676
00:37:45,615 --> 00:37:48,795
productivity accelerants for which AI is becoming invaluable.

677
00:37:49,335 --> 00:37:50,775
These are not enterprise stories.

678
00:37:51,315 --> 00:37:55,785
I still struggle to see the massive, wide scale enterprise upside,

679
00:37:55,785 --> 00:37:58,215
where we're gonna basically be able to make every one of our

680
00:37:58,215 --> 00:38:01,695
employees way more productive and replace entire teams with agents.

681
00:38:02,145 --> 00:38:04,365
I, I think that they're wish casting.

682
00:38:05,325 --> 00:38:11,205
Yeah, I that I've seen some really great examples of AI

683
00:38:11,205 --> 00:38:16,155
doing jobs no one else would've done or didn't want to do.

684
00:38:17,145 --> 00:38:20,055
Will, you know, I'm, I keep seeing, you know what?

685
00:38:20,085 --> 00:38:23,444
Dell Technologies hiring is laying off due to ai.

686
00:38:23,835 --> 00:38:26,924
Dell Technologies is not hiring, laying off doing ai.

687
00:38:26,924 --> 00:38:30,345
Dell technology is laying off 'cause they overhired during COVID COVID

688
00:38:30,585 --> 00:38:34,305
and they're trying to get back to a number that meets, that keeps

689
00:38:34,305 --> 00:38:38,805
Michael Dell, you know, happy that, that, that has nothing to do with ai.

690
00:38:40,125 --> 00:38:42,584
They need an AI story more than they need anything else.

691
00:38:42,584 --> 00:38:44,774
When it comes to these things, like I, and I still think

692
00:38:44,774 --> 00:38:47,174
that there's this constituency that that's using ai.

693
00:38:47,174 --> 00:38:49,575
Like I got an email the other day saying, I noticed you

694
00:38:49,575 --> 00:38:52,575
work with AWS costs, so let's talk about how to lower the

695
00:38:52,575 --> 00:38:56,234
cost of your voice phone system was the opening sentence.

696
00:38:56,384 --> 00:38:59,415
And I honestly couldn't figure out if it was AI run amok or just

697
00:38:59,415 --> 00:39:03,015
someone with not a lot of neurons to bang together to make sparks.

698
00:39:03,254 --> 00:39:05,654
Like on some level it almost doesn't matter

699
00:39:05,654 --> 00:39:07,875
'cause I'm not responding to the bad pitch.

700
00:39:08,525 --> 00:39:10,625
But it is something that makes me think

701
00:39:10,625 --> 00:39:13,445
that you still have to use it appropriately.

702
00:39:13,865 --> 00:39:16,715
You still have to use it appropriately, and it's not something that

703
00:39:16,720 --> 00:39:19,895
that, you know, we haven't really talked about agentic and all that, but

704
00:39:20,285 --> 00:39:27,845
just regular chat, I have, without a doubt, I have won deals because ai.

705
00:39:28,560 --> 00:39:32,610
Was able to flesh out details in the email that I typically ignore.

706
00:39:33,150 --> 00:39:34,950
I am not a big emailer.

707
00:39:34,950 --> 00:39:37,440
I, I'm an email, big email is, I like to use

708
00:39:37,440 --> 00:39:40,440
email, but I don't like pages and pages of emails.

709
00:39:40,800 --> 00:39:44,790
Evidently, my customers like pages and pages of details.

710
00:39:45,735 --> 00:39:48,225
Short email means that you're angry or annoyed

711
00:39:48,495 --> 00:39:49,905
and AI expands.

712
00:39:49,905 --> 00:39:52,065
Like, I don't really know what else to say in this email.

713
00:39:52,065 --> 00:39:56,835
And no AI will say, oh, you missed, you know, these three different topics and

714
00:39:57,015 --> 00:40:01,005
whether they read it or not, whatever, but it, it has been super effective.

715
00:40:02,160 --> 00:40:06,420
Yes, I am a big fan of what some of these things are unlocking.

716
00:40:06,810 --> 00:40:09,600
Uh, Keith, I really wanna thank you for taking the time to speak with me.

717
00:40:09,690 --> 00:40:12,900
If people wanna learn more, where's the best place for them to find you?

718
00:40:13,380 --> 00:40:15,090
The CTO advisor.com.

719
00:40:15,180 --> 00:40:17,910
I'm still trying to buy CTO advisor.com.

720
00:40:18,330 --> 00:40:20,370
The guy wants $1,200 for it.

721
00:40:20,400 --> 00:40:21,720
I refuse.

722
00:40:21,750 --> 00:40:23,970
Now someone's gonna out and pay the 1200 bucks.

723
00:40:23,970 --> 00:40:25,380
I don't really care but.

724
00:40:25,595 --> 00:40:27,339
He, he wants 1200 bucks for it.

725
00:40:27,370 --> 00:40:28,089
I refuse.

726
00:40:28,095 --> 00:40:30,850
The, the, the, the, the site has been abandoned

727
00:40:30,850 --> 00:40:34,930
for years, but, ah, the ct advisor.com

728
00:40:35,259 --> 00:40:35,740
oh, oh.

729
00:40:35,740 --> 00:40:38,529
I have a couple of site domains like that and I've made overtures and when

730
00:40:38,529 --> 00:40:41,140
you come back with a number that has two commas in it, we're done here.

731
00:40:41,140 --> 00:40:44,080
There is no conversation that we are going to have that's

732
00:40:44,080 --> 00:40:46,089
going to lead to an outcome that we're both happy with.

733
00:40:46,600 --> 00:40:46,779
I bet.

734
00:40:46,779 --> 00:40:49,180
We'll of course, put links to that in the show notes.

735
00:40:49,240 --> 00:40:51,040
Keith, thank you so much for your time,

736
00:40:51,400 --> 00:40:52,480
Corey, thanks again.

737
00:40:52,480 --> 00:40:52,810
It's always fun.

738
00:40:54,255 --> 00:40:56,475
Keith Townsend, the CTO advisor.

739
00:40:56,625 --> 00:40:59,775
I'm Cloud economist Cory Quinn, and this is Screaming In the Cloud.

740
00:40:59,955 --> 00:41:02,445
If you've enjoyed this podcast, please leave a five

741
00:41:02,445 --> 00:41:04,725
star review on your podcast platform of choice.

742
00:41:04,785 --> 00:41:08,085
Whereas if you've hated this podcast, please leave a five star review on

743
00:41:08,085 --> 00:41:11,985
your podcast platform of choice, along with an angry comment written by Ai.

744
00:41:12,105 --> 00:41:15,765
Heck, while you're at it, have it Leave angry comments on all the platforms.