1
00:00:00,000 --> 00:00:03,720
Eric Anderson: I think you have to be
as good at the AI game as these frontier

2
00:00:03,720 --> 00:00:05,250
labs, but I think that's possible.

3
00:00:05,370 --> 00:00:07,410
Like they clearly don't have a lock.

4
00:00:07,410 --> 00:00:07,860
Hold on.

5
00:00:07,860 --> 00:00:10,110
Talent, you know, talent's
just leaking everywhere.

6
00:00:15,330 --> 00:00:17,010
Corey Quinn: Welcome to
Screaming in the Cloud.

7
00:00:17,160 --> 00:00:18,300
I'm Corey Quinn.

8
00:00:18,570 --> 00:00:22,260
My guest today is one of those
rare breeds we don't see a

9
00:00:22,260 --> 00:00:24,240
lot of as guests on this show.

10
00:00:24,450 --> 00:00:29,310
Eric Anderson is a partner at
Scale, which is a VC firm, Eric.

11
00:00:29,770 --> 00:00:30,730
Thank you for joining me.

12
00:00:30,880 --> 00:00:31,660
Eric Anderson: Thank you, Corey.

13
00:00:32,290 --> 00:00:35,380
Corey Quinn: This episode is
sponsored in part by my day job Duck.

14
00:00:35,380 --> 00:00:38,590
Bill, do you have a horrifying AWS bill?

15
00:00:38,860 --> 00:00:40,750
That can mean a lot of things.

16
00:00:40,960 --> 00:00:46,030
Predicting what it's going to be,
determining what it should be, negotiating

17
00:00:46,030 --> 00:00:51,490
your next long-term contract with AWS,
or just figuring out why it increasingly

18
00:00:51,490 --> 00:00:55,990
resembles a phone number, but nobody
seems to quite know why that is.

19
00:00:56,290 --> 00:00:56,980
To learn more.

20
00:00:56,980 --> 00:00:57,490
Visit dot.

21
00:00:57,610 --> 00:00:59,860
Bill hq.com.

22
00:01:00,160 --> 00:01:03,040
Remember, you can't duck the duck bill.

23
00:01:03,070 --> 00:01:07,870
Bill, which my CEO reliably informs
me is absolutely not our slogan.

24
00:01:09,000 --> 00:01:12,390
Usually we talk a lot more to folks who
are the engineering type, the founder

25
00:01:12,390 --> 00:01:15,600
type, occasionally the marketing type
who weasel their way through, but

26
00:01:15,600 --> 00:01:20,280
we've only had a handful of VCs in
the years this show has been running.

27
00:01:20,520 --> 00:01:25,949
So for those listening in the audience who
might not be entirely clear on what the

28
00:01:25,949 --> 00:01:31,169
role of a general partner at a VC firm is
and only be able to gather it contextually

29
00:01:31,169 --> 00:01:34,800
and badly from the platform formerly
known as Twitter, what is it you'd say?

30
00:01:34,800 --> 00:01:35,850
It is you do here?

31
00:01:36,615 --> 00:01:39,075
Eric Anderson: We fund and
support startups, it's a lot of

32
00:01:39,075 --> 00:01:42,675
trying to find what's the next
big thing, who's building it.

33
00:01:42,795 --> 00:01:45,585
Uh, convince 'em to take
your money and then make 'em

34
00:01:45,585 --> 00:01:47,324
successful to the best they can.

35
00:01:47,324 --> 00:01:48,675
Corey Quinn: It always seems
counterintuitive for that to

36
00:01:48,675 --> 00:01:51,074
be the framing because like,
please, please take my money.

37
00:01:51,074 --> 00:01:54,345
Sounds like it's, it doesn't sound
like it's a real problem, if that

38
00:01:54,345 --> 00:01:58,245
makes much sense, but I've, I have
a bunch of friends who've raised and

39
00:01:58,245 --> 00:02:01,185
talking to them about the process
and seeing folks go through it.

40
00:02:01,505 --> 00:02:02,104
It's weird.

41
00:02:02,104 --> 00:02:03,455
It's it's feast or famine.

42
00:02:03,455 --> 00:02:05,795
It's either no one will fund you
or everyone wants to fund you.

43
00:02:05,795 --> 00:02:11,555
And how do you decide which of the various
economic suitors you decide to go with?

44
00:02:11,555 --> 00:02:14,795
So a lot of it comes back, channel
references, track record history,

45
00:02:14,855 --> 00:02:18,125
similar to the same way that VCs wind
up trying to pick the founders that

46
00:02:18,125 --> 00:02:20,795
they want to take bets on these days.

47
00:02:20,795 --> 00:02:25,595
It seems like it is difficult to
separate out the world of VC and

48
00:02:25,595 --> 00:02:30,125
funding and startups from the
monstrosity that has become ai.

49
00:02:30,705 --> 00:02:33,945
But you've been doing this
longer than AI has been a thing.

50
00:02:33,975 --> 00:02:39,045
Historically, you ran a product team, I'm
not sure which one, over at AWS, which

51
00:02:39,045 --> 00:02:40,695
I will accept apologies for in a moment.

52
00:02:40,935 --> 00:02:45,075
Uh, then you were at GCP doing similar
things for a while and then decided,

53
00:02:45,075 --> 00:02:46,815
you know, building things seems hard.

54
00:02:46,815 --> 00:02:49,635
Let's go instead do the corporate
version of betting on the

55
00:02:49,810 --> 00:02:50,805
ponies, on the horse track.

56
00:02:50,865 --> 00:02:51,915
Uh, what was the progression there?

57
00:02:52,195 --> 00:02:55,375
Eric Anderson: Spot instances
was the, was the AWS thing.

58
00:02:55,495 --> 00:02:56,665
Corey Quinn: Ah, yes, yes, yes.

59
00:02:56,665 --> 00:02:56,935
Eric Anderson: Yeah.

60
00:02:56,935 --> 00:03:00,865
Everyone's favorite, like intellectual
product to, to, to tinker on.

61
00:03:00,925 --> 00:03:01,675
And then it was.

62
00:03:03,045 --> 00:03:06,285
Uh, kind of BigQuery, but mostly
this thing called data flow.

63
00:03:06,554 --> 00:03:09,495
But I, I say BigQuery just 'cause
people know it a little bit better.

64
00:03:09,885 --> 00:03:13,274
And the progression was really, I don't
know, it was incidental at the time.

65
00:03:13,274 --> 00:03:16,005
I was just, uh, I was interested
in startups and I wanted to prove

66
00:03:16,035 --> 00:03:18,285
my, my metal in Silicon Valley.

67
00:03:18,345 --> 00:03:22,995
And so I was, I felt like the best way to
do that was I going to Amazon or Google

68
00:03:22,995 --> 00:03:24,495
and working on the most technical thing.

69
00:03:24,975 --> 00:03:27,015
I didn't study cs, I started mechanical.

70
00:03:27,015 --> 00:03:30,825
And that, that was always a bit of like
an uphill battle with these like hiring.

71
00:03:30,885 --> 00:03:31,545
Firms.

72
00:03:31,695 --> 00:03:33,975
I was like, oh, just stick
me on EC2 and I can show you.

73
00:03:33,975 --> 00:03:35,864
I, I can, I'll survive.

74
00:03:36,075 --> 00:03:38,084
Corey Quinn: It kind of feels like
that's almost the problem we have at

75
00:03:38,084 --> 00:03:42,795
Google as well, where you have a cash
cow that is advertising and everything

76
00:03:42,795 --> 00:03:45,345
else is almost incidental to that,
oh, you wanna build a moon base?

77
00:03:45,345 --> 00:03:45,584
Fine.

78
00:03:45,584 --> 00:03:46,394
Go build a moon base.

79
00:03:46,394 --> 00:03:46,755
Good luck.

80
00:03:46,785 --> 00:03:48,495
We're still selling ads primarily.

81
00:03:48,674 --> 00:03:52,245
I, I've always gotten the sense
that the AWS world, that EC2 was

82
00:03:52,245 --> 00:03:53,745
kind of that 800 pound gorilla.

83
00:03:53,835 --> 00:03:54,674
Eric Anderson: Yes, yes.

84
00:03:54,674 --> 00:03:59,085
That was like, like you build products
and then you monetize them via EC2.

85
00:03:59,385 --> 00:04:02,055
You know, when I was first there, it
was before they broke out revenues.

86
00:04:02,115 --> 00:04:06,165
You know, so it was kind of, no one quite
knew how interesting this thing was.

87
00:04:06,465 --> 00:04:08,685
Corey Quinn: Everyone thought the thing
was losing money hand over fist and then

88
00:04:08,685 --> 00:04:12,495
one day they make an announcement and oh
my God, those are damn near SaaS margins.

89
00:04:12,555 --> 00:04:14,445
Eric Anderson: And I would get
this mini announcement, right?

90
00:04:14,445 --> 00:04:17,175
I got an email every Friday that
told me how many cores we had sold.

91
00:04:17,175 --> 00:04:19,875
Like a little team summary, and
I would get out my calculator

92
00:04:19,875 --> 00:04:20,960
and be like, this is crazy.

93
00:04:20,960 --> 00:04:21,440
They're.

94
00:04:21,700 --> 00:04:22,990
Printing cash.

95
00:04:23,349 --> 00:04:23,620
Corey Quinn: Yeah.

96
00:04:23,620 --> 00:04:27,430
And what I love about Spot is that from,
I've heard this from multiple folks who

97
00:04:27,430 --> 00:04:32,920
were there at the time and afterwards,
that it really is just unused capacity.

98
00:04:32,920 --> 00:04:36,849
It's not like they wind up building stuff
specifically to shore up their spot.

99
00:04:37,030 --> 00:04:38,590
It's just stuff that would otherwise.

100
00:04:38,625 --> 00:04:42,075
Be going to waste sitting there as
more or less air conditioner ballast.

101
00:04:42,195 --> 00:04:45,975
Suddenly they found a way to monetize
this and they managed to do it in a way

102
00:04:46,125 --> 00:04:50,535
that doesn't completely destroy anyone
ever paying for on demand anything.

103
00:04:50,715 --> 00:04:52,455
And I think that's kind
of a neat approach.

104
00:04:52,485 --> 00:04:54,495
'cause there are sub use cases
for which it's phenomenal.

105
00:04:54,755 --> 00:04:56,075
Others for, which is terrifying.

106
00:04:56,285 --> 00:05:00,545
Uh, you were there back in the
days when it was inter, our wide

107
00:05:00,545 --> 00:05:04,415
swings in pricing before they,
they stabilized it significantly,

108
00:05:04,415 --> 00:05:05,825
which frankly was for the best.

109
00:05:05,975 --> 00:05:08,855
I didn't really want to become
a high frequency trader in this

110
00:05:08,855 --> 00:05:10,235
one incredibly niche thing.

111
00:05:10,505 --> 00:05:12,365
Eric Anderson: Apparently
this was Bezos's idea.

112
00:05:12,365 --> 00:05:14,555
I mean, I never spoke with him, but

113
00:05:14,945 --> 00:05:16,145
Corey Quinn: I have no
trouble believing that.

114
00:05:16,235 --> 00:05:16,445
Eric Anderson: Yeah.

115
00:05:16,445 --> 00:05:19,985
The banker guy was like, why aren't
we doing this kind of marketplace?

116
00:05:20,955 --> 00:05:22,544
And yeah, the wild swings.

117
00:05:22,575 --> 00:05:26,025
I didn't appreciate the fact that there
was this, you, you kind of imagine

118
00:05:26,025 --> 00:05:31,005
like unused capacity is this nebulous,
singular blob, but they carve up all the

119
00:05:31,005 --> 00:05:34,275
instances into all these instance types,
into all these regions availability.

120
00:05:34,275 --> 00:05:39,015
Suddenly you're like, we
have 400 SKUs of EC2, right?

121
00:05:39,015 --> 00:05:41,570
And each one is a little tiny spot market.

122
00:05:42,450 --> 00:05:43,170
It's a mess.

123
00:05:43,470 --> 00:05:43,710
Corey Quinn: Yeah.

124
00:05:43,710 --> 00:05:45,360
You, you're definitely dating
yourself with that reference.

125
00:05:45,360 --> 00:05:47,790
There's over 700 now in US East one alone.

126
00:05:47,820 --> 00:05:50,790
It's, I, I did the math on all of
this where I wind up tracking a

127
00:05:50,790 --> 00:05:54,090
lot of, I, I, I had Claude build
me some nonsense because why not?

128
00:05:54,270 --> 00:05:58,380
Where it tracks the, uh, the pricing pages
for everything that gets dropped out.

129
00:05:58,870 --> 00:06:01,600
It's multiple gigabytes for
all the pricing information

130
00:06:01,600 --> 00:06:02,710
broken down by service.

131
00:06:02,710 --> 00:06:06,850
But the EC2 specific one, I had
to refactor some of the code

132
00:06:06,850 --> 00:06:08,080
because it kept timing out.

133
00:06:08,110 --> 00:06:08,500
Lambdas.

134
00:06:08,500 --> 00:06:10,000
There's no way you can
grab the whole thing.

135
00:06:10,210 --> 00:06:14,530
IOOM killed an EC2 box a couple of
times because yeah, that thing's

136
00:06:14,530 --> 00:06:18,820
enormous and it's, and I'm also on
bad at programming, but that's okay.

137
00:06:18,820 --> 00:06:21,280
Gonna fix that particular
problem for me rather nicely.

138
00:06:21,640 --> 00:06:26,200
And just being able to track all of
this, it, it is a monstrous surface area.

139
00:06:26,955 --> 00:06:30,645
And not just tracking the pricing, let
alone actually making the thing work.

140
00:06:30,795 --> 00:06:33,975
Eric Anderson: So, so keeping stock,
you know, like, like basically a spot

141
00:06:33,975 --> 00:06:35,415
is like an inventory problem, right?

142
00:06:35,445 --> 00:06:38,565
And keeping inventory of one
product is a lot different than 400.

143
00:06:38,565 --> 00:06:39,135
I mean, it's like.

144
00:06:39,735 --> 00:06:40,034
Corey Quinn: Right.

145
00:06:40,034 --> 00:06:42,344
My, my customers all tend to be
extraordinarily large scale, which

146
00:06:42,344 --> 00:06:46,605
kind of puts the lie to a lot of
the historical way the cloud was

147
00:06:46,605 --> 00:06:49,094
positioned and sold even in 2018.

148
00:06:49,094 --> 00:06:51,675
I had a client when the I threes
came out like, okay, we're gonna

149
00:06:51,675 --> 00:06:53,895
spin up 1200 of those in Ohio.

150
00:06:54,045 --> 00:06:56,715
And the response from eight
OS was, can you give us about

151
00:06:56,715 --> 00:06:58,065
six weeks on that, please?

152
00:06:58,065 --> 00:06:59,355
That would be great.

153
00:06:59,715 --> 00:07:02,205
There's the cloud does not scale.

154
00:07:02,205 --> 00:07:03,705
Infinitely, uh, source tried.

155
00:07:03,705 --> 00:07:04,395
It didn't go so well.

156
00:07:05,295 --> 00:07:05,685
Doing.

157
00:07:05,685 --> 00:07:09,405
The capacity planning comes back
around at significant scale.

158
00:07:09,405 --> 00:07:12,255
It starts to resemble a lot of
the old school data center stuff.

159
00:07:12,465 --> 00:07:15,435
It's, oh, just turn this thing on
for an experiment and turn it off.

160
00:07:15,495 --> 00:07:16,185
That's still there.

161
00:07:16,215 --> 00:07:17,415
That's incredibly powerful.

162
00:07:17,415 --> 00:07:21,735
I think it has been a tremendous boon
to getting companies from idea to

163
00:07:21,735 --> 00:07:26,475
start up to success, but also from
idea to, oh wait, that won't work.

164
00:07:26,475 --> 00:07:29,745
Nevermind, turn it off, and I owe
24 cents at the end of the month.

165
00:07:30,075 --> 00:07:31,965
Both of those are incredibly powerful.

166
00:07:31,965 --> 00:07:32,590
Things.

167
00:07:32,859 --> 00:07:37,330
Uh, but as you continue to succeed
and, and grow and grow and grow it,

168
00:07:37,330 --> 00:07:40,630
it starts to resemble multi-year
capacity and contractual planning.

169
00:07:41,080 --> 00:07:44,049
So these days, what are you
finding that's exciting you?

170
00:07:44,049 --> 00:07:48,669
What is it that you are, that you're
looking at and saying, yes, that is

171
00:07:48,669 --> 00:07:49,900
something worth paying attention to?

172
00:07:50,910 --> 00:07:52,170
Eric Anderson: Certainly
the coding agents.

173
00:07:52,470 --> 00:07:57,120
This is, I mean, there's all this
talk about a GI, this, which is, is a

174
00:07:57,450 --> 00:07:59,700
poorly, it's an unhelpful phrase, right?

175
00:07:59,700 --> 00:08:02,880
I mean, what I, I, when we get
there, does anything change?

176
00:08:03,540 --> 00:08:04,260
Probably not.

177
00:08:04,260 --> 00:08:08,050
And then the overlap between better
than human or less than better.

178
00:08:09,180 --> 00:08:15,270
But regardless, whatever it is, I think
coding agents are, maybe it, uh, yeah.

179
00:08:15,270 --> 00:08:18,690
Software has forever changed and it,
I feel like it, it happened more.

180
00:08:19,065 --> 00:08:22,005
In the last three months
than, than I guess before that

181
00:08:22,340 --> 00:08:24,345
Corey Quinn: I, I think that even looking
at something like Claude Code as a

182
00:08:24,345 --> 00:08:28,245
software, as a coding agent is a bit of
a misnomer and a bit of a weird approach.

183
00:08:28,425 --> 00:08:31,605
It can integrate with effectively
everything, and the interaction

184
00:08:31,605 --> 00:08:34,455
model is human language, where.

185
00:08:35,179 --> 00:08:37,909
You can tell it to go out and
grab a bunch of different APIs.

186
00:08:37,909 --> 00:08:41,089
Research the best way to do this,
construct a research report.

187
00:08:41,089 --> 00:08:44,900
You can treat it just like you can
the, the Claude Chatbot expression.

188
00:08:45,110 --> 00:08:50,089
When you have access to the entire CLI,
when you have access to every API out

189
00:08:50,089 --> 00:08:55,189
there on the planet, suddenly it's starts
to look a lot less like a coding agent

190
00:08:55,189 --> 00:08:59,089
and a lot more like an orchestrator
where you can tie together all sorts of

191
00:08:59,089 --> 00:09:01,730
things with a, we're still at a point
where you need a little bit of coding

192
00:09:01,730 --> 00:09:03,920
knowledge to make things work, but.

193
00:09:04,305 --> 00:09:08,295
Software is no longer the bottleneck
for an awful lot of stuff.

194
00:09:08,655 --> 00:09:08,925
Eric Anderson: Yeah.

195
00:09:08,925 --> 00:09:12,825
I don't, like, I don't use
slides anymore or PowerPoint.

196
00:09:13,005 --> 00:09:16,755
It's actually, I think, easier to just
ask Claude to generate a presentation

197
00:09:16,755 --> 00:09:21,765
and it does it in like an HTML, you
know, webpage ish thing, which is

198
00:09:21,765 --> 00:09:23,145
like, how would you ever edit it?

199
00:09:23,145 --> 00:09:27,225
You don't, you ask Claude to edit it
or, or your, your agent of choice.

200
00:09:27,704 --> 00:09:28,725
Corey Quinn: Yeah, I, I do much the same.

201
00:09:28,725 --> 00:09:32,084
I use a slide dev theme that for my
company branding and the rest, but I

202
00:09:32,084 --> 00:09:35,865
built an entire custom plugin that has
a multiple, uh, different skills for how

203
00:09:35,865 --> 00:09:37,785
I do slides, how I think it should work.

204
00:09:37,995 --> 00:09:41,324
And I'll give it the, I'll
give it an outline and Great.

205
00:09:41,324 --> 00:09:42,765
Turn this into a slide deck.

206
00:09:42,944 --> 00:09:47,439
Suddenly all the problems I
had as a presenter, I, I. I'm a

207
00:09:47,439 --> 00:09:49,060
public speaker probably too much.

208
00:09:49,060 --> 00:09:51,760
I have this ongoing love affair
with the sound of my own voice.

209
00:09:51,760 --> 00:09:57,579
He said on his own podcast where this
became, or my biggest challenge was I

210
00:09:57,579 --> 00:10:00,730
would work on a part of the slide deck
here, then part of the slide deck here.

211
00:10:00,880 --> 00:10:03,730
Then I'd go and give the thing,
and I'm circling the same 0.3

212
00:10:03,730 --> 00:10:06,610
different times at different
points throughout the presentation.

213
00:10:06,880 --> 00:10:11,500
It is a terrific editor for, okay, now
go back and fix the narrative flow.

214
00:10:11,650 --> 00:10:14,770
Make sure that it does the
things in the right order.

215
00:10:15,600 --> 00:10:16,800
It's almost, but not quite.

216
00:10:16,800 --> 00:10:18,840
At a point where I can have it
just build my slide deck for me.

217
00:10:18,990 --> 00:10:22,260
It's great for a first attempt at
that, but it'll just make things up.

218
00:10:22,319 --> 00:10:24,750
But it turns out, if you say
things with a straight enough

219
00:10:24,750 --> 00:10:25,829
face, people will believe you.

220
00:10:26,910 --> 00:10:27,240
Eric Anderson: Yeah.

221
00:10:27,540 --> 00:10:31,020
Slop through the mouth of a human
is like all the, all the content

222
00:10:31,020 --> 00:10:32,189
and all the credibility in one.

223
00:10:32,340 --> 00:10:33,000
Corey Quinn: Exactly.

224
00:10:33,060 --> 00:10:35,069
Now it's I, I think
that it's an assistant.

225
00:10:35,069 --> 00:10:38,460
I think human in the loop is still gonna
be required for the foreseeable future

226
00:10:38,490 --> 00:10:39,870
when it comes to most of these things.

227
00:10:39,960 --> 00:10:41,235
I think that as soon as you take.

228
00:10:42,180 --> 00:10:46,650
That judgment piece out of
it and let it speak for you.

229
00:10:47,250 --> 00:10:48,570
There are problems.

230
00:10:48,720 --> 00:10:51,360
You are risking your own
credibility every time you do it.

231
00:10:51,360 --> 00:10:55,770
Like I have a ea style bot that I
built, Billy the Platypus, uh, whatever

232
00:10:55,770 --> 00:10:58,710
I wind up, uh, turning it loose on
various pitches that people send me.

233
00:10:58,890 --> 00:11:02,790
It's technically professional,
technically, but it, he is

234
00:11:02,790 --> 00:11:03,960
just basically a total jerk.

235
00:11:03,990 --> 00:11:06,150
That's sort of the entire
persona that's built into it.

236
00:11:06,360 --> 00:11:09,870
There's a reason he sends as
Billy the Platypus and not as me.

237
00:11:10,050 --> 00:11:12,060
The first time that gets
even slightly wrong.

238
00:11:12,090 --> 00:11:14,820
I suddenly have a serious
reputation problem.

239
00:11:15,120 --> 00:11:16,950
Eric Anderson: Things that get
me excited along those lines

240
00:11:17,100 --> 00:11:19,500
are like, we won't look at code.

241
00:11:19,500 --> 00:11:20,790
I mean, I, I'm excited about this.

242
00:11:20,880 --> 00:11:23,490
Like there was a time when I thought we
used the coding agents and then you review

243
00:11:23,490 --> 00:11:24,960
it and someone else does the code review.

244
00:11:25,020 --> 00:11:26,790
Like as long as you do the
code review, you're safe.

245
00:11:26,790 --> 00:11:27,120
Right?

246
00:11:27,690 --> 00:11:31,290
And now we have the agents doing the
code review, and now maybe the risk is

247
00:11:31,290 --> 00:11:32,820
like, well, what about like performance?

248
00:11:33,464 --> 00:11:36,615
You know, you get these like terribly,
you know, no one's refactoring

249
00:11:36,615 --> 00:11:38,625
the code is just slop on top of.

250
00:11:38,775 --> 00:11:41,055
But I think we, I think we'll
have agents refactoring the code.

251
00:11:41,714 --> 00:11:44,295
And so I'm excited about the
idea, like, what, what ha what

252
00:11:44,295 --> 00:11:46,454
does the world look like when no
one looks at the code anymore?

253
00:11:46,694 --> 00:11:49,305
Like, what, what emerges,
what are the opportunity?

254
00:11:49,305 --> 00:11:53,564
And so I think there's some, maybe some
cool concepts around like, yeah, you know,

255
00:11:53,564 --> 00:11:56,969
performance improvement bots or you know,
someone that goes through and kind of.

256
00:11:57,765 --> 00:12:02,175
Refactor optimizes deploys this thing to,
you know, constantly keeps it updated with

257
00:12:02,175 --> 00:12:04,395
latest libraries, patching, I don't know.

258
00:12:04,665 --> 00:12:04,875
Corey Quinn: Yeah.

259
00:12:04,875 --> 00:12:06,675
That sort of maintenance
bot on it on some level.

260
00:12:06,705 --> 00:12:10,005
Uh, there's also, this, this is an
early optimization in some ways too.

261
00:12:10,035 --> 00:12:12,765
Most of the stuff that I have
it build and go nuts on only

262
00:12:12,765 --> 00:12:14,325
lives in my internal network.

263
00:12:14,445 --> 00:12:18,075
It's stuff that improves quality of
life for the way that I do things.

264
00:12:18,075 --> 00:12:21,345
It improves my own workflows, but I don't.

265
00:12:22,199 --> 00:12:23,189
Make this public.

266
00:12:23,189 --> 00:12:27,689
I don't expose it on the internet and the
performance issues of, for example, when

267
00:12:27,689 --> 00:12:31,079
I write my newsletter every week and I've,
uh, got it the way I want, it goes ahead

268
00:12:31,079 --> 00:12:34,709
and does the rendering, the formatting,
checks, all the links, et cetera, and

269
00:12:34,770 --> 00:12:38,219
they're small performance improvements,
like, huh, you're checking 35 links.

270
00:12:38,219 --> 00:12:42,870
Maybe you could do that in parallel rather
than sequentially, but even it's not.

271
00:12:43,219 --> 00:12:43,849
Okay.

272
00:12:43,849 --> 00:12:47,180
I, it doesn't, I, I'm not sitting
here with a stopwatch waiting for this

273
00:12:47,180 --> 00:12:50,569
thing to finish on my stuff 'cause
it's saving me a fair bit of time.

274
00:12:50,569 --> 00:12:51,979
Checking those links manually.

275
00:12:52,280 --> 00:12:54,890
I can grab a cup of coffee
while it does it at some point.

276
00:12:54,890 --> 00:12:58,849
Yes, I'll do the easy optimizations,
but performance on a lot of

277
00:12:58,849 --> 00:13:03,770
those back of house workflow
style tools does not need to be.

278
00:13:04,015 --> 00:13:04,855
Top notch.

279
00:13:04,855 --> 00:13:08,095
In fact, one of the things I like
with my own expression of how I think

280
00:13:08,095 --> 00:13:12,475
about things is I'll have, like all my
development stuff with Claude Code now

281
00:13:12,475 --> 00:13:17,365
exists in an EC2 box where it has root,
where it lives in its own AWS account

282
00:13:17,365 --> 00:13:21,085
where it has admin access and there's
nothing of value in this AWS account.

283
00:13:21,085 --> 00:13:22,195
Let be very clear on that.

284
00:13:22,195 --> 00:13:25,765
It's just a bill risk where it can do
anything that it wants, but it doesn't

285
00:13:25,765 --> 00:13:27,205
have access to anything sensitive.

286
00:13:27,295 --> 00:13:30,055
And I'll just go and I'll tab
over to it and kick it to the

287
00:13:30,055 --> 00:13:32,485
next step, and then I'll go back
to doing whatever I was doing.

288
00:13:32,485 --> 00:13:32,575
It's.

289
00:13:32,865 --> 00:13:35,535
It's sort of a drive by
and now do the next step.

290
00:13:35,835 --> 00:13:38,235
And I'm sure that's that there's an
orchestration story that's coming to,

291
00:13:38,235 --> 00:13:42,015
uh, as an overlayer on top of that
anytime now everyone's trying to build

292
00:13:42,015 --> 00:13:45,135
one and get those funded, it seems this
week, but there's gonna be something

293
00:13:45,135 --> 00:13:49,395
that emerges and is the next iteration
of this, and we'll see how it goes.

294
00:13:49,455 --> 00:13:53,385
I like the fact that if you don't
like how things work, give it a month.

295
00:13:53,775 --> 00:13:58,215
Now that said, I think it's really
hard to come up with a durable pitch

296
00:13:58,275 --> 00:14:01,545
in the AI space right now that is.

297
00:14:02,085 --> 00:14:05,985
That is fundable just from the
perspective of that's a feature release

298
00:14:05,985 --> 00:14:10,575
from Anthropic or open AI before
suddenly you have to do a massive pivot.

299
00:14:10,605 --> 00:14:12,255
Like we saw this historically.

300
00:14:12,705 --> 00:14:13,155
Oh wow.

301
00:14:13,155 --> 00:14:16,694
Suddenly ChatGPT can speak to
PDF, and suddenly a whole bunch

302
00:14:16,694 --> 00:14:17,895
of companies had a problem.

303
00:14:18,105 --> 00:14:20,805
But that was also relatively easy
to predict that that was coming.

304
00:14:21,045 --> 00:14:21,825
How do you think about it?

305
00:14:21,885 --> 00:14:26,505
Eric Anderson: The thing that has
proven the most defensible is like, I

306
00:14:26,505 --> 00:14:29,745
mean, I, I agree with you certainly,
but I'm impressed that cursor.

307
00:14:30,255 --> 00:14:35,655
The, the way open AI and anthropic became
so big is like be, became so scary.

308
00:14:35,655 --> 00:14:39,795
It's just, it's just the sheer growth
rate and, and like cursor captured

309
00:14:39,795 --> 00:14:40,724
a little bit of that lightning.

310
00:14:40,724 --> 00:14:41,025
Right.

311
00:14:41,025 --> 00:14:44,665
And, and, and then became
maybe threatening to Anthropics

312
00:14:44,689 --> 00:14:45,584
and Anthropics kind of.

313
00:14:45,885 --> 00:14:49,334
But like, I, I feel like when I
talk to my portfolio, it's like.

314
00:14:49,770 --> 00:14:52,860
Yes, we should be afraid of them unless
we can just grow faster than that.

315
00:14:52,860 --> 00:14:57,360
Like is there, is there a way you can kind
of find the vein and shoot to like cursor

316
00:14:57,360 --> 00:15:02,160
scale to the point that, that you kind of
can own something and, and some of these

317
00:15:02,160 --> 00:15:04,440
things are growing just incredibly fast?

318
00:15:04,620 --> 00:15:05,250
Corey Quinn: Oh yeah.

319
00:15:05,400 --> 00:15:08,940
I used to use Cursor a fair bit and
then I pivoted to Claude Goad and I

320
00:15:08,940 --> 00:15:11,430
haven't gone back since just because.

321
00:15:11,925 --> 00:15:14,775
Cursor was great when I was looking
at the code and, okay, now make this

322
00:15:14,775 --> 00:15:16,605
section, do this other thing instead.

323
00:15:16,845 --> 00:15:20,475
But increasingly I don't look at
the code that this stuff puts out.

324
00:15:20,805 --> 00:15:23,714
Also, again, this is all
backend stuff that I'm building

325
00:15:23,714 --> 00:15:25,425
for ease of use in my stuff.

326
00:15:25,545 --> 00:15:28,515
I suppose now is a good time to
detour into Germane story that

327
00:15:28,515 --> 00:15:29,714
is, you know, our sponsor break.

328
00:15:29,775 --> 00:15:33,314
'cause my own company sponsors this
at Duck Bill, we have a history as a

329
00:15:33,314 --> 00:15:37,814
consultancy helping companies fix their
horrifying AWS and other cloud bills.

330
00:15:37,895 --> 00:15:42,335
By a contract negotiation and
diving into finops strategy.

331
00:15:42,395 --> 00:15:44,165
Then now we're doing it
with software as well.

332
00:15:44,165 --> 00:15:48,905
Our product is called Skyway, and yeah,
we're using Cursor and Claude code

333
00:15:48,905 --> 00:15:53,704
and the rest to build this thing, but
it is not itself an AI play directly.

334
00:15:53,704 --> 00:15:57,335
It's providing foundational, normalized
data warehouse infrastructure for other

335
00:15:57,335 --> 00:16:01,235
things such as CPS to wind up talking
to and getting that data out of it.

336
00:16:01,475 --> 00:16:06,185
But by and large, that is still
a place that is relatively not.

337
00:16:06,750 --> 00:16:07,890
Where AI excels.

338
00:16:07,890 --> 00:16:10,290
And it's not because I have a
bias in this perspective that I'm

339
00:16:10,290 --> 00:16:13,920
saying that I've done a number of
experiments and continue to do them.

340
00:16:14,280 --> 00:16:18,089
It's not there yet for data sets
of this scale and this sensitivity.

341
00:16:18,390 --> 00:16:23,400
So if you're interested in learning more,
duck bill hq.com, please give us a shout

342
00:16:23,400 --> 00:16:26,490
and you might even have to deal with
me, should we have that conversation.

343
00:16:26,520 --> 00:16:27,000
Don't worry.

344
00:16:27,030 --> 00:16:28,740
We have people who are
actually good at this stuff.

345
00:16:29,099 --> 00:16:31,740
But yeah, there, there are
some areas where it seems like.

346
00:16:32,245 --> 00:16:34,225
Everyone's like, oh, you're
really building a B2B SaaS.

347
00:16:34,285 --> 00:16:36,355
Isn't this going to be disrupted by ai?

348
00:16:36,505 --> 00:16:39,295
Well, when you're, when you're
talking about things like normalized

349
00:16:39,295 --> 00:16:42,865
infrastructure spend across a wide
variety of providers, yeah, AI can

350
00:16:42,865 --> 00:16:47,065
help build the tooling and whatnot, but
telling Claude to go out and hit your

351
00:16:47,065 --> 00:16:51,745
billing data for all of your providers
and put it into a database for you sure

352
00:16:51,745 --> 00:16:53,905
would be terrific if that were to work.

353
00:16:53,965 --> 00:16:56,605
And it does in 80 to 90% of it.

354
00:16:56,875 --> 00:16:59,365
And then the edge and corner
cases absolutely cut you to

355
00:16:59,365 --> 00:17:02,815
ribbons because that's why this
is an area of enterprise concern.

356
00:17:03,355 --> 00:17:05,185
If it were simple, it
wouldn't be worth doing.

357
00:17:05,485 --> 00:17:05,665
Eric Anderson: Yeah.

358
00:17:05,665 --> 00:17:09,625
I think maybe an analogy that, that
your listeners might appreciate.

359
00:17:09,745 --> 00:17:11,454
You know, we, we've been
through this before, right?

360
00:17:11,454 --> 00:17:16,135
When AWS was kind of in its early
heyday, everyone was afraid of AWS.

361
00:17:16,135 --> 00:17:18,475
All the investors would
go down to reinvent and.

362
00:17:18,795 --> 00:17:21,705
They'd announced this new
database and a bunch of startups

363
00:17:21,705 --> 00:17:23,595
would, would die because of it.

364
00:17:23,955 --> 00:17:26,925
And, and so we all thought the cloud
was gonna be vertical and, and Amazon

365
00:17:26,925 --> 00:17:28,935
was just taking all over all the things.

366
00:17:29,385 --> 00:17:34,335
And then yet, uh, you know, come years
later, like four or five years later.

367
00:17:34,475 --> 00:17:35,855
Kind of late to the party.

368
00:17:35,855 --> 00:17:39,094
We got Datadog, kind of horizontal
monitoring across all the stack.

369
00:17:39,094 --> 00:17:42,754
We got eventually, just a couple years
ago, Wiz security monitoring across

370
00:17:42,754 --> 00:17:44,074
all, all, you know, all the clouds.

371
00:17:44,225 --> 00:17:46,564
We get the proprietary
databases, snowflake and

372
00:17:46,564 --> 00:17:48,795
Databricks and, and ClickHouse.

373
00:17:48,814 --> 00:17:51,365
I think these are, these are
the things people prefer to use.

374
00:17:51,665 --> 00:17:54,754
So I'm optimistic that, uh,
and, and I guess I'm referring

375
00:17:54,754 --> 00:17:56,044
mostly the infrastructure stack.

376
00:17:56,534 --> 00:17:57,625
That, that it doesn't go off.

377
00:17:57,824 --> 00:17:59,070
Kubernetes was a big unlock

378
00:17:59,070 --> 00:17:59,820
Corey Quinn: for this as well.

379
00:17:59,820 --> 00:18:03,480
I mean, back when I started doing
this, I have of opinion this is game

380
00:18:03,480 --> 00:18:07,469
point and match to AWS, the end and
there's gonna be a bunch of also ran.

381
00:18:07,889 --> 00:18:09,899
I do not have that opinion these days.

382
00:18:09,929 --> 00:18:11,699
Uh, they're obviously not going away.

383
00:18:11,699 --> 00:18:13,409
They're not going anywhere, but.

384
00:18:13,670 --> 00:18:18,470
It's impossible to ignore Azure, GCP, and
even Oracle Cloud, but all the value seems

385
00:18:18,470 --> 00:18:20,510
to be at just one level up the stack.

386
00:18:20,720 --> 00:18:24,230
Take Vercel, for example, for
front end it, it does all the

387
00:18:24,230 --> 00:18:26,150
things that you can do on AWS.

388
00:18:26,150 --> 00:18:30,590
Clearly, Vercel runs on AWS and
about a 20 to 30% markup on top

389
00:18:30,590 --> 00:18:34,310
of it, but I have a lot of stuff
running on Vercel instead of on AWS.

390
00:18:34,540 --> 00:18:35,080
Why?

391
00:18:35,229 --> 00:18:38,020
Well, because I don't know anything
about front end, but that's

392
00:18:38,020 --> 00:18:40,899
what the LLM picks and Okay.

393
00:18:40,899 --> 00:18:44,379
I, I don't have a strong enough opinion
to override it on that space, so.

394
00:18:44,379 --> 00:18:44,800
Okay.

395
00:18:44,800 --> 00:18:46,149
I guess we're putting the front end there.

396
00:18:46,449 --> 00:18:48,879
Eric Anderson: Yeah, so I think, I think
there's a chance, you know, there's a way

397
00:18:48,879 --> 00:18:51,590
to compete against open AI or Anthropic.

398
00:18:51,610 --> 00:18:57,129
I think certainly the thing that they're
weak on is just the diversity of like,

399
00:18:57,129 --> 00:19:00,820
they can go into Claude Code, they can
go into Claude Bot, they can go into.

400
00:19:01,005 --> 00:19:05,355
The Claude Cowork, they're
fighting a battle on many fronts.

401
00:19:05,835 --> 00:19:09,915
And, and so if you can, if you can be
laser focused, if you can fe realize

402
00:19:09,915 --> 00:19:13,305
what is the front that is actually the
most valuable, like in the case of the

403
00:19:13,305 --> 00:19:17,685
cloud, it turned out to be like the data
warehouse was the front to fight on.

404
00:19:18,150 --> 00:19:21,750
Um, that was the area to win
where both Amazon was weak

405
00:19:21,750 --> 00:19:23,040
and the value would accrue.

406
00:19:23,100 --> 00:19:25,380
So if you can figure out the
right front and then just be

407
00:19:25,380 --> 00:19:26,880
laser focused and be good.

408
00:19:26,880 --> 00:19:31,290
I mean, I think, I think you have to be
as good at the AI game as these frontier

409
00:19:31,290 --> 00:19:32,820
labs, but I think that's possible.

410
00:19:32,970 --> 00:19:35,430
Like they clearly don't
have a lock hold on.

411
00:19:35,430 --> 00:19:37,770
Talent, you know, talent's
just leaking everywhere.

412
00:19:38,340 --> 00:19:41,700
So yeah, you find great talent,
you figure out where value's gonna

413
00:19:41,700 --> 00:19:44,640
accrue in an interesting space,
and then you're just laser focused.

414
00:19:44,820 --> 00:19:46,050
And if you can catch the growth.

415
00:19:46,500 --> 00:19:47,730
I think there's a viable path.

416
00:19:48,000 --> 00:19:48,210
Corey Quinn: Yeah.

417
00:19:48,240 --> 00:19:51,150
There's also the question of
what are the underserved niches?

418
00:19:51,150 --> 00:19:54,750
I've always liked finding the expression
of these things that, that works.

419
00:19:54,930 --> 00:19:59,280
There is no amount of money I can raise
from anyone that is going to mean that

420
00:19:59,280 --> 00:20:03,450
I am now the third massive frontier
lab that's building this stuff out.

421
00:20:03,450 --> 00:20:06,330
I'm discounting the ones at Google,
for example, like the, that, that's

422
00:20:06,330 --> 00:20:07,650
not exactly the same thing here.

423
00:20:07,860 --> 00:20:12,360
Uh, but I, I'm not gonna ex, I'm not gonna
outrun these players at that, but, and.

424
00:20:12,775 --> 00:20:14,905
The capabilities are
growing by leaps and bounds.

425
00:20:14,965 --> 00:20:18,115
So where are the areas that I know well
that I can bring some of these things to

426
00:20:18,115 --> 00:20:21,355
bear on, uh, industry specific expertise.

427
00:20:21,715 --> 00:20:22,015
Uh.

428
00:20:22,590 --> 00:20:26,399
Opportunity passes everywhere, and I
think that that is the way to think

429
00:20:26,399 --> 00:20:29,189
about it to, to no small extent.

430
00:20:29,820 --> 00:20:34,620
I also don't necessarily know that
I want to be building the exact same

431
00:20:34,620 --> 00:20:36,120
thing that everyone else is building.

432
00:20:36,389 --> 00:20:39,330
I, I like finding a
direction to take things in.

433
00:20:39,720 --> 00:20:42,360
It's weird because I find myself
for one of the first times in my

434
00:20:42,360 --> 00:20:45,960
life being something of a centrist
on this because I don't believe

435
00:20:45,960 --> 00:20:49,470
the doomsayers say that we're going
to build a GI and we're going to.

436
00:20:49,845 --> 00:20:53,295
At this point, trample everything
out there because computers

437
00:20:53,295 --> 00:20:54,195
will think for themselves.

438
00:20:54,735 --> 00:20:56,355
We're not summoning God through JSO here.

439
00:20:56,775 --> 00:20:59,565
And I also am on the other side
where I don't think it's just a

440
00:20:59,565 --> 00:21:03,105
jumped up auto complete because
it is clearly far more than that.

441
00:21:03,315 --> 00:21:05,115
I'm, I'm between those two extremes.

442
00:21:05,115 --> 00:21:07,065
And it's a weird place to find myself,

443
00:21:07,305 --> 00:21:07,725
Eric Anderson: right.

444
00:21:08,055 --> 00:21:08,325
Yeah.

445
00:21:08,505 --> 00:21:12,615
To, because usually the world's just
either really wrong or it's obvious.

446
00:21:12,615 --> 00:21:15,135
And in this case it's
neither, it's, it's like.

447
00:21:15,570 --> 00:21:19,470
This thing is for real and it's,
but you know, it, it's subject to to

448
00:21:19,470 --> 00:21:21,510
physical laws like, like everybody else.

449
00:21:21,540 --> 00:21:21,750
Yeah,

450
00:21:21,870 --> 00:21:22,110
Corey Quinn: yeah.

451
00:21:22,560 --> 00:21:25,200
OpenAI alone has, RA has committed
to do more infrastructure spend

452
00:21:25,200 --> 00:21:28,590
between now and 2030 than there
is deployable global VC capital.

453
00:21:29,130 --> 00:21:33,540
I have some questions about what that's
going to look like because they're not the

454
00:21:33,540 --> 00:21:35,310
only lab that is doing this sort of thing.

455
00:21:35,550 --> 00:21:38,160
What does it look like
five years from now?

456
00:21:38,190 --> 00:21:40,620
What is the economic story of this?

457
00:21:40,620 --> 00:21:42,600
We're clearly looking at
something bubble shaped.

458
00:21:42,965 --> 00:21:44,705
What does the correction look like?

459
00:21:45,035 --> 00:21:45,215
It.

460
00:21:45,215 --> 00:21:46,385
I'll tell you what, it's not.

461
00:21:46,535 --> 00:21:49,895
It's not, and now we're going
to act as if LLMs never existed.

462
00:21:49,985 --> 00:21:52,145
You're not putting that
genie back in the bottle.

463
00:21:52,415 --> 00:21:54,845
Maybe this price of inference
is going to skyrocket.

464
00:21:54,875 --> 00:21:58,175
Maybe the ability to run things
that are almost as good locally

465
00:21:58,175 --> 00:21:59,585
is going to be the approach.

466
00:22:00,389 --> 00:22:03,330
Even with having coding assistance
build this stuff, maybe I don't need

467
00:22:03,330 --> 00:22:07,290
the top tier frontier, bleeding edge
state-of-the-art model to wind up

468
00:22:07,290 --> 00:22:11,760
doing what is effectively a fancy,
uh, said string replacement in a file.

469
00:22:11,760 --> 00:22:14,790
Maybe that can be the local thing
and the deep architecture planning

470
00:22:14,790 --> 00:22:16,080
is something attica's outsourced.

471
00:22:16,500 --> 00:22:20,129
These are all things that people way
smarter than I am, are focused on.

472
00:22:20,370 --> 00:22:23,310
I'm just curious to see where it goes
because the benefit as a developer

473
00:22:23,310 --> 00:22:25,620
myself is accruing rapidly and massively

474
00:22:25,889 --> 00:22:26,939
Eric Anderson: a bubble is inevitable.

475
00:22:27,145 --> 00:22:31,014
There's, there's no avoiding a bubble
because the growth rates are so

476
00:22:31,044 --> 00:22:37,345
incredibly high and un like, like when,
so anthropic went from one to 7 billion

477
00:22:37,345 --> 00:22:41,010
in revenue in a year, so they have
to plan for another seven x increase.

478
00:22:41,985 --> 00:22:43,365
Or at least a five x increase.

479
00:22:43,365 --> 00:22:46,965
Like they can't just like, not
buy the capacity they need.

480
00:22:47,385 --> 00:22:49,455
And, and could it be higher,
could it be They have no idea.

481
00:22:49,455 --> 00:22:52,125
So they have to, they have
to, they have to procure the

482
00:22:52,125 --> 00:22:56,475
capacity to satisfy at least some
portion of that expected demand.

483
00:22:57,075 --> 00:23:01,125
And until these crazy growth
rates give, we have to plan.

484
00:23:01,530 --> 00:23:05,190
We, we have to overbuy, you know,
over like, until, like eventually they

485
00:23:05,190 --> 00:23:06,810
give, we won't know when they give.

486
00:23:07,470 --> 00:23:10,410
And, and then when they give, we'll
have realized we have overbought,

487
00:23:10,470 --> 00:23:13,320
but until they give, we'll
feel like we have under bought.

488
00:23:13,710 --> 00:23:17,100
So a bubble is inevitable, but I
don't, and, and so I think in terms

489
00:23:17,100 --> 00:23:19,890
things, in terms of like bubble
prediction isn't all that helpful.

490
00:23:20,565 --> 00:23:23,835
Unless you can kind of call
the point at which we saturate,

491
00:23:23,865 --> 00:23:24,165
Corey Quinn: right?

492
00:23:24,165 --> 00:23:26,925
Economists have successfully
produced, uh, predicted five

493
00:23:26,925 --> 00:23:28,005
of the last three recessions.

494
00:23:28,005 --> 00:23:29,265
I mean, this is always the problem.

495
00:23:29,265 --> 00:23:30,495
You smack into it.

496
00:23:30,495 --> 00:23:31,815
You can't timing the market.

497
00:23:31,815 --> 00:23:34,935
It can remain irrational longer
than you can remain solvent, but

498
00:23:34,935 --> 00:23:36,255
there are, there are limits on this.

499
00:23:36,255 --> 00:23:39,475
I, I spend 200 bucks a month
for the Claude Pro Max Plan

500
00:23:39,495 --> 00:23:40,605
with a smile on my face.

501
00:23:40,845 --> 00:23:44,145
I'm not gonna spend $5,000 a
month on that because at some

502
00:23:44,145 --> 00:23:47,115
point there is a limit and.

503
00:23:48,120 --> 00:23:51,090
It's, there has to be something that
gives, you're into population limits.

504
00:23:51,090 --> 00:23:53,490
People willing to drop that
kind of money on these things,

505
00:23:53,700 --> 00:23:55,170
but where is that boundary?

506
00:23:55,200 --> 00:23:55,830
I don't know.

507
00:23:56,040 --> 00:24:00,810
A lot of the funding sort of acts and the
messaging has been around that your boss

508
00:24:00,810 --> 00:24:05,760
is going to replace you with AI and then
split your salary with the AI company.

509
00:24:05,790 --> 00:24:09,000
Like I, I don't think that
that necessarily tracks

510
00:24:09,060 --> 00:24:09,390
Eric Anderson: Yeah.

511
00:24:09,390 --> 00:24:12,900
This, this idea that all pricing
holds and like whoever deploys

512
00:24:12,900 --> 00:24:14,370
the AI gets to keep all the money.

513
00:24:14,800 --> 00:24:15,730
Is crazy.

514
00:24:15,790 --> 00:24:19,270
Like there's certainly gonna be some
amount of commodification where people

515
00:24:19,270 --> 00:24:23,050
are like, oh, I don't have to, like,
I wanna keep some of my money too.

516
00:24:23,050 --> 00:24:24,190
I'm not just gonna give it all to you.

517
00:24:24,190 --> 00:24:27,970
And you deploy the ai I'm expecting
that all of the people I buy

518
00:24:27,970 --> 00:24:29,260
things from are deploying ai.

519
00:24:30,130 --> 00:24:32,140
They're gonna bargain, you
know, they're, they're gonna

520
00:24:32,140 --> 00:24:33,580
compete in a marketplace where.

521
00:24:34,155 --> 00:24:38,265
Everyone lower lowers their prices
because they can eventually, margins

522
00:24:38,325 --> 00:24:39,975
get to the point where they always were.

523
00:24:40,245 --> 00:24:42,765
You know, there's a certain amount of
money people are willing to work for

524
00:24:43,125 --> 00:24:45,975
and if you have monopoly power, you get,
you get to charge a little bit extra.

525
00:24:46,155 --> 00:24:48,615
Like, like the lawyers today,
there's a lot of talk that, like

526
00:24:48,765 --> 00:24:50,205
lawyers used to bill by the hour.

527
00:24:50,205 --> 00:24:53,025
They can't do that anymore
because now AI is doing the work.

528
00:24:53,025 --> 00:24:53,385
So they have to.

529
00:24:54,645 --> 00:24:56,715
Bill by the, by the project.

530
00:24:57,075 --> 00:24:59,895
And then they just keep
the extra money, I think.

531
00:24:59,895 --> 00:25:02,325
I think we're all just gonna be
like, no, you're not really doing

532
00:25:02,325 --> 00:25:03,855
any work, so I'm gonna pay you less.

533
00:25:03,855 --> 00:25:05,295
And then we get back
to billing by the hour.

534
00:25:05,865 --> 00:25:07,695
Corey Quinn: Everyone acts like
this changes everything, and

535
00:25:07,695 --> 00:25:09,045
I'm not convinced that it does.

536
00:25:09,225 --> 00:25:10,545
There are strong.

537
00:25:11,100 --> 00:25:13,770
Indications that this is,
there are ways forward on this.

538
00:25:13,950 --> 00:25:14,700
Uh, take a couple.

539
00:25:14,700 --> 00:25:15,930
You're a board observer for Honeycomb.

540
00:25:15,930 --> 00:25:18,960
Uh, we've been working with them both with
the client and other ways for a long time.

541
00:25:19,169 --> 00:25:23,370
I love the way that they do AI because
they don't splatter AI all over

542
00:25:23,370 --> 00:25:24,810
their messaging and their marketing.

543
00:25:24,990 --> 00:25:26,879
They have built it in useful ways.

544
00:25:26,909 --> 00:25:28,560
Their MCP is a thing of beauty.

545
00:25:28,740 --> 00:25:31,440
The fact that you can ask in
plain language what the hell is

546
00:25:31,440 --> 00:25:34,530
going on in your environment and
it will tell you is glorious.

547
00:25:34,620 --> 00:25:38,040
But when, whenever someone talks
about AI to the exclusion of all else.

548
00:25:38,115 --> 00:25:40,785
I'm sorry to break the hearts
of marketers out there, but as

549
00:25:40,785 --> 00:25:42,495
a customer, that's off-putting.

550
00:25:42,735 --> 00:25:47,325
I don't care if you are using ai,
incredibly smart if statements or

551
00:25:47,325 --> 00:25:49,035
just interns that type very quickly.

552
00:25:49,305 --> 00:25:52,575
I just care about the outcome
that you are delivering for value.

553
00:25:52,845 --> 00:25:56,505
That's the important piece from where
I sit and I'm not, I'm very far from

554
00:25:56,535 --> 00:25:59,685
alone in that it's similar to, I
don't care how the sausage is made.

555
00:25:59,685 --> 00:26:00,615
I care that it tastes good.

556
00:26:00,795 --> 00:26:01,125
Eric Anderson: Yeah.

557
00:26:01,185 --> 00:26:01,455
Yeah.

558
00:26:01,455 --> 00:26:03,075
There's a kind of a second order.

559
00:26:03,405 --> 00:26:07,875
Wave, I think of AI use, like the first
is the obvious, where we use it in the

560
00:26:07,875 --> 00:26:13,065
context of our current, what's a good
example, you know, the, the AI workers.

561
00:26:13,125 --> 00:26:17,835
Like, oh, let's have an ai, SDR or
an AI data scientist, because that's,

562
00:26:17,865 --> 00:26:21,045
that's like the current framework of
our society, and we can plug them in in

563
00:26:21,045 --> 00:26:25,665
those holes, but presumably we should
discover new ways of organizing society.

564
00:26:26,250 --> 00:26:28,080
That weren't possible until we had ai.

565
00:26:28,080 --> 00:26:30,750
And once we discover those new ways, then
we'll have products that take shapes.

566
00:26:30,750 --> 00:26:32,790
We don't, we can't really
imagine at this point.

567
00:26:33,330 --> 00:26:35,640
And so I think you're right, like
the way Honeycomb feels like a,

568
00:26:35,640 --> 00:26:39,060
a tease towards this future where
like, hey, maybe not, everything's a

569
00:26:39,060 --> 00:26:42,930
chat bot, actually, uh, like a side
panel alongside the traditional app.

570
00:26:43,365 --> 00:26:46,485
Maybe it's like infused within
applications in ways that we

571
00:26:46,485 --> 00:26:49,215
didn't really think possible
before because it wasn't possible.

572
00:26:49,455 --> 00:26:51,825
Corey Quinn: Maybe I don't wanna alert
your dumb proprietary SQL version

573
00:26:51,825 --> 00:26:53,445
to get value out of your platform.

574
00:26:53,445 --> 00:26:57,705
Maybe I, maybe your robot can do that
for me, similar to when I have a problem,

575
00:26:57,705 --> 00:26:58,965
I need to reach out to a company.

576
00:26:59,085 --> 00:27:00,764
Don't make me talk to an AI bot.

577
00:27:01,240 --> 00:27:04,899
But have that AI bot provide valuable
context to this human support agent that

578
00:27:04,899 --> 00:27:09,010
I'm talking to, to power through that
a lot more quickly and provide context

579
00:27:09,010 --> 00:27:10,389
like, I had the last seven tickets.

580
00:27:10,389 --> 00:27:13,209
This guy either knows what he's
talking about or is a complete

581
00:27:13,209 --> 00:27:14,679
buffoon, just accordingly.

582
00:27:15,010 --> 00:27:18,639
And they can provide, they can get
to answers a lot more effectively

583
00:27:18,639 --> 00:27:22,209
that way, as opposed to making
me run the AI gauntlet before.

584
00:27:22,209 --> 00:27:22,780
Finally.

585
00:27:22,840 --> 00:27:22,929
Huh.

586
00:27:23,290 --> 00:27:24,879
It looks like you can't
solve this problem yourself.

587
00:27:24,879 --> 00:27:26,350
You're gonna have to talk to a human.

588
00:27:26,379 --> 00:27:27,340
No kidding.

589
00:27:27,970 --> 00:27:30,520
Uh, all these companies have been
talking about chatbots, like it's

590
00:27:30,520 --> 00:27:32,560
somehow the pinnacle of user experience.

591
00:27:32,770 --> 00:27:32,949
No.

592
00:27:32,949 --> 00:27:36,879
People talk to chatbots or humans when
the user experience has failed them.

593
00:27:36,939 --> 00:27:38,379
You're already starting a step behind.

594
00:27:38,500 --> 00:27:38,740
Eric Anderson: Yes.

595
00:27:38,745 --> 00:27:38,925
Yes.

596
00:27:39,250 --> 00:27:41,649
What if the people in the call center
were three times more effective

597
00:27:41,830 --> 00:27:45,159
because they just solved their problem
three times faster rather than talk

598
00:27:45,159 --> 00:27:48,669
to three customers at once, which is
what I feel like most of the time is.

599
00:27:48,669 --> 00:27:50,770
They're like, yeah, let
me check on that for you.

600
00:27:50,770 --> 00:27:51,490
Two minutes later.

601
00:27:52,000 --> 00:27:53,470
Like, how is it taking this long?

602
00:27:54,040 --> 00:27:54,430
Corey Quinn: Right.

603
00:27:54,460 --> 00:27:57,760
Or they wind up asking you questions
you answered three messages ago.

604
00:27:57,760 --> 00:27:59,590
It's like, I thought I had
a short context window.

605
00:27:59,800 --> 00:28:00,280
It's awful.

606
00:28:00,850 --> 00:28:01,270
Ugh.

607
00:28:01,450 --> 00:28:04,060
I wanna thank you for taking the
time to speak with me about all this.

608
00:28:04,090 --> 00:28:07,000
If people wanna learn more, where's
the best place for them to find you?

609
00:28:07,090 --> 00:28:08,650
Eric Anderson: You can
find me on Scale's website.

610
00:28:08,740 --> 00:28:10,180
I'm fairly active on LinkedIn.

611
00:28:10,180 --> 00:28:12,700
I need to get my Twitter,
what do we call it now?

612
00:28:12,700 --> 00:28:13,480
Game up.

613
00:28:13,480 --> 00:28:18,280
But yeah, I, I, LinkedIn or uh, scale
website would be a good place to start.

614
00:28:18,665 --> 00:28:21,335
Corey Quinn: And we'll of course
put links to that in the show notes.

615
00:28:21,665 --> 00:28:23,765
Thank you so much for taking
the time to speak with me.

616
00:28:23,795 --> 00:28:24,575
I appreciate it.

617
00:28:24,875 --> 00:28:25,355
Eric Anderson: Thanks, Corey.

618
00:28:25,535 --> 00:28:27,545
Corey Quinn: Eric
Anderson, partner at scale.

619
00:28:27,965 --> 00:28:31,205
I'm cloud Economist Corey Quinn,
and this is Screaming in the Cloud.

620
00:28:31,475 --> 00:28:32,825
You've enjoyed this podcast.

621
00:28:32,825 --> 00:28:36,035
Please leave a five star review on
your podcast platform of choice,

622
00:28:36,215 --> 00:28:39,365
whereas if you've hated this podcast,
please, we have a five star review

623
00:28:39,365 --> 00:28:43,175
on your podcast platform of choice,
along with an angry comment talking

624
00:28:43,175 --> 00:28:47,315
about how your minor incremental
feature to an AI foundation model.

625
00:28:47,530 --> 00:28:51,520
Is way different than the others and no
one could ever possibly compete with you.