1
00:00:08,355 --> 00:00:16,066
Hi everyone, welcome to the Monkey Patching Podcast where we go bananas about all things
physical world, AI generated art and more.

2
00:00:16,066 --> 00:00:19,510
My name is Murilo, I'm joined by my friend Bart, hey Bart.

3
00:00:21,453 --> 00:00:22,826
Doing good, how are you?

4
00:00:22,826 --> 00:00:24,267
Doing fine as well, doing fine as well.

5
00:00:24,267 --> 00:00:27,771
Quite busy these days with the new startups that we launched.

6
00:00:27,771 --> 00:00:28,552
Top of mind.

7
00:00:28,552 --> 00:00:29,573
Everybody check it out.

8
00:00:29,573 --> 00:00:30,526
Top of mind.cloud.

9
00:00:30,526 --> 00:00:31,286
There we go.

10
00:00:31,286 --> 00:00:33,506
I don't know if it's been a little while since we recorded last.

11
00:00:33,506 --> 00:00:37,657
I don't know if you had a website back then, but you definitely do one have one now,
right?

12
00:00:37,657 --> 00:00:38,309
Yeah, we do.

13
00:00:38,309 --> 00:00:40,394
I'm not sure if we had it back then, actually.

14
00:00:40,394 --> 00:00:45,178
Everything is a few weeks old, but we're moving very quickly.

15
00:00:45,178 --> 00:00:50,736
Maybe again, maybe people that didn't listen last time, what is the top of mind in
one-liner?

16
00:00:50,736 --> 00:00:59,846
a, I want to say an application, but it's more than an application, but we're building
basically a system for people that take both a lot of, try to store a lot of knowledge,

17
00:00:59,846 --> 00:01:04,070
try to store a lot of information, and also are very active with their network.

18
00:01:04,070 --> 00:01:07,793
Think about salespeople, think about investors, think about business leaders.

19
00:01:07,793 --> 00:01:11,786
Like you want to be aware of, I'm having a talk with Marilo.

20
00:01:11,786 --> 00:01:14,746
He's saying that he's going to run a marathon in two months.

21
00:01:14,746 --> 00:01:17,226
Like, would be nice if you get remembered in two months.

22
00:01:17,226 --> 00:01:18,666
I'm really looking forward to this marathon.

23
00:01:18,666 --> 00:01:20,917
Like, that you have this context top of mind.

24
00:01:20,917 --> 00:01:28,768
And what we're basically building is a system that allows you to do very, very low, with a
very, very low friction in just any kind of information.

25
00:01:28,768 --> 00:01:32,495
And then it basically structures it for you and surfaces it for you at the right time.

26
00:01:32,495 --> 00:01:34,379
Yeah, very, cool.

27
00:01:34,379 --> 00:01:36,612
you're also looking for your said you're busy, right?

28
00:01:36,612 --> 00:01:38,446
Maybe a small shout out, right?

29
00:01:38,446 --> 00:01:41,450
Like you're also looking for for people to join the team.

30
00:01:41,450 --> 00:01:42,992
Yes, we have two.

31
00:01:42,992 --> 00:01:47,136
We have a total team of four people now, of which two people are technical, and that's not
enough.

32
00:01:47,136 --> 00:01:49,318
So we're looking for a new software engineer.

33
00:01:49,318 --> 00:01:51,150
Next to that, also a marketing engineer.

34
00:01:51,150 --> 00:01:54,733
Okay, okay, so if anyone is interested, yeah, I think.

35
00:01:55,313 --> 00:01:58,796
Well, we got to pay the podcast bill somehow in part, just.

36
00:01:58,796 --> 00:01:59,866
So yeah, very cool.

37
00:01:59,866 --> 00:02:03,277
So you've been busy with, yeah, I guess just sorting anything out.

38
00:02:03,277 --> 00:02:06,849
And also you have some users already, like test users.

39
00:02:06,849 --> 00:02:08,970
Yeah, we went live with test users.

40
00:02:08,970 --> 00:02:12,490
think actually last time we recorded, I said we were going to go live with test users.

41
00:02:12,490 --> 00:02:13,881
And we went live, yeah.

42
00:02:13,981 --> 00:02:16,503
I think for now, this is the second week.

43
00:02:16,503 --> 00:02:18,986
I think it's positive overall.

44
00:02:18,986 --> 00:02:25,388
I think the biggest challenge that we have actually is that our back end is basically an
AI agent.

45
00:02:25,388 --> 00:02:29,370
So you ingest a lot of knowledge, and this agent can do a lot for it.

46
00:02:29,370 --> 00:02:31,552
for you because it has access to a lot of different tools.

47
00:02:31,552 --> 00:02:35,857
It can structure your knowledge, but it also can use a lot of external sources.

48
00:02:35,857 --> 00:02:44,306
And because of all these combinations of tools that it has access to, it's hard to get an
overview of what are actually all the features that this app has.

49
00:02:45,027 --> 00:02:51,988
And we need to make sure that we surface the right features to the right users, if that
makes sense.

50
00:02:51,988 --> 00:02:53,029
I see what you're saying.

51
00:02:53,029 --> 00:02:58,968
basically what you're, well, maybe to make sure I understand you're saying like, this
person would leverage a lot this kind of feature.

52
00:02:58,968 --> 00:03:02,103
So we need to make sure this person knows that this feature is there.

53
00:03:02,103 --> 00:03:08,091
But because there's so much that it does, maybe it's not easy for the person to know that
they can just, I don't know, use this.

54
00:03:08,091 --> 00:03:18,762
should, so what our agent now does is like, could say like, for example, at the end of the
week, give me a summary of all the new information that I collected about people, about

55
00:03:18,762 --> 00:03:19,893
organizations this week.

56
00:03:19,893 --> 00:03:30,525
Or I can say at the beginning of the week, give me an overview of which new invites I have
on in upcoming meetings and build a profile on them.

57
00:03:30,525 --> 00:03:36,071
Or I can do, I think Apple is a very nice company, send me an update if there's any daily
news on them.

58
00:03:36,071 --> 00:03:40,767
all of these are very, like all of these are possible today already, but they're not like
enable this.

59
00:03:40,767 --> 00:03:41,858
Like there's not a click button.

60
00:03:41,858 --> 00:03:43,913
Like you need to say it to the application.

61
00:03:43,913 --> 00:03:49,961
And I think what we need to do is like to see like what are the features that 80 % likes
and automatically enable it for them.

62
00:03:49,961 --> 00:03:50,962
Yeah, I see what you're saying.

63
00:03:50,962 --> 00:03:52,134
Okay, very, very cool.

64
00:03:52,134 --> 00:03:52,874
Very cool.

65
00:03:52,874 --> 00:04:00,064
So again, if someone is interested, it's an AI product looking for AI native engineers,
It's right enabled as well.

66
00:04:00,064 --> 00:04:05,732
So I think if you want to work on AI and if you want to work with AI, think this is a cool
opportunity, right?

67
00:04:07,282 --> 00:04:08,057
Yeah, no?

68
00:04:08,057 --> 00:04:13,103
Like I said, we'll adjust the advertisement fees I'll send it to you later.

69
00:04:13,103 --> 00:04:14,615
But no, very cool.

70
00:04:14,615 --> 00:04:22,425
I've also been playing more with AI, but not so much cloud code, which I was doing before,
but also with the co-work as well for.

71
00:04:22,425 --> 00:04:22,990
yeah.

72
00:04:22,990 --> 00:04:28,690
I have some less, let's say less technical work like preparing proposals or slides or all
these things on my day job.

73
00:04:28,690 --> 00:04:33,741
And I've been using it more and actually I've been super, I was, we're discussing a bit
before the, the restart recording.

74
00:04:33,741 --> 00:04:40,601
Not only, I think the output is better because I feel like I have a sparing partner that
can, can brainstorm with me and it's much faster and it maybe thinks of things that I

75
00:04:40,601 --> 00:04:41,192
didn't think.

76
00:04:41,192 --> 00:04:49,123
I think it's also faster because I spent a lot of time like preparing slides, making sure
this like the box is the same size, creating text boxes, whatever.

77
00:04:49,123 --> 00:04:50,354
And also.

78
00:04:50,354 --> 00:04:51,087
It's more fun.

79
00:04:51,087 --> 00:04:55,491
So I feel like I spend time doing these things and I'm, so it's been interesting.

80
00:04:55,491 --> 00:04:56,862
Yeah.

81
00:04:56,862 --> 00:05:01,038
is a bit like a workflow that you need to get into, but once you're into it, it really
makes you more efficient,

82
00:05:01,038 --> 00:05:02,079
Yeah, really it does.

83
00:05:02,079 --> 00:05:07,266
I think I was, mean, I was even reflecting to myself cause co-work like you give access to
a directory, right?

84
00:05:07,266 --> 00:05:09,237
So in some ways it kind of feels like cloud codes.

85
00:05:09,237 --> 00:05:15,021
Like when you, when you go to a terminal and you go there and you talk to it and you
create and you can edit files and then you can go in and change it again.

86
00:05:15,021 --> 00:05:17,734
And then you can read the files and edit it, like iterate on this.

87
00:05:17,734 --> 00:05:26,377
But cloud co-work feels, I don't know if it was, it feels like it fits better with the
non-technical work somehow, even though I feel like I could do the same things with cloud

88
00:05:26,377 --> 00:05:27,198
code.

89
00:05:27,424 --> 00:05:27,675
Right.

90
00:05:27,675 --> 00:05:31,101
So I was also reflected that like, is it just the UX thing?

91
00:05:31,101 --> 00:05:33,135
Is it just because there's an app instead of the terminal?

92
00:05:33,135 --> 00:05:34,437
What is, what is this?

93
00:05:34,437 --> 00:05:34,838
Right.

94
00:05:34,838 --> 00:05:35,388
So.

95
00:05:35,388 --> 00:05:38,151
I think it's also a bit primed to desktop work.

96
00:05:38,151 --> 00:05:45,001
I think it has some like built-in skills around, for example, generating PowerPoint
presentations, stuff like that, which you don't by default have in Cloud Code,

97
00:05:45,001 --> 00:05:45,812
yeah for sure.

98
00:05:45,812 --> 00:05:54,571
But I'm also, we also use like granola and I also have like a granola MCP and sometimes we
use have meetings to discuss a bit what should be the format of the proposal and then you

99
00:05:54,571 --> 00:05:55,472
can just hook up.

100
00:05:55,472 --> 00:05:59,846
I don't know, it just yeah, I think maybe it's just maybe there's also some prompting that
makes it better.

101
00:05:59,846 --> 00:06:03,831
This built-in skills maybe but yeah.

102
00:06:03,831 --> 00:06:06,453
And also have you used the cloud projects as well?

103
00:06:08,188 --> 00:06:10,525
But that's the one that exists already for long time,

104
00:06:10,525 --> 00:06:18,715
Yeah, so like they had like you can have projects and you can have you can spin from
projects to co-work as well.

105
00:06:18,715 --> 00:06:21,335
But then to me, I was a bit like, what's the difference between them?

106
00:06:21,335 --> 00:06:30,155
And I did some quick search and they saying like projects is more so you have the same
context throughout and then co-work is more for automating according to them.

107
00:06:30,155 --> 00:06:32,506
Like if you want to convert from one thing to another.

108
00:06:32,506 --> 00:06:34,017
and they interact with each other as well.

109
00:06:34,017 --> 00:06:38,381
So from a project and say, do this in Cloud Work and then let me know when it's back and
all these things.

110
00:06:38,381 --> 00:06:44,236
So I'm also giving it a try, but yeah, still need to organize myself a bit and see what
works really well for me.

111
00:06:44,236 --> 00:06:45,817
yeah, do you use Cloud Work or?

112
00:06:45,817 --> 00:06:47,598
uh

113
00:06:47,659 --> 00:06:52,339
use it, but not as a daily driver.

114
00:06:52,339 --> 00:06:53,170
Let's play that.

115
00:06:53,170 --> 00:06:53,721
Okay.

116
00:06:53,721 --> 00:06:54,161
Very cool.

117
00:06:54,161 --> 00:07:00,952
actually ran out of, I reached the limit with Cloud Cowork when I was using it to come up
with it.

118
00:07:00,952 --> 00:07:04,563
So I had to wait after, I don't know, three hours.

119
00:07:04,563 --> 00:07:05,894
Team plan, but yeah.

120
00:07:05,894 --> 00:07:09,520
I actually, like, the team plan doesn't have max.

121
00:07:09,520 --> 00:07:15,365
I realized, so I was setting up our, our team plan, like on Tropic team plan for Claude
code.

122
00:07:15,365 --> 00:07:16,726
And it doesn't have a max plan.

123
00:07:16,726 --> 00:07:23,223
Like it only has a sort of the teams version of the pro plan, which is at max five X,
think while the max plan is 20 X.

124
00:07:23,223 --> 00:07:27,687
So you basically need to subscribe as an individual contributor to get to 20 X plan.

125
00:07:27,687 --> 00:07:29,389
And I think if you, if you.

126
00:07:29,389 --> 00:07:34,050
use this to do AI native coding as a daily driver, you need 20x.

127
00:07:34,050 --> 00:07:37,324
Otherwise, you're going to run out of your limits very soon.

128
00:07:37,324 --> 00:07:39,175
Yeah, yeah, I see what you're saying.

129
00:07:39,175 --> 00:07:41,315
the max is...

130
00:07:41,315 --> 00:07:42,815
but so...

131
00:07:42,815 --> 00:07:44,906
So I'm looking here at the page as we were saying is.

132
00:07:44,906 --> 00:07:50,966
So you have the pro and then you have max and then it says you choose between 5x or 20x.

133
00:07:50,966 --> 00:07:53,546
and I think the 5x is $100 here.

134
00:07:53,546 --> 00:07:55,646
If I'm not mistaken, the 20x is $180.

135
00:07:55,646 --> 00:07:57,106
So I'm telling you that.

136
00:07:57,106 --> 00:07:57,906
so this is max.

137
00:07:57,906 --> 00:08:04,477
And then if you go for the team, you have this premium seat, which is 5X, which is 100.

138
00:08:04,857 --> 00:08:06,668
Yeah, okay, okay, that's interesting.

139
00:08:06,668 --> 00:08:08,910
Huh, interesting, yeah, yeah.

140
00:08:08,910 --> 00:08:10,011
Maybe one last thing.

141
00:08:10,011 --> 00:08:12,773
you see that a little tidbit as well as we're talking about Claude?

142
00:08:12,773 --> 00:08:20,749
I think Claude also expanded the usage for Claude code on the weekends or something like
that to try to encourage people to tinker more with Claude on the weekends, I think.

143
00:08:20,749 --> 00:08:21,229
Did you see that?

144
00:08:21,229 --> 00:08:21,800
Something like that.

145
00:08:21,800 --> 00:08:23,460
I don't have a link for that.

146
00:08:25,903 --> 00:08:34,063
Yeah, it was like Claude, can, the weekend usage doesn't count for like, cause at Claude
you have a session limit and then you have a weekly limit.

147
00:08:34,063 --> 00:08:34,464
Right.

148
00:08:34,464 --> 00:08:34,975
And

149
00:08:34,975 --> 00:08:39,965
Now they extend it on the weekends and if you use it on the weekend it doesn't count for a
weekly limit or something like this.

150
00:08:39,965 --> 00:08:45,668
think actually in Cloud Code there was like a hint or tip rendered at some point about
this.

151
00:08:45,668 --> 00:08:46,631
Drinks a Bell, yeah.

152
00:08:46,631 --> 00:08:48,262
Yeah, but there's a, that was funny.

153
00:08:48,262 --> 00:08:54,046
Like, I mean, they're kinda, they're kinda encouraging people to tinker in the weekend
right now work, but yeah.

154
00:08:54,046 --> 00:08:55,937
And also for Codex, they did something similar.

155
00:08:55,937 --> 00:08:57,996
They said, Codex, I think it's still free.

156
00:08:57,996 --> 00:08:58,782
I'm not sure.

157
00:08:58,782 --> 00:09:05,667
But they said if you're using the app instead of the terminal, you can, you get two, two X
usage as well.

158
00:09:05,667 --> 00:09:07,149
Okay, okay.

159
00:09:10,298 --> 00:09:11,331
Only the terminal.

160
00:09:11,331 --> 00:09:18,090
Yeah, I also feel like the I use the Codex app because the more usage and I want to try
Codex, but I also prefer the terminal.

161
00:09:18,090 --> 00:09:19,082
I'm not too sure.

162
00:09:19,082 --> 00:09:24,423
Yeah, I I feel a bit limited when I'm on the use the cloth code.

163
00:09:24,423 --> 00:09:30,963
It is nice though if you're like on the road and you're like you can quickly create a PR
for something, right?

164
00:09:31,242 --> 00:09:34,323
One thing I have also used is the remote control from the terminal.

165
00:09:34,323 --> 00:09:36,343
You know, I've used the remote control.

166
00:09:36,343 --> 00:09:45,218
So it's like I sent something, I think I sent something to work on and then I went to the
gym or something and then everyone's now like clarifications or something or just like,

167
00:09:45,218 --> 00:09:47,689
it's this, it's that, go ahead, do this, do this.

168
00:09:47,853 --> 00:09:49,114
was quite happy with it as well.

169
00:09:49,114 --> 00:09:50,096
So cool.

170
00:09:50,096 --> 00:09:52,309
What do we have for today Bart?

171
00:09:52,309 --> 00:09:57,634
Jan Lecoen is back with a billion dollar bet that AI should learn from the physical world,
not just from text.

172
00:09:57,634 --> 00:10:00,087
It's a direct challenge to the language model consensus.

173
00:10:00,087 --> 00:10:09,146
The new startup AMI launches a creep at a 3.5 billion valuation and aims first at
industries like manufacturing, biomedicine and robotics.

174
00:10:09,146 --> 00:10:14,516
Cool, maybe for people that never heard or maybe they don't know as much in the AI world,
who is Yann LeCun?

175
00:10:14,516 --> 00:10:16,691
Why does it matter that it's this guy?

176
00:10:16,691 --> 00:10:19,388
I think Jan Lecon is probably one of the...

177
00:10:19,388 --> 00:10:24,591
not sure the godfather, but at least one of the grandfathers of modern AI, I would say.

178
00:10:24,591 --> 00:10:38,553
He is probably most known to the general public through his work at Facebook, where among
other things, and I don't think he's the sole responsible, but among other things, the

179
00:10:38,553 --> 00:10:40,715
llama models came out of his team.

180
00:10:40,715 --> 00:10:44,390
Yeah, he did a lot of stuff on the research as well, right?

181
00:10:44,390 --> 00:10:47,542
He worked for the universe, like computer vision.

182
00:10:47,542 --> 00:10:50,271
Yeah, exactly.

183
00:10:50,271 --> 00:10:57,866
And maybe more recently as well, he's been, he was very vocal in the beginning that he, he
wasn't like, he had a very famous tweet, I think it was him.

184
00:10:57,866 --> 00:10:59,978
They said like, I'm not interested in LLMs anymore.

185
00:10:59,978 --> 00:11:01,570
I'm moving on to something else.

186
00:11:01,570 --> 00:11:03,406
He also said a few times that

187
00:11:03,406 --> 00:11:06,140
He didn't think that LLMS is the path to AGI.

188
00:11:06,140 --> 00:11:11,338
So he's been quote unquote critical, critical of the what's today's state of the art AI,
right?

189
00:11:11,338 --> 00:11:12,808
Which is LLMS basically.

190
00:11:12,808 --> 00:11:13,990
All very critical, right?

191
00:11:13,990 --> 00:11:16,195
Like in the last two years, I want to say.

192
00:11:16,195 --> 00:11:18,370
Something like that.

193
00:11:18,370 --> 00:11:19,034
yeah.

194
00:11:19,034 --> 00:11:22,980
And what is this now that he has a new startup?

195
00:11:22,980 --> 00:11:23,824
What is it about?

196
00:11:23,824 --> 00:11:34,864
So his point basically is that the current architecture, the current text-based LLM
models, that they are limited in how much further that they can evolve because they have a

197
00:11:34,864 --> 00:11:38,490
lot of difficulty understanding the physical world, basically.

198
00:11:38,490 --> 00:11:43,743
And what they're building on, what AMI is building, AMI apparently stands for Advanced
Machine Intelligence.

199
00:11:43,743 --> 00:11:44,945
Sounds very advanced, right?

200
00:11:44,945 --> 00:11:45,657
Yeah.

201
00:11:45,657 --> 00:11:54,265
But they're building a new type of model which they call the JIPA architecture, the Joint
Embedding Predictive Architecture, which I think we should probably do a deep dive session

202
00:11:54,265 --> 00:11:55,366
on this at some point.

203
00:11:55,366 --> 00:12:02,954
But it learns abstract representations rather than predicting like something pixel by
pixel or like word by word.

204
00:12:02,954 --> 00:12:05,894
um

205
00:12:05,894 --> 00:12:12,657
how that exactly translates to an architecture I would be interested in to do a deep dive,
I'm not sure at this point.

206
00:12:12,657 --> 00:12:15,855
um

207
00:12:15,855 --> 00:12:16,885
initial results?

208
00:12:16,885 --> 00:12:26,664
Because again, to me it's like, we have LLMs, which is basically the transformer
architecture that exploded now and now there are variations of this architecture, as I

209
00:12:26,664 --> 00:12:27,355
understand.

210
00:12:27,355 --> 00:12:34,280
I know that I'm not as much in the know for these latest and greatest architectures, but
it took a long time to get there.

211
00:12:34,280 --> 00:12:36,444
And I feel like to introduce a new one is also...

212
00:12:36,444 --> 00:12:41,168
It's not because it's new that it's gonna be better than LLMs, right?

213
00:12:41,168 --> 00:12:42,525
difficult to say, but they're.

214
00:12:42,525 --> 00:12:46,587
There are clearly a lot of people that believe in it, if you can raise one billion at this
moment, right?

215
00:12:46,587 --> 00:12:52,430
For something that's for as long as far as we know, there is no real practical proof of
this yet.

216
00:12:52,430 --> 00:12:59,635
But what they did, think, so the MIT now consists of a lot of people of Jan LeCun's old
team at Facebook, the FAIR team.

217
00:12:59,635 --> 00:13:04,058
And the FAIR team there actually did a lot, like was focusing more on research.

218
00:13:04,058 --> 00:13:06,841
Like we all know Facebook AI from Lama, right?

219
00:13:06,841 --> 00:13:10,193
But Lama actually came from the Gen.AI team and Lajjan LeCoumb was probably involved.

220
00:13:10,193 --> 00:13:12,545
He was not the one driving the Lama models.

221
00:13:12,545 --> 00:13:20,032
His fair team was, what I understand, lot involved in more fundamental research, but also
on reinforcement learning and these kinds of things.

222
00:13:20,967 --> 00:13:27,913
And uh how I understand it is that also there already, like he was investigating these
type of newer architectures.

223
00:13:27,913 --> 00:13:28,775
Okay.

224
00:13:29,202 --> 00:13:30,003
And,

225
00:13:30,219 --> 00:13:41,879
he's been getting also like a lot of flack in the last year, I want to say, because he's
been extremely negative for the last two years on the advances that that's quote unquote

226
00:13:41,879 --> 00:13:43,670
traditional LLMs can still make.

227
00:13:43,670 --> 00:13:54,456
But at the same time, like we're in a space where one in the last two years, these
traditional LLMs have become one multimodal and two better than he ever thought possible.

228
00:13:54,456 --> 00:13:55,196
Probably, right?

229
00:13:55,196 --> 00:13:59,287
Like if we see how good these things are in certain tasks today.

230
00:13:59,287 --> 00:14:04,458
And the other thing is also, let's be honest, Facebook was never able to compete on LLMs.

231
00:14:05,267 --> 00:14:10,278
The only thing why they were relevant is because they had an overweight model and that
everybody could easily use it then cheaply.

232
00:14:10,278 --> 00:14:11,118
Yeah, that's true.

233
00:14:11,118 --> 00:14:11,809
That's true.

234
00:14:11,809 --> 00:14:13,691
Yeah, maybe the...

235
00:14:13,691 --> 00:14:16,283
He says here like, just did a quick search, right?

236
00:14:16,283 --> 00:14:21,108
One year ago, Young-Le Koon, he said, if you're interested in human level AI, don't work
on LLMs.

237
00:14:21,108 --> 00:14:27,003
So I think he was also saying like, a lot of these things are marginal gains on LLMs that
like, it's not gonna be a breakthrough.

238
00:14:27,003 --> 00:14:29,015
It's not gonna be pathway AI, which...

239
00:14:29,015 --> 00:14:35,206
I mean, I can also grant that it's probably not going to be the path to AGI, but I'm not
like AGI is also very idealistic, right?

240
00:14:35,206 --> 00:14:39,372
I mean, who's to say that whatever he's building now is the path to AGI, right?

241
00:14:40,174 --> 00:14:41,295
Exactly.

242
00:14:41,467 --> 00:14:42,790
I like every time...

243
00:14:42,822 --> 00:14:46,675
there is also this concept that's, that's at least he's trying a new architecture, right?

244
00:14:46,675 --> 00:14:54,143
Like they're what, what I think a lot of these, these big players, are so heavily invested
in their transformer architecture.

245
00:14:54,363 --> 00:14:56,155
These models are so huge.

246
00:14:56,155 --> 00:14:58,607
It takes months and months and months to train a new model.

247
00:14:58,607 --> 00:15:02,219
It's also super risky to experiment with a lot of new things, right?

248
00:15:02,219 --> 00:15:03,381
Like, and then that's at least.

249
00:15:03,381 --> 00:15:03,981
what he's doing.

250
00:15:03,981 --> 00:15:06,353
And the question is a bit like, like, is it the right timing?

251
00:15:06,353 --> 00:15:06,514
Right?

252
00:15:06,514 --> 00:15:13,652
Like it could be like that's young Lekun is like this brilliant researcher and that he's
very right in theory, but maybe he's wrong in timing.

253
00:15:13,652 --> 00:15:13,902
Right?

254
00:15:13,902 --> 00:15:17,994
Like, and then for a startup, for a startup judge, it just means that you're wrong.

255
00:15:17,994 --> 00:15:18,784
Right.

256
00:15:18,905 --> 00:15:20,737
Because you need to go to market at some point.

257
00:15:20,737 --> 00:15:29,248
Also, I'm thinking like even well, part of reason why LLMs are good because they basically
said, fuck everyone, I'm gonna train all your data.

258
00:15:29,248 --> 00:15:34,308
Do you think if he needs that level of scaling data, do you think he could do that again?

259
00:15:34,308 --> 00:15:40,168
Like if he needs like, even if he has this new architecture, but he needs to have all the
data in the internet to train it.

260
00:15:40,968 --> 00:15:43,128
I'm also wondering if like,

261
00:15:43,895 --> 00:15:44,443
you

262
00:15:44,443 --> 00:15:53,096
two, three, whatever, were able to get away with it because no one really knew what was
right from wrong or right, legally speaking, let's say.

263
00:15:53,096 --> 00:15:55,569
But I feel like if someone tried to do the same thing now, they wouldn't be able to.

264
00:15:55,569 --> 00:15:58,445
So I'm also wondering, is also a disadvantage there probably,

265
00:15:58,445 --> 00:15:59,096
Fair question.

266
00:15:59,096 --> 00:16:03,672
What I'm wondering is like, where does the majority of this data for this new Jeep
architecture come from, right?

267
00:16:03,672 --> 00:16:06,004
Like, what is this physical world understanding, right?

268
00:16:06,004 --> 00:16:07,946
Like, is it video?

269
00:16:07,946 --> 00:16:09,707
Is it lighter sensor data?

270
00:16:09,707 --> 00:16:11,340
Is it like, is it text?

271
00:16:11,340 --> 00:16:14,994
Is it, I don't know, like, is it audio, right?

272
00:16:14,994 --> 00:16:16,036
But it's a fair point.

273
00:16:16,036 --> 00:16:19,499
I still don't think the landscapes changed that much though, right?

274
00:16:19,499 --> 00:16:20,541
Like if you...

275
00:16:20,541 --> 00:16:23,032
There have been some court cases.

276
00:16:23,032 --> 00:16:31,494
But what we're basically seeing is that a lot of the court rulings are pointing to saying
that, ah, yeah, it was more or less fair use of the copyrighted material.

277
00:16:31,494 --> 00:16:33,314
So there's not a major issue.

278
00:16:33,314 --> 00:16:35,325
And if there is a major issue, then let's settle.

279
00:16:35,325 --> 00:16:39,146
Then we're going to pay you something to stop this court case.

280
00:16:39,146 --> 00:16:44,262
So I'm not sure if that landscape really changed from a few years ago, to be honest.

281
00:16:44,262 --> 00:16:46,393
maybe, I don't know.

282
00:16:46,393 --> 00:16:49,844
Maybe one last thought that I have on this, because it's like just on this.

283
00:16:49,844 --> 00:16:53,675
Human level AI will come from mastering the physical world, not language.

284
00:16:53,675 --> 00:16:58,695
I'm also thinking from a maybe philosophical standpoint.

285
00:16:58,695 --> 00:17:00,175
I don't fully agree with this.

286
00:17:00,175 --> 00:17:06,946
I actually think that the things that make humans human, think it's more abstract than the
physical world.

287
00:17:06,946 --> 00:17:10,657
I feel like animals master the physical world.

288
00:17:10,657 --> 00:17:12,939
but they don't have the complex language that we do.

289
00:17:12,939 --> 00:17:22,037
I think it's more like the thing that makes human level intelligence, human level, is
actually the ability of understanding complex concepts, right?

290
00:17:22,037 --> 00:17:24,453
And I don't think it's really tied to the physical world itself.

291
00:17:24,453 --> 00:17:27,370
Again, maybe I'm being a bit too philosophical here.

292
00:17:27,370 --> 00:17:28,411
understand what you're saying.

293
00:17:28,411 --> 00:17:29,223
understand what you're saying.

294
00:17:29,223 --> 00:17:34,721
But at the same time, combine that with the physical world, it's again a step forward,

295
00:17:34,956 --> 00:17:35,996
That I agree.

296
00:17:35,996 --> 00:17:37,316
I feel like that I agree.

297
00:17:37,316 --> 00:17:37,956
That I agree.

298
00:17:37,956 --> 00:17:38,716
But I...

299
00:17:38,716 --> 00:17:42,988
we need something like this to actually go the next step in robotics as well.

300
00:17:42,988 --> 00:17:44,125
That I also fully agree.

301
00:17:44,125 --> 00:17:45,426
That I also fully agree.

302
00:17:45,426 --> 00:17:46,335
I think...

303
00:17:46,335 --> 00:17:53,224
still, like if you, and maybe that's a like, a good explanation on where something like
this might be much more valuable.

304
00:17:53,224 --> 00:17:58,408
Like something as simple as controlling a browser is still super shitty with an LLM.

305
00:17:58,408 --> 00:18:05,043
Like whatever the LLM, even the latest, which should be better, like a GPT 5.4, which
should be better in this, they're just shit at it, right?

306
00:18:05,043 --> 00:18:06,203
Like it's like.

307
00:18:07,574 --> 00:18:08,115
Yeah.

308
00:18:08,115 --> 00:18:09,376
the big pixel by pixel.

309
00:18:09,376 --> 00:18:11,637
Let's move the mouse and then let's try to click.

310
00:18:11,637 --> 00:18:13,428
shit, I was in the wrong location.

311
00:18:13,428 --> 00:18:16,429
mean, like there's, this is clearly not performing right?

312
00:18:16,429 --> 00:18:18,130
even though there are huge, huge, huge models.

313
00:18:18,130 --> 00:18:26,346
And so there is clearly like some, some mastering of whatever physical world is out there
aside from there is this binary or whatever text string, right?

314
00:18:26,346 --> 00:18:27,898
Like there's more to it than that.

315
00:18:27,898 --> 00:18:29,249
No, that I fully agree.

316
00:18:29,249 --> 00:18:36,976
I do think that LMS became popular because of text and now they're trying to transpose
that into different domains, but I'm not sure if that's the way to go.

317
00:18:36,976 --> 00:18:38,740
But yeah, let's see.

318
00:18:38,740 --> 00:18:47,254
think again, I also, in the of the day, whether it's gonna succeed or not, whether I think
it's a good idea or not, or whatever I think about the guy, I also think it's good that

319
00:18:47,254 --> 00:18:48,506
people are trying new things.

320
00:18:48,506 --> 00:18:49,477
think that's always gonna be good.

321
00:18:49,477 --> 00:18:52,041
It's intelligent people, people that have resources and...

322
00:18:52,041 --> 00:18:52,843
and the mind for it.

323
00:18:52,843 --> 00:18:54,649
So let's see.

324
00:18:54,649 --> 00:18:57,200
something, a very nice realization of Jan LeCun.

325
00:18:57,200 --> 00:18:59,980
It's one of the largest C-trons ever.

326
00:18:59,980 --> 00:19:04,891
So 1 billion and a 3.5 billion pre-money valuation.

327
00:19:04,891 --> 00:19:08,502
But it's the largest ever for a European company.

328
00:19:08,502 --> 00:19:10,622
And its headquarters is in Paris.

329
00:19:11,422 --> 00:19:12,543
he is French, I think, no?

330
00:19:12,543 --> 00:19:13,664
Yann LeCun.

331
00:19:14,020 --> 00:19:17,064
And, but he was, I thought he was in the US actually.

332
00:19:17,064 --> 00:19:23,229
Well, Facebook, I thought he was in the US and also I thought that he was also teaching at
the University of New York, but his company is in Paris now.

333
00:19:23,229 --> 00:19:24,431
Maybe he moved back.

334
00:19:24,538 --> 00:19:29,602
HQ is in Paris, that's offices in the US, in New York, Montreal and Singapore.

335
00:19:29,602 --> 00:19:34,277
Wow, but I definitely agree that this is very good for Europe.

336
00:19:34,277 --> 00:19:35,719
So yeah, we'll see.

337
00:19:35,719 --> 00:19:36,639
Best of luck to him.

338
00:19:36,639 --> 00:19:42,465
And I hope to be proven wrong as well by my, let's say, skepticism towards this, but we'll
see.

339
00:19:42,465 --> 00:19:44,246
Next, what do we have?

340
00:19:44,246 --> 00:19:52,813
have Capwing is reflecting on an unusual AI experiment, paying artists royalties when
their styles helped power generate images.

341
00:19:52,813 --> 00:19:56,106
A live test of whether ethical AI art can really work.

342
00:19:56,106 --> 00:20:05,356
The post looks back on test.design from launch in May 2024 to shut down in January 2026
with the hard lessons in between.

343
00:20:05,356 --> 00:20:13,198
I think this brings us back to what we said about the copyrights and using the data for AI
and trying to be ethical about it.

344
00:20:13,198 --> 00:20:14,879
oh

345
00:20:14,917 --> 00:20:16,121
So this is Tesla design.

346
00:20:16,121 --> 00:20:20,745
They actually tried to make an ethical artist friendly AI marketplace, but it didn't work.

347
00:20:20,745 --> 00:20:24,016
Yeah, yeah, so try to summarize.

348
00:20:24,016 --> 00:20:26,527
I have feeling it's been two weeks since I read it.

349
00:20:26,527 --> 00:20:40,100
But what they tried to do is in summer of 2034, they set up a marketplace of fine-tuned
stable diffusion models, fine-tuned for specific artists, where as an artist, you could

350
00:20:40,100 --> 00:20:44,382
say, is, you can use my data to fine-tune this stable diffusion model.

351
00:20:44,382 --> 00:20:52,990
and that everybody that uses that stable diffusion model via that website, I assume it's
via an API or something like that, they pay a few cents and there is some royalties going

352
00:20:52,990 --> 00:20:54,163
from that to the artist.

353
00:20:54,163 --> 00:20:57,411
But then basically like me as a user, say, want this style.

354
00:20:57,411 --> 00:20:58,382
So I'm gonna use this.

355
00:20:58,382 --> 00:21:01,600
It's almost like the model becomes a proxy for the artist.

356
00:21:01,600 --> 00:21:02,320
Exactly.

357
00:21:02,320 --> 00:21:02,741
exactly.

358
00:21:02,741 --> 00:21:04,584
Let's say Rembrandt was still alive.

359
00:21:04,584 --> 00:21:15,671
He could sign an agreement with with test.design, was called, test.design, where you could
say, okay, you test.design, can use my works to, to fine tune a stable diffusion model.

360
00:21:15,671 --> 00:21:22,777
And that stable Rembrandt stable diffusion model you can then offer on your website and
your users can then interact with it.

361
00:21:22,777 --> 00:21:25,960
And basically for, let's say a few cents, I don't know what it would cost.

362
00:21:25,960 --> 00:21:30,187
They can generate an AI generated image using that fine-tuned stable diffusion model.

363
00:21:30,187 --> 00:21:35,074
And of the payments that the user does to the platform, a few cents go to the original
artist.

364
00:21:35,074 --> 00:21:46,395
That also solves a bit of the problem because we talked in the past how if you ask the
HTTP to generate an image, it's also hard to know how much of the style came from this

365
00:21:46,395 --> 00:21:49,415
data or that data and consequently from this artist, that artist.

366
00:21:49,415 --> 00:21:56,515
So they just kind of solved that by saying, we're going to have multiple models and each
model is exclusively trained on this artist.

367
00:21:58,315 --> 00:22:00,155
And it didn't work.

368
00:22:00,834 --> 00:22:01,484
It didn't work.

369
00:22:01,484 --> 00:22:03,396
think there were two challenges.

370
00:22:03,396 --> 00:22:07,089
getting artists to basically sign over their rights to them.

371
00:22:07,089 --> 00:22:11,171
Like you as a platform can now use my intellectual property to train something.

372
00:22:11,171 --> 00:22:17,036
I think that was one of the challenges that have to basically convince artists or
copyright holders in general.

373
00:22:17,036 --> 00:22:20,750
I think the other, and that's probably the major reason why it failed.

374
00:22:20,750 --> 00:22:23,722
Customers like don't care about it.

375
00:22:23,722 --> 00:22:32,314
I think the, is there, there was at that point, 2024, some uncertainty on like, if I
generate something with AI, what will happen?

376
00:22:32,314 --> 00:22:34,735
Like, do I actually own the copyright or not?

377
00:22:34,735 --> 00:22:38,898
And they had these, these, these noise about, about court cases being done and

378
00:22:38,898 --> 00:22:40,419
There was a lot of legal uncertainty.

379
00:22:40,419 --> 00:22:44,663
And I think over time, it's not necessarily that the legal certainty went away.

380
00:22:44,663 --> 00:22:48,757
But we see that leadership of major countries don't care about this.

381
00:22:48,757 --> 00:22:53,140
And that also means that corporations don't really need to care about this compliance.

382
00:22:53,140 --> 00:22:55,803
that there's not really a willingness to pay for something like this.

383
00:22:55,803 --> 00:23:01,799
So basically like because the country leadership basically they didn't necessarily care.

384
00:23:01,799 --> 00:23:03,859
I mean maybe not care but maybe it wasn't a priority.

385
00:23:03,859 --> 00:23:11,512
maybe, but like, I don't the people that should care about this are actually acting
towards it, right?

386
00:23:11,512 --> 00:23:17,395
Like we're not seeing like a majority of court cases being ruled in favor of original
copyrights.

387
00:23:17,395 --> 00:23:26,952
Like in the opposite, we're seeing a majority of legislation like coming up that says like
a lot of these things have become, have been fair use of copyrighted material.

388
00:23:26,952 --> 00:23:27,212
Right?

389
00:23:27,212 --> 00:23:29,958
Like we had the huge course case, I think it was on Tropic.

390
00:23:29,958 --> 00:23:38,883
I'm not sure if it was in Tropic actually, where the books on, I think a lot of them came
from Anna's archive, where basically the ruling was.

391
00:23:38,883 --> 00:23:39,872
You need to buy one book.

392
00:23:39,872 --> 00:23:43,332
it was okay to train on this, but what you did wrong is you didn't buy the books.

393
00:23:43,429 --> 00:23:44,869
You tormented the books.

394
00:23:44,869 --> 00:23:47,889
So you just need to buy a single book and you can do whatever you want with it, right?

395
00:23:47,889 --> 00:23:54,380
Like that is, and that's a bit like, if I take a legal picture of a Rembrandt, I can do
whatever I want with it.

396
00:23:54,380 --> 00:23:57,651
that's a bit of what it seems to be becoming the norm.

397
00:23:57,651 --> 00:24:05,553
And that, and if that becomes the norm, like something like this, like test or design,
like becomes something like you're ethical norms, right?

398
00:24:05,553 --> 00:24:06,670
Like the only people

399
00:24:06,670 --> 00:24:10,273
purchasing this is like, yeah, because I feel it's more honest.

400
00:24:10,900 --> 00:24:16,614
And I mean, there is probably a very small audience for this, but it's not viable, right?

401
00:24:16,614 --> 00:24:18,177
You can live off of this.

402
00:24:18,177 --> 00:24:22,343
Probably the people that would do this are the people that would just donate to artists
directly,

403
00:24:22,343 --> 00:24:30,739
Well, yeah, I think what they try to do is like, try to sell basically ethics in a
marketplace where a customer doesn't care about ethics.

404
00:24:30,739 --> 00:24:31,278
Yeah.

405
00:24:31,278 --> 00:24:33,634
think you can abstract everything that it did away to that.

406
00:24:33,634 --> 00:24:34,734
Yeah, yeah, yeah.

407
00:24:34,734 --> 00:24:35,725
Yeah, it's true.

408
00:24:35,725 --> 00:24:47,045
So yeah, so I think what I think is interesting is that this is a concrete example that
it's like everyone will complain that these things are not right and it's not ethical.

409
00:24:47,045 --> 00:24:53,116
But when you give people the power to do it, they don't want to do it either, right?

410
00:24:53,116 --> 00:24:56,856
Well, yeah, you as a consumer, let's say you want to generate some AI art.

411
00:24:56,856 --> 00:25:00,936
mean, you're going to pay your $20 Gemini license and just do whatever you want, right?

412
00:25:00,936 --> 00:25:06,476
Like you're not going to pay for a single image, something to an artist and that means
that you're going to pay more.

413
00:25:06,476 --> 00:25:08,576
Like you're not going to do that out of your own free will.

414
00:25:08,576 --> 00:25:11,679
You need a strong legal system to enforce something like that.

415
00:25:11,679 --> 00:25:19,747
Yeah, no, but I do remember in the beginning, especially like there was a lot of blog
posts and articles saying like, this is not fair, you know, like this is straight on the

416
00:25:19,747 --> 00:25:20,067
artist.

417
00:25:20,067 --> 00:25:21,419
The artist don't see anything.

418
00:25:21,419 --> 00:25:23,511
I feel like there was a lot I felt right.

419
00:25:23,511 --> 00:25:27,136
Of course, there was a lot of the public opinion was like, this is not right.

420
00:25:27,136 --> 00:25:29,639
And something needs to people need to do something about it.

421
00:25:29,639 --> 00:25:32,642
But then and that's probably where test.design came up.

422
00:25:32,642 --> 00:25:34,374
But the reality is like it's

423
00:25:34,374 --> 00:25:44,396
People are very enthusiastic about it when they need to give their opinion, but when it's
time to act on this, then people are not as excited, let's say, right?

424
00:25:44,396 --> 00:25:46,768
You need to pay for it yourself, it's less interesting.

425
00:25:46,768 --> 00:25:52,932
Yeah, I think in the end people just want the companies to pay but it's also like, yeah,
this is like, you cannot control that, right?

426
00:25:52,932 --> 00:25:56,014
companies to pay like you need a strong legal system that enforces it.

427
00:25:56,014 --> 00:26:00,193
mean, companies are owned by stakeholders and stakeholders want short-term gains.

428
00:26:00,193 --> 00:26:01,964
It's as as that.

429
00:26:03,737 --> 00:26:07,460
As long as we're in a capitalistic society, that's reality simply.

430
00:26:07,460 --> 00:26:08,600
Yeah, exactly.

431
00:26:08,600 --> 00:26:09,893
It's just like physics, right?

432
00:26:09,893 --> 00:26:11,532
That's just the way it is.

433
00:26:11,532 --> 00:26:12,233
Can't fight it.

434
00:26:12,233 --> 00:26:12,743
All right.

435
00:26:12,743 --> 00:26:13,775
All right.

436
00:26:13,775 --> 00:26:17,819
But still, I also thought it was interesting experiment at least.

437
00:26:17,819 --> 00:26:20,852
And I thought it was also very nice that they were very open about this, right?

438
00:26:20,852 --> 00:26:23,684
Like this is our lessons learned.

439
00:26:23,881 --> 00:26:33,374
Yeah, and to be clear, I hope at some point something like this does come up, that there
are royalties going to original artists, original authors, that today is not the day.

440
00:26:33,374 --> 00:26:36,591
agree, agree, to better days, to better days.

441
00:26:36,591 --> 00:26:37,772
What is next?

442
00:26:38,460 --> 00:26:45,935
Anthropic is trying to measure AI's effects on jobs with new occupation-level datasets,
moving the debate from vibes to something close to evidence.

443
00:26:45,935 --> 00:26:49,027
Its early read is the tension worth watching.

444
00:26:49,027 --> 00:26:55,152
AI use is spreading fastest in some knowledge work tasks, but the broader employment
impact is still far from settled.

445
00:26:55,152 --> 00:26:57,355
It's a new report from Anthropic.

446
00:26:57,355 --> 00:27:00,176
It's by now already a bit more than a week old.

447
00:27:00,176 --> 00:27:03,065
um What is it about, Meryl?

448
00:27:03,065 --> 00:27:03,946
stale, right?

449
00:27:03,946 --> 00:27:14,451
Like one week old is like, so what I understood is basically actually looked at what is
the impact of AI on the workforce.

450
00:27:14,451 --> 00:27:16,693
And there's a lot of theoretical, right?

451
00:27:16,693 --> 00:27:21,515
So basically places that AI could have an impact, but there's also some place that is
already having an impact.

452
00:27:21,515 --> 00:27:28,671
So they did a bit of analysis and trying to basically, yeah, I guess it's like basically
see how the workforce is going to change or how the workforce is going to be impacted.

453
00:27:28,671 --> 00:27:30,811
with AI going forward.

454
00:27:31,911 --> 00:27:36,851
So actually I think there was one image here that I thought was quite interesting.

455
00:27:36,931 --> 00:27:39,251
This one, I wanna say.

456
00:27:39,551 --> 00:27:42,551
There are capabilities, so maybe for people listening, it's like, how do you call it?

457
00:27:42,551 --> 00:27:43,691
Like a rater chart?

458
00:27:43,691 --> 00:27:47,051
It's kind of like a rater chart.

459
00:27:47,051 --> 00:27:50,722
And for people that don't know what a rater, yeah, that's the name.

460
00:27:50,722 --> 00:27:54,124
And for people that don't know what a rater chart is, and you play FIFA.

461
00:27:54,124 --> 00:27:59,885
At least before you had like the different skills and then it kind of, it kind of comes up
like, kind of looks like a net, right?

462
00:27:59,885 --> 00:28:03,069
So you have a circle with a whole bunch of concentric circles.

463
00:28:03,069 --> 00:28:12,575
And then if like there are different points that kind of reflect on how, how from zero to
100, I guess, like kinda, you know, how, I don't know, maybe just, I'll just link the

464
00:28:12,575 --> 00:28:12,875
article.

465
00:28:12,875 --> 00:28:15,386
If people are interested, can click on it.

466
00:28:15,386 --> 00:28:16,086
Yeah.

467
00:28:16,310 --> 00:28:17,878
It's actually quite intuitive when you watch it.

468
00:28:17,878 --> 00:28:19,508
But what do we see on the chart?

469
00:28:19,508 --> 00:28:28,290
So basically like you see the different domains on the different axes, So management,
business, finance, computer, math, architecture, engineering, it's so good and goes on and

470
00:28:28,290 --> 00:28:30,965
on, including sales, office, agriculture even.

471
00:28:30,965 --> 00:28:33,119
And then there's a blue area.

472
00:28:33,119 --> 00:28:35,462
which shows the theoretical AI coverage.

473
00:28:35,462 --> 00:28:40,919
And then below smaller, there's a red area, which is the observed AI coverage.

474
00:28:40,919 --> 00:28:52,019
So for example, management, there is a very high theoretical AI coverage, meaning that
there's a lot of stuff that management people could use AI for, but the reality is that

475
00:28:52,019 --> 00:28:54,834
very little people actually do it today.

476
00:28:54,834 --> 00:28:56,066
So I guess it's something that...

477
00:28:56,066 --> 00:29:00,801
As time goes by and these things become more commonplace, I would expect this to expand,
right?

478
00:29:00,801 --> 00:29:03,785
To get closer to the blue area as well.

479
00:29:03,785 --> 00:29:07,709
So yeah, maybe one thing like grounds maintenance, that it's almost zero, right?

480
00:29:07,709 --> 00:29:11,905
But I think the management, business and finance, computer math, architecture,
engineering.

481
00:29:11,905 --> 00:29:19,349
Life and social sciences, legal arts and media, and office and admin, and sales, they all
have very high theoretical AI coverage.

482
00:29:19,349 --> 00:29:29,674
So I guess the message here is if you're in one of those areas or part of the job is one
of those areas, I think looking into AI is gonna be very relevant for your job going

483
00:29:29,674 --> 00:29:30,775
forward, right?

484
00:29:30,775 --> 00:29:33,307
And of course, computer math, business and finance.

485
00:29:33,307 --> 00:29:37,673
Legal education, no, yeah, arts and media, sales and office and admin.

486
00:29:37,673 --> 00:29:39,437
It's already a reality today.

487
00:29:39,437 --> 00:29:39,756
Right.

488
00:29:39,756 --> 00:29:40,188
So.

489
00:29:40,188 --> 00:29:50,963
Yeah, I think what they say is that the most exposed occupations, because this is more
domain, like most exposed occupations today, because they have 75 % already task coverage,

490
00:29:50,963 --> 00:29:53,455
are computer programmers.

491
00:29:53,455 --> 00:29:56,229
And then closely followed by customer service reps.

492
00:29:56,229 --> 00:29:57,850
Also recognizable, right?

493
00:29:57,881 --> 00:30:00,733
And data entry people.

494
00:30:00,733 --> 00:30:03,374
People that do data entry on whatever, right?

495
00:30:04,615 --> 00:30:08,136
You get in some information and you need to insert it into some systems.

496
00:30:08,197 --> 00:30:12,138
Very administrative tasks done in a lot of big institutions.

497
00:30:12,617 --> 00:30:13,188
Indeed.

498
00:30:13,188 --> 00:30:14,440
So it's already very relevant.

499
00:30:14,440 --> 00:30:16,573
Maybe we discussed a bit the...

500
00:30:16,573 --> 00:30:24,362
I think we discussed on the like during the recording you think that the computer science
is not going to be as thriving as a career, right?

501
00:30:24,362 --> 00:30:24,963
Like it's going to...

502
00:30:24,963 --> 00:30:27,376
the workforce is going to decrease quite a lot because of AI.

503
00:30:27,376 --> 00:30:28,558
This sounds very pessimistic.

504
00:30:28,558 --> 00:30:32,079
um

505
00:30:32,079 --> 00:30:36,601
So I've been a bit both positive and negative about this.

506
00:30:37,984 --> 00:30:41,230
Maybe the negative part is I think what we will see is that.

507
00:30:41,230 --> 00:30:46,014
your typical software engineer will become five times more efficient if they adopt these
skills.

508
00:30:46,014 --> 00:30:47,505
I think that will take some time.

509
00:30:47,505 --> 00:30:51,288
Like people need to pick it up, but I think they will at least become five times more
efficient.

510
00:30:51,288 --> 00:31:00,517
That means that your typical engineering team suddenly can output five times more stuff,
but your customers as a company are probably not asking for five times more stuff.

511
00:31:00,517 --> 00:31:02,470
So there's, I think there will be like,

512
00:31:02,470 --> 00:31:04,422
this lack effect.

513
00:31:04,422 --> 00:31:12,448
I think at some point that will equalize, but I think that lack effect will cause some
displacement.

514
00:31:12,448 --> 00:31:21,074
And I think also this report, they're saying we cannot yet prove that there is less
employment, but we see more hesitation to out-hiring rejuveners, which is, I think,

515
00:31:21,074 --> 00:31:23,487
something that we all recognize from hearing around.

516
00:31:23,487 --> 00:31:25,593
em

517
00:31:25,593 --> 00:31:34,098
in Belgium, you do hear like, mean, the US you also heard a lot, but like people saying
they're gonna cut their workforce with AI and they're planning to reduce the workforce.

518
00:31:34,098 --> 00:31:36,492
So this is also a reality to.

519
00:31:36,492 --> 00:31:37,312
It's a reality.

520
00:31:37,312 --> 00:31:41,732
I think some of them are also overstated because there are like an explanation on why
you're cutting jobs, right?

521
00:31:41,732 --> 00:31:47,472
it's, so there's a lot of noise on that, on those messaging as well.

522
00:31:47,472 --> 00:31:49,212
But we do hear it a lot here.

523
00:31:49,212 --> 00:31:51,892
It's not all noise, right?

524
00:31:53,092 --> 00:31:53,992
It's...

525
00:31:54,096 --> 00:31:56,359
well for data for this particular signal.

526
00:31:56,359 --> 00:32:00,854
I do agree with you as a general trend that this will probably happen like growing pains,
right?

527
00:32:00,854 --> 00:32:06,030
Like the market will probably adjust on the what is asking, but there will be some time
that they're going to be like, whoa, wow.

528
00:32:06,030 --> 00:32:07,392
Why there's like

529
00:32:07,392 --> 00:32:09,432
you can produce way more than what we're asking, right?

530
00:32:09,432 --> 00:32:11,672
And there's gonna take some time before they start asking more.

531
00:32:11,672 --> 00:32:22,023
But I also think that for this very particular signal, like the company saying they're
gonna cut the workforce because of AI, I also wonder if there's a bit of a herd mentality.

532
00:32:22,023 --> 00:32:27,323
Like they see one place doing this and the people are like, okay, why are they doing this
and we are not, right?

533
00:32:27,323 --> 00:32:29,803
If you can actually be more productive with AI, why are we not?

534
00:32:29,803 --> 00:32:35,965
And then there's a bit of a, you start discussing more of these things and then there's
more people that kind of jump in it, right?

535
00:32:35,965 --> 00:32:42,361
Yeah, I think it's also a way to force yourself as a company to pick up these skills, of
course.

536
00:32:43,742 --> 00:32:52,149
If you have a very big engineering team, you're in a very comfortable situation, and
nobody's really incentivized to become five times more efficient.

537
00:32:52,149 --> 00:32:56,911
Unless you have a problem, you still have less people, and you need to do it to survive.

538
00:32:57,783 --> 00:33:00,616
so you said like, do you think everyone's gonna be 5X?

539
00:33:00,616 --> 00:33:03,089
Or do you think the good ones are gonna be 5X?

540
00:33:03,089 --> 00:33:12,650
Because I'm also wondering like, I think everyone's gonna be more productive, but I'm
wondering if like, everyone is gonna be 2X and then the very good ones are gonna be 5X?

541
00:33:12,748 --> 00:33:14,828
very good ones will be 100X.

542
00:33:15,548 --> 00:33:20,279
Where we used to say the good ones are 10X engineers, I think it will be 100X engineers.

543
00:33:20,279 --> 00:33:21,816
Okay, okay, cool.

544
00:33:21,816 --> 00:33:31,698
The thing that I am positive about, they actually did a write up on my blog the other day
on this, is that I was talking a while back to a startup and they were like, they're

545
00:33:31,698 --> 00:33:40,149
non-technical founders and they actually had like a quite a good product market fit, but
they were all like, oh yeah, you don't need to show the article, I'm bit shy on these

546
00:33:40,149 --> 00:33:40,719
things.

547
00:33:40,719 --> 00:33:41,414
I insist.

548
00:33:41,414 --> 00:33:47,978
And they were actually a non-technical founders and they were actually like they have a
very strong offering, but a very limited offering.

549
00:33:47,978 --> 00:33:52,490
they had a difficult time of hooking customers for the long run.

550
00:33:52,490 --> 00:33:56,872
And they had a very clear pathway, like we need to build these and these and these
features to hook them in for the long run.

551
00:33:56,872 --> 00:33:58,192
And the offering was very strong.

552
00:33:58,192 --> 00:33:58,524
And...

553
00:33:58,524 --> 00:34:05,104
But the challenge was like, ah, yeah, but it takes so much money to develop these features
and it takes so much time to develop these features.

554
00:34:05,104 --> 00:34:08,735
So we can only get this feature to market like six months from now.

555
00:34:08,735 --> 00:34:13,797
And I was thinking to myself, like, I think this landscape will completely change.

556
00:34:13,797 --> 00:34:20,888
to me, it's a bit parallel with the online advertisement, like online advertisement when
it didn't exist.

557
00:34:20,888 --> 00:34:23,939
I wanted us as a, we're creating a brand now, top of mind.

558
00:34:23,939 --> 00:34:29,241
If I wanted to become known in, let's say India, it was gonna, I had to probably go there.

559
00:34:29,241 --> 00:34:30,441
need to contract people there.

560
00:34:30,441 --> 00:34:32,561
need to do a lot of manual advertisement there.

561
00:34:32,561 --> 00:34:36,821
It's going to take me six months to get to, I don't know, 10,000 people that view my
brand.

562
00:34:36,866 --> 00:34:41,168
And, but now with online advertisement, I can, I can put credit, like it's not free,
right?

563
00:34:41,168 --> 00:34:43,688
But it's, I can put credits towards it.

564
00:34:43,688 --> 00:34:47,728
And like, if I can get those 10,000 views on my brand next week.

565
00:34:48,062 --> 00:34:49,913
And I think that's dynamic.

566
00:34:49,913 --> 00:34:55,256
We're also going to see in software development, where software development used to be
like prohibitively expensive.

567
00:34:55,256 --> 00:34:57,667
Like you need a very big investors to get something done.

568
00:34:57,667 --> 00:35:02,857
Suddenly we're in this new stage where, I mean, it's not, the cost is not zero, of course,
but like it's much more doable.

569
00:35:02,857 --> 00:35:05,451
It's like just something that you can, that you also have to do.

570
00:35:05,451 --> 00:35:10,032
It's not the major investment to get stuff started.

571
00:35:10,379 --> 00:35:13,430
And to make that parallel, maybe that's why I'm optimistic.

572
00:35:13,430 --> 00:35:15,692
The advertisement market as such.

573
00:35:15,692 --> 00:35:18,265
only grew over time and roles shifted, right?

574
00:35:18,265 --> 00:35:24,204
Like the jobs that were there, they completely evolved to what before what they were, but
the space as such only grew.

575
00:35:24,204 --> 00:35:25,075
And that's a positive thing.

576
00:35:25,075 --> 00:35:27,447
And I hope that we see that with the tech space as well.

577
00:35:27,672 --> 00:35:29,914
I see what you're I see what you're saying.

578
00:35:29,914 --> 00:35:36,048
I think as you're saying this as well, I'm also wondering if the signal to noise is going
to be different scale as well.

579
00:35:36,048 --> 00:35:44,456
Like you're to have a lot of these features, a lot of these companies, a lot of these
things, but because there's so much more, it's going to be harder to find the...

580
00:35:44,456 --> 00:35:48,511
the true, like the things that are valuable, the things that you should pay attention to,
right?

581
00:35:48,511 --> 00:35:49,833
What is actually robust and what is not?

582
00:35:49,833 --> 00:35:50,656
I mean, it...

583
00:35:50,656 --> 00:35:53,789
You mean like applications being offered, SaaS platforms being offered.

584
00:35:53,789 --> 00:35:54,999
That's what you mean?

585
00:35:55,062 --> 00:35:56,863
Yeah, but that I agree with you.

586
00:35:56,863 --> 00:36:03,758
There is actually a nice report maybe for next time to go to the survey of revenue cats
that just came out.

587
00:36:03,758 --> 00:36:10,101
And it's actually like your revenue cat is a bit of the man in the middle when it comes to
integrating with the iOS payment system.

588
00:36:10,101 --> 00:36:12,373
But it's very big in the SaaS ecosystem.

589
00:36:12,373 --> 00:36:20,387
And they have a lot of data on how are subscriptions evolving, what is churn, how much
more new entrance do we have in last year, which is a crazy amount.

590
00:36:20,387 --> 00:36:25,143
It's worth a read to get a bit of a view on how these dynamics are.

591
00:36:25,143 --> 00:36:26,344
Yeah, exactly.

592
00:36:26,985 --> 00:36:28,158
Let's go for it next time.

593
00:36:28,158 --> 00:36:29,769
we'll cover next time for sure.

594
00:36:29,769 --> 00:36:34,351
But yeah, but I'm also, yeah, like you said, six months, take six months to build
something.

595
00:36:34,351 --> 00:36:39,893
And the first time I have, my first thought was kind of the same as yours is like, months
is like, is that right?

596
00:36:39,893 --> 00:36:48,595
Like if you have a team of four people that have, that are efficient with AI, like I
cannot think of something that, like if you know what you want.

597
00:36:48,595 --> 00:36:52,319
Like you have a very concrete view of what you want that probably won't take already six
months, right?

598
00:36:52,319 --> 00:36:53,991
Because I think that's big part of it.

599
00:36:53,991 --> 00:36:56,226
So the horizon shrinks for sure, right?

600
00:36:56,226 --> 00:36:59,052
So yeah, to be seen, be seen.

601
00:37:00,255 --> 00:37:01,457
And last but not least,

602
00:37:01,457 --> 00:37:12,512
Nvidia's GTC conference is shaping up as a pivot point, with Jensen Huang expected to show
how CPUs and new inference chips fit alongside the company's GPUs empire.

603
00:37:12,512 --> 00:37:15,012
One striking forecast hangs over the story.

604
00:37:15,012 --> 00:37:21,435
Inference could make up 75 % of a 1.2 trillion AI data center market by 2030.

605
00:37:21,435 --> 00:37:25,204
So Nvidia, they have the GTC, which actually stands for, for what?

606
00:37:25,204 --> 00:37:26,199
It's like a...

607
00:37:26,199 --> 00:37:29,568
Basically their conference, but I forgot what it's stood for.

608
00:37:29,568 --> 00:37:32,768
But basically their, the expectation, I think they have...

609
00:37:32,768 --> 00:37:38,213
coming up, the GTC, and they're expecting Nvidia to announce their first CPU chip, right?

610
00:37:38,213 --> 00:37:42,778
So Nvidia historically has been very, well, it's been the market, the standard, right?

611
00:37:42,778 --> 00:37:44,550
Industry standard for GPUs.

612
00:37:44,550 --> 00:37:51,286
And now they're also entering the CPU stage, which is actually, I think is Intel and ARM,
I think, right?

613
00:37:51,286 --> 00:37:53,276
That really dominate the space.

614
00:37:55,568 --> 00:37:56,022
you

615
00:37:56,022 --> 00:38:05,340
still in the AI play, let's say, they're saying that CPUs, still are, well, they said it's
become the bottleneck for actually AI inference, right?

616
00:38:05,340 --> 00:38:12,693
So, and I'm actually, read it, it's not on this article, but I think I saw somewhere else
that they were even thinking of how to link this with...

617
00:38:12,693 --> 00:38:24,675
because they also acquired Grok with the queue, so not the AI, not the Grok AI, but also
was like linking a bit how that acquisition also leads to the CPU technology, right?

618
00:38:24,675 --> 00:38:29,366
Um, yeah, my, understanding of this is, uh, well, are two a bit different things, right?

619
00:38:29,366 --> 00:38:31,426
Grok and the CPU, uh, X-ray.

620
00:38:31,426 --> 00:38:34,037
So Grok is an acquisition that they did.

621
00:38:34,037 --> 00:38:35,877
Um, we actually covered it.

622
00:38:35,877 --> 00:38:41,328
Want to say, uh, six months ago, uh, $20 billion acquisition.

623
00:38:41,328 --> 00:38:47,479
Uh, and what Grok does it, it's generally, it creates, uh, LPUs, uh, language processing
units.

624
00:38:47,479 --> 00:38:48,054
Um,

625
00:38:48,054 --> 00:38:50,936
And they really focus on being very good at inference.

626
00:38:50,936 --> 00:38:54,360
So they're probably not very efficient at training, but they're very good at inference.

627
00:38:54,360 --> 00:39:04,770
And Yanshan Wang is lookout for the future is actually that we will see more of a
commoditization of inference, like on LLMs.

628
00:39:04,770 --> 00:39:11,477
we will see this growth of large AI factories that do inference at a very, very large
scale.

629
00:39:11,477 --> 00:39:15,242
And that's a lot of these new AI factories will use really like chips.

630
00:39:15,242 --> 00:39:19,447
Well, focused on inference, very good at inference, like this grok offering.

631
00:39:19,447 --> 00:39:20,817
Nvidia actually didn't have it.

632
00:39:20,817 --> 00:39:25,251
it's their, like their acquisition is a bit of their answering on this offering because
they didn't have it.

633
00:39:25,251 --> 00:39:28,363
but there are, they do have competitors like Google's TPU.

634
00:39:28,363 --> 00:39:33,703
The thing is the cranium, I think it's called cranium or something like that by AWS.

635
00:39:33,703 --> 00:39:42,603
And there are a few, there's actually some competitors and Avia did have a lot of
competitors in the space before they, so Grog is a bit like they bought one of the

636
00:39:42,603 --> 00:39:43,154
competitors.

637
00:39:43,154 --> 00:39:50,134
But what we will also see like in these very big AI factories is not just a lot of very
efficient inference.

638
00:39:50,134 --> 00:40:00,345
What we also see is an optimization for agentic workloads where you actually will have,
for example, lot of tool calls, a lot of agentic workloads that typically run on a CPU.

639
00:40:00,345 --> 00:40:09,013
And I think that is a bit the reasoning why they're now saying like CPU is becoming the
bottleneck because data centers are expected to also do much more of not just inference,

640
00:40:09,013 --> 00:40:11,048
but also this agentic workloads.

641
00:40:11,048 --> 00:40:11,819
I see.

642
00:40:11,819 --> 00:40:19,906
So it's like there's the CPU and the Grok, it connected to the sense that it's to support
AI usage, right?

643
00:40:19,906 --> 00:40:28,234
So Grok is more the model inference and then the CPU is more for the stuff around the
model inference, like tool calling or interacting with the operating system and all these

644
00:40:28,234 --> 00:40:30,825
different things, which according to them is the...

645
00:40:30,825 --> 00:40:39,036
And in terms of CPUs, so they have, and I'm not too knowledgeable on the exact specs on
that, but they have a CPU, it's called Grace.

646
00:40:39,036 --> 00:40:44,827
They launched it already, I think, I want to say four years ago, but now they announced a
new generation, it's called Vera.

647
00:40:44,827 --> 00:40:46,107
Yeah, exactly.

648
00:40:46,107 --> 00:40:49,749
2021 was grace and now Vera is now in production.

649
00:40:49,749 --> 00:40:55,633
And I think I also saw somewhere that I think there are multi-year deal with Meta as well.

650
00:40:55,633 --> 00:41:02,660
they also already like you mentioned servers and all these things, but they already, they
have a very concrete, yeah, very concrete client there, right?

651
00:41:02,660 --> 00:41:04,399
the, well.

652
00:41:04,693 --> 00:41:09,231
are you hopeful that we will see a more commoditization of these elements?

653
00:41:09,231 --> 00:41:10,433
Yes, I am.

654
00:41:10,433 --> 00:41:14,275
think, well, again, think, commoditization, there are degrees to it, right?

655
00:41:14,275 --> 00:41:23,172
But I do think, I do think, yes, I think, I saw a report a while ago that Chinese open
models are on average six months behind US closed source models.

656
00:41:23,172 --> 00:41:24,672
Maybe this will close the gap a bit.

657
00:41:24,672 --> 00:41:27,564
So there's that, you still need specialized infrastructure.

658
00:41:27,564 --> 00:41:29,177
But I also think it's a bit the...

659
00:41:29,177 --> 00:41:34,470
And maybe I'm not the best person to say it, but my impression is that research and
industry, they swing a bit like this, right?

660
00:41:34,470 --> 00:41:44,355
Like first research shows something is possible and then the industry makes sure it's
usable and then the industry makes sure it's usable for everyone, right?

661
00:41:44,355 --> 00:41:47,882
So I think that's also the next, I think there's a big, I don't know.

662
00:41:47,882 --> 00:41:52,272
I do think in the future we will see more of these things, not just in model size, like.

663
00:41:52,272 --> 00:41:57,554
We talked about inference, but maybe models being smaller for maybe more specific tasks.

664
00:41:57,554 --> 00:42:02,762
saw, I think we covered months ago, the tiny models as well, the tiny recursion models.

665
00:42:02,762 --> 00:42:06,455
So I know there are some things that people are looking into.

666
00:42:06,455 --> 00:42:10,068
There's still a lot of people that want to run things more locally as well.

667
00:42:10,068 --> 00:42:11,753
I think this is a reality today.

668
00:42:11,753 --> 00:42:13,803
It's not gonna be a reality tomorrow, but...

669
00:42:13,803 --> 00:42:16,838
But I do think it's something that people are asking themselves, right?

670
00:42:16,838 --> 00:42:19,852
And I think if people are asking themselves, I think people are looking into it.

671
00:42:19,852 --> 00:42:22,587
And I think at some point in the future, it will be more commoditized.

672
00:42:22,587 --> 00:42:25,963
So again, I think it will, but I'm not sure how much, right?

673
00:42:25,963 --> 00:42:28,003
I'm not sure if you're still gonna need like...

674
00:42:28,003 --> 00:42:29,814
Maybe it's not closed source models.

675
00:42:29,814 --> 00:42:32,086
Maybe more open models will do the trick.

676
00:42:32,086 --> 00:42:37,910
Maybe it's, you don't need GPUs for doing the like racks and racks of GPUs to run
inference.

677
00:42:37,910 --> 00:42:40,812
Maybe it's CPUs, maybe it's the, these cheaper chips.

678
00:42:40,812 --> 00:42:43,315
But I do think something will change as well.

679
00:42:43,315 --> 00:42:48,469
I also hear that the costs for running LLMs is still super high.

680
00:42:48,469 --> 00:42:55,275
Like actually there was another article that maybe we'll cover next time that they were
talking about open AI and entropic and even

681
00:42:55,275 --> 00:43:03,905
XAI how they're not profitable they're not close to being profitable so there's also
appealing a bit to the to government subsidies in the US and the only one that is a bit

682
00:43:03,905 --> 00:43:14,095
that is a very luxury position is Gemini because they can take the the costs right they
can they can eat up the losses because they have very very good very healthy revenue from

683
00:43:14,095 --> 00:43:16,386
the other sources on the other products so

684
00:43:16,386 --> 00:43:20,649
I do think to counterbalance that even the big players are probably also looking into
this.

685
00:43:20,649 --> 00:43:23,091
I don't know if OpenAI announced like a chip or something.

686
00:43:23,091 --> 00:43:24,592
I know Google has the TPU.

687
00:43:24,592 --> 00:43:28,704
I know Anthropic had some deals with think AWS as well or even Google as well.

688
00:43:28,704 --> 00:43:29,906
I think we covered as well.

689
00:43:29,906 --> 00:43:35,231
So I do think something will change there, but I'm not sure how it's going to look like.

690
00:43:35,231 --> 00:43:38,653
How far down the commoditization spectrum are we going to get?

691
00:43:38,653 --> 00:43:39,754
What do you think?

692
00:43:40,943 --> 00:43:42,624
I'm hopeful.

693
00:43:42,624 --> 00:43:52,873
think if you look at most benchmarks, like if it was the big one, the artificial analysis
one, for example, like if the top models are very, very close in terms of performance.

694
00:43:52,873 --> 00:43:58,467
I also see it now because we have a very large test suite for the thing that we're
building on a lot of different cases.

695
00:43:58,467 --> 00:43:59,849
And like, if you compare

696
00:43:59,849 --> 00:44:07,260
I don't know, Gemini 3.1 with, for example, Sonnador Opus, and then you compare it to GLM
5.

697
00:44:07,260 --> 00:44:20,042
There are some differences, but the differences don't weigh against, they're not
significant enough to say that this is a six month difference in pace of evolution to me.

698
00:44:20,042 --> 00:44:21,533
So I'm actually very hopeful.

699
00:44:21,533 --> 00:44:30,059
I think a lot of like, even if you're using your, let's say you're using Gemini as an app
on your phone, if someone behind the screen switched it to Sonos or to OpenAI or to GLM or

700
00:44:30,059 --> 00:44:32,735
to Keemi, I doubt that a lot of people would even notice.

701
00:44:32,735 --> 00:44:33,736
I think so too.

702
00:44:33,736 --> 00:44:41,085
I think they would only notice when you hit a certain problem and then you hit your head
against it a few times and then if you switch models it just kind of goes through.

703
00:44:41,605 --> 00:44:44,158
Yeah, that's thing that could happen.

704
00:44:44,158 --> 00:44:49,730
I still believe that there are some models that are better than others, but I think it's
very, it's a very close race.

705
00:44:49,730 --> 00:44:59,585
that's since I'm also hopeful in a sense that's what we, that we will not get like these
two or three very, very big, almost monopolistic players that basically rule the world.

706
00:44:59,585 --> 00:45:05,657
But that's in the end, like there's enough competition to basically move the value that
gets created to the bigger ecosystem, right?

707
00:45:05,657 --> 00:45:06,796
Like people are.

708
00:45:06,796 --> 00:45:13,614
through these things, we are able to create more value for society at large, not just for
these few major players.

709
00:45:14,075 --> 00:45:15,837
I am hopeful for that, to be honest.

710
00:45:15,837 --> 00:45:20,579
If you see the evolutions now, um but let's see in five years.

711
00:45:20,579 --> 00:45:25,441
most wondering, example, we talked about cloud core work and we talked about cloud code a
few times.

712
00:45:25,441 --> 00:45:31,757
Like I think for sure the models are really good, but I also think a lot of it is the
application behind it, right?

713
00:45:31,757 --> 00:45:38,522
Like making sure it has the right context, making sure it keeps the right things, making
sure like JudgePT or came up with a codex, right?

714
00:45:38,522 --> 00:45:40,674
And it says optimize for long running tasks.

715
00:45:40,674 --> 00:45:42,835
And I don't think it's because they have such a better model.

716
00:45:42,835 --> 00:45:47,441
I think it's probably because the engineering behind it that made it better for long
running tasks.

717
00:45:47,441 --> 00:45:49,603
what people these days call the harness, right?

718
00:45:49,603 --> 00:45:57,998
Like you have an LLM model and you have like a harness around it, whether it's a cloth
coat, whether it's codex, whether it's Gemini CLI, whether it's whatever solution that

719
00:45:57,998 --> 00:46:01,972
you're building, it's this harness that keeps it in check, gives a direction.

720
00:46:01,972 --> 00:46:06,337
And I think cloth coat is a very, very, very strong harness.

721
00:46:06,337 --> 00:46:07,919
Yeah, no, I agree.

722
00:46:07,919 --> 00:46:08,611
I agree.

723
00:46:08,611 --> 00:46:17,701
Yeah, I think and I think again, maybe one thing we talked a while ago, I think those
things I think is interesting to me, like working with in the beginning of Gen.ai, I

724
00:46:17,701 --> 00:46:24,047
wasn't very excited because it was like prompting and prompting is like, like that's my
that's your job now.

725
00:46:24,047 --> 00:46:30,492
But I feel like if it's creating the hardness and how to make these applications work, I
think that's way more exciting for me.

726
00:46:31,228 --> 00:46:32,364
you

727
00:46:32,364 --> 00:46:35,588
with me a position, was a prompt, what was it?

728
00:46:35,588 --> 00:46:38,590
Full stack prompt engineer or something.

729
00:46:39,953 --> 00:46:45,080
Like, I'm sorry, like if I see this, if this is my job title, I would be ashamed.

730
00:46:45,080 --> 00:46:47,354
Like, you know, like this doesn't look interesting at all.

731
00:46:47,354 --> 00:46:52,038
Like this doesn't look interesting at all, but I feel like if it's like building an
application, mean, prompts are part of it, right?

732
00:46:52,038 --> 00:46:59,334
But like context, engineering the context and all these things, what are the tools, how,
what to keep, how to make sure the context doesn't overflow, et cetera, et cetera.

733
00:46:59,334 --> 00:47:00,005
think that's more.

734
00:47:00,005 --> 00:47:03,299
Boom, we have, I think that's it for the main articles.

735
00:47:03,299 --> 00:47:06,076
Do want to go over quickly about the two tidbits that we have?

736
00:47:06,076 --> 00:47:07,969
I'm actually, I just ran out of time.

737
00:47:07,969 --> 00:47:09,650
It's 11 o'clock here.

738
00:47:10,191 --> 00:47:11,964
So let's keep them for next time.

739
00:47:11,964 --> 00:47:13,616
We'll keep them for next time.

740
00:47:13,616 --> 00:47:14,177
All righty.

741
00:47:14,177 --> 00:47:16,041
I think this is it for today.

742
00:47:16,041 --> 00:47:18,044
Again, we're still cooking some changes.

743
00:47:18,044 --> 00:47:21,998
We've mentioned a few times, but for the listeners, things may change in the future.

744
00:47:21,998 --> 00:47:25,881
But something maybe we can announce hopefully in the next weeks.

745
00:47:26,365 --> 00:47:27,586
Thank you Bart.

746
00:47:27,626 --> 00:47:30,200
Any last final words of wisdom?

747
00:47:30,200 --> 00:47:31,381
Keep coding.

748
00:47:31,760 --> 00:47:32,550
Keep coding away.

749
00:47:32,550 --> 00:47:34,002
Burn those tokens.

750
00:47:34,924 --> 00:47:36,085
Keep plotting.

751
00:47:36,655 --> 00:47:37,216
Thank you.

752
00:47:37,216 --> 00:47:38,101
Ciao everyone.

753
00:47:38,101 --> 00:47:38,883
Ciao.