1
00:00:00,092 --> 00:00:01,612
Just a little bit?

2
00:00:04,752 --> 00:00:05,622
Okay.

3
00:00:06,549 --> 00:00:07,261
Hi everyone.

4
00:00:07,261 --> 00:00:13,196
Welcome to the Bright Signal Podcast where we cut through the noise and bring you the
latest tech news and interviews.

5
00:00:13,337 --> 00:00:14,118
My name is Murilo.

6
00:00:14,118 --> 00:00:16,219
I'm joined by my friend Bart.

7
00:00:16,240 --> 00:00:18,663
Hey Bart and Rafael.

8
00:00:18,663 --> 00:00:19,803
Hey Rafael.

9
00:00:20,485 --> 00:00:21,386
How are we doing?

10
00:00:21,386 --> 00:00:24,151
Good, good, good, good, good.

11
00:00:24,151 --> 00:00:29,983
I'll get to you Raphael in a second, I think it's the first time maybe people are hearing
about you, but maybe quickly before.

12
00:00:29,983 --> 00:00:32,244
Also we have a new intro, right?

13
00:00:32,244 --> 00:00:36,785
Some new changes, think Bart and I, alluded to it in the previous episodes.

14
00:00:37,265 --> 00:00:39,165
And a new name, exactly.

15
00:00:41,586 --> 00:00:43,047
yeah, indeed, indeed.

16
00:00:43,047 --> 00:00:48,748
Maybe I'll let Raphael introduce himself first and then we can talk about like what can
people expect going forward.

17
00:00:48,748 --> 00:00:50,929
But maybe Raphael, who are you?

18
00:00:50,930 --> 00:00:56,710
So yeah, thank you, Moïlie, for the introduction and also Bart.

19
00:00:56,710 --> 00:01:01,330
I'm really happy to be there for the first time.

20
00:01:01,690 --> 00:01:15,311
And for those of you who don't know me, which means 100 % of the audience, except my mom,
if she's listening, I'm working in finance mainly, but also teaching economics at an

21
00:01:15,311 --> 00:01:17,051
university here in Belgium.

22
00:01:17,137 --> 00:01:20,730
and also passionate about tech, technology as a whole.

23
00:01:20,730 --> 00:01:25,074
I'm not as techie as Bart and Morello, you will see it in the discussions.

24
00:01:25,074 --> 00:01:35,182
But I like to keep myself updated on mostly the entrepreneurial ecosystem, but also tech
in economics, tech in finance, or finance in tech.

25
00:01:35,182 --> 00:01:37,616
It depends on which way you are seeing it.

26
00:01:37,616 --> 00:01:40,602
And also I had to find an excuse to be on the podcast.

27
00:01:40,602 --> 00:01:46,614
So yeah, that's me and I'm really happy to go bananas on about Tech News with you guys.

28
00:01:46,615 --> 00:01:47,789
Welcome, Rafael.

29
00:01:48,253 --> 00:01:49,089
Thanks.

30
00:01:49,251 --> 00:01:50,261
indeed, indeed, indeed.

31
00:01:50,261 --> 00:01:51,602
Happy to have you.

32
00:01:51,602 --> 00:01:54,274
And I think also I think this is a nice addition.

33
00:01:54,274 --> 00:02:02,146
I think you mentioned maybe you're not as good with or not as as familiar with all the
techie things, but on the other side, you're definitely way more familiar than I am with

34
00:02:02,146 --> 00:02:05,787
all the financial news and all these different things.

35
00:02:07,388 --> 00:02:09,428
Exactly, exactly the money.

36
00:02:09,428 --> 00:02:10,849
So now we're going to be rich Bart.

37
00:02:10,849 --> 00:02:13,509
We're going to be rich.

38
00:02:13,509 --> 00:02:15,130
oh

39
00:02:15,130 --> 00:02:18,765
speak only about the money, but yeah, that's the goal.

40
00:02:19,066 --> 00:02:20,038
For sure, for sure.

41
00:02:20,038 --> 00:02:22,995
But then now we have also a new name, Bright Signal Podcast.

42
00:02:22,995 --> 00:02:24,758
What is the Bright Signal Podcast?

43
00:02:24,758 --> 00:02:26,300
Maybe I'll ask you now Bart.

44
00:02:26,301 --> 00:02:28,122
that's a bit unprepared as you ask me.

45
00:02:28,122 --> 00:02:29,953
I should have prepared this a bit better.

46
00:02:31,054 --> 00:02:38,240
we've been thinking a bit about how to evolve going forward from the Monkey Patching
podcast, right?

47
00:02:38,240 --> 00:02:48,450
I think we started the Monkey Patching podcast very, maybe very techie and then slowly
evolved a bit more to global data and AI news.

48
00:02:48,451 --> 00:02:52,505
came a bit of a combination of more the global news and like small tech updates.

49
00:02:52,505 --> 00:02:57,289
think the speaking a bit for you as well here, Marilo, I think we enjoyed what we were
doing.

50
00:02:57,289 --> 00:03:02,014
I think we also had some humble successes in terms of subscriber count and stuff.

51
00:03:02,014 --> 00:03:04,826
But we also saw like there's like it's a very crowded space, right?

52
00:03:04,826 --> 00:03:11,060
Like a lot of people are doing global tech and AI news because just simply because it's so
hot these days.

53
00:03:11,293 --> 00:03:17,618
We started thinking about we can differentiate ourselves.

54
00:03:17,618 --> 00:03:19,843
What Bright Signal is, bit of a...

55
00:03:19,843 --> 00:03:25,589
Try to differentiate more by having a more complementary co-host team.

56
00:03:25,589 --> 00:03:32,716
Where Raphael brings a lot of background from the more financial and commercial side of
things.

57
00:03:32,716 --> 00:03:41,310
But also we have a slightly different focus where we don't only just do news, we will
still do news roughly one or two times a month, still to be the title, exactly the

58
00:03:41,310 --> 00:03:45,335
schedule where we have very consciously split the news and global news and EU news.

59
00:03:45,335 --> 00:03:47,155
I think that is a big thing.

60
00:03:47,155 --> 00:03:54,140
So we'll cover the global news, EU news, but also like still small tech innovations.

61
00:03:54,188 --> 00:04:04,354
during our updates and all the other sessions we try to focus on interviews with tech
startups from Europe or investors in the broad sense of the word whether it be VCs,

62
00:04:04,354 --> 00:04:09,978
whether it's an angel, whether to also get a bit their point of view on investing in the
European ecosystem.

63
00:04:09,978 --> 00:04:12,040
I think that kind of summarizes it.

64
00:04:12,040 --> 00:04:13,094
Anything to add to that, Marjole?

65
00:04:13,094 --> 00:04:25,254
Yeah, no, I think just echo what you say, but also to, I think it also comes from, we're
interested in people that are passionate, that have nice ideas as well, and also focus a

66
00:04:25,254 --> 00:04:26,294
bit on the EU, right?

67
00:04:26,294 --> 00:04:35,414
I think we're all in Belgium, and we also noticed that a lot of these news are in the US.

68
00:04:35,737 --> 00:04:41,187
Yeah, but at the same time there is a lot of stuff that happens in the EU and we also
wanted to shed a light there, right?

69
00:04:41,187 --> 00:04:43,588
So I think that's also a bit the goal here.

70
00:04:43,588 --> 00:04:45,589
So I'm actually very excited today.

71
00:04:45,589 --> 00:04:47,543
It will be the news.

72
00:04:47,543 --> 00:04:53,656
think listeners are going to notice slight change in format, but I think largely the
content is the same.

73
00:04:53,656 --> 00:04:56,998
think like you said, it's some things that we're already discussing.

74
00:04:56,998 --> 00:05:04,431
I'm happy to have Rafael now because there's a lot of stuff about the big numbers we saw
from OpenEye and DropPick, all these different players, you know, that I was always

75
00:05:04,431 --> 00:05:14,700
a bit like what does this mean I kind of get it but like so I'm also happy to have Raphael
joining us and yeah so we're already talking to we're already planning some some startups

76
00:05:14,700 --> 00:05:24,512
as well so very very much looking forward to to what's coming but without further ado we
should just kick it off who would like to start

77
00:05:28,022 --> 00:05:36,291
Chinese AI models from labs like DeepSeek and Minimax have overtaken US arrivals in token
consumption on OpenRouter since February.

78
00:05:36,343 --> 00:05:44,384
Driven by prices as low as 2-3 dollars per million output tokens compared to roughly 15
dollars for Anthropix Cloud Sonnet 4.5.

79
00:05:44,384 --> 00:05:50,590
The cost advantage stems from cheaper energy and more efficient model architectures and
its reshaping developer behavior.

80
00:05:50,590 --> 00:05:57,737
One Hong Kong developer now routes 80 % of his work through Moonshot's Kimi model to avoid
spending 900 dollars a day on Cloud alone.

81
00:05:57,737 --> 00:06:04,092
Alibaba has moved to capitalize, creating a new Alibaba Token Hub business group led by
CEO Eddie Wu.

82
00:06:04,192 --> 00:06:08,584
betting that token economics will define the next phase of AI competition.

83
00:06:08,585 --> 00:06:10,011
China on the rise.

84
00:06:10,448 --> 00:06:14,975
Yeah, maybe we should explain what's an AI token.

85
00:06:15,017 --> 00:06:17,831
Did you know that before the article?

86
00:06:17,833 --> 00:06:28,383
we use a lot so I think yeah we consume a lot of AI tokens yeah yeah so yeah but like
maybe yeah if you like to explain what is what is an AI token

87
00:06:28,384 --> 00:06:31,906
So personally, as I'm not a developer, I don't really use AI.

88
00:06:31,906 --> 00:06:35,758
I'm just like a subscriber of AI models, et cetera.

89
00:06:35,758 --> 00:06:45,042
So yeah, so I guess that AI is just the price of AI tokens or the price of AI usage.

90
00:06:45,042 --> 00:06:46,283
So basically it's a fuel.

91
00:06:46,283 --> 00:06:47,684
It's the fuel for AI.

92
00:06:47,684 --> 00:06:53,676
So each time you just type on chat GPT or cloud, hello, how are you?

93
00:06:53,794 --> 00:06:55,889
It just costs AI tokens.

94
00:06:55,890 --> 00:07:00,163
Yeah, also the way we works a bit behind the scenes, right?

95
00:07:00,163 --> 00:07:09,011
Is that you have text like this and the text is almost broken down into, it's almost like
syllables, but we call it like in machine learning, right?

96
00:07:09,011 --> 00:07:11,313
If you're building these things, it's called tokenization.

97
00:07:11,313 --> 00:07:15,517
So, and then for example, software can maybe be split into software.

98
00:07:15,517 --> 00:07:16,599
This is just hypothetical.

99
00:07:16,599 --> 00:07:18,299
I'm not saying this is how it is.

100
00:07:18,379 --> 00:07:21,402
And each one of these things are converted to numbers that are sent to the machine.

101
00:07:21,402 --> 00:07:23,083
But then each one of these things are tokens.

102
00:07:23,083 --> 00:07:25,385
So these are the AI tokens and

103
00:07:25,651 --> 00:07:29,323
when you pay for them, you pay for each token that you use.

104
00:07:29,323 --> 00:07:37,565
So if you have a very big text that you want to send to models and the text that sent
back, basically those are the in and out tokens, right?

105
00:07:37,565 --> 00:07:39,689
They actually have different prices.

106
00:07:39,709 --> 00:07:47,973
And if you have the models that kind of talk to themselves, they think out loud, they also
produce those tokens in between, which also has a different price.

107
00:07:47,973 --> 00:07:51,014
But basically if you have a lot of tokens, you spend more money.

108
00:07:51,014 --> 00:07:54,316
And if the tokens are expensive, then these scales as well.

109
00:07:54,316 --> 00:07:55,096
And I think this is also

110
00:07:55,096 --> 00:08:08,696
what it's saying here like OpenAI and Tropic they have the today's the best models still
but the Chinese models they're actually they open source and they are way more token

111
00:08:08,696 --> 00:08:18,016
efficient right they well I'm not sure if they're token efficient they're more cheap
they're cheaper per token that's for sure right and I think

112
00:08:18,017 --> 00:08:19,130
They're not like in there.

113
00:08:19,130 --> 00:08:27,969
think well looking a bit apart here to see if you agree with me I think they're not as
good as the closed models, but they're still pretty good right

114
00:08:27,971 --> 00:08:29,612
Maybe just to add to the token stuff.

115
00:08:29,612 --> 00:08:36,415
So just as a very, very, very rough proxy, you can think a bit about if you type, four
characters is roughly one token.

116
00:08:36,696 --> 00:08:40,279
And typically models have a pricing on input tokens.

117
00:08:40,279 --> 00:08:45,002
So how many characters do send to the model and how much comes out?

118
00:08:45,002 --> 00:08:46,442
Those are output tokens.

119
00:08:46,442 --> 00:08:47,803
And again, they're output tokens.

120
00:08:47,803 --> 00:08:50,851
If it's one token, you get four characters out, roughly.

121
00:08:50,851 --> 00:08:52,186
This is very rough.

122
00:08:52,523 --> 00:08:58,071
And what we see in these is that China outputs a lot of, well, you call them open source,
you call them open weight models.

123
00:08:58,071 --> 00:09:01,415
I think there's a bit of a debated topic on open sources in this context.

124
00:09:01,415 --> 00:09:05,340
But they are cheaper to use, I would say.

125
00:09:05,340 --> 00:09:08,191
Open router, sorry?

126
00:09:08,191 --> 00:09:17,088
much cheaper compared to like the prices if you compare Minimax or Tropic I think it's
like, I don't know, 15 and what did they give numbers here?

127
00:09:17,088 --> 00:09:19,360
Like two to three dollars versus 15.

128
00:09:19,360 --> 00:09:20,082
So, yeah.

129
00:09:20,082 --> 00:09:23,222
But the.

130
00:09:23,223 --> 00:09:25,754
The question is a bit how much cheaper are they actually, right?

131
00:09:25,754 --> 00:09:28,476
Like for a developer to use it, they are much cheaper.

132
00:09:28,476 --> 00:09:29,926
If you use them through OpenRouter.

133
00:09:29,926 --> 00:09:38,490
So OpenRouter is a bit of a system that lets you use a central entry point, a bit of a
proxy and that you can say, I want to talk to this model from Anthropic or I want to talk

134
00:09:38,490 --> 00:09:42,884
to this model from DeepSeek or to this model from Alibaba or whatever.

135
00:09:42,884 --> 00:09:46,967
like based on whatever model you talk to, you pay a cost, right?

136
00:09:47,132 --> 00:09:52,958
It's very expensive to talk directly to Entropix models or to Gemini models or to OpenAI
models.

137
00:09:52,958 --> 00:09:56,683
But I would argue that they also price them commercially.

138
00:09:56,683 --> 00:10:00,086
There's also development costs on this, we need to take this into account.

139
00:10:00,086 --> 00:10:08,456
While if you actually look at open router, where these other models are hosted, they are
typically just by server farms that are hosting these open weight models.

140
00:10:08,456 --> 00:10:09,376
It's not...

141
00:10:09,462 --> 00:10:17,306
deep-seek themselves often like it's not Alibaba themselves like these are server parts
that are hosting open-weight models and like they just need to make a margin on on the

142
00:10:17,306 --> 00:10:28,840
compute right like they don't need to take into account the whole development so I think
it's logical to some extent that today they are cheaper right I think performance-wise

143
00:10:28,841 --> 00:10:32,393
they they are very comparable to be honest

144
00:10:32,747 --> 00:10:41,287
So the thing that we are developing where we go from capturing any type of information,
like the startup I'm working on, we capture any type of information, we try to structure

145
00:10:41,287 --> 00:10:42,987
that automatically for you.

146
00:10:42,987 --> 00:10:47,287
While we're in the test phase, we're also using OpenRoute, we can very easily switch to a
different model.

147
00:10:47,447 --> 00:10:50,727
The difference is really minimal on a lot of different cases.

148
00:10:50,727 --> 00:11:00,547
From the moment it becomes very complex, you tend to end up in the state of the art
models, but even there, the gap with the Chinese models is becoming very small.

149
00:11:00,548 --> 00:11:01,186
Yeah.

150
00:11:01,186 --> 00:11:05,346
So you're also doing what's written in the news.

151
00:11:05,606 --> 00:11:08,406
it's the kind of the 80-20 rule.

152
00:11:08,406 --> 00:11:13,886
So 80 % of the job is doable with Chinese AI application.

153
00:11:13,886 --> 00:11:22,739
And then the rest, the most complex part can also be done by more complex AI, which could
cost more.

154
00:11:22,740 --> 00:11:24,192
Which today still costs more.

155
00:11:24,192 --> 00:11:26,825
But the question is whether it will still cost more tomorrow.

156
00:11:26,825 --> 00:11:32,885
I think like what is the bigger thing here is more also the geopolitical competition that
is happening here.

157
00:11:32,885 --> 00:11:40,912
Because what we only think that we see through OpenRouter is what it costs for these
server farms to basically host an open model.

158
00:11:40,912 --> 00:11:42,105
That's what we see with OpenRouter.

159
00:11:42,105 --> 00:11:43,856
But we don't really see the cost that...

160
00:11:43,912 --> 00:11:46,363
these Chinese companies haven't trading these models.

161
00:11:46,363 --> 00:11:48,183
Like it's very, it's not very transparent.

162
00:11:48,183 --> 00:11:50,705
There's a lot of discussion that is also being sponsored by government.

163
00:11:50,705 --> 00:11:51,877
don't, we don't really know.

164
00:11:51,877 --> 00:12:02,454
but what you do see is that like, this is like, the U S has a very strong competitive
advantage on AI and everybody thought like a year ago, like it's miles ahead of everybody

165
00:12:02,454 --> 00:12:04,785
else, but actually China is very, very close.

166
00:12:04,785 --> 00:12:05,336
Right.

167
00:12:05,336 --> 00:12:06,596
And it's right to.

168
00:12:06,733 --> 00:12:11,793
by doing this, by serving all these things at a very low cost, is making people think
twice, right?

169
00:12:11,793 --> 00:12:20,653
Like it's not, from the moment it's like 10 times as cheap, it's not something that you
don't think about anymore, right?

170
00:12:20,913 --> 00:12:30,393
If one costs a dollar and the other costs 95 cents, then you think, well, no, let's do the
dollar because we know the context, it's Americans, you can probably trust this.

171
00:12:30,453 --> 00:12:36,313
But from the moment that the price has become that big, like the stage, it's more
instrumental, right?

172
00:12:36,314 --> 00:12:37,118
Yeah, for sure.

173
00:12:37,118 --> 00:12:38,151
Also,

174
00:12:38,275 --> 00:12:40,755
And also like very recently, right?

175
00:12:40,755 --> 00:12:46,335
We're not going to cover this today, but the source code for cloud code also came out,
right?

176
00:12:46,335 --> 00:12:51,375
And I think people are a lot of looking at the how the actual application works.

177
00:12:51,375 --> 00:12:58,255
And I saw in a lot of different dissections, let's say that the model wasn't maybe as

178
00:12:58,256 --> 00:13:00,197
there wasn't the key differentiator, let's say.

179
00:13:00,197 --> 00:13:06,570
mean, they said the model of course is better, but they were also saying that there's a
lot of scaffolding around these applications, right?

180
00:13:06,570 --> 00:13:11,933
And I mean, you can actually use today Minimax models, which actually I think is the most
popular, I wanna say.

181
00:13:11,933 --> 00:13:12,904
Yeah, Minimax.

182
00:13:12,904 --> 00:13:14,846
with cloud code as well, right?

183
00:13:14,846 --> 00:13:21,832
You can also have a subscription for coding with Minimax and I actually see a lot of
people that are opting for this instead of Anthropic.

184
00:13:21,832 --> 00:13:27,638
yeah, I think it's a, I'm not sure, like I'm wondering if they're still gonna be that much
ahead.

185
00:13:27,638 --> 00:13:34,762
I mean, that much ahead, like you said, the gap is already bridging a bit, but how long
are they gonna stay ahead until people just say, okay, I'm just gonna go for the cheaper

186
00:13:34,762 --> 00:13:37,828
one because it's good enough, quote unquote, right?

187
00:13:37,828 --> 00:13:44,551
Yeah, I think that the US has a competitive advantage, as you said, Barton, more the
models as such.

188
00:13:44,551 --> 00:13:56,316
But what's important in this article, and it shows a bit what's going on on the AI race,
is that sometimes it just comes down to one thing, which is the cost.

189
00:13:56,316 --> 00:13:58,937
And in this case, it's the energy costs.

190
00:13:58,937 --> 00:14:00,657
I think that China has...

191
00:14:00,761 --> 00:14:03,223
a huge competitive advantage on this part.

192
00:14:03,223 --> 00:14:10,908
We see huge investments in China on electricity generation, like with renewable energy.

193
00:14:10,908 --> 00:14:18,341
There is an interesting figure for electricity generation of the US compared to China.

194
00:14:18,341 --> 00:14:23,278
you look at the numbers of electricity generation in terawatts.

195
00:14:23,278 --> 00:14:32,171
In 2010, China and the US was at the same level, meaning 4,000 terawatt an hour.

196
00:14:32,171 --> 00:14:40,905
And right now, China has doubled its electricity generation, which is also an investment
in AI.

197
00:14:40,905 --> 00:14:47,601
um

198
00:14:48,026 --> 00:14:52,621
It has an efficient electricity grid, China, while the West...

199
00:14:52,623 --> 00:15:00,932
Yeah, the infrastructure is a bit aging and I think that energy is something that most
people overlook, to be honest.

200
00:15:00,932 --> 00:15:04,267
recently, it is also interesting, recently Mr.

201
00:15:04,267 --> 00:15:11,002
Frank, so an Nvidia CEO, um compared the AI race to a five-layer escape.

202
00:15:11,143 --> 00:15:14,625
At the top, there are the AI applications.

203
00:15:14,818 --> 00:15:17,882
and then the LLMs, the infrastructure.

204
00:15:17,943 --> 00:15:21,388
And at the very bottom, there is just the energy.

205
00:15:21,388 --> 00:15:23,412
And this is part of the competition as well.

206
00:15:23,412 --> 00:15:29,081
So yeah, I think that the rise of China is also due to its efficiency in producing energy.

207
00:15:29,082 --> 00:15:34,362
Yeah, and I think maybe even there, like if you look at these five layers, like you start
with applications, you end up with energy.

208
00:15:34,362 --> 00:15:44,782
But I would even argue that potentially at the application level there, China also has
some advantages, like there are way ahead in terms of robotics and these type of

209
00:15:44,782 --> 00:15:50,142
applications, integrating into AI and home appliances goes very quickly.

210
00:15:50,162 --> 00:15:56,142
a lot of the whole supply chain to even build this into hardware just only exists in
China.

211
00:15:56,143 --> 00:16:00,063
It's interesting to see how this moves forward.

212
00:16:00,064 --> 00:16:03,084
Indeed, Maybe moving on?

213
00:16:03,365 --> 00:16:06,105
Yes, maybe Figma.

214
00:16:06,106 --> 00:16:15,220
Figma introduced write access for AI agents via its MCP server, allowing tools like Cloud
Code and Cursor to design directly on the Figma campus.

215
00:16:15,220 --> 00:16:19,492
Significantly improved upgrade from the previous read-only integration.

216
00:16:19,492 --> 00:16:27,796
The feature means AI agents can now generate components, variables, and full screens using
a team's existing design system and real Figma primitives.

217
00:16:27,796 --> 00:16:36,060
It's a strategic bet that could shape Figma's role in product development as AI agents
increasingly become the starting point for prototyping.

218
00:16:36,060 --> 00:16:37,801
rather than Figma itself.

219
00:16:38,061 --> 00:16:41,194
So maybe to start, what is Figma?

220
00:16:41,194 --> 00:16:45,186
And maybe I'll, I think you have the most experience out of the three of us part with
Figma.

221
00:16:45,186 --> 00:16:47,346
So I'll let you take this one.

222
00:16:47,433 --> 00:16:52,645
Well, I'll explain from my own experience, it's probably not giving the full picture.

223
00:16:52,645 --> 00:16:56,938
I would describe Figma as a mock-up and design tool.

224
00:16:56,938 --> 00:17:04,061
Let's say if you want to start building an application, you can very quickly mock-up like
this is how the screen of the application could look like on Figma.

225
00:17:04,061 --> 00:17:09,283
I think you can do way more with Figma, but to me that's a bit like the very logical use
case for Figma.

226
00:17:09,285 --> 00:17:14,885
challenge that they have in the current climate is that it's a bit twofold.

227
00:17:14,886 --> 00:17:26,586
But I'll maybe go into the MCP part first, because the article is a bit like, Figma
allowed read access to everything that they have on the platform via MCP.

228
00:17:26,586 --> 00:17:31,526
That means through your chat GPT, through your cloud, through whatever AI tool you can
talk to it.

229
00:17:31,526 --> 00:17:33,766
But you could only read from the platform.

230
00:17:33,767 --> 00:17:36,447
A bit with the idea, and I think a lot of platforms

231
00:17:36,753 --> 00:17:45,458
I Hope taught that, that their platform will still have a significant place in a new
world, right?

232
00:17:45,458 --> 00:17:47,190
In an AI native world.

233
00:17:47,190 --> 00:17:56,717
And I think what Figma now does is that they allow also write access, is basically that
they realize that users will maybe not use their...

234
00:17:56,941 --> 00:18:03,107
platform like the front end to the platform anymore in the future and they will just use
an AI agent to do something on the platform.

235
00:18:03,107 --> 00:18:04,328
Which is a big shift, right?

236
00:18:04,328 --> 00:18:10,792
Like you have much less control over your user if it's just the agent of the user creating
something on your platform.

237
00:18:10,792 --> 00:18:19,719
uh So I think that is interesting to think about like from if you're building a SaaS like

238
00:18:19,720 --> 00:18:29,682
Does the visual entry point of that sounds like how can you still expect your users to
keep using that going forward or will they just ask a question on Chess GPT and Chess GPT

239
00:18:29,682 --> 00:18:32,355
will read or write something to the platform.

240
00:18:32,356 --> 00:18:37,530
Yeah, I think a lot of the times the entry point is becoming more and more of these
chatbots, right?

241
00:18:37,530 --> 00:18:40,192
Clouds, Cursor, ChaiJPT.

242
00:18:40,192 --> 00:18:48,235
But I also think it's, I think, well, maybe I'm looking a bit at you again, Bart, but I
think Figma was really like, someone would say, this is the UI I want, and then you give

243
00:18:48,235 --> 00:18:53,338
it to a front-end developer, and then they'll go bananas and work and do the animations
and all these different things, right?

244
00:18:53,338 --> 00:18:59,953
But I feel like now, maybe people, because Figma maybe wasn't playing ball, let's say, to
make it easier to play with the agents, now it

245
00:18:59,953 --> 00:19:08,028
easier for someone to actually just talk to Claude and say build something like this in
HTML or whatever and then they will kind of do the same mock-up.

246
00:19:08,029 --> 00:19:09,060
Exactly, exactly.

247
00:19:09,060 --> 00:19:21,267
So I think that is the argument you can make that this is just a Figma specific case in a
sense that it has become very easy for everybody to just make a mock-up with either Claude

248
00:19:21,267 --> 00:19:25,111
Cote or Lovable or whatever and you don't even need Figma anymore.

249
00:19:25,399 --> 00:19:31,259
And I think maybe them not allowing people to write, like only read made it even worse for
them, right?

250
00:19:31,259 --> 00:19:41,039
Like I think, so maybe that's also a bit them saying, okay, we're not playing hardball
anymore because we know it's a losing battle, right?

251
00:19:41,259 --> 00:19:41,798
Maybe.

252
00:19:41,798 --> 00:19:47,746
that Figma is becoming like just a supplier to Claude and Chattipiti for example.

253
00:19:47,747 --> 00:19:54,641
Yeah, maybe it looks more like a plugin to Cloud than an actual platform itself.

254
00:19:55,105 --> 00:20:01,447
So they lose basically all their direct relationship with the final users.

255
00:20:01,448 --> 00:20:05,528
Well, that's what they are indeed at risk at, right?

256
00:20:05,529 --> 00:20:18,349
I think the argument that you can make is that, like from a UX, UI design perspective, is
that companies still want some unified approach to this, hopefully.

257
00:20:18,429 --> 00:20:27,394
And that is the value that Figma brings, that it allows you in the same way always to look
at new mock-ups, that there is a team way of working on this.

258
00:20:27,394 --> 00:20:35,505
which you all need to define yourself if you do this in Loveable, of course, but the
arguments become much slimmer than they used to be, Because it used to be, Figma used to

259
00:20:35,505 --> 00:20:38,609
be the king of mock-up design.

260
00:20:38,610 --> 00:20:39,581
Yeah, exactly.

261
00:20:39,581 --> 00:20:54,097
also to some extent pricing power to the also in the company, which is also for, I think
for SaaS, I think that valuations of SaaS are built on the pricing power, meaning that

262
00:20:54,097 --> 00:20:57,780
it's built also on the stickiness of requiring revenue.

263
00:20:57,780 --> 00:21:00,531
When you have a base of users, you just know that

264
00:21:00,531 --> 00:21:09,160
for CC users to replace all the stuff and just go to another SAS, it will be a bit
complicated and costly.

265
00:21:09,160 --> 00:21:13,955
Now you just need to ask CharsGPT and Cloud to do that, which is.

266
00:21:14,234 --> 00:21:16,926
I think that's for Figma is a very challenging thing.

267
00:21:16,926 --> 00:21:28,369
And because at one side, like not opening up full write access for their MCP, like it
potentially makes the users think like, yeah, why am I even paying for this?

268
00:21:28,369 --> 00:21:37,728
I can also do this in love below whatever, like I maybe don't need Figma, but at the same
time, if they do open it up and start users like just instrument Figma using an AIJ agent,

269
00:21:37,728 --> 00:21:39,479
then they're maybe going to think, yeah.

270
00:21:39,767 --> 00:21:42,409
I mean, I'm just using it as a generative layer.

271
00:21:42,409 --> 00:21:46,271
Like, is it even worth the $20 a month I'm paying per user per month?

272
00:21:46,271 --> 00:21:50,470
So I think it's a very challenging situation that they're in.

273
00:21:50,470 --> 00:21:52,705
It sounds a bit like a lose-lose.

274
00:21:52,706 --> 00:22:01,408
Do you think Figma is the only case where a SaaS service can just not disappear but be
less useful?

275
00:22:01,408 --> 00:22:06,325
Or does it mean that no SaaS could be ever made again?

276
00:22:06,512 --> 00:22:10,904
No, no, I'm very, I'm very against the whole SaaS are not relevant anymore.

277
00:22:10,904 --> 00:22:12,286
think SaaS are very relevant.

278
00:22:12,286 --> 00:22:20,211
I think why Sigma specifically is under pressure is because what Gen.AI today is extremely
good at is the whole AI coding part.

279
00:22:20,211 --> 00:22:24,182
And Figma is exactly on that path, like somewhere in between that path.

280
00:22:24,182 --> 00:22:29,595
Like you start a project, you need Figma somewhere in between, and then you're actually
going to build the application.

281
00:22:29,710 --> 00:22:36,305
They have the challenge that they're on a path where you actually don't really need them,
but it becomes a nice to have.

282
00:22:36,305 --> 00:22:42,449
But I think a lot of SaaS platforms, they're crucial for compliance, they're crucial for
having very clear structure processes.

283
00:22:42,449 --> 00:22:45,513
I think it's more of the specific field that Figma is in.

284
00:22:45,514 --> 00:22:46,866
Yeah, I think I agree.

285
00:22:46,866 --> 00:22:53,517
think it's, it's, think with AI, I think it changed a bit the software development life
cycle, let's Right.

286
00:22:53,517 --> 00:22:58,044
And I think the other example of the giving the article is linear with the, which is like
a.

287
00:22:58,287 --> 00:23:00,778
ticketing or like project management tool, right?

288
00:23:00,778 --> 00:23:09,755
But I think when, because AI now changes the way we are developing software, then I also
think it's, people are rethinking how we do these things, right?

289
00:23:09,755 --> 00:23:17,518
The very traditional, like you had this, you hand it to someone else and you have to make
this right in the UI because if you don't, then you're have to make changes and then it's

290
00:23:17,518 --> 00:23:20,310
gonna be a lot of back and forth and sending over between teams.

291
00:23:20,310 --> 00:23:21,540
That's not the case anymore, right?

292
00:23:21,540 --> 00:23:24,692
So I think the software today that is,

293
00:23:24,692 --> 00:23:29,947
the traditional software development lifecycle tools may need to be adapted a bit, right?

294
00:23:29,947 --> 00:23:31,708
They may need to reinvent themselves.

295
00:23:31,708 --> 00:23:37,824
And I think there may be new tools as well that are more appropriate for the new coding
era, let's say.

296
00:23:37,824 --> 00:23:42,057
But it's also hard, I think, to tell today what's good and what's not good because there's
so many apps.

297
00:23:42,057 --> 00:23:45,290
I think also AI coding made it easy for people to build apps.

298
00:23:45,290 --> 00:23:47,652
So think there's a lot of noise, right?

299
00:23:47,652 --> 00:23:50,173
So it's hard to find that bright signal.

300
00:23:50,175 --> 00:23:52,784
Anyways, anything else you want to say here before I move on maybe?

301
00:23:52,784 --> 00:23:53,935
Then what do we have next?

302
00:23:53,935 --> 00:23:55,421
Raphael?

303
00:23:55,421 --> 00:24:05,346
Oracle begins laying off up to 30,000 employees, roughly 18 % of its workforce, with
termination emails sent at 6 a.m.

304
00:24:05,346 --> 00:24:10,228
local time across the U.S., India, Canada, and Mexico.

305
00:24:10,269 --> 00:24:13,750
No prior warning from HR or direct managers.

306
00:24:13,750 --> 00:24:20,544
The cuts are directly tied to Oracle's aggressive expansion into AI data center
infrastructure.

307
00:24:20,740 --> 00:24:28,074
with the layoffs expected to free up $8 to $10 billion in cash flow to fund the build-out.

308
00:24:28,195 --> 00:24:37,281
The move follows a pattern of massive tech layoffs this year, including $30,000 at Amazon
and a 40 % headcount reduction at Block.

309
00:24:37,282 --> 00:24:40,199
That's not a nice way to start your day, right?

310
00:24:40,199 --> 00:24:40,871
A 6 a.m.

311
00:24:40,871 --> 00:24:43,314
email saying goodbye.

312
00:24:45,504 --> 00:24:46,785
That's massive.

313
00:24:47,187 --> 00:24:55,870
I think it's the biggest massive thing that's happened in the US workforce.

314
00:24:56,116 --> 00:24:57,523
Okay, interesting.

315
00:24:57,523 --> 00:24:59,835
This will never fly in the EU, I think.

316
00:24:59,835 --> 00:25:02,757
Because they said it was like without prior warning or anything.

317
00:25:02,757 --> 00:25:06,060
They just said your role is not useful anymore.

318
00:25:06,060 --> 00:25:10,614
Because I think it was part of the argument is that it's a restructuring of the company,
right?

319
00:25:10,695 --> 00:25:13,658
And then the roles are not required anymore.

320
00:25:13,658 --> 00:25:16,301
So they just kind of say like, okay, we're done here.

321
00:25:16,663 --> 00:25:18,163
Yeah, it's weird, huh?

322
00:25:18,163 --> 00:25:23,183
I think also the only time I heard something like this was with Twitter, now X, right?

323
00:25:23,183 --> 00:25:25,323
When Elon Musk started firing people.

324
00:25:25,323 --> 00:25:28,743
It was just an email saying, it's like, okay, pack your things and you're done.

325
00:25:28,883 --> 00:25:32,223
It's very, very rough, very strange.

326
00:25:32,224 --> 00:25:38,278
And I think it was needed to free up cash to invest in AI infrastructure.

327
00:25:38,399 --> 00:25:46,823
Is it the big AI replacement for you or is it just a way to free up cash and just to
invest in technology?

328
00:25:46,824 --> 00:25:48,486
I don't think it's an AI replacement thing.

329
00:25:48,486 --> 00:25:54,932
think what they're doing is that they're betting that their business will be AI compute
going forward.

330
00:25:54,932 --> 00:26:00,614
And that this comes in a context where they took on a huge amount of new debt.

331
00:26:00,614 --> 00:26:03,846
I think it's almost 60 billion earlier in the year.

332
00:26:03,846 --> 00:26:07,238
And apparently their free cash flow went negative.

333
00:26:07,238 --> 00:26:10,198
So there was already rumors that there were going to be layoffs.

334
00:26:10,335 --> 00:26:13,635
So these are the layoffs that are happening because of that.

335
00:26:13,775 --> 00:26:22,495
And I think this is because they think that these apparently the service departments that
they will not bring the value that they hope to bring over the coming years and that they

336
00:26:22,495 --> 00:26:26,395
will just put all their their acts in the AI compute basket.

337
00:26:26,396 --> 00:26:33,556
It's rumored that this frees up eight to 10 billion in annual cashflow.

338
00:26:33,557 --> 00:26:37,497
But these 30,000 layoffs.

339
00:26:37,498 --> 00:26:39,621
Yeah, I'm also, I yeah, I saw this year.

340
00:26:39,621 --> 00:26:45,086
I mean, also on the article says indeed 58 billion in debt in just two months.

341
00:26:45,247 --> 00:26:50,894
It also says that they tried to go to banks, but many banks reported stepping back from
financing this data center project.

342
00:26:50,894 --> 00:26:52,556
So they're really trying to find money.

343
00:26:52,556 --> 00:26:54,377
They probably already have debt.

344
00:26:54,647 --> 00:26:58,469
And I mean here it says that 95 % jump in net income.

345
00:26:58,469 --> 00:27:01,850
they apparently they do have money coming in.

346
00:27:01,850 --> 00:27:08,274
They are checking loans and they are cutting people because they want to massively into
these data centers.

347
00:27:08,274 --> 00:27:12,218
The thing for me is like, I feel like we've been hearing about this so much, right?

348
00:27:12,218 --> 00:27:15,740
From Google, from Entropic, from OpenAI.

349
00:27:15,740 --> 00:27:16,661
had, what's the name?

350
00:27:16,661 --> 00:27:20,314
There was a government related thing, project, or Stargate, I want to say, right?

351
00:27:20,314 --> 00:27:21,434
There was an OpenAI announcement.

352
00:27:21,434 --> 00:27:24,166
a while ago with the administration.

353
00:27:24,166 --> 00:27:26,037
I hear so much of this.

354
00:27:26,037 --> 00:27:27,532
thing or I'm not sure what.

355
00:27:27,532 --> 00:27:28,382
data centers.

356
00:27:28,382 --> 00:27:32,464
I think but I don't think it was just open AI but open AI was a big player there as well.

357
00:27:32,464 --> 00:27:34,515
And then like the funding got pulled.

358
00:27:34,515 --> 00:27:34,926
Yeah.

359
00:27:34,926 --> 00:27:42,759
And like I remember it got pulled when the Chinese models came out and the performance and
it disrupted and then I don't know but I feel like there's a lot of discussion on this.

360
00:27:42,759 --> 00:27:44,640
I remember one time there was one article.

361
00:27:44,640 --> 00:27:49,858
I don't remember if he covered it that it was like Anthropic investing on Google or
something like that.

362
00:27:49,858 --> 00:27:57,007
It was just like one big play investing on another investment on another and it really
feels like it's a bubble that is going to burst.

363
00:27:57,007 --> 00:28:04,880
because you have all these investments being made and it looks like there's a lot of money
circulating through everyone but it's really just that that group right and now oracle

364
00:28:04,880 --> 00:28:12,974
apparently making that switch right do you i don't know i don't know how to feel about
this to be honest i'm not sure if it's

365
00:28:13,111 --> 00:28:17,678
It's not even a distress company because they make good results, I think.

366
00:28:17,678 --> 00:28:20,460
But I think they are just betting on AI.

367
00:28:20,460 --> 00:28:29,170
They put their balance sheet at risk on AI and this is bet as Amazon is doing right now
with investment in open AI.

368
00:28:29,170 --> 00:28:34,935
And I think that investors know that this is a bet when you see at the Oracle.

369
00:28:34,936 --> 00:28:39,076
stock, it has been punished last year.

370
00:28:39,216 --> 00:28:48,956
I think it lost more than 50 % of its value from September or November until now.

371
00:28:49,296 --> 00:28:52,296
I think that investors are not really convinced.

372
00:28:52,298 --> 00:29:02,285
I agree with Raffel, it looks like they're so confident in that AI will be the future that
they're willing to cut 18 % of their workforce to fund it basically.

373
00:29:03,867 --> 00:29:14,966
And it's a bit like, instead of having the operational expense of all these people on
payroll, we are going to use this amount of money to put it into capital expenditure and

374
00:29:14,966 --> 00:29:19,339
just see it as an investment going forward and hardware and computer we can buy with it.

375
00:29:19,743 --> 00:29:26,186
And do you think it's confidence in AI or it's fear that they're gonna be irrelevant
because everyone else is moving AI?

376
00:29:26,186 --> 00:29:28,971
Do think it's a push or a pull?

377
00:29:29,248 --> 00:29:30,330
I think it's a combination, right?

378
00:29:30,330 --> 00:29:32,244
Like you're preparing yourself for the future.

379
00:29:32,244 --> 00:29:36,222
And everybody is going into that direction.

380
00:29:36,223 --> 00:29:40,976
But I think in this ecosystem, there are a lot of different bets.

381
00:29:40,976 --> 00:29:51,146
When you see even for Oracle in September 2025, they just made an announcement where they
will do a deal with OpenAI.

382
00:29:51,146 --> 00:30:01,332
OpenAI would purchase 30 billion of dollars of computing power from Oracle.

383
00:30:01,332 --> 00:30:02,923
people were just OK.

384
00:30:02,952 --> 00:30:10,300
Thus, OpenAI actually has 30 billion of dollars to spend in computing power from Oracle.

385
00:30:10,301 --> 00:30:11,972
That's the question.

386
00:30:11,973 --> 00:30:14,244
Yeah, maybe one last thing before moving on.

387
00:30:14,244 --> 00:30:16,326
Like these are the teams that were hit the hardest.

388
00:30:16,326 --> 00:30:22,530
I was just wondering like if I could reading this, I would actually say, okay, they're
actually deparatizing this and prioritizing that.

389
00:30:22,530 --> 00:30:30,754
Maybe just to read in the article is that Revenue Health Sciences, the SAS and Virtual
Operations Services and NetSuite's India Development Center.

390
00:30:30,755 --> 00:30:34,758
Does this, like, I don't know, does this numbers tell you a story or?

391
00:30:34,758 --> 00:30:38,615
Not really, because for me it was hard to see what this meant.

392
00:30:38,616 --> 00:30:48,696
Well, probably it's bit easier to cut these people, also with the promise that a lot of
these service roles can potentially be automated.

393
00:30:48,697 --> 00:30:52,637
So it's a bit maybe also de-risking there.

394
00:30:52,638 --> 00:30:57,958
But maybe it's also like Oracle is built on a lot of legacy services, right?

395
00:30:57,958 --> 00:31:02,978
Like maybe they have seen a decline in the last years on these type of services.

396
00:31:02,978 --> 00:31:05,098
I don't know, to be honest.

397
00:31:05,099 --> 00:31:06,371
We'll see, we'll see.

398
00:31:06,371 --> 00:31:09,836
I know that there is a lot of speculation currently on Reddit.

399
00:31:09,836 --> 00:31:13,239
A lot of fired people just raising voice on the forum.

400
00:31:13,239 --> 00:31:17,271
So I think we will know more in the coming days also.

401
00:31:17,272 --> 00:31:19,159
We'll see, we'll pay attention to it.

402
00:31:19,384 --> 00:31:20,707
What is next,

403
00:31:20,774 --> 00:31:29,102
A Los Angeles jury found Meta and Google liable for intentionally building addictive
social media platforms that harmed a young woman's mental health.

404
00:31:29,102 --> 00:31:30,944
Awarding her 6 million in damages.

405
00:31:30,944 --> 00:31:41,513
The woman, known as Kaylee, testified she started using Instagram at age 9, YouTube at 6,
encountering no age verification and was later diagnosed with anxiety, depression and body

406
00:31:41,513 --> 00:31:42,693
dysmorphia.

407
00:31:42,694 --> 00:31:48,038
The verdict is expected to have implications for hundreds of similar cases now winding
through US courts.

408
00:31:48,172 --> 00:31:53,082
came just one day after a separate New Mexico jury also found metal liable for

409
00:31:53,083 --> 00:31:54,954
Yeah, I'm happy to see this.

410
00:31:54,955 --> 00:31:55,717
Bye.

411
00:31:55,718 --> 00:32:07,839
I think social media, especially social media as we know it today, which is very much like
the algorithm, is very much made to make you addicted to it.

412
00:32:07,840 --> 00:32:11,820
It's also, I think it's very influential to kids.

413
00:32:12,080 --> 00:32:17,760
I don't think it improves kids, their self-image, their well-being.

414
00:32:17,761 --> 00:32:21,597
And the challenge is that there was or is almost no regulation on that.

415
00:32:21,597 --> 00:32:32,520
And what think what everybody is saying is that this is a bit like this meta YouTube
moment is a bit like we've seen in the cigarettes with the Marlboro moment where at least

416
00:32:32,520 --> 00:32:38,276
we have like a formal precedent saying that this is indeed harmful.

417
00:32:38,277 --> 00:32:41,342
and basically opening up for regulation.

418
00:32:41,343 --> 00:32:45,148
And I think that is something to be hopeful for.

419
00:32:45,149 --> 00:32:45,730
That I agree.

420
00:32:45,730 --> 00:32:50,734
I think also here the the well, maybe the actual story is not like that.

421
00:32:50,734 --> 00:32:51,862
It's like it's for children, right?

422
00:32:51,862 --> 00:32:53,496
It's really specific for children.

423
00:32:53,496 --> 00:32:53,746
Right.

424
00:32:53,746 --> 00:32:55,374
And that's the main case here.

425
00:32:55,374 --> 00:33:05,324
I don't think it's just saying that if it was like someone that started using YouTube and
TikTok and whatever at age 20, I don't think that would fly as much.

426
00:33:05,324 --> 00:33:06,184
Right.

427
00:33:06,185 --> 00:33:07,926
But I do.

428
00:33:08,086 --> 00:33:09,867
But I do agree that that

429
00:33:09,947 --> 00:33:10,307
I don't know.

430
00:33:10,307 --> 00:33:12,918
think these things aren't designed to be very addictive.

431
00:33:12,918 --> 00:33:23,852
I remember like years ago there was like whistleblower saying that there was research
within Metta about Instagram and how this was linked to mental illnesses for young

432
00:33:23,852 --> 00:33:25,092
teenagers basically.

433
00:33:25,092 --> 00:33:30,874
Like even showing how the addictiveness and like the body image issues and all these
different things.

434
00:33:30,874 --> 00:33:31,214
Right.

435
00:33:31,214 --> 00:33:36,636
So basically saying they kind of knew about it, but they didn't really act upon this.

436
00:33:36,636 --> 00:33:38,491
So I do think it's good.

437
00:33:38,491 --> 00:33:45,943
I do think it's, I mean, I agree with you Bart, that it's good that is a landmark right
for legal precedent to maybe regulate more.

438
00:33:46,305 --> 00:33:48,869
I do think it's a, there's a.

439
00:33:49,336 --> 00:33:55,969
What I don't necessarily fully agree is that only it's one, I mean, I think they carry big
part of the blame, but I also think, like you said, there maybe should be more

440
00:33:55,969 --> 00:33:56,829
legislation.

441
00:33:56,829 --> 00:33:59,060
Maybe there should be other things around in place, right?

442
00:33:59,060 --> 00:34:05,483
Not just, I don't think it's realistic to expect just one person to take all the blame for
everything, right?

443
00:34:05,483 --> 00:34:13,392
But I think there should be m like meta, meta is the only one that is to blame here, meta
in YouTube like that.

444
00:34:13,392 --> 00:34:14,665
is happening here, right?

445
00:34:14,665 --> 00:34:19,933
Like, like I think what everybody's hoping here is that this will open up for broader
regulation on this topic.

446
00:34:19,933 --> 00:34:20,633
that I fully agree.

447
00:34:20,633 --> 00:34:21,313
That I fully agree.

448
00:34:21,313 --> 00:34:28,293
think maybe maybe maybe it's just was my perception on the article because you're saying
like, oh, they were fine and this and this and everyone was celebrating that it's a win.

449
00:34:28,293 --> 00:34:31,099
So we really divide by it.

450
00:34:31,099 --> 00:34:37,100
I think both TikTok and Snapchat in this case, they settled before it went to court.

451
00:34:37,100 --> 00:34:37,693
Yeah.

452
00:34:37,693 --> 00:34:38,467
yeah, I see.

453
00:34:38,467 --> 00:34:39,924
No, but that I agree.

454
00:34:39,924 --> 00:34:41,287
That I agree as well.

455
00:34:41,893 --> 00:34:42,223
yeah.

456
00:34:42,223 --> 00:34:44,804
I'm very happy to see this happen.

457
00:34:44,804 --> 00:34:48,510
I think this has very far-running implications for kids.

458
00:34:48,510 --> 00:34:55,518
I think it's very addictive, it's very easy to get your dopamine hits from here by
scrolling on social media.

459
00:34:55,518 --> 00:35:03,483
Because you get your dopamine hit there, it's also like maybe you don't need to go out as
much, maybe you don't need to meet as many people because that's difficult, right?

460
00:35:03,491 --> 00:35:05,402
Maybe it's easier to just scroll on social media.

461
00:35:05,402 --> 00:35:11,396
I think it has very far-reaching implications and I'm very happy to see this playing out.

462
00:35:11,396 --> 00:35:20,603
at the same time, what is also fascinating in this article is that the product, which is
in this case Meta, so Instagram, worked exactly as designed.

463
00:35:20,603 --> 00:35:25,766
So Instagram made a child spend 16 hours a day on the app.

464
00:35:25,766 --> 00:35:27,146
It was not really a bug.

465
00:35:27,146 --> 00:35:28,797
It was the intention.

466
00:35:28,797 --> 00:35:34,428
So on the other hand, I also agree with you about, I think...

467
00:35:34,801 --> 00:35:38,245
And I think it's really common in technology regulation.

468
00:35:38,245 --> 00:35:46,113
Something needs to happen, a really specific case needs to happen in order to get the
broader picture regulated also.

469
00:35:46,113 --> 00:35:52,290
And I think that this case can maybe have some implications on regulation as a whole.

470
00:35:52,290 --> 00:35:52,801
Exactly.

471
00:35:52,801 --> 00:36:02,091
And I think there's also this camp that says that this doesn't need to be regulated, you
just need to have parental controls on this and it's up to the parents to control this.

472
00:36:02,091 --> 00:36:04,122
But I don't agree with this.

473
00:36:04,122 --> 00:36:06,394
think as a society we also have a role in this.

474
00:36:06,394 --> 00:36:09,099
To me it's very much similar to cigarettes, right?

475
00:36:09,099 --> 00:36:13,034
Like, if parents can control them, we also don't need any regulation on age for
cigarettes.

476
00:36:13,034 --> 00:36:14,345
Like, then it's up to the parents, right?

477
00:36:14,345 --> 00:36:19,882
But we all know that as a society, we're not the greatest at this, right?

478
00:36:19,882 --> 00:36:25,888
And I think it's complicated for parents to also enforce that and also for platforms to
enforce.

479
00:36:25,888 --> 00:36:31,632
I think the age limit threshold is 13 on meta.

480
00:36:31,632 --> 00:36:34,373
I think it was written in the article.

481
00:36:34,374 --> 00:36:40,659
But well, I guess that a child can just bypass this threshold.

482
00:36:40,660 --> 00:36:41,500
But yeah, but I agree.

483
00:36:41,500 --> 00:36:43,500
think I was thinking the same thing.

484
00:36:43,700 --> 00:36:46,760
if you look at social media as a drug, right?

485
00:36:46,760 --> 00:36:51,521
Then I mean, cause I think for me, like try to look at objectively, right?

486
00:36:51,521 --> 00:36:56,181
But I think if you compare this tobacco to alcohol, I think it should be treated the same
way, right?

487
00:36:56,181 --> 00:36:58,721
I think there should be legislation.

488
00:36:58,721 --> 00:37:00,821
I think there should be controls.

489
00:37:00,821 --> 00:37:05,501
I think there should be also more, more, how do say, education around it in a way, you
know?

490
00:37:05,501 --> 00:37:08,112
I feel like every time you guys cigarette, there's like...

491
00:37:08,112 --> 00:37:09,543
I mean, at least in Brazil, right?

492
00:37:09,543 --> 00:37:12,526
And I was like, this is how the lungs of a smoker is, right?

493
00:37:12,526 --> 00:37:14,387
Like, to really try to bring awareness.

494
00:37:14,387 --> 00:37:16,041
I do think all these things should be there.

495
00:37:16,041 --> 00:37:23,836
And maybe also, I'm kind of glad that I grew up in the time that I did because I did catch
a bit of social media, but I didn't catch it when I was really young.

496
00:37:23,836 --> 00:37:28,199
And even me as an adult, I can see how addictive it is, right?

497
00:37:28,199 --> 00:37:29,200
I do get very hooked.

498
00:37:29,200 --> 00:37:30,240
And I think...

499
00:37:30,403 --> 00:37:33,665
Yeah, you mentioned dopamine hit and it's not just like not going out.

500
00:37:33,665 --> 00:37:37,386
But also if I spend a lot of time on the screen, I do feel drained.

501
00:37:37,386 --> 00:37:45,810
I do feel like there's a there's a like a little quote unquote depression that kind of
like a dip, you know, like it's not just like dopamine hit and you're happy and then you

502
00:37:45,810 --> 00:37:46,853
go and you do something else.

503
00:37:46,853 --> 00:37:48,982
It's like I do feel like it weighs you down.

504
00:37:48,982 --> 00:37:49,602
Right.

505
00:37:49,602 --> 00:37:56,927
Even reading a book about about addiction actually is called like dopamine nation that
they talk about a lot of these mechanisms like in your in your brain.

506
00:37:56,927 --> 00:37:57,137
Right.

507
00:37:57,137 --> 00:37:59,608
So I do think it's I do think it's good.

508
00:37:59,608 --> 00:38:11,928
myself for example I have Instagram account I have Facebook I have Reddit I have X but
nowadays I don't have anything on my phone I deleted everything because for me it

509
00:38:11,928 --> 00:38:13,248
difficult

510
00:38:13,249 --> 00:38:15,990
much time also 16 hours a day

511
00:38:15,990 --> 00:38:17,910
Not 16 hours, not 16 hours.

512
00:38:17,910 --> 00:38:20,810
But like you, I don't need to spend 16 hours, but I feel drained.

513
00:38:20,810 --> 00:38:26,990
I feel like I wasted my, I wasted so much time and I feel drained and I don't know, it's
not a good feeling.

514
00:38:27,110 --> 00:38:33,870
So, I mean, and me, as an adult, I imagine like a kid to, you know, to expect them to
self-regulate and all these things.

515
00:38:33,870 --> 00:38:36,383
gonna, it's like, it's hard for me, imagine for.

516
00:38:36,383 --> 00:38:42,438
And for me, it raises also a broader question, which is who should be accountable for
what?

517
00:38:42,438 --> 00:38:44,248
Is it the parents?

518
00:38:44,248 --> 00:38:48,053
Is it the governments for not educating enough children?

519
00:38:48,053 --> 00:38:49,444
Is it the platforms?

520
00:38:49,444 --> 00:38:53,847
And I think that in the tech ecosystem, these kind of questions are everywhere.

521
00:38:53,847 --> 00:38:57,799
Also for data protection, who is accountable for data?

522
00:38:57,799 --> 00:38:58,892
it the final?

523
00:38:58,892 --> 00:38:59,954
it the final user?

524
00:38:59,954 --> 00:39:01,786
Is it the data center location?

525
00:39:01,786 --> 00:39:05,452
Is it the government in which the user is using?

526
00:39:05,452 --> 00:39:13,404
So I think it will be answered by governments shortly, but right now these are questions.

527
00:39:13,405 --> 00:39:14,337
indeed.

528
00:39:14,338 --> 00:39:17,027
Maybe moving on we have

529
00:39:17,029 --> 00:39:27,024
OpenAI Entropic are both racing to lock in enterprise clients through joint ventures with
private equity firms where PE firms bring their portfolio companies as customers and the

530
00:39:27,024 --> 00:39:29,765
AI labs get distribution at scale.

531
00:39:29,765 --> 00:39:41,200
OpenAI is offering partners guaranteed minimum return of 17.5 % while above market and is
in advanced stocks with TPG, Bank Capital, Advent International and Brookfield to raise

532
00:39:41,200 --> 00:39:43,071
about 4 billion.

533
00:39:43,071 --> 00:39:46,654
Entropic is pursuing its own version courting Blackstone,

534
00:39:46,654 --> 00:39:49,734
and Friedman and Bermera.

535
00:39:49,734 --> 00:39:51,114
A lot of names.

536
00:39:51,294 --> 00:39:56,234
Though some major buyout firms have already walked away questioning whether the economics
actually work.

537
00:39:56,235 --> 00:40:07,195
a lot of basically they're looking for oh maybe for dummies like me that don't know about
this what is a private equity what what are they saying here right like what does this all

538
00:40:07,195 --> 00:40:10,015
mean what's the financial situation of Entropiq in my

539
00:40:10,016 --> 00:40:13,608
Private equity is basically funds buying companies.

540
00:40:13,608 --> 00:40:17,390
can be minority stakes, can be majority stakes in this case.

541
00:40:17,390 --> 00:40:19,552
It will be of course minority stakes.

542
00:40:19,552 --> 00:40:30,701
And the goal of these private equity firms, funds, is just to buy a company, make them
grow over, I don't know, five, six, seven, eight years, depends on the funds, to sell them

543
00:40:30,701 --> 00:40:31,934
at the end of the...

544
00:40:31,934 --> 00:40:33,034
this period.

545
00:40:33,354 --> 00:40:47,754
there, basically, what he's saying in this article is that OpenAI is willing to offer
these private equity funds a 17.5 % guaranteed capital.

546
00:40:48,153 --> 00:40:50,874
I've never seen that, honestly.

547
00:40:50,875 --> 00:40:57,709
So basically, when a PE fund is buying a company, just...

548
00:40:57,709 --> 00:41:02,523
are not making a bet, they are making an investment and the profits are uncertain.

549
00:41:02,523 --> 00:41:11,147
But it's the job of the PFund to make the company grow by, I don't know, changing the
management, buying other companies.

550
00:41:11,287 --> 00:41:16,714
It is called buy and build in finance or to just improve the efficiency of the operations.

551
00:41:16,714 --> 00:41:21,747
There, OpenAI is just saying, okay, buy me and I will just give you yields.

552
00:41:21,747 --> 00:41:23,471
which is surprising.

553
00:41:23,472 --> 00:41:27,052
Can they guarantee the 17.5 %?

554
00:41:27,092 --> 00:41:30,832
Can they actually say this is guaranteed?

555
00:41:30,833 --> 00:41:33,753
no risk or I don't know.

556
00:41:33,754 --> 00:41:38,135
Maybe to specify a little bit because there's a lot of news on this round of OpenAI.

557
00:41:38,135 --> 00:41:39,667
So there is one.

558
00:41:39,669 --> 00:41:50,127
big round that I understand is that they are doing a 122 billion round at a post-money
valuation of 852 billion, which they just announced a few days ago.

559
00:41:50,127 --> 00:41:59,574
And next to that, there's also like this, they have the Skoll Investment Joint Venture
with a number of PE firms, which is smaller and they're like pre-moneyed valued at 10

560
00:41:59,574 --> 00:42:00,124
billion.

561
00:42:00,124 --> 00:42:07,074
in this vehicle and I don't know exactly how destruction was in this vehicle that they're
promising this return.

562
00:42:07,074 --> 00:42:09,693
And well, you can promise everything, right?

563
00:42:09,693 --> 00:42:10,234
Of course.

564
00:42:10,234 --> 00:42:15,294
in this case, this is the external vehicle that will promise the 17.5%.

565
00:42:15,294 --> 00:42:23,179
I think if I understood well the structure, and it will not be open AI promising 17.5%,
which...

566
00:42:23,180 --> 00:42:26,381
So they don't put the balance sheet at risk.

567
00:42:26,381 --> 00:42:27,371
Exactly.

568
00:42:27,373 --> 00:42:38,463
And I think what is actually going on here, which actually happens a lot in large private
equity, especially US funds, is that from the moment that as a fund you invest something

569
00:42:38,463 --> 00:42:43,248
in, you also bring the other portfolio companies as customers.

570
00:42:43,248 --> 00:42:50,814
And what OpenAI is trying to do here, because Entropiq is also trying to court all these
private equity firms.

571
00:42:50,864 --> 00:42:54,738
is saying the return is going to be so good, you basically can't say no.

572
00:42:54,738 --> 00:43:00,703
So come now with all your portfolio companies and all those portfolio companies will then
become OpenAI customers.

573
00:43:00,703 --> 00:43:08,709
Because we're definitely like in the last months in a race on who will become the B2B
player in the in the AI world, right?

574
00:43:08,709 --> 00:43:10,512
m

575
00:43:10,512 --> 00:43:23,376
on the other hand, of course, it will be a good thing, not only for OpenAI because they
will just raise funds, they will just spread, as you say, about their Gen.AI solution into

576
00:43:23,376 --> 00:43:28,479
the portfolio of the large PE funds and also make enterprise adoption.

577
00:43:28,480 --> 00:43:34,214
But it's also good for the funds, think, because they will integrate the newest

578
00:43:34,214 --> 00:43:45,294
open AI solutions in their portfolios company, maybe increase the productivity and it's
also a good marketing as well.

579
00:43:45,295 --> 00:43:51,795
And also for the underlying companies of the funds, they can use the newest open AI
solution.

580
00:43:51,835 --> 00:43:54,955
it can be a virtuous circle.

581
00:43:55,512 --> 00:44:02,518
Exactly, probably they are also promising that they can use it at a reduced rate or
something, so there is something in it for everybody.

582
00:44:02,520 --> 00:44:03,307
Exactly.

583
00:44:03,307 --> 00:44:05,688
Yeah, but people are still saying no.

584
00:44:07,609 --> 00:44:16,233
But apparently like they're still, like you mentioned this is an offer you can't refuse,
it's all sweet and all these things, but apparently, well, from reading the article,

585
00:44:16,633 --> 00:44:18,444
they're still having a bit of a hard time, right?

586
00:44:18,444 --> 00:44:21,248
Like a lot of people are still saying no and all these different things, no.

587
00:44:21,248 --> 00:44:29,472
But the thing is like, like, we're getting these these promises of what was it 17 or 19 %
return like quite late on.

588
00:44:29,472 --> 00:44:38,035
and but as long as the market keeps going up, as long as valuation keeps increasing, like
all of that is fine from the moment that that stops, like, you still have these guaranteed

589
00:44:38,035 --> 00:44:39,346
returns, need to be paid out.

590
00:44:39,346 --> 00:44:45,598
So investors that came in earlier get squeezed out, like it becomes a bit of a like a
house of cards, right.

591
00:44:45,598 --> 00:44:47,239
But I think that

592
00:44:47,241 --> 00:44:50,876
that House of Cards has been building for a few years now already.

593
00:44:51,186 --> 00:44:55,176
Yeah, and also this is why some P fun are just...

594
00:44:55,178 --> 00:44:58,371
just backing off this deal.

595
00:44:58,371 --> 00:45:01,544
think Thomas Bravo was mentioned in the article.

596
00:45:01,544 --> 00:45:07,971
They are just saying, okay, this is too much for me, especially when you know that OpenAI
is just burning cash right now.

597
00:45:07,971 --> 00:45:10,555
They are breakeven.

598
00:45:10,555 --> 00:45:15,862
So I think that the loss last year was more than 10 billion or just less.

599
00:45:15,862 --> 00:45:18,023
I don't know, but they are losing money.

600
00:45:18,025 --> 00:45:24,700
The break even will be at least is forecasted in 2020-30 or 2028, something like that.

601
00:45:24,700 --> 00:45:27,113
So once again, it's kind of a bet.

602
00:45:27,114 --> 00:45:34,798
Yeah, and also the I saw like, mean, while ago, but I don't know if it is updated, but
like they projected to go like really low and then like in one year to really just turn

603
00:45:34,798 --> 00:45:40,090
everything around, which is seems a bit from me looking from the outside and not knowing
how much these things work.

604
00:45:40,090 --> 00:45:42,722
It feels a bit unrealistic, but okay.

605
00:45:42,722 --> 00:45:45,323
And also feels like OpenAI is trying to find its way still, right?

606
00:45:45,323 --> 00:45:48,595
Like they pulled out Sora, they deprotetized some other features.

607
00:45:48,595 --> 00:45:53,578
Now they're also focusing on B2B, which is what Anthropic has been focused on from the
beginning.

608
00:45:53,578 --> 00:45:56,322
So it feels a bit like they're so searching.

609
00:45:56,322 --> 00:45:58,539
bit but we'll see.

610
00:45:58,539 --> 00:45:58,940
see.

611
00:45:58,940 --> 00:46:00,417
We'll see what happens.

612
00:46:00,417 --> 00:46:13,660
what's important, I think that we will talk about that in the next episodes, Anthropic and
OpenAI are just battling and they are maybe going IPOs and they are maybe preparing some

613
00:46:13,660 --> 00:46:15,272
stuff to go IPOs.

614
00:46:15,272 --> 00:46:18,236
Some article mentioned one year, some other six months.

615
00:46:18,236 --> 00:46:19,937
So we'll see also.

616
00:46:19,938 --> 00:46:26,689
Yeah, and I think they're expecting three very big ones like Entropic, OpenAI and SpaceX
this year.

617
00:46:26,689 --> 00:46:32,478
But I think like the people running OpenAI, the smart people, they're probably the best
bankers.

618
00:46:32,478 --> 00:46:34,041
I think it will be a good IPO.

619
00:46:34,043 --> 00:46:36,955
And we, they are talking about a one trillion IPO.

620
00:46:37,000 --> 00:46:38,341
But let's come back to that later.

621
00:46:38,341 --> 00:46:45,347
m Next it's on me, think.

622
00:46:45,347 --> 00:46:52,754
Fivetran has transferred SQL Mesh, the open source data transformation framework it
acquired through its purchase of Tobico data.

623
00:46:52,754 --> 00:46:57,305
They moved it to the Linux Foundation for community governance.

624
00:46:57,305 --> 00:47:07,105
Six founding member organizations will support projects which helps data teams manage
complex SQL transformation pipelines with built-in testing, versioning and automation.

625
00:47:07,246 --> 00:47:15,756
The move signals 5Trans believe that the transformation layer of the modern data stack
should evolve through open collaboration rather than corporate ownership.

626
00:47:15,756 --> 00:47:19,038
um

627
00:47:19,038 --> 00:47:27,962
bit about this, So, Fivetrend bought Tobico data, which had SQL Mesh, and then Fivetrend
bought dbtLabs, which has dbt.

628
00:47:27,962 --> 00:47:30,513
And they're a bit competing, right?

629
00:47:30,513 --> 00:47:32,845
They're two different features to the same problem, let's say.

630
00:47:32,845 --> 00:47:35,587
And I think there were a lot of questions, what they're gonna do with it.

631
00:47:35,587 --> 00:47:37,908
And I think after some time, we have a bit the answer, right?

632
00:47:37,908 --> 00:47:43,911
They're focusing maybe on dbt, with dbt cloud, and SQL Mesh becomes an open source, they
donate it, right?

633
00:47:43,911 --> 00:47:44,771
Which...

634
00:47:45,005 --> 00:47:50,153
Did you mention also on the on our pre pre pre pre production chat, right?

635
00:47:50,153 --> 00:47:52,661
That it's a good move from 5Trend, right?

636
00:47:52,661 --> 00:47:54,913
They still gain points with open source community.

637
00:47:54,913 --> 00:47:56,935
They don't need to maintain it anymore.

638
00:47:57,166 --> 00:48:05,327
Yeah, well you said that they will focus on dbt and not necessarily that this is a thing
about dbt versus SQL mesh.

639
00:48:05,814 --> 00:48:09,228
No, but I think they had both and they let one go.

640
00:48:09,228 --> 00:48:16,798
So to me it signals implicitly that dbt is the product now, The other one is not there
anymore.

641
00:48:16,798 --> 00:48:18,339
they will focus on that.

642
00:48:19,601 --> 00:48:20,242
Interesting.

643
00:48:20,242 --> 00:48:21,222
Why not?

644
00:48:21,223 --> 00:48:24,203
To me, because that's a bit, makes it bit more complex.

645
00:48:24,203 --> 00:48:31,883
Like there is like, I think last year, Fivetran and dbt kind of merged, but also not
really, but on paper they merged, right?

646
00:48:31,884 --> 00:48:36,944
Fivetran also bought Tobiko, had SQL mesh.

647
00:48:36,944 --> 00:48:44,524
SQL mesh is a big competitor to dbt core, which is dbt's open source toolkit for data
transformation.

648
00:48:44,526 --> 00:48:52,893
And what we're seeing now, think, that they're maintaining open source packages as a
commercial company.

649
00:48:52,893 --> 00:48:53,613
It's hard.

650
00:48:53,613 --> 00:49:03,461
I think it costs a lot of money to do it, but it's also very tricky from a point of view
of how are you seen by the market?

651
00:49:03,473 --> 00:49:04,893
Are you governing it correctly?

652
00:49:04,893 --> 00:49:05,793
Are you doing it correctly?

653
00:49:05,793 --> 00:49:07,353
Are you interacting with the community correctly?

654
00:49:07,353 --> 00:49:08,433
All these things.

655
00:49:08,573 --> 00:49:19,293
And I think what Fyfran is doing now and looking very good doing it is saying we are the
good parents of SQL Mesh and we will donate to the Linux Foundation, which also has a very

656
00:49:19,293 --> 00:49:20,513
good name in the community.

657
00:49:20,513 --> 00:49:23,913
And it seems a very good signal, which I agree with.

658
00:49:23,913 --> 00:49:27,633
Like it's probably the best for the future of SQL Mesh.

659
00:49:27,633 --> 00:49:33,361
But what Fyfran is basically doing here is saying that we don't really...

660
00:49:33,361 --> 00:49:38,743
care about this specific tool, we care about the market that is using this tool.

661
00:49:38,923 --> 00:49:43,565
And the market that is using this tool can actually use our platform to run it on.

662
00:49:43,946 --> 00:49:49,969
But let's put the nuances of maintaining this or something like this, let's put it out in
the community.

663
00:49:50,442 --> 00:49:51,896
I think that is what has happened.

664
00:49:52,014 --> 00:49:58,942
in SQLMesh was already, it was still open, like it was open source still even when it was
officially under 5 trend or yeah, right?

665
00:49:58,942 --> 00:50:02,077
So I think it's more like officially changing heads, right?

666
00:50:02,077 --> 00:50:04,694
Like passing the head from one organization to another.

667
00:50:04,694 --> 00:50:07,928
Yeah, but they still capture the market, right?

668
00:50:07,928 --> 00:50:11,313
Everybody that is using this can still do it on their platform.

669
00:50:11,313 --> 00:50:14,848
They're just the headache of maintaining an open source package.

670
00:50:14,848 --> 00:50:16,810
They're just giving that away,

671
00:50:17,273 --> 00:50:18,955
Yeah, good move from them.

672
00:50:18,955 --> 00:50:28,842
And maybe, but the reason why you said you don't agree with that, maybe they're focusing
on DBTs because the merge with DBT labs is not very clear where one stops and the other

673
00:50:28,842 --> 00:50:29,423
one begins.

674
00:50:29,423 --> 00:50:32,364
That's why you don't fully agree with what I said before.

675
00:50:32,365 --> 00:50:42,935
I think long-term, what they will focus on is just be a platform where you can host these
transformation runtimes, where you can store all your data that will become your data

676
00:50:42,935 --> 00:50:48,229
lake, data warehouse, whatever, lake house, your one-stop shop for everything data
transformation and data storage.

677
00:50:48,229 --> 00:50:55,813
And what is in their best interest is that there are big communities that are very much in
love with packages that you can...

678
00:50:55,905 --> 00:50:57,268
run easily on their platform.

679
00:50:57,268 --> 00:50:59,853
And SQL Mesh is one of them, dbt core is another one of them.

680
00:50:59,853 --> 00:51:03,217
But I don't think they care necessarily about those specific tools.

681
00:51:03,217 --> 00:51:07,152
They care of having the market that uses those specific tools run on their platform.

682
00:51:07,561 --> 00:51:11,888
Yeah, I see what you're Interesting.

683
00:51:12,330 --> 00:51:13,992
Shall we move on to the next?

684
00:51:53,826 --> 00:51:30,884
Mistral AI raised $830 million in its first-ever debt financing to build a data center in
Bruyères-le-Châtel, near Paris, powered by 30,800 NVIDIA GPUs with 44 megawatts of

685
00:51:30,884 --> 00:51:32,084
capacity.

686
00:51:32,224 --> 00:51:37,736
Seven banks backed the transaction, including BNP Paribas, Crédit Agricole and HSBC.

687
00:51:37,736 --> 00:51:41,523
with the facility expected to be operational by Q2 2026.

688
00:51:41,523 --> 00:51:53,198
The company has also committed 1.4 billion to build AI infrastructure in Sweden, targeting
200 megawatts of compute capacity across Europe by 2027.

689
00:51:53,199 --> 00:51:57,905
I really liked that Afael read it because he actually pronounced the names correctly.

690
00:51:57,905 --> 00:52:02,443
I was just like, wow, this looks so great.

691
00:52:02,443 --> 00:52:04,356
uh

692
00:52:04,356 --> 00:52:05,303
do it on purpose.

693
00:52:05,303 --> 00:52:06,494
Exactly, exactly.

694
00:52:06,494 --> 00:52:07,816
It just happened, you see.

695
00:52:07,816 --> 00:52:12,901
So, Mistral is the big EU player for foundational models, right?

696
00:52:12,901 --> 00:52:21,189
So, I guess they're building a data center near Paris, which, I mean, I think, I know I
mentioned earlier that there's a lot of data center stories and like, the stories like,

697
00:52:21,189 --> 00:52:23,030
everyone's talking about data centers.

698
00:52:23,165 --> 00:52:26,137
I think it's good that we hear one in Europe, right?

699
00:52:26,137 --> 00:52:27,538
It also mentions Sweden.

700
00:52:27,538 --> 00:52:36,851
one question that I had also reading this is what's the story of this versus because Bart
when we talked to Charlotte as well from the Flemish government, she also, we talked about

701
00:52:36,851 --> 00:52:38,291
the AI factories, right?

702
00:52:38,291 --> 00:52:42,462
Which seems like data centers spread around and they want to be collaborative.

703
00:52:42,462 --> 00:52:45,873
I guess the Mistral AI one is not part of the AI factories.

704
00:52:45,873 --> 00:52:47,834
So it's not, it's really proprietary, right?

705
00:52:47,834 --> 00:52:48,724
I guess.

706
00:52:48,725 --> 00:52:50,453
But how does it all play, right?

707
00:52:50,453 --> 00:52:51,326
Like, I think...

708
00:52:51,326 --> 00:52:51,575
know.

709
00:52:51,575 --> 00:52:55,695
I can imagine that they can use some EU funding for this, right?

710
00:52:55,695 --> 00:52:56,436
True.

711
00:52:56,436 --> 00:53:05,701
But if I remember correctly from the discussion with Charlotte, AI factories are really
supposed to be for European companies and people can kind of like, they were trying to

712
00:53:05,701 --> 00:53:10,885
give subsidies as well to encourage people to use these supercomputers and all these
different things.

713
00:53:10,885 --> 00:53:14,788
But this one is a bit separate from that as well, I guess, right?

714
00:53:15,389 --> 00:53:19,178
Usually more for commercial, really more mystery focused and all these different things.

715
00:53:19,178 --> 00:53:28,998
I don't know the exact numbers, but they have been apparently doing very well servicing
corporate needs across Europe in the last years.

716
00:53:29,278 --> 00:53:35,538
Also because it's probably the only real sovereign provider in Europe.

717
00:53:35,539 --> 00:53:41,799
And what they're doing now, well, I think it's a signal that they're doing very well
because they raised 830 million in debt.

718
00:53:41,799 --> 00:53:45,835
I think you can only do that if you have like good...

719
00:53:45,835 --> 00:53:47,375
figures to show.

720
00:53:47,376 --> 00:53:56,036
And they're doing this to basically build a data center that powers 13,800 GPUs.

721
00:53:56,136 --> 00:54:06,316
So you can also like, there's also like a way to think about how CapEx heavy this, these
data centers are because it's it's like, what is it?

722
00:54:06,316 --> 00:54:10,596
60, 65,000 per GPU, right?

723
00:54:10,597 --> 00:54:13,268
like the CDs are huge investments.

724
00:54:14,168 --> 00:54:20,833
It requires much more capital requirement, these capex, and also much longer payback
periods.

725
00:54:20,833 --> 00:54:32,522
So they are in direct competition with current hyperscalers that have been just invested a
lot in this kind of solution.

726
00:54:32,522 --> 00:54:37,666
for me, the question is also whether the European sovereign angle is enough.

727
00:54:38,165 --> 00:54:42,380
be a differentiator for justified disinvestments.

728
00:54:42,380 --> 00:54:44,362
But I think it's a good news.

729
00:54:44,595 --> 00:54:52,339
Yeah, but I think like also if you compare this to the opening story before, like where
it's very much VC driven, right?

730
00:54:52,339 --> 00:54:57,171
Like it's really like betting for growth.

731
00:54:57,171 --> 00:55:02,353
Like this is debt driven, like you're betting that there's actually going to be cashflow.

732
00:55:02,433 --> 00:55:09,437
Like it sounds more mature as a company, like that there is a clearer product market fit
already or that they're just...

733
00:55:09,438 --> 00:55:15,355
growing at a more reasonable rate, that there is actually a current need for that
reasonable rate in Europe.

734
00:55:15,355 --> 00:55:18,617
That's what to me seems to look like.

735
00:55:18,989 --> 00:55:24,805
Yeah, and this makes sense, think, because I think that data centers can provide also
predictable cash flow.

736
00:55:24,805 --> 00:55:27,968
um So depth makes sense compared to equity.

737
00:55:27,968 --> 00:55:30,511
think they raised some equity before, of course.

738
00:55:30,511 --> 00:55:35,398
So yeah, that's also, I would say, good investment for the investor as such.

739
00:55:35,398 --> 00:55:38,655
And also, it's really, I really like this news.

740
00:55:38,655 --> 00:55:44,381
as it creates more data sovereignty for Europe.

741
00:55:44,381 --> 00:55:55,850
yeah, Mistral is just selling right now solution where data never leaves Europe, which is
kind of, it's really important for EU companies right now.

742
00:55:55,850 --> 00:55:57,081
Yeah, I fully agree.

743
00:55:57,081 --> 00:55:58,135
It's good news.

744
00:55:58,135 --> 00:56:04,719
one side news from Mistral as well that I heard that state of the art voice models for
Dutch is actually from Mistral as well.

745
00:56:04,719 --> 00:56:10,321
So I think maybe they're also focusing a bit more on the European languages and all these
different things, which I think makes a lot of sense.

746
00:56:10,321 --> 00:56:16,387
So it's also like there are a few gaps, which maybe they're not huge gaps, but there are
gaps that I Mistral is also filling.

747
00:56:16,387 --> 00:56:18,658
And I think it's just good, like competition is good.

748
00:56:18,658 --> 00:56:21,170
And I think bringing these things to Europe as well is also good.

749
00:56:21,170 --> 00:56:23,351
So yeah, really happy to see this as well.

750
00:56:23,719 --> 00:56:36,091
And the investment question is whether the argument of, data never leaves Europe is strong
enough to compete with also maybe more efficient US and Chinese models.

751
00:56:36,092 --> 00:56:37,263
Fair point.

752
00:56:37,263 --> 00:56:49,916
Maybe to counteract that, think, with the whole geopolitical climate that we've seen
evolve in the last two years, maybe just efficiency is no longer a strong enough argument

753
00:56:49,917 --> 00:56:50,982
to choose for sovereignty.

754
00:56:50,982 --> 00:56:53,630
Do you use, guys, Mistral AI?

755
00:56:53,631 --> 00:56:54,896
More not really.

756
00:56:55,155 --> 00:56:56,396
I should say yes, right?

757
00:56:56,396 --> 00:56:59,770
m But we're not now.

758
00:56:59,770 --> 00:57:04,608
think the problem is that they are typically slightly behind the state of the art.

759
00:57:04,608 --> 00:57:12,115
typically, at least in the situation that we're in, currently where we're building, you
want to test what the performance of the state of the art is.

760
00:57:12,116 --> 00:57:25,338
Yeah, I also think that I think from my side and what I see also from colleagues is that
we mainly use well, big use for us is generating code.

761
00:57:25,338 --> 00:57:27,259
I think the best one is to cloud, right?

762
00:57:27,259 --> 00:57:29,982
And everything we do, most of the everything I do is in English.

763
00:57:29,982 --> 00:57:31,983
Most of thing most people do is in English.

764
00:57:32,184 --> 00:57:36,884
I here I also see sometimes a bit of Dutch, a bit of French, but I think via text is okay.

765
00:57:36,884 --> 00:57:45,444
So I do see Mistral a bit more on the more niche, like I said, like a colleague shared
about the voice model that is Dutch state-of-the-art.

766
00:57:45,444 --> 00:57:51,584
And I think if you have something customer facing, then you need a voice bot and you have
different dialects here in Belgium, right?

767
00:57:51,584 --> 00:57:53,204
And then it's a bit more niche.

768
00:57:53,204 --> 00:57:56,224
Then I think there's a bigger case to use these models.

769
00:57:56,604 --> 00:58:00,764
But I think for most of the things, think just staying with Claude is sufficient.

770
00:58:00,764 --> 00:58:01,624
Yeah.

771
00:58:01,625 --> 00:58:02,893
Should we move to the next one?

772
00:58:02,893 --> 00:58:03,614
one?

773
00:58:04,529 --> 00:58:04,839
Yeah.

774
00:58:04,839 --> 00:58:07,139
more EU news, right?

775
00:58:07,479 --> 00:58:09,099
Go for it, Marilla.

776
00:58:09,243 --> 00:58:22,580
The AI note-taking app Granola secured $125 million in Series C funding led by Index
Ventures and Kleiner Perkins, reaching a $1.5 billion valuation, a six-fold increase from

777
00:58:22,580 --> 00:58:25,182
its $250 million valuation less than a year ago.

778
00:58:25,182 --> 00:58:34,276
The company is expanding from Mini's note-taker into a broader enterprise AI platform with
aging capabilities and newly announced public and enterprise APIs.

779
00:58:34,276 --> 00:58:37,457
Customer includes Vanta, Gusto, Asana, Cursor, and Mr.

780
00:58:37,457 --> 00:58:38,889
AI, bringing a total of

781
00:58:38,889 --> 00:58:41,575
to 192 million.

782
00:58:41,575 --> 00:58:43,118
Yeah, I think now there are unicorns.

783
00:58:43,118 --> 00:58:44,401
Yeah, hitting unicorn status.

784
00:58:44,401 --> 00:58:45,503
So really cool.

785
00:58:45,503 --> 00:58:47,691
The granola for people that don't know.

786
00:58:47,691 --> 00:58:52,551
I would describe it as a note taking tool thingy.

787
00:58:52,551 --> 00:58:57,691
it pops up when you have a meeting, it takes a transcript, you can still take notes next
to it.

788
00:58:57,691 --> 00:59:02,031
And at the end of the meeting, it basically enhances the notes you have with transcript.

789
00:59:02,031 --> 00:59:04,691
And you can also chat with your notes as well.

790
00:59:04,691 --> 00:59:06,951
It also has MCP tools, you can connect to Claude.

791
00:59:06,951 --> 00:59:09,171
So if you say, Claude, what did we talk about yesterday?

792
00:59:09,171 --> 00:59:10,471
You will be able to fetch.

793
00:59:10,471 --> 00:59:15,331
So it's a very nice product, I would say, in the UK.

794
00:59:15,332 --> 00:59:17,452
And yeah, I think, yeah.

795
00:59:17,618 --> 00:59:21,479
Again, I kind of like granola, personally.

796
00:59:21,539 --> 00:59:24,199
I don't know if everyone else here used it, but yeah.

797
00:59:24,199 --> 00:59:25,219
What do think, Bart?

798
00:59:25,221 --> 00:59:26,102
Yeah, I like it as well.

799
00:59:26,102 --> 00:59:30,447
think, of course, a bit biased over building with top of mind is a bit sideways related.

800
00:59:30,447 --> 00:59:37,794
But what Cronulla does very well in this age of AI is that you can capture information
very frictionlessly.

801
00:59:38,255 --> 00:59:41,238
I think that's what they really excel at, like the era.

802
00:59:41,238 --> 00:59:44,898
out of the way, but they do capture every day all the meetings that you have.

803
00:59:44,978 --> 00:59:46,658
And it's like, takes a concern away.

804
00:59:46,658 --> 00:59:49,277
Like I need to make notes, no granola is capturing it.

805
00:59:49,277 --> 00:59:51,538
So you can look at them later.

806
00:59:51,539 --> 00:59:57,739
And they have been highly, highly efficient at building this, at growing this.

807
00:59:57,739 --> 01:00:02,619
it's actually a very impressive story as well.

808
01:00:02,620 --> 01:00:06,840
The history is a bit that they raised 4.25 million.

809
01:00:06,840 --> 01:00:21,341
I want to say in 2023, before they had any users, like just two months after they were
created, and then they built one year a bit in stealth mode, came out, and then basically

810
01:00:21,341 --> 01:00:28,381
34 months after that initial 4.25, they're valued at 1.5 billion.

811
01:00:28,381 --> 01:00:30,801
mean, it's crazy, yeah?

812
01:00:31,104 --> 01:00:35,424
I think that we should also challenge these valuations.

813
01:00:35,424 --> 01:00:40,864
The company was valued at 250 million, now 1.5 billion.

814
01:00:40,864 --> 01:00:46,084
It would be interesting to see how they compute it also in the cap table.

815
01:00:46,085 --> 01:00:46,366
True.

816
01:00:46,366 --> 01:00:53,174
What I think is very, and I can't put my finger on why, but I don't see any advertisement
for them anywhere.

817
01:00:53,174 --> 01:00:57,478
But everybody knows, everybody that I talk to knows about Granada.

818
01:00:57,939 --> 01:01:01,293
So what I also would like to see is like how many users do they actually have, right?

819
01:01:01,293 --> 01:01:02,040
I think.

820
01:01:02,040 --> 01:01:07,883
And I do think actually like they're very viral and they're your user base very quickly.

821
01:01:07,883 --> 01:01:11,766
But I think there's challenges to convert those free users to paid users.

822
01:01:11,766 --> 01:01:16,828
And that I would be very interested in to see how well they're doing there.

823
01:01:16,828 --> 01:01:23,100
Because I'm still a free user, I've been using them for months and they keep bugging me
that I need to go to pro, but they're not limiting my functionality.

824
01:01:23,100 --> 01:01:28,032
Like I think they were afraid to start blocking users with the risk of actually losing
them.

825
01:01:28,033 --> 01:01:36,118
Yeah, I think now, think what I noticed recently is that they're trying to add a time for
like if meeting is older than one month, then you lose.

826
01:01:36,118 --> 01:01:38,420
So I think they're starting to do these things, I agree.

827
01:01:38,420 --> 01:01:40,230
I think it's a very sticky thing.

828
01:01:40,230 --> 01:01:42,051
It's very easy, like you said, it's very frictionless.

829
01:01:42,051 --> 01:01:43,441
So it's very sticky.

830
01:01:43,441 --> 01:01:48,014
So I feel like it's very easy to get started with this and it's very easy to rely on it,
to expect it.

831
01:01:48,014 --> 01:01:50,176
know, like if you don't have it, then you're gonna miss it.

832
01:01:50,176 --> 01:01:55,368
But I agree, like they also get, they also being very generous, like it was free for a
long time and you could do whatever.

833
01:01:55,368 --> 01:01:58,929
And now they're trying to limit a bit of expiration date of the older meetings.

834
01:01:58,990 --> 01:02:03,453
And every time you create a new workspace, they give you one month of pro for free.

835
01:02:03,453 --> 01:02:11,326
So it's like, if you just keep changing workspaces and migrating your meeting notes once a
month, you can just use pre, you know, so I think, I think they're going to time it up,

836
01:02:11,326 --> 01:02:13,778
but yeah, let's see how much, how many people we lose.

837
01:02:13,778 --> 01:02:14,867
They lose along the way.

838
01:02:14,867 --> 01:02:17,720
Yeah, but they also announced a public API.

839
01:02:17,720 --> 01:02:25,995
So I think that also they want to move from being an end user app to being just an
infrastructure as such.

840
01:02:25,995 --> 01:02:26,931
it can be also.

841
01:02:26,931 --> 01:02:34,849
the play that they're doing a bit is that they will become your repository of all your
personal data, which will become very valuable, right?

842
01:02:34,849 --> 01:02:41,056
Like also if you can share this with other AI assistants and kind of things, think that is
the play that they're trying to do, which makes sense.

843
01:02:41,056 --> 01:02:42,307
And this is the end age.

844
01:02:42,308 --> 01:02:47,307
Yeah, you mentioned maybe for another time, you mentioned tangential with top of mind.

845
01:02:47,307 --> 01:02:51,972
But I'll be very curious to hear your thoughts particularly on this part, but we'll leave
it for another.

846
01:02:54,136 --> 01:02:55,310
Yeah, another day.

847
01:02:55,310 --> 01:02:59,246
time because I will actually have to run to my next meeting.

848
01:02:59,246 --> 01:03:00,246
all right.

849
01:03:00,288 --> 01:03:04,233
But then I think we stop here today.

850
01:03:04,695 --> 01:03:05,396
Thanks, everyone.

851
01:03:05,396 --> 01:03:06,338
Welcome, Rafael, again.

852
01:03:06,338 --> 01:03:07,888
em

853
01:03:07,888 --> 01:03:08,320
Eftel.

854
01:03:08,320 --> 01:03:13,706
was fun having you on in the discussion and looking forward to building a bright signal.

855
01:03:13,981 --> 01:03:17,506
Looking forward to it and looking forward to the other format as well.

856
01:03:17,969 --> 01:03:18,611
Exactly.

857
01:03:18,611 --> 01:03:22,628
think next week we will take also the Eastern break, right?

858
01:03:22,628 --> 01:03:26,187
hit the ground running.

859
01:03:26,187 --> 01:03:30,067
hope we need to see the practicalities, but the next time we're back, it's for an
interview.

860
01:03:30,538 --> 01:03:30,924
Yes.

861
01:03:30,924 --> 01:03:31,576
so.

862
01:03:31,577 --> 01:03:36,295
Alright, thanks everyone, thanks Bart, thanks Rafael, thanks everyone for listening.

863
01:03:36,418 --> 01:03:37,460
Ciao!

864
01:03:37,807 --> 01:03:38,749
Ciao.