TBPN

Diet TBPN delivers the best of today’s TBPN episode in 30 minutes. TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays 11–2 PT on X and YouTube, with each episode posted to podcast platforms right after.

Described by The New York Times as “Silicon Valley’s newest obsession,” the show has recently featured Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella.

TBPN is made possible by:

Ramp - https://Ramp.com

AppLovin - https://axon.ai

Cisco - https://www.cisco.com

Cognition - https://cognition.ai

Console - https://console.com

CrowdStrike - https://crowdstrike.com

ElevenLabs - https://elevenlabs.io

Figma - https://figma.com

Fin - https://fin.ai

Gemini - https://gemini.google.com

Graphite - https://graphite.com

Gusto - https://gusto.com/tbpn

Kalshi - https://kalshi.com

Labelbox - https://labelbox.com

Lambda - https://lambda.ai

Linear - https://linear.app

MongoDB - https://mongodb.com

NYSE - https://nyse.com

Okta - https://www.okta.com

Phantom - https://phantom.com/cash

Plaid - https://plaid.com

Public - https://public.com

Railway - https://railway.com

Restream - https://restream.io

Sentry - https://sentry.io

Shopify - https://shopify.com/tbpn

Turbopuffer - https://turbopuffer.com

Vanta - https://vanta.com

Vibe - https://vibe.co


Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays from 11–2 PT on X and YouTube, with full episodes posted to Spotify immediately after airing.

Described by The New York Times as “Silicon Valley’s newest obsession,” TBPN has interviewed Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella. Diet TBPN delivers the best moments from each episode in under 30 minutes.

Speaker 1:

Last time we had Mark Cuban on the show, we were debating ads in LLMs. And since then, we've gotten a bunch of data points about ads in LLMs, and I think that, some of his takes have probably aged well. Some of our takes have probably aged well. It'll be an interesting time to reevaluate what's actually happening. There's been a lot more points.

Speaker 2:

I don't know, John.

Speaker 1:

We just have more evidence.

Speaker 2:

Said that ads would be fine.

Speaker 1:

Well

Speaker 2:

And now, the world is ending.

Speaker 1:

Yes. Here's a white pill. Samsung is investing $70,000,000,000 to advance their fab capacity. They're they're they're getting back in the AI chips game. They've always been in the AI chips game.

Speaker 1:

So brief history of Samsung. You probably know them from the phones, from the TVs. They, of course, are a major player in HBM, high bandwidth memory. They are a massive company, over a quarter million employees. They're close to touching a trillion dollars in USD market cap.

Speaker 1:

They pull in around 200,000,000,000 USD a year in revenue, maybe $250,000,000,000 this year in revenue. Really good. All that's USD. I like to think in USD because I'm an American. They're the global leaders in memory and o and OLED displays as well.

Speaker 1:

So a lot of the displays that you see in other electronics, even it has a different brand name, it's still Samsung actually making that OLED display. But they're second in smartphones to I to the iPhone and Apple, and they're second in the semiconductor foundry business to TSMC. Semiconductors still make up 30 to 40% of their business, and they supply HBM to NVIDIA for the h 100 and Blackwell systems. So it's not like they're sitting out the AI bull market. Are

Speaker 2:

doing They

Speaker 1:

are definitely participating. They're incredibly important in the AI build out. But if TSMC is bottlenecked and TSMC is sort of risk off and they're not going to be, you know, guiding to, like, insane CapEx numbers while every American hyperscaler is, well, that creates an opportunity for Samsung. And so Samsung is stepping up, they're announcing that, hey, we're gonna put another 70,000,000,000 to work on this particular business. So Tesla has been working with Samsung on the foundry side in AI for a while.

Speaker 1:

So Samsung's never really been on the frontier with a direct competitor to the h 100 or the Blackwell chip. That's been more of like AMD's game, and AMD also fads to TSMC. So there hasn't really been this like neck and neck battle between TSMC and Samsung. But it's like you can do AI inference on a Samsung chip, and we know that because Tesla went to Samsung years ago and said, we need a chip that can take in pictures from the road, decide where the lines are,

Speaker 2:

and They want their chips with the dip.

Speaker 1:

They want their chips with the dip, and that's Samsung does too. That's all you know. And so the FSD system, if you have a Tesla, you might be familiar with like h w three hardware three. That has been deployed into millions of cars, and it was fabbed on Samsung's 14 nanometer process, which is a lagging node. We're not in the three nanometer, the crazy frontier stuff, but it's working and it's on the road.

Speaker 1:

And according to a US regulatory probe, there were 3,200,000 vehicles, Teslas, on the road in America with FSD systems that were basically all running Samsung chips inside. Now, to be clear, Tesla, just like any foundation model lab company, they have training and then they also have inference. They're a little bit different than many of the labs that you know and love because they do training in a data center using what's called the Dojo chip, and that is Fablet TSMC. So they train the system. They take all the data in from every Tesla camera, every road, all the information that they have.

Speaker 1:

Every time that there's a disengagement, that's feedback to the reinforcement learning system. It says, hey, we were in FSD mode, but then someone grabbed the wheel or someone stepped on You the made a mistake. Understand what happened to get you to that point where you made that mistake. And so that all that data gets collected in the Tesla data center, runs on these Dojo chips. They do the training, and then they deploy the model onto the Samsung chips in the actual cars.

Speaker 1:

So with the backdrop of NVIDIA's massive GTC news cycle, they've done so much press around GTC and so many different launches. You know that NVIDIA is just gonna suck a lot of the air out of the semiconductor discussion this week. Out the clean room. Out of the clean room, yes, which is recycling all of the air every three seconds or something like that.

Speaker 2:

Yeah. But I think this is like particularly Yep. Important, especially this morning. The I guess the CCP put something out in the last twenty four hours basically saying, hey, Taiwan is gonna have an energy crisis

Speaker 3:

Yeah.

Speaker 2:

Due to the broader Yeah. Global energy crisis. So we need to reunify There's an opportunity for peaceful reunification.

Speaker 1:

But peaceful reunification, even if was completely peaceful and all the Taiwanese people just say, hey, we wanna be part of China. They all vote for it democratically. That's going to be rough for the American chip buying industry. And so having another chip on the board metaphorically to make physical chips is probably a good thing. So, yeah, Samsung has been doing well over the last five days.

Speaker 1:

Stock's up 11% during a time when the NASDAQ is down 2.2% and geopolitical tensions continue to rise. The compute bottleneck, we know it's important. We've been discussing this constantly, and it's going be very constraining over the next few years. So every increase in CapEx in the supply chain is a step in the right direction. And so Samsung gets the first gong hit of over at Samsung making making a big bet.

Speaker 2:

Cursor is out with Composer two. Composer two. It is frontier level at coding priced at 50¢ per million input tokens and 2 and a half dollars per million output tokens. It's also they have a fast version. They say we're able to significantly improve the model quality and cost to serve.

Speaker 2:

These quality improvements come from our first continued pre training around providing a far stronger base to scale our RL.

Speaker 1:

It's not one of these graphs that's just like, oh, look, we made some arbitrary x and y axes. And like, we're in the top right corner, of course, because the axes are good and cool.

Speaker 2:

Yeah, TBPN bench is like technology podcasts

Speaker 3:

Yes.

Speaker 2:

Publish at least three hours of content every week. Yes.

Speaker 1:

So naturally Exactly.

Speaker 2:

Naturally, we are

Speaker 3:

We're right at

Speaker 2:

the top. And at it's actually there's no one else on there.

Speaker 1:

Yes. But yeah, mean, this seems fair. It is a little bit odd to read this because the cost is on the x axis, it's inverted. So the further you are to the right, the cheaper you are, which makes sense because people associate an x and y graph with you want to be in the top right quadrant, and they certainly are. And it does seem like in terms of this Pareto frontier, you want to be on the frontier, you want to be pushing out across every single curve.

Speaker 1:

Maybe if you are interested in sparing no expense, you'll go with the GPT 5.4 high or medium model, and you can align Cursor to GPT. I'm sort of surprised that Opus is not doing as well on CursorBench. That feels surprising based on like the general vibes around Opus 4.6 generally. But Cursor has specific needs for specific customers. And I don't know.

Speaker 1:

What else do you think is going on here?

Speaker 4:

Yeah. I mean, the cost is really big. Like, this is basically like 10 x cheaper than

Speaker 2:

Yeah.

Speaker 4:

An Opus. So I I think also, Cursor has kind of been like not really a, like, dark horse, like, everyone knows about it. But in the coding race, it's like, everyone's like, okay, there's Codex versus Cloud Code. Yep. If you imagine that Cloud Code and Codex are kind of like these environments for getting a ton of, like, really good data for training coding models.

Speaker 4:

Like, Cursars had that for way, way longer Yeah. Than than OpenAI or Anthropic.

Speaker 3:

Yeah.

Speaker 4:

So you should imagine that, like, at least, you know, in the near term, like, they actually have, like, really, really good data

Speaker 1:

Yeah.

Speaker 4:

That they can, you know, train these these good models on. And, obviously, like, this is a very specific model. This they've said it, like, you're not gonna write poems with this model. It's this very, like, specific, almost like point solution model where it's Don't

Speaker 2:

listen to them, Tyler. Write a poem with the model. Poem bench.

Speaker 1:

Poem bench. Yeah. I I I would be interested to know, like, how many sacrifices were made, because it's at a certain point, like I I remember talking to an AI researcher, actually a semiconductor, who was saying that, like, he actually thinks he actually does believe that importing, like, the Odyssey and, like, Homeric epics is key to humanoid robots learning to walk.

Speaker 4:

Yeah. Well, I I think, like, if you look back at just, the general history of, like, machine learning AI, like, the the lesson is that, like, big general models always beat these small specific models. Yes. But if you kind of zoom in on the time scale, like, you can still train GLM, some open source model on a very specific task like accounting or something. And you can like, it'll climb and you can actually make it better than the frontier models right now.

Speaker 1:

At that specific thing. Especially at cost. Yeah. Especially at cost.

Speaker 3:

Yeah. Yes.

Speaker 4:

Yeah. Very much so. But, like, on the long term, if you zoom out, we're actually once here it it seems like it's basically always gonna be these big Yeah. You know, general models.

Speaker 2:

But

Speaker 1:

And I wonder I wonder if that's true. I mean, we talk about this a lot where the big general model outperforms the smaller model. But at the limit, like if you were to think about like a Python if statement, just like flow control that is truly deterministic, if you piped the same question of like the if statement, like is this number bigger than this number, You pipe that into 5.4. It's gonna get it right all the time. It's gonna be very expensive compared to an if statement, which takes like no no compute whatsoever.

Speaker 1:

But the if statement is 100% accurate.

Speaker 2:

The legendary poster sent cop says all s h I t s and giggles on that headline till Anthropic or OpenAI decide to cut off their access to Cursor, referencing the Bloomberg article. Cursor is taking on Anthropic and OpenAI with a new AI coding model.

Speaker 1:

Would would that matter? Like, at this point, if they have if they have Composer two, and it's a small model, but it's good at writing code, it performs well in CursorBench, and the Cursor users are satisfied with the Composer two model, and they do Cursor does get their access cut off. And when you install Cursor, you roll it out to your organization. You just get Composer two. And you know what?

Speaker 1:

It's it's, you know, maybe there's taste that wouldn't pull

Speaker 2:

you at this point right now, I don't think we have any visibility into how much of Cursor's revenue right now Yeah. Is tied to using OpenAI or Anthropic Models. George says, I'm hearing tons of complaints from Cursor customers at enterprise companies. A silent change put almost all models Cursor uses behind max mode. Devs, used to manage to spread out monthly credits over a month see all of it used up in one to two days

Speaker 1:

Oh, interesting.

Speaker 2:

Are furious and switchy.

Speaker 1:

It does feel like there's a little bit of like an economic war here. Yeah.

Speaker 2:

And and this is what came up earlier this month Yeah. Around the labs sort of subsidizing.

Speaker 1:

Yeah. You know who's

Speaker 2:

They're not in an easy position, but they're such a talented team. Nikita says, we're rolling out summaries for articles now. Just tap the summarize button if you want to know if it's worth your time to read it. Yes. Yes.

Speaker 2:

And yeah, it's basically, Grock, turn this into a regular I am excited about the listen button. I've had this, you know, on on my commute. Yeah. There's so many moments where I'm like, I wish I could just have somebody read this Yeah.

Speaker 1:

I actually wound up doing this with a number of Will Menidas long form essays. I would copy them, put them into 11 Reader from Eleven Labs and have it read it to me in sort of a

Speaker 2:

Silly voice.

Speaker 1:

Voice, a silly voice. It was a good time.

Speaker 2:

Well, was trying I was actually trying to use I the was trying to use X app to just take an article, paste it into Grock, and say, hey, can you read this to me? Yeah. It said, cannot find Yeah. The post.

Speaker 1:

This is a this is a response to, you know, every article people would post. People would always say, Grock, summarize this. And now there's just a button. I recently learned that you can only ask Grock, like, at Grock, is this true? You can only do that if you're paying for x.

Speaker 1:

Sort of underrated how well x has seemingly I don't know how big the subscriber base is, but that was a crazy idea to have a paid social network.

Speaker 2:

I think it's because people are people are deeply addicted to X. Yeah. It is very valuable to them to be on there to participate. Yeah. And the paid functionality, the way that it was marketed and the way that it generally worked was like, you were gonna have a bad time on x.

Speaker 2:

Yeah. Like, if x was valuable to you and you didn't pay the $10 a month, it was gonna be, like, significantly less valuable to you. Probably, you know, you might you might, depending on what kind of business you're running or what you use x for, it might be the equivalent of, like, losing thousands of dollars a month of value or you could just pay the $10. Yep. So was a good trade.

Speaker 2:

Yep. But

Speaker 1:

It was also just it was weird how the targeting never seemingly got dialed to the point where you could actually target the CEOs of companies who were on X. Like, I mean, you see Travis Kalanick on X, like replying to things. It's like he's raising money, he's growing a business. Like, there's a lot of value in advertising to him because he's gonna be picking a corporate card soon, or he probably already has, or it might be in that market. He might be picking up payroll suite.

Speaker 1:

Like, there's all these things where if you could deliver that to that audience, it would be incredibly valuable and the CPM should be like through the roof. But I think for privacy reasons and for a variety of other reasons and sort of really monetizing that long tail has been very difficult across every platform. So they've just gone with scale and the products that have sold the most on social networks have been very broadly marketed. The criticism that we saw from the Oscars is always like YouTube ads are generic. It's just like for Yeah.

Speaker 1:

A pillow or like injury or like something that applies to every single person. But there's always this like hyper targeted opportunity there.

Speaker 2:

Yeah. Other the other thing is is the paid program with X has seemingly worked.

Speaker 3:

Mhmm.

Speaker 2:

In that we know a lot of people that happily pay and have no plans to churn. Mhmm. But it would be a failure in the context of like meta scale. Right? I think the last reported number that I saw was something like one to one and a half million paid subs at $10 a month.

Speaker 1:

Oh, on x?

Speaker 2:

Yeah. So you're talking about somewhere in the range of a 100 to 200,000,000 of like ARR. If Zuck had launched a product like that, he would just wind it down. Right? Reels went from zero to 50,000,000,000 Yeah.

Speaker 2:

Of run rate in like a handful of years. Right? That's what a that's what a home run looks like. And so Yeah. I think it makes sense for X, but it certainly is not a home run from a, you know, consumer application standpoint.

Speaker 2:

And they still need the, you know, the overall business.

Speaker 1:

Olivia Morrison, a big story that most people are missing in the AI race for the consumer, chat GPT versus Claude, is ads. Right now, most consumer AI revenue is coming from power users who are willing to pay high subscription costs. This currently skews positive for products like Claude, but this will not be the end state. Google makes $460 per user per year in The United States more mostly on ads. I didn't know that their ARPU was so high.

Speaker 1:

Meta makes around $2.50. Mean, I guess those Google Ads are really, really valuable, and it's so intent driven that it it makes sense. ChatGPT's ad based ARPUs will be even higher as they will ultimately have deeper, more frequent user engagement. Even at the $460 level, monetizing everyone in The US via ads is a 152,000,000,000 in annual revenue. By contrast, if you're able to monetize even 5% of the population at $200 a month subscription, which is a stretch, that's only 40,000,000,000.

Speaker 1:

That's a that that's actually a crazy difference. Because $200 a month subscription is is like super high. Like, you know, you're talking 20 times like Netflix or something else that's, you know Yeah. Premium and like really important.

Speaker 2:

Yeah. The $200 subscription at the time was crazy. Yeah. But even at that point, some of the people that were more AI pill generally were like, oh, it's actually possible that someday you could spend $20,000 a I

Speaker 1:

was like, give me the $20,000 a month. So she says, I suspect this will be even more drastic outside of The United States, where users are even less willing to pay or directly pay for subscriptions. And the earliest data from a very small rollout shows ChatGP ads are already outperforming that in in effectiveness. It just gets better over time. So interesting.

Speaker 1:

Is an interesting story. Is an

Speaker 2:

interesting Apple is way behind in AI and still making a fortune from it. Let's see.

Speaker 1:

Begs the question, are they actually behind?

Speaker 2:

It's AI revenue is set to top 1,000,000,000 this year reassuring investors wary of rivals sky high spending.

Speaker 1:

And keep in mind have a

Speaker 2:

chart here showing gross revenue from Gen AI apps as well as Apple's commission.

Speaker 1:

Look at this. The beginning of 2025 was really the boom of Gen AI app growth. 400,000,000. Is this is this monthly app store revenue? Wow.

Speaker 1:

They're really cooking. And then and then sort of a flat line.

Speaker 2:

Yeah. It's so so interesting that that it actually dropped.

Speaker 1:

Well, we did read that article a few days ago about how Apple has been pushing back against some of the vibe coding apps. And there's this question about where are the bounds. Obviously, Apple's had pretty strict App Store rules around adult content and what else you can do, even just the app reconstituting itself, pushing changes because they wanna review every line of code that goes in the App Store. If someone's pushing ten, twenty, 30,000 lines of code a day, that's a lot of code for Apple to review, gonna slow things down. So that could be a little bit of what what we're seeing.

Speaker 1:

Maybe they've capped out on their ability to review all the vibe coded apps that are flooding the app store.

Speaker 2:

Apple's on pace to surpass 1,000,000,000 in AI revenue this year, a tidy sum that demonstrates the company's AI advantage even as it struggles to deliver an AI strategy of its own. Siri chatbot is still weak by modern AI standards. What Apple does have that the other AI players don't is a dominant position making devices. However however fancy OpenAI, Google, Anthropic, and XAI make their chatbots, iPhones are still a primary way to deliver them to customers. Mhmm.

Speaker 2:

That means they typically pay the App Store tax roughly 30% of subscription fees in the first year and 15% a year thereafter. The rates vary. Gen AI apps paid Apple nearly $900,000,000 in App Store fees in 2025. Almost 1,000,000,000 of revenue and very, very, very little CapEx. Three fourths of the revenue Apple rakes in from Gen AI apps and its App Store come from ChatGPT.

Speaker 2:

Woah. Next, about 5% is XAI's Grok. Cooking. We go, Grok.

Speaker 1:

Cooking. I mean, there's so many different, like, funnels. They did the essay competition. They did the the video competition. And I've I've talked to people that are just people that are in the Apple ecosystem, they're, like, in the Tesla ecosystem.

Speaker 1:

So they're, like, yeah, I talked to Grock on my way to work. I I I'm not kidding.

Speaker 2:

Yeah. Grock in the iPhone app store is that did 12,000,000 last month.

Speaker 1:

Yeah. And I know I know like the the the true like AI heads will be like, Grock's behind on this benchmark or model or whatever. Tyler, is that a correct characterization?

Speaker 2:

Yeah. Grock did more revenue Grock did more revenue last month than it than Claude in the iPhone app store.

Speaker 1:

I've started having conversations with I'm I mean, I'm using JGPT, but I wanted to just I I wanted to get up to speed on Taiwan and the the the just the like, what was the the reason for the original civil war and stuff. And so I was just having a conversation back and forth. At no point was I like, oh, I really needs to be like, you know, GPT 5.4 pro. It's like these are things that exist just, like, with one search to Wikipedia or one search to any it's probably baked into the weights of 3.5. So, like, if I'm just going to be, like, chatting with someone who's, like, reasonably smart, like, I would say Grock is there.

Speaker 1:

And so what do you think?

Speaker 4:

Yeah. But you like, you could be talking to someone who's really, really

Speaker 1:

No. Like, not if you're asking, like, basic basic knowledge retrieval questions that, like, that, like, any model's gonna one one shot and just be Right.

Speaker 4:

Yeah. You're just describing stuff that you could just, like, actually Google.

Speaker 1:

But I can't Google via voice in my car on the drive. And for someone who's driving a Tesla and has a Grok integration right there, they're just, sure. Like, this

Speaker 4:

is great. Okay. Yeah. That's fair. But, like, I don't think those people have actually tried, like, GPT 5.4 Pro.

Speaker 1:

But I don't So good. It is it is good, but it's slow. And truthfully, like, you can fire off the exact same query to 5.4 Pro and five point four and five point foe 5.4 Fast. Fast. And if the query is simple enough, the answer will be exactly the same.

Speaker 1:

Because if I ask five point four five point four extended thinking, like, what is the capital of California? And it thinks for ten minutes, and it just tells me, say, like Sacramento.

Speaker 2:

See? You Yeah. There you go. That's why you need to think. A lot of people I told you.

Speaker 1:

I run my life on GPT-two. I hallucinate a lot.

Speaker 2:

People have said I have the mind of GPT-two.

Speaker 1:

It's true. It's true. Anyway, let's continue. Apple's revenue from generative AI apps rose from about 35,000,000 in January to a high of 100,000,000 in August. Do nothing win.

Speaker 1:

Do nothing win. Sales have fallen from their peak partly because ChatuchPN downloads have declined according to the data. As a proportion of Apple's total sales, $1,000,000,000 is small, yet Gen AI apps are the growth driver for Apple's services business, which investors have focused on in recent years because it has grown faster than device sales and boasts higher profit margins. Apple's dominant share at the top of the smartphone market affords it another luxury, time to get its own AI strategy right. So they're making money while they figure everything else out.

Speaker 1:

Apple's AI plans plan runs counter to strategies of competitors that are spending hundreds of billions of dollars on chips and data centers to build frontier language models. Apple is spending a fraction of that, aiming instead to use all of the personal information people store on their iPhones together with the chips that it designs itself to power an on device AI strategy. If they can act as a toll road for providers of AI, then they'll probably end up looking good long term for not having the big CapEx overhang. I have to imagine that Apple is not capturing any revenue from enterprises, developers, Clog Code, Codecs, any of those developers. They're probably not.

Speaker 1:

Even if they even if they are winding up using like a ChatGPT subscription in Codex, they're probably setting that new subscription up on desktop.

Speaker 2:

Road on on the on the actual

Speaker 1:

Yeah. Side. But it's a toll road on consumer, which is consumer sales. All the more reason to get into ads, honestly, because Apple does not tax those.

Speaker 2:

And AI is exciting for Apple because they need they need a new product that they can just randomly bill you, like, $2.99 Yeah. Anytime they need a cat, like, a What are talking about? Cash.

Speaker 1:

$2.99?

Speaker 2:

Like, don't don't you get just random bills from Apple, like, here and $2.99?

Speaker 1:

Like, $2.99?

Speaker 2:

Yeah. Like, I I feel like every time I check my email, it's like, Apple has charged you, like, some random amount for some or something. For some subscription. In other news

Speaker 1:

Okay.

Speaker 2:

Rolls Royce has scrap plans to go all electric by 2030 as quote, drivers prefer v 12 engines. Would you look at that?

Speaker 1:

Look at

Speaker 2:

I mean, and this is just total shock. Total shock. Yeah? Total shock. Yeah.

Speaker 2:

Drivers totally had to experience, you know, being forced EVs forced upon them for the last few years to know that they they preferred k. Combustion engines after all. Well Of course, I'm kidding.

Speaker 1:

Elon has been saying that Roadster Reveal will blow your mind. If it has a v 12, and a

Speaker 2:

We've painted man. We've been

Speaker 1:

People are going crazy. If he drops a v 12, that would completely break the Internet. Yeah. It'd be it'd be incredible.

Speaker 2:

Let's talk about this Tesla that you were following yesterday.

Speaker 1:

Oh, yes. Did you drop this in the chat already? I sent it to you. No. We we shouldn't we shouldn't shouldn't share the actual picture.

Speaker 1:

I saw a Tesla that was a very funny mix of it had the anti Elon club on it, but also an 1199 license plate. And it was a plaid and it just like mixed every possible political ideology together.

Speaker 2:

And and it had a vanity plate

Speaker 1:

It was very sci fi.

Speaker 2:

Sci fi. Was like, I do want to go to Mars Yep. But not with Elon.

Speaker 3:

Yep.

Speaker 2:

And I want go The license plate basically said, beam me up.

Speaker 1:

Beam me up.

Speaker 2:

So they want to go to Mars, but not with Elon. They support

Speaker 1:

They have an incredible amount of disposable income based

Speaker 2:

on Law enforcement. They enjoy high trim levels, but they they they do not agree with Elon's actions.

Speaker 1:

Well, maybe they work for a rival AI lab or something. And so they they're extremely sci fi pill, but they just don't like they they just feel like they're competing with X. Yeah.

Speaker 2:

California has now spent over a $100,000,000 on a new bridge to nowhere. Mhmm. It is Wildlife Bridge, which I've driven by hundreds of times. I've been seeing it. I've been experiencing the traffic that it causes.

Speaker 2:

I'm not against the concept of a wildlife bridge. In fact, I think it's fantastic.

Speaker 1:

It does feel like

Speaker 2:

But it is a

Speaker 1:

concrete jungle, this is beautiful. Totally. This has a lot of opportunity to actually improve the visual aesthetics of this particular part of the state.

Speaker 2:

Caleb Hammer says, bro, this state cannot be real.

Speaker 1:

Isn't isn't Caleb Hammer

Speaker 2:

It's very real.

Speaker 1:

Isn't Caleb Hammer he's like a finance

Speaker 2:

Yeah. He's got like the the number one.

Speaker 1:

He's like the one person you'd come to to be like, should I spend $100,000,000 on a bridge? And he would say

Speaker 2:

like that. And it's actually quite a bit more than 100 at this point. And the funny thing is it's just kind of a bridge, but it doesn't it's lacking the entrances Yeah. To the bridge. I feel like it's basically just like

Speaker 1:

just a little bit of wood to, like like, smooth it out so that it looks like there's at least going to be start of a of a of a of a ramp to get on the bridge. Like, the bridge looks solid. The actual center part looks solid. It doesn't feel that hard to finish this bridge. I'm I'm optimistic that this gets done in the next hundred years, like tops.

Speaker 2:

Apparently, Colorado built a built built a wildlife bridge for a low cost of $15,000,000. No. That's not bad. Like, functionally, something very, similar. The interesting thing is, apparently, the bridge is is in some part for cougars.

Speaker 2:

Cool. And the wild thing is, on one side of the bridge, you have a bunch of, like, residential homes. Mhmm. And on the other side, you have a bunch of cougars. And so Yeah.

Speaker 2:

They're now gonna the cougars are be able to go hang basically hang Excited. In all the backyards. So we'll see how this goes. But I'm I'm excited for this to be finished up.

Speaker 1:

Mhmm. What else is Anthropic doing? They're hiring for a policy manager who will be in charge of chemical weapons and high yield explosives. This reads like you're going to be building high yield explosives, which sounds like an annual job posting, but it is in fact for a policy manager who will be hopefully stopping people from

Speaker 2:

No. No. No. I read this as somebody whose job it is to decide how Claude is used to create chemical weapons and high yield explosive.

Speaker 1:

I I think it's probably like this person decides, like, where's the edge? If you're asking, like, okay, I have a firework, and I wanna make sure it doesn't go off. Like, should should I, you know, throw it in the trash or put it the recycling or take it to a special place? Like, Claude should answer that. But if you go to it and you ask it, like, how do I build this C4 or something like that, like, there's all these, like, policy edges where if you're talking about Counter Strike and you say, like, let's plant the bomb, it shouldn't flag that as, okay, you're actually trying to plant a bomb.

Speaker 1:

Like, you know, you're asking about a video game. We we know how to interpret that appropriately. But there needs to be like a human in the loop to decide like where that frontier is and where that particular tree comes.

Speaker 2:

Martin Screlly.

Speaker 3:

What does he say?

Speaker 2:

He's coming on Monday Okay. For the great debate.

Speaker 3:

The

Speaker 2:

great peptide debate says, good music is the mile of AI.

Speaker 1:

Mhmm.

Speaker 2:

And Lil Wayne some thoughts on AI music.

Speaker 4:

Let's play

Speaker 2:

this clip.

Speaker 1:

Let's play this two minute clip. How you handle AI in

Speaker 3:

this in this business now? Challenge. The challenge?

Speaker 1:

Bro, this is wild.

Speaker 3:

I love it. AI is a better thing. I love that AI is what it is. Yeah. Because, man, I love to be able to stand right next to whoever AI is.

Speaker 3:

He, she, they, whatever or whatever AI is, stand right next to it. Know I'm still better. Hey, man. I'm gonna keep telling you what you do again. Yeah.

Speaker 3:

Go go run your list. I do this. I do that. Yeah. Love it.

Speaker 3:

I love the challenge of it. The first time I seen somebody was my my friends was a little worried. They was like, man, Roy, they got this AI stuff where you can just ask it to do give you a verse like Lil Wayne. And so I did it. I said, well, give me a give me a verse.

Speaker 3:

He gave me her best shot. Yeah. I did it on a couple of devices. Asked her to give me one and they all all you suck. I'm gonna be okay.

Speaker 1:

I fuck with that. Yeah. Beanie Siegel, I think he using it because he like

Speaker 3:

was losing his voice a little bit.

Speaker 2:

Yeah. Another rapper to MOG.

Speaker 1:

Basically. That's his take. It's so funny.

Speaker 2:

That's great. There is some breaking news that we do gotta talk about. Jeff Bezos in talks to raise a 100,000,000,000 for AI manufacturing fund. Amazon founders traveled to The Middle East Singapore in fundraising effort linked to project Prometheus.

Speaker 1:

That is incredible. Very We have the red lights going. Exciting. Breaking news. Advance talks.

Speaker 1:

I don't care if it's just advance talks. To Jeff Bezos.

Speaker 2:

He's meeting with some of the world's largest asset managers to raise funds for the project. A few months ago, he traveled to The Middle East to discuss the new fund with sovereign wealth representatives. It's being described as a manufacturing transformation vehicle.

Speaker 4:

I am against TK. Right?

Speaker 2:

Maybe. I mean, TK is not as directly focused on manufacturing. Like, this is vehicle. Right? Mhmm.

Speaker 2:

No. No. You're saying manufacturing transport like, it's a vehicle, a fund for transforming manufacturing.

Speaker 1:

Oh. Yeah. Yeah. Yeah. Yeah.

Speaker 1:

Yeah. It's like an investing vehicle.

Speaker 2:

It's aiming to buy companies in major industrial sectors such as chip making defense vehicle industry. Aerospace. It would dwarf the size of some of the world's largest bio funds and rival SoftBank's a $100,000,000,000 fund. I gotta wonder how much how much do you think how much do you think Jeff is pitching in himself? He's like, I'm good I'm good for 30.

Speaker 2:

Yeah. You know, something in that range.

Speaker 1:

Yeah. Fine.

Speaker 2:

But this is such a white pill. Basically, need to re industrialize Yeah. America. Yeah. We're not gonna do it by just copying Yeah.

Speaker 2:

Everything from the past. There's some element of transformation that needs to happen as well as new efforts. Yeah. This is tremendous news.

Speaker 1:

Yeah. And I mean, there has been like a venture capital boom in re industrialization, but most of the funds that we talk to that are in that category are 50,000,000, couple 100,000,000, certainly nothing at this scale. And this has got to be incredible news for the the founders that we talked to that are part of the re industrialization effort. Thank you for watching. Leave us five stars on Apple Podcasts and Spotify.

Speaker 2:

It's an honor.

Speaker 1:

Will see you tomorrow.

Speaker 4:

Going flash bang.

Speaker 1:

Goodbye.