TBPN

Sign up for TBPN’s daily newsletter at TBPN.com

  • (02:05) - Something Mini is Coming
  • (10:28) - Anthropic's $20B Round
  • (20:11) - 𝕏 Timeline Reactions
  • (28:51) - Where the WB Deal Stands Now
  • (36:44) - Higgsfield Accused of Ragebait
  • (49:20) - Ackman’s Meta Move
  • (01:03:54) - Bryan Johnson, a tech entrepreneur and founder of Braintree, is dedicated to extending human lifespan through his "Blueprint" protocol, which includes consuming three tablespoons of high-polyphenol extra virgin olive oil daily, accounting for 15% of his caloric intake. In the conversation, he discusses the health benefits of this regimen, emphasizing the importance of specific types of olive oil for longevity, and introduces his "Immortals" program—a comprehensive, data-driven health initiative designed to optimize biological age through personalized protocols and advanced monitoring.
  • (01:34:01) - Matthew Zeitlin, a correspondent at Heatmap News, discusses the significant energy demands of data centers, noting that large facilities can consume as much electricity as entire cities. He highlights the challenges utilities face in meeting this demand, including the need for substantial infrastructure upgrades and the potential impact on electricity prices. Zeitlin also addresses the environmental concerns associated with data centers, such as increased emissions and water usage, and emphasizes the importance of balancing technological advancement with sustainable energy practices.
  • (02:03:14) - Joon Sung Park, co-founder and CEO of Simile, discusses the company's development of a foundational model of human behavior capable of simulating society at both individual and population levels to predict outcomes. He highlights applications such as enabling Fortune 500 companies to interact with agents representing real people, allowing for more accurate market predictions and decision-making. Park also emphasizes the potential of these simulations to model entire markets or nations, aiming to enhance policy and product development through a deeper understanding of societal dynamics.
  • (02:15:15) - David Risher, CEO of Lyft and co-founder of Worldreader, discusses the company's record bookings and profits, attributing this success to a strong focus on customer satisfaction. He highlights Lyft's strategic initiatives, including international expansion and the integration of autonomous vehicles, particularly through a partnership with Waymo in Nashville. Risher also addresses the challenges posed by market volatility and emphasizes the importance of maintaining a clear focus on data and long-term goals amidst fluctuating stock prices.
  • (02:31:41) - Todd McKinnon, co-founder and CEO of Okta, discusses his journey from leading engineering at Salesforce to founding Okta in 2009, aiming to address identity management challenges in the emerging cloud computing era. He highlights the company's growth to a $3 billion annual revenue with 6,000 employees, emphasizing the increasing importance of identity management amid trends like AI and remote work. McKinnon also shares insights on the surge of interest in AI agents across industries, stressing the need for secure implementation and the evolving role of software engineers in a rapidly expanding software landscape.
  • (02:47:11) - Alexander Ksendzovsky, a neurosurgeon and neuroscientist, has been cultivating neurons on electrodes for two decades. He discusses leveraging the superior energy efficiency and adaptability of biological neurons to enhance artificial neural networks, aiming to address the energy challenges in AI computing. Ksendzovsky emphasizes that this approach is not science fiction but a current, deployable technology, with his company actively integrating biological networks to improve AI model performance and efficiency.
  • (02:55:15) - Andrew Huberman is an American neuroscientist and tenured professor of neurobiology at Stanford University School of Medicine, where he studies brain function, behavior, and visual system plasticity. He is the creator and host of the Huberman Lab podcast, which translates neuroscience and health research into practical tools for sleep, focus, fitness, and mental health. Huberman is known for bringing academic research to a broad audience through long-form interviews and solo deep dives on science-backed protocols.

TBPN.com is made possible by:

Ramp - https://Ramp.com

AppLovin - https://axon.ai

Cisco - https://www.cisco.com

Cognition - https://cognition.ai

Console - https://console.com

CrowdStrike - https://crowdstrike.com

ElevenLabs - https://elevenlabs.io

Figma - https://figma.com

Fin - https://fin.ai

Gemini - https://gemini.google.com

Graphite - https://graphite.com

Gusto - https://gusto.com/tbpn

Kalshi - https://kalshi.com

Labelbox - https://labelbox.com

Lambda - https://lambda.ai

Linear - https://linear.app

MongoDB - https://mongodb.com

NYSE - https://nyse.com

Okta - https://www.okta.com

Phantom - https://phantom.com/cash

Plaid - https://plaid.com

Public - https://public.com

Railway - https://railway.com

Restream - https://restream.io

Sentry - https://sentry.io

Shopify - https://shopify.com/tbpn

Turbopuffer - https://turbopuffer.com

Vanta - https://vanta.com

Vibe - https://vibe.co


Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays from 11–2 PT on X and YouTube, with full episodes posted to Spotify immediately after airing.

Described by The New York Times as “Silicon Valley’s newest obsession,” TBPN has interviewed Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella. Diet TBPN delivers the best moments from each episode in under 30 minutes.

Speaker 1:

You're welcome to TV. Today is Thursday, February 12. We are live from the TV channel show up in Temple of Technology, the fortunes of finance, the capital of capital. Ramp.com, baby. Time is money.

Speaker 1:

Save both. These use corporate cards, bill pay, accounting, and a whole lot more all in one place. Have a great show for you today, folks. We're taking a little tour of the health world. We got Brian Johnson and Andrew Huberman on the show.

Speaker 1:

Of course, we also have Matthew Zeitlin, June Park, David Rischer from Fit from Lyft is coming in, Todd McKinnon from Okta. We have a number of other folks joining the the lightning round. It should be a fun show. But first, I have to tell you what Linear is. Not only do they sponsor the lineup, but they are the system for modern software development.

Speaker 1:

70 of enterprise workspaces on linear are using agents. So something big happened yesterday. There was a massive viral article. And do you know how big this something big is happening? How much that grew?

Speaker 2:

It went big. I was getting texts about it.

Speaker 1:

It has over a 100,000 likes now. We were looking at It like 50,000

Speaker 2:

likes. Views?

Speaker 1:

I think it's over a 100,000,000 now. It's quickly becoming the most read thing in the world. Don't know. It's it's crazy.

Speaker 2:

Wonder 75,000,000.

Speaker 1:

75,000,000 views? Views. Okay. And a 100 They're

Speaker 2:

saying that only 200 people actually read it, but everyone has an opinion.

Speaker 1:

Everyone has an opinion. I had an opinion. I wrote a piece about it yesterday comparing, saying AI is not like COVID. I kinda got one shot by the by the COVID analogy just from a mathematical perspective. Spent a lot of time talking about the difference between exponential growth and and sigmoidal curves and logistics curves.

Speaker 1:

But I I, you know, I enjoyed it. I enjoyed talking to the author on the show. And Will Menidas had a great rebuttal as well talking about these are tool shaped what what what are they? Tools? Tool shaped objects.

Speaker 1:

Just the idea of like getting your workflow set up. It can be a little bit of a nerd snipe sometimes. There's a whole bunch of news. But the best reaction for two, something big is happening. It's got to be John Palmer.

Speaker 1:

Something small is happening. I was reading this this morning, actually laughing out loud. We'll read through it. John Palmer says

Speaker 2:

Yeah, were reading it to yourself.

Speaker 1:

Yes.

Speaker 2:

And then you just started belly laughing. Yes. And so then I started filming. So then I asked you

Speaker 1:

what you were reading.

Speaker 2:

I was filming you. No way. Sent that to John

Speaker 1:

Oh, I missed that.

Speaker 2:

To get his reaction.

Speaker 1:

It was done. Reacting to his. Yeah. Before we read through this, let me tell you about Restream. One livestream, 30 plus destinations.

Speaker 1:

If you want a multistream, go to restream.com. So think back to 2014, a little over a decade ago. If you wanted a burrito, you had to get in your car and drive to the restaurant. You had to stand in line. You had to talk to a person.

Speaker 1:

If someone told you that one day a stranger would bring you a burrito to your front door because you pressed a button on your phone, you would have said that sounds dystopian and also incredible. Then over the course of about three years, it just became normal. And this was before Chipotle started skimping out on the portions too. So it was really, really good. I think we're at that same inflection point right now, except of instead of burritos, it's your entire job.

Speaker 1:

I've spent the last two weeks reading about AI and watching clips of people building things with it. I live in this world now. The future is being shaped by a remarkably small number of people, a few 100 researchers at a handful of companies. I'm not one of these people, says John Palmer. I work in crypto, but I am close enough to feel the ground shake.

Speaker 1:

I follow several AI researchers on X, and I also recently purchased a Mac Mini. I haven't set it up yet, but I think you can understand what I'm saying. I know this is real because it happened to other people first. Here's the thing nobody outside of tech understands yet. The people sounding alarm aren't making predictions.

Speaker 1:

They're telling you what already happened to them. I am relaying this information to you secondhand with conviction. For years, AI had been improving steadily. I absorbed these improvements fine because I wasn't really paying attention, to be honest. Then, apparently, in 2025, everything got much faster and then faster again.

Speaker 1:

I know this because I watched a three hour podcast about it last Wednesday. The host was visibly shaking. I could tell because he wasn't even able to finish a single pint of Guinness. Taking shots. Then on February 5, two major AI labs released new models on the same day, and something clicked.

Speaker 1:

Not for me personally. I was at a birthday dinner for my wife's coworker. But for the people who tried the models that day, it was apparently a huge deal. One guy said he tells his AI what to build, walks away for a few hours and comes back to find the work done. This will very likely be what my life is like as well, I set up my acne.

Speaker 1:

AI will be doing things while I'm in the other room. Potentially huge. Huge. Huge. Let me give you an example so you can understand what this looks like in practice.

Speaker 1:

A guy on Twitter built an entire app just by describing it. The AI wrote the code, opened the app, tested the buttons, decided it didn't like the layout, fixed it, and only then said, it's ready. I'm going to do something I'm going use something like that with my Mac Mini, some type of cool app or something. I'm still figuring it out, but I have all the hardware. And I'm definitely gonna optimize my setup with all types of skills and plug ins.

Speaker 1:

I'm not exaggerating. That's what my Tuesday

Speaker 2:

could look like potentially.

Speaker 1:

But I tried AI, and it wasn't that good. People would say this. They say this, I tried AI. It wasn't that good. Here's John Palmer's response.

Speaker 1:

He says, I hear this constantly. I understand it because I also thought this. But the models today are apparently unrecognizable from what existed six months ago. Some of the people who are saying this have hundreds of thousands of followers and make sure you're not on the free plan.

Speaker 3:

You have

Speaker 1:

to get on the paid plan.

Speaker 2:

That's key.

Speaker 1:

How fast is this actually moving? Let's answer that question. In 2022, AI couldn't do basic math. By 2023, it could pass the bar exam. By 2024, it could write working software.

Speaker 1:

And by 2025, some of the best engineers in the world had handed over most of their coding work to AI. In February 2026, you can make AI send you a morning summary of the top posts on Reddit from your Mac Mini. Here we go. What should you actually do? I'm not writing this to scare you, says John Palmer.

Speaker 1:

I'm writing this because the single biggest advantage you can have right now is being early. Early to understand it. Early to buy the hardware. Early to subscribe to the paid tier and ask you to make a meal. You can start using it for real work.

Speaker 1:

If you're an engineer, give it a Not

Speaker 2:

a meal. A meal plan.

Speaker 1:

A meal plan. Oh, a meal plan. Yeah. Actually, make a meal. That would be cool.

Speaker 1:

If you're an engineer, give it a GitHub repo. If you're in finance, give it a messy spreadsheet. If you're in other industries, definitely figure out what you can give it. Then give it that. But give it something for sure.

Speaker 1:

I just need to think about it more. Mention it to your coworkers. Right now, there's a brief window where most people are still ignoring this. You can be the person ignoring it less, which puts you ahead. Be the person who walks into a meeting and says, I used AI to do this analysis in an hour instead of three days.

Speaker 1:

It's fine if it actually took you three days. The point is people will respect you. Get a Mac Mini. This is point number three.

Speaker 2:

I think this And by the way, you can actually get a Mac Mini on DoorDash.

Speaker 1:

You can. Just like a burrito. It's important. Every single account I've seen posting this stuff has a Mac Mini. It's the ultimate tool for this stuff.

Speaker 1:

You gotta get your financial house in order. I'm not a financial adviser, says John Palmer. That said, I did just spend $599 on a Mac Mini plus AppleCare and $50 on clawed tokens. But if it gets too expensive, have a backup plan to switch to an open source Chinese model. Fourth, watch the podcasts.

Speaker 1:

Actually, don't just watch. Study, TBPN, DoorCash, etcetera. The good thing is that these shows are several hours long, so you can basically fill up your entire day just watching them. Or making them. Or making them, in our case.

Speaker 1:

Rethink what you're telling your kids. The standard playbook, good grades, good college, stable job, it's over. Tell them it's tell them it's likely not looking good for them. Go to your four year old and tell them, it's not looking

Speaker 2:

good It's looking good, buddy.

Speaker 1:

It's not looking good.

Speaker 2:

It's not looking good.

Speaker 1:

What I know. I know this isn't a fad. The technology works, and the richest institutions in history are pouring trillions into it. I know this because I've seen it mentioned pretty frequently over the last past few months. I I know the people who will come out of this best are the ones who start engaging now, not with fear, but with curiosity, a sense of urgency, and ideally A Mac Mini.

Speaker 1:

The future is already here. It just hasn't knocked on your door yet. It's about to. And when it does, I will be ready. My Mac Mini will be unboxed.

Speaker 1:

My agent will be configured. I will describe what I want in plain English, and it will appear. I just have to optimize everything first to make sure my setup is super legit. If this resonated with you, share it with someone in your life who should be thinking about this too. Most people won't wanna hear about it until it's too late.

Speaker 1:

You can be the reason someone you care about buys a Mac Mini. So funny. A banger. Such a banger. So so good.

Speaker 1:

I really I

Speaker 2:

really No words.

Speaker 1:

No words. I did let's see let's see how how Codex is doing. I gave I I went and I described in plain English to Codex 5.3 Medium. Build me an app, an amazing app, the best app ever, one that will win awards and make me a legend in Silicon Valley. An award winning app.

Speaker 1:

An award winning app. And it appears to be working. Let's see. We'll we'll we'll we'll check-in on this. I need to I need to run it?

Speaker 1:

No. It's running it I don't know. It's describing it. It doesn't it hasn't said what it's thinking of doing, but it says You don't need

Speaker 2:

to know that, John. You don't need to know that. And more seriously, John John Palmer's post feels like a response to Will's essay, Tool Shaped Objects, which is fantastic. We won't read through Will's whole essay because I think you should read it yourself.

Speaker 1:

He might be coming on the show soon.

Speaker 2:

Yeah. We're gonna get him on the show tomorrow

Speaker 1:

Yeah.

Speaker 2:

Which we're excited about.

Speaker 1:

But we do love reading a Will Menardus essay laced with tons and tons of advertising. We know that that's one of his favorite things and, you know, we'll read one right now for the New York Stock Exchange. Wanna change the world? Raise capital at the New York Stock Exchange.

Speaker 2:

Let's do it. Amphropic. Amphropic. Amphropic is nearing the completion of a deal to raise more than 20,000,000,000 in a funding round led by investors including Peter Thiel's Founders Fund, D. E.

Speaker 2:

Shaw and Dragoneer. Let's dig in. Of course, being a little bit silly with the pronunciation, Sydney over on X said British AI researcher be like amphropic. So I'm gonna start integrating that into my sentences as well. Anthropic is nearing the completion of a deal to raise more than $20,000,000,000 in a funding round led by the group I just mentioned.

Speaker 2:

The deal is set to value Anthropic at about $350,000,000,000 nearly doubling its prior value, and could be announced as early as this week. The funding round includes a range of investors, including Kotu, Singapore's GIC, Microsoft's, Nvidia, and has become a who's who of Silicon Valley and Wall Street investors. I think it's worth kind of talking about this just from a PT lens. Yeah. I don't know if you want to break it down, given that

Speaker 1:

Yeah. He's more of a So Founders Fund is known for the monopoly thesis. Like, there's power law. You wanna be in the best company in the category. You want the best ownership.

Speaker 1:

Put all

Speaker 2:

your money in that one.

Speaker 1:

Yeah. I'll I'll go all in, famously. They they they did some other social networking deals, but obviously, like Facebook was the big one at seed, and that was enough to return that fund many times over. And so the whole thesis of the fund for a long time has been, like, these concentrated bets concentrated bets. AI has been an interesting one.

Speaker 1:

They're now in three major labs because they have a huge position in SpaceX, which now owns xAI. And so they're in that lab, sort of coincidentally. Peter Thiel, I believe, is listed on Wikipedia as a cofounder of OpenAI, or at least he was like an initial donor and got like a cofounder title. And so has been involved in OpenAI for a long time. And then they invested, I believe, in like the $60,000,000,000 round right after right when ChatGPT was launching, there was a big funding round that happened.

Speaker 1:

And it was an odd deal because the company had not yet converted. It was very unclear what Microsoft would get, what the employees would get, what the investors would get, what the nonprofit would get because the nonprofit was going to keep a stake. And so it was this, like, wild bet. And I know a lot of investors

Speaker 2:

were scared off.

Speaker 1:

It was a leap of faith. A lot of investors were scared off because there was, you know, what am I investing in? Like, I'm not getting shares. I'm getting units. And it was, a very odd structure.

Speaker 1:

And I know many investors that were scared off by that. They were like, yeah,

Speaker 2:

Well, product's I'm kind of a unit.

Speaker 1:

An absolute unit. I'm an absolute unit,

Speaker 2:

but I'm used to buying shares.

Speaker 1:

Yeah, exactly. So you had to do a lot of sort of leaps of faith. At the time, I was sort of trying to I mean, was enjoying Chateappiti. I thought it was an important product. I didn't really have fully full thesis around like the aggregation theory and, you know, maybe there would be a winner take all in consumer AI, any of that.

Speaker 1:

I was just looking at the team because at the team, at at the time, it was Sam Altman who'd founded and sold a company in this, like, acquihire, then worked at YC, been a successful investor. And it was just, like, all green flags for, like if it wasn't all the baggage of, like, OpenAI and stuff, just, if if someone with that resume came to you and was like, I'm starting a company, you'd just be like, yeah. You check all the boxes. Great. And then you have Greg Brockman, was the CTO of Stripe, huge hugely successful company.

Speaker 1:

And so I was like, wait. You got that guy to come be your cofounder? Okay.

Speaker 2:

That's profit.

Speaker 1:

And then you have Ilia, and you had a whole bunch of other folks who were just really, really top tier just in terms of, like, founder quality, like, worth backing. And so you did have to put the valuation out of your mind because it was so not so disconnected from, like, start up valuation.

Speaker 2:

Yeah.

Speaker 1:

But in terms of just is the team high quality, it seemed like it checked all the boxes and that they would figure it out. They, of course, ultimately did. There was another sort of interesting thing just about, like, the thesis at the time, how big can this be, what is the market, what are the opportunities for really, really huge categories. A lot of founders fund investments like SpaceX. It's like this massive category that's very, very hard, but if you crack it, it's going to be an entirely new market.

Speaker 1:

Social media was certainly that as well. And so that made a lot of sense. Obviously, they're in X. Now they're in Anthropic. Sequoia is also in all three.

Speaker 1:

And the interesting thing is, like, P. T. Has given a number of talks where he's sort of been anti AI. You've see you've seen this whole thing about, like, crypto is libertarian, AI is communist, and that AI is, a consolidating force. He's he's never been one to say, like, oh, wow.

Speaker 1:

The tools are so amazing. It's been a lot of, like, is this just doing your homework for you?

Speaker 2:

He's not putting out threads about No. Using his Mac Mini.

Speaker 1:

I don't even know I don't even know if he has a Mac Mini.

Speaker 2:

You should send you should send John Palmer's piece to PT. Mhmm. Just flip it over there. Yeah. And say, hey, think you should read this.

Speaker 1:

I think you should read this.

Speaker 2:

I don't, you know, think you kind of understand this. You did invest in DeepMind Yeah. Just like, you know, than a decade ago.

Speaker 1:

Yeah. The DeepMind deal was was was wild. So Demis met PT at a party, I think. And instead of pitching him on DeepMind, he said, let's play chess. And so they played chess, then Founders Fund invested in DeepMind very early in Do we know who won?

Speaker 1:

I don't know, actually. That might be DeepLore. I'm not sure. But DeepMind, the the the initial pitch was like this counterforce to Google a little bit or like this, like, renegade group of AI. It's it's the same thesis over and over again.

Speaker 1:

It's like, let let's get the pure play going, and then it always gets tied in with a hyperscaler some point. But Google acquired Yeah.

Speaker 2:

Beat them, join them.

Speaker 1:

DeepMind in twenty fourteen, January, for $600,000,000 and Founders Fund owned more shares than all than the than the cofounders who had split up and then and then all the employees. Because because they had, again, developed, a very concentrated position.

Speaker 2:

Apparently, here's some of the lore. Knowing Thiel was a chess player, Demis struck up a deep technical conversation about the relative strengths of the bishop versus

Speaker 1:

the knight. So they're just talking about chess. They didn't actually play.

Speaker 2:

Maybe.

Speaker 1:

Really quickly, let me tell you about Century. Century shows developers what's broken. It helps you helps them fix it fast. That's why 150,000 organizations use it to keep their apps working.

Speaker 2:

Yeah. Apparently, first investment was 2 and a half mil.

Speaker 1:

You can't get deals like that anymore.

Speaker 2:

The Can't do anything with 2 and a half mil.

Speaker 1:

It's a nightmare. It's a nightmare. Mean Nightmare

Speaker 2:

check size.

Speaker 1:

Wasn't the initial Facebook check was it a $100,000 or was it a million dollars? It was like not a lot for 10% of Facebook or something like that. It was crazy. Yes, wild, wild times. But so yes, I mean reading into this, does this say that the Founders Fund team thinks that there's like a divergence in the market, that there's actually two there are actually two markets going on here, one in consumer AI, advertising, knowledge retrieval, the other in code and developer tooling and cogen and that Anthropic has actually carved out a separate space, so they're less competitive than people think.

Speaker 1:

Maybe they're just more competitive and it just makes sense to to get into both. We'll have to talk about them once the deal closes and everyone's ready to chat. Tyler, do you have something to add about

Speaker 4:

I was gonna say, didn't Scott Wu ran back that playbook. Right? Because he he played. Didn't he challenge PT in chess and then Napoleon in poker? Wasn't that

Speaker 1:

the Yeah. Yeah. Yeah. Yeah. Yeah.

Speaker 1:

And it was like, if I win, I get my terms. If you win, you get your terms.

Speaker 2:

And we have some breaking news.

Speaker 1:

What's the breaking news?

Speaker 2:

Anthropic has formally announced the round.

Speaker 1:

Let's go.

Speaker 2:

They said we've raised 30,000,000,000, so they upsized it. 20 was They are now at a $14,000,000,000 run rate. Wow. With this figure growing over 10% annually in each of those past three years. We'll see if they got another 10 x in them.

Speaker 2:

Dylan Pretty tell on the show last week felt fairly confident.

Speaker 1:

Yeah. Yeah. I mean, yeah. The the the the the big shift is that yeah. I mean, the the the biggest update for me is that the the forward deployed engineer thing is, like, very real.

Speaker 1:

And and even if the tools can build themselves, there's still so many blockers to actually rolling these out that you're gonna need a lot of people inside these organizations who are actually playing with Mac minis on the weekend and getting obsessed with these tools and then bring them into the enterprise because it's very easy for a lot of people in Fortune 500 companies to just show up and do their job the way they always do it because no one gets fired for switching things up. But if you roll out some tool and it doesn't work the way you intended or it sucks about a bunch of your time, even if you're just, I have eight hours of work to do in my traditional system, where do you get the extra hours to automate your work? It takes time, and that's where the forward deployed engineers come in. So great gig for anyone who wants to go implement enterprise agentic workflows because that will be the theme. Let me tell you about Phantom Cash.

Speaker 1:

Fund your wallet without exchanges or middlemen and spend with the Phantom card.

Speaker 2:

Let's pull up this chart. Yes. Andrew Kern over on X pulled it out. We can see Anthropix run rate revenue growth. Wow.

Speaker 2:

Can see here quite the pop from $0 in 2023 to a 100,000,000 in 2024 to a clean 1,000,000,000 in January 2025 and now sitting at 14,000,000,000

Speaker 1:

is acceleration. Dollars This is this is true Are you

Speaker 2:

feeling it yet? True acceleration. Are you feeling it?

Speaker 1:

Dario sort of underplayed 2027, I guess. Or I guess he said that he he was like, I don't think we have another 10 x. If they go to a 100,000,000,000

Speaker 4:

Well, he said, like, we've been 10 x ing. Yeah. You know, we'll see if we do it again. Yeah. Like, wink wink.

Speaker 2:

But

Speaker 1:

If they if they hit a 140,000,000,000 run rate by the end of the year, that would be insane. But strange times. Strange times. You love this effect.

Speaker 5:

That's all, folks.

Speaker 2:

I mean Tyler Tyler's been saying stuff along these lines for some time.

Speaker 1:

Okta. Okta helps you assign every AI agent a trusted identity so you get the power of AI without the risk. Secure every agent. Secure any agent. We have a CEO of Okta coming on the show.

Speaker 2:

Pumped Back for

Speaker 1:

to the timeline. There are there are more

Speaker 2:

there are more More in anthropic. Nathan Lambert was quoting their announcement from yesterday saying, we're committing to covering electricity price increases from our data centers this is anthropic

Speaker 1:

Oh, yes.

Speaker 2:

To ensure rate payers aren't picking up the tab. We'll pay 100% of grid upgrade costs, work to bring new power online, and invest in systems to reduce grid strain.

Speaker 1:

Makes so much sense.

Speaker 2:

Yeah, Nathan's saying that maybe this should have been their Super Bowl ad. Yeah. Yeah, it felt like when you looked at when you kind of look at clearly how much energy the industry put into the Super Bowl. Yeah. And what the net effect was for the average American.

Speaker 2:

I don't think any I don't think the average American came away from that, like, more positive feelings towards Yeah. AI. Yeah. If anything, they were like, I'm sick of this stuff.

Speaker 1:

Yeah. Yeah. Yeah. The meme was like, oh, beer ads, like, refreshing. Yeah.

Speaker 1:

Yeah. It would have been good to focus on this. Microsoft, did this, what, a couple months ago. The news was, on January 14. So just one month ago, Microsoft announced that they were going to do this, because there was a 267% wholesale electricity price near data centers, and local communities were starting to push back.

Speaker 1:

Some $64,000,000,000 in projects faced delays or cancellation because of opposition. So it's pretty easy to say, Hey, my electricity bill is going up. I don't want a data center in my county, in my city. And you call your city county representative, and they say, I don't like the slop either. I see a bunch of junk that doesn't seem that valuable.

Speaker 1:

Why do we need it here? It's not going to create that many jobs. It's not going to be that good for taxes, blah, blah, blah, blah. And other companies that do this, too, Google had their clean transition tariff, and Amazon also pays a surplus above electricity costs already. But Microsoft was unique because they timed it and did a whole press cycle around their initiative, And the volume was much, much higher.

Speaker 1:

The other interesting fact from that previous report was the anti AI issue is remarkably bipartisan. Data Center Watch claims that 55% of Republicans and 45% Democrat in affected districts. So 50 five-forty five. So it's not really like a red a right issue or a left issue. Like, there's pushback on both sides.

Speaker 1:

Like, when the data center comes in, it's not just, Oh, it's all green flags in a Republican district. No, both are pushing back. Also going in on super PACs, right? So Anthropic is putting $20,000,000 into a super PAC operation that is meant as a direct counter to the OpenAI super PAC operation. The midterms are a battleground between these two rival AI labs, says Teddy Schleffer at The New York Times.

Speaker 1:

And so we're going see some attack ads. It's like Sam and Dario are running for election, and we're gonna see some

Speaker 2:

Be the president of AI.

Speaker 1:

Yeah. This person can't can't restore the economy. What do you think, Tyler?

Speaker 4:

Yeah. But, I mean, we just said that, like, that overwhelmingly on both sides, there's, like, a lot of pushback. Right? So it's, what what are the actual differences between their their approaches? Yeah.

Speaker 4:

That is interesting. Maybe, I guess, maybe Antarct will be more, like, about the the China chip issue.

Speaker 1:

Yeah. It feels like there's they they should be more in alliance than against each other.

Speaker 4:

Yeah. There's, like, every there's a crypto super pack. Yeah. It's like the big one. Yeah.

Speaker 4:

There's no like big AI one right now.

Speaker 1:

Yeah. Yeah. It it would it would be weird if there was like a USDC super pack and then a USDT super pack, they were like fighting each other and just constantly surfacing bodies that are buried in the closets or skeletons in the closet. Because every time there's a successful attack on OpenAI, yes, people might be like, oh, okay. I like Anthropic a little bit more.

Speaker 1:

But most most people just come away being like, I don't like AI. Yep. So if they're attacking each other, that could just be bad for both, maybe? I don't know. But we'll see we'll see what they run.

Speaker 1:

We'll see.

Speaker 2:

We'll find out.

Speaker 1:

We'll see what

Speaker 2:

they Ben, over on X says, if you were falsely accused of a crime and it went to trial, who would you prefer to listen to the arguments and give the verdict? Mhmm. And the options are a jury of your peers or Claude Opus four 0.6. There's been 2,000 votes and 63% of people prefer Claude. Yes.

Speaker 1:

Might be the audience here.

Speaker 2:

Yeah. Yeah, a little

Speaker 1:

Ben does work at Orchid Health, engineering whole genome embryo screening. So I wouldn't be surprised if that's his audience. But I don't know. I don't know. I think the jury of your peers will be around for a very, very long time.

Speaker 1:

I think that that will be something. I mean, they still have courtroom sonographers. And we have

Speaker 4:

I think Vlad's gonna be around a long time, too.

Speaker 1:

Yeah. Okay. We're thinking in centuries, maybe. But I mean, the courtroom sketch artist. We've had cameras for a hundred years.

Speaker 1:

Like, the jury's

Speaker 6:

Yeah.

Speaker 4:

I'm not saying that I think this will ever happen.

Speaker 1:

Yeah. Yeah. Okay. But you'd prefer it. You you would ideally.

Speaker 1:

Yeah. I mean, how many jurors will be sitting there being like, okay, Claude, how should I vote? I haven't really been paying attention. Like, I've been playing Candy Crush. I've been playing I've been vibe coding my my optimal workflow on Claude Bot.

Speaker 1:

Really quickly, fin dot ai, the number one AI agent for customer service. If you want AI to handle your customer support, go to fin dot ai.

Speaker 2:

Will OpenAI or Anthropic IPO First? As of late last year, it was sitting at 75%

Speaker 1:

Yeah.

Speaker 2:

Of people on call. She thought that it would be OpenAI. Yeah. Now it's flipped.

Speaker 1:

It's flipped.

Speaker 2:

Almost 70% of people think that Anthropic will get out the door.

Speaker 1:

It is narrowing. It is narrowing.

Speaker 2:

But I expect these to kind of fluctuate Yeah. Just with the headlines.

Speaker 1:

It's all it's all red meat for Michael Grimes. He I I got so many text messages about about his move back and being like, is the most bullish thing for late stage growth technologies that he's back at Morgan Stanley. Like, you just like if you had any doubt that there will be major, major blockbuster IPOs, it's Michael Grimes being back at Morgan Stanley. So very excited for that. Really quickly, Gusto, the unified platform for payroll benefits in HR, built to evolve with modern small and medium sized businesses.

Speaker 2:

Teja over on X says, I turned on Meta AI's auto reply for Facebook Marketplace. I listed an item for $75 And when someone asked if it was available, the AI replied asking if they wanted it for free.

Speaker 1:

It's a lead. It's a it it's all it's all just a bait tactic. So, hold on.

Speaker 2:

This is getting Community note.

Speaker 1:

Yeah. We gotta dig in. It says Meta AI only suggest one reply. So this is a typo in here. Only suggest one reply for every message of the listing.

Speaker 1:

Then you can say, edit, or say that this one is manually written by the user. So some fake news, but still funny.

Speaker 2:

No. Mean, Teja is responding. Teja, I learned do not trust community notes. This happened to me. No idea why the community note appeared.

Speaker 2:

Meta AI only sends one message. I cannot edit it, but I can unsend it.

Speaker 1:

A whole Mars catalog. Stop talking trash about it publicly, and it might stop giving away your stuff. That's on you, That's on you.

Speaker 2:

Rocco's Basilisk.

Speaker 1:

He's Rocco's Basilisk.

Speaker 2:

What's up with warnerbrothers?in.com

Speaker 1:

first. Consul builds AI agents that automate 70% of IT, HR, and finance support, giving employees instant resolution for access requests and

Speaker 2:

passwords. Intensified this week as Paramount CEO David Ellison and a vocal investor made new moves to thwart rival Netflix's planned takeover.

Speaker 1:

The storied Hollywood asset and the home of Batman, Harry Potter, and the White Lotus.

Speaker 2:

Paramount's long rebuffed suitor for Warner Brothers Discovery enhanced its 77,900,000,000.0 all cash offer for the entire company Tuesday in an effort to bring Warner to the negotiating table and ultimately abandon a seal with Netflix. Also an all cash transaction valued at $75,000,000,000 Cleveland investor and Cora Holdings entered the fray by acquiring a small stake in Warner with an eye toward increasing its holdings and pressuring the company to negotiate with Paramount. I wonder if they're friends with the Ellisons. Interesting. Paramount's gamble.

Speaker 2:

Paramount is hoping its latest pitch can persuade Warner to ditch its agreement and sell its movie and TV studios and HBO Max streaming service to Netflix. Unlike Netflix, Paramount also intends to buy Warner Discovery's cable networks, which include CNN, TVS, and Food Network.

Speaker 7:

Mhmm.

Speaker 2:

But Warner has so far shown little interest in engaging with Paramount, telling investors that Ellison's previous offer was not even comparable to the Netflix agreement. On Tuesday, Warner said its board will review Paramount's new offer, but it but isn't modifying its recommendation regarding its agreement with Netflix. The company advised shareholders to not take any action now regarding Paramount's amended tender offer. The bar for Warner to reopen talks with Paramount is very high given its contract with Netflix.

Speaker 1:

Massive termination fee.

Speaker 2:

Paramount said it would pay the 2,800,000,000.0. Let's give it up for multi billion dollar termination fees. Warner would owe Netflix should the agreed to deal collapse and pay a ticking fee of 25¢ a share to Warner shareholders for each quarter its deal hasn't closed.

Speaker 1:

Woah.

Speaker 2:

We got ticking fees.

Speaker 1:

Ticking time bomb.

Speaker 2:

Some stakeholders are holding out for Paramount to further boost its bid. I'm not surprised given that they've repeatedly said with their offers, this is not our best and final offer. Analysts from Raymond James wrote in a note to clients that many Warner Discovery shareholders still expect Paramount and its backers to raise its $30 a share bid by $2 to $3 a share. So everyone is waiting around saying, let's get those numbers up.

Speaker 1:

And remind me of how Paramount is doing right now. They have UFC, correct? And the experience watching UFC was a downgrade from pay per view, but cheaper? What was the takeaway?

Speaker 2:

Well, I mean, I don't think we can like give an analysis of Paramount as an entire business based on the the Everything looks like experience was not as good as paying for the pay per view.

Speaker 1:

And was that because you had to you had to go through all the you had to jump through all the hoops to sign up for Paramount plus and you weren't already a member?

Speaker 2:

No. The the the like, the viewing experience was different Mhmm. Because it's now subscription and ad supported. Okay. Whereas previously, pay per views had ads Yeah.

Speaker 2:

But not nearly as much. Okay. So the thing you're missing now watching is, like, in round commentary Sure. Walkouts, things that like hardcore fans Yeah. Really like.

Speaker 1:

Is there less Joe Rogan commentary?

Speaker 2:

Basically. Brutal. Brutal. This is extremely bad. Well, market's reacting.

Speaker 2:

Paramount's down 7% today.

Speaker 1:

That might be the good day.

Speaker 2:

Of course, just reacting to this analysis of the UFC viewing experience. Yes. So But no. The the stock dip was apparently in reaction to the breakup fee. People being like, great.

Speaker 2:

You're gonna pay even more.

Speaker 1:

Yeah. Let me tell you about AppLovin. Profitable advertising made easy with axon.ai. Get access to over 1,000,000,000 daily active users and grow your business today. So for now, Paramount's move Tuesday to sweeten its offer could push Netflix to increase its bid for Warner.

Speaker 1:

Paramount needs to test Netflix's pain threshold without having to bid too high. The revised offer does exactly that without further constraining its balance sheet. For now, the ball is now with Netflix. Warner and Netflix are focusing on getting their deal approved by the justice department as well as regulatory authorities in Europe. As part of its investigation, the DOJ is looking into whether Netflix has engaged in anticompetitive practices.

Speaker 1:

Last week, president Trump reversed course and said in an interview with NBC News that he didn't think he should weigh in on the deal and would instead leave it to the justice department. I've decided I shouldn't be involved. The justice department will handle it, Trump said in the interview. He previously expressed concerns about the size and scale of a Netflix Warner combination, saying the combined company would command significant market share, but he's clearly been watching YouTube. And he just says YouTube's too dominant.

Speaker 1:

Too many watch hours on YouTube. It's fine. Let it rip. He that talking point probably probably worked on him. We'll see.

Speaker 1:

Paramount said Tuesday.

Speaker 2:

Did in talking to Ashley, what was it, on Tuesday? Mhmm. Ashley wasn't super excited about having Yeah. Basically one fewer buyer on the documentary side. But he said it was already so cooked that

Speaker 1:

Yeah. He was like, sorry. He was blackmailing hard. If you want some white pills, our newsletter, tbpn.com, has a list of AI white pills you may have missed. There's a number of them, so go check them out.

Speaker 1:

Digging through the archive, finding basically, the vibes on the on the timeline have been miserable. Everyone's like, you know, fast takeoff, permanent underclass. Tyler Cosgrove is not black pilling. He thinks that we're in the fast take off. There's been a lot of white pills.

Speaker 1:

Lot of white pills. Recursive And they're raising white pill. Hooked they hooked an AI agent up to a bio lab.

Speaker 2:

That is a white Yes.

Speaker 1:

No. I I agree. I agree. But I understand that a lot of people might see it as a black bell.

Speaker 2:

Erica over on X posted Aileen Gu to join Benchmark as a senior associate when she returns from the Olympics. She photoshopped this. Of course, we did not put this out, but it's a good bit. Aileen competes for China in the Olympics. Which is a Even though she's a US Citizen.

Speaker 2:

Citizen.

Speaker 1:

And an American student.

Speaker 2:

An American student. Bill Gurley responded.

Speaker 1:

I know a lot of I know a lot of Americans do this. I have a buddy who was able to go play ping pong in the Olympics because he had dual citizenship. Or he wasn't even like a dual citizen. He just like his parents were from another country, so he was able to go Yeah. Beyond that team.

Speaker 1:

Because a lot of the other teams are like way less competitive. So it's like, you wouldn't make the US team, but you wanna play anyway, you go with the other country. But

Speaker 2:

I think with Aileen, it's because the sponsorship dollars Mhmm. She can make

Speaker 1:

Oh, this

Speaker 2:

is by being a Chinese star. Oh. There's all these Chinese brands.

Speaker 1:

I thought you couldn't

Speaker 2:

She's a global star.

Speaker 1:

You can just get paid? You can get

Speaker 2:

a bunch of it. Not not to compete in the Yeah. Olympics but broader endorsement deals. Yeah.

Speaker 1:

You're on the Wheaties box.

Speaker 2:

Getting getting a gold medal is like UBI because you'll be able to do

Speaker 1:

You'll be turned into a libouu potentially.

Speaker 2:

Well, yeah. In this case in this case, yeah. But Bill Gurley actually responded

Speaker 1:

What'd he say?

Speaker 2:

And said, I can't believe you actually found out about this. We were planning to keep it a secret, but he wrote it in Chinese.

Speaker 1:

He's leaning in. He's leaning into the meme. Interesting.

Speaker 2:

Turn post notifications

Speaker 1:

on for when Deli in response.

Speaker 2:

Gurley will be coming on the show.

Speaker 1:

Yes. We're very excited.

Speaker 2:

His new book

Speaker 1:

His book launches.

Speaker 2:

Drops. We've

Speaker 1:

got don't copies. Know how embargoed it is, so I haven't been talking about it.

Speaker 2:

So we're not gonna share it

Speaker 1:

with We're very you excited.

Speaker 2:

But moving on. Yes. We should talk about Public.

Speaker 1:

We should talk about Public. Public is investing for those who take it seriously. Stocks, options, bonds, crypto, treasuries, and more with great customer service.

Speaker 2:

We got talk about

Speaker 1:

amazing ad that Leaf over at Public put together all about teaching kids the power of compound interest, a very different positioning from many other platforms. And I you know, it was all AI generated. He made it in like two seconds, I guess. But it was like still emotional because the idea hit me because I'm a father.

Speaker 2:

But Yeah. It was good stuff. Anyway. We've to talk about Higgs Field. Yes.

Speaker 2:

We've had the founder of Higgs Field on the show a couple times.

Speaker 1:

They've been accused of rage bait, basically.

Speaker 2:

Yes. I think that's what's more More rage bait. Okay. Go down. But Rishi in Forbes Mhmm.

Speaker 2:

Has the daily cover story.

Speaker 8:

Oh, wow.

Speaker 2:

All the racist videos and payment problems, the dark side of this AI startup super fast growth. Interesting. Influencer marketing has helped AI company Hagfield hit $300,000,000 in annual revenue run rate in just eleven months, but misleading marketing tactics and a social media strategy based on shock has led to backlash among creators. In late January, Tim Sohrutt, a London based video game director, received a message on a social media site from the marketing team at Higgs Field, a fast growing AI generation startup. This is the biggest moment in Higgs Field history, and we want you to be a part of it at RED, the $1,300,000,000 startup whose tools are used by some 15,000,000 creators and ad agencies to churn out four and a half million video clips every day.

Speaker 2:

It was about to launch a new tool called Vibe Motion, which uses AI models to convert text prompts into motion graphics.

Speaker 1:

You like good vibes? You have motion? Use Vibe Motion.

Speaker 2:

The offer of Soret shared the startup social media post along with a video clip from preassembled marketing materials. The company would pay him $200. Mhmm. But Soret, has spent years designing graphics both manually and with AI tools, could tell something was off. Videos Higgs field had shared with him lack the visual quirks of AI.

Speaker 2:

He's like, this is simply too good.

Speaker 1:

Interesting.

Speaker 2:

And he quickly realized that some clips in the media kit weren't generated with AI at all.

Speaker 1:

I said this. It was a huge alpha in just taking a cinema camera, filming yourself, and being like, this is AI generated, and I'm raising. It's fraud. But alpha That's not alpha. Don't do it.

Speaker 1:

But but there was a moment I mean, this happens a lot with open AI stuff where people will see a video and they'll be like, wow, they definitely have a new model. And it's like, they also have a production budget. Like, they can't just hire actors.

Speaker 6:

Well,

Speaker 2:

when Sam Blonde launched Monaco yesterday, was his video AI? Because it looked it it was fifty fifty for me. Yeah. I think Like, there's definitely there's definitely

Speaker 1:

I think you would just rip a direct

Speaker 2:

Oh, I know. You obviously could. Yeah. But clearly, the models are good enough where you could and it's kind of a funny thing if you're launching an AI company Yeah. To show Why not?

Speaker 2:

I made this with AI. It's not good.

Speaker 1:

It's good enough.

Speaker 2:

Anyways, so they get it this guy gets media kit Mhmm. Says, hey, can you share some of these AI outputs? Yep. But he found out and said there were video templates that appeared to have been lifted from a stock site in Vato and went to Oh, know and pasted its own logos. While Sora didn't share the videos, others did circulating the stock video templates on X to promote Hicksfield.

Speaker 1:

Wow.

Speaker 2:

All this hype is fake and it's bought, Sora told Forbes. Hicksfield's co founder and chief strategy officer Mahi told Forbes Immediate Kit was created by an employee in the startup's marketing team for ideation purposes and was inadvertently shared with creators, saying the company's processes went haywire. Mhmm. With its library of 400 presets of camera motions and visual effects, SF based Hicksfield offers an easy way for creators and advertisers to produce cinematic short videos through text based prompts. You guys already know this.

Speaker 2:

Mahi says, we fully admit that we push the envelope. We learn from what works on platforms like X, and very explicitly, it's more controversial content that gets attention. So that's translated into hockey stick revenue growth. Last month, the startup claimed it doubled its annualized revenue to 200,000,000 in just two weeks Yeah. Driven largely by subscriptions from its 300,000 paying users.

Speaker 2:

Wow. By early February, its annual run rate crossed 300,000,000. Mhmm. CEO Alex told Forbes that he hopes to reach 1,000,000,000 by the end of the year after raising 80,000,000 from firms like Excel and Menlo in mid January. The startup is now in talks to raise funding again.

Speaker 2:

But that rapid growth appears to be driven by aggressive, shocking, and sometimes misleading marketing tactics. Anyways, they're moving very quickly. I it was it was interesting when we asked Alex about his margins. He kind of he seemed kind of he gave kind of a concerning response. He he he immediately was like.

Speaker 1:

Yeah. Well, when you're reselling a model, are at risk there. At the same time, like, with a different business model, with a different customer acquisition flow, it's possible to have good margins on top of models if it's deeply embedded. But I've actually been getting served a whole bunch of Higgs Field SpawnCon or like how to videos that are amazing. A lot of them are testaments to the power of the underlying models, like Nana Banana.

Speaker 1:

I saw there's this cool video where this creator was talking about how hard it is to film yourself. Like, let's say you want to do one of those, like, cool, like, I'm locked in videos, which I really enjoy, and you want the camera to be here getting a profile shot of you and then flip around, go behind you over your shoulder, then spin around, show your face, then go inside the keyboard, that can be a really, really hard shot to get. You might need like a automated camera or robot. Like you need a KUKA robot arm to do that move. MKBHD has one, but they're very, very expensive, hundreds of thousands of dollars.

Speaker 1:

Not every creator can afford one. And so he was basically showing this workflow where he will film himself on a tripod here, move the tripod, move the tripod, move the tripod, and then put in the start and end frames into Nano Banana Pro, and then it interpolates those and get and does the sweeping move. And because it's all in motion, there's motion blur, and it has two reference frames of, like, Geordie's side and then Geordie's back. It actually looks perfect. And it doesn't need to come up with as much imagination, so it delivers really well.

Speaker 1:

And a lot of those have been sort of those little fun tools or, like, workflows have been promoted by influencers. And they'll say, like, comment, and I'll send you the prompts, or I'll send you the workflow, or I'll break it down, or I have a course or something like that, info products on top of these things. And I haven't really jumped in, but I've seen those and been like, I would love to make that video of me or my dog or whatever, my car. It's it's like it's a fun thing. And for the right person, that's gonna actually work as a marketing ad or work as content for them.

Speaker 1:

Really quickly, let me tell you about Eleven Labs. Build intelligent, real time conversational agents. Re imagine human technology interaction with Eleven Labs. Also, I got to tell you about Eleven Reader because I used it this morning in the car to read or dictate the Will Menaitus essay, and I really enjoyed it. So shout out Eleven Reader.

Speaker 2:

Can it do Will's voice yet? No. We should get Will to license his voice.

Speaker 1:

We really should.

Speaker 2:

Will, the author.

Speaker 1:

I had the option to pick a few different voices.

Speaker 2:

And

Speaker 4:

can clone your own voice, but you can't clone someone else's yet

Speaker 1:

because it

Speaker 4:

makes you read, like, certain text.

Speaker 1:

The so the iconic voice collection, it's it it it served to you and says, do you wanna do Michael Caine, who's a British icon, Bert Reynolds, the masculine iconic storyteller, Richard Feynman, Raw Genius, Sir Laurence Olivier, and Doctor Maya Angelou. And I picked Bert Reynolds, the masculine iconic storyteller. So he led me well

Speaker 2:

Here's maybe where Higgs Field Please. Kind of crossed the line. Yeah. Apparently, they shared they were sharing a Google Drive folder Mhmm. With the creators that had popular children's characters like Shrek, Moana, and Mickey Mouse saying Mhmm.

Speaker 2:

Saying things that are

Speaker 1:

on the show, but

Speaker 2:

sound a little a little racist. Yeah. And then non consensual deep fakes Mhmm. Of public figures like Sydney Sweeney and Zendaya and President Trump. So, yeah, the the the challenge here is like companies like Higgs field Yeah.

Speaker 2:

Are competing with these Chinese open source models All way that have rules. Okay. They'll generate whatever you want. Yep. And clearly, there's an insane amount of demand for video models that have no rules.

Speaker 1:

Of course.

Speaker 2:

And so this isn't surprising Yep. Even though it's wrong.

Speaker 1:

Yeah. It's I mean, it goes back to the the the BitTorrent Napster analogy, where, like, the demand for things that just can't be done legally is really, really high. And you have to you can't view that as a business opportunity because, of course, you'll see really, really high demand for completely free music if you're not paying anything, right?

Speaker 2:

Yeah.

Speaker 1:

You'll see really high demand for generate a Mickey Mouse character. The question is, the business question is, can you actually do the deal, license it, make sure every all the stakeholders and all the legal parties are represented and happy with the deal and everyone's paying. If you're just throwing out free IP violation, you're going to see huge demand. But it's probably unsustainable.

Speaker 2:

Yeah. So they did launch a program called Higgs Field Earn, which was basically like a clipping thing. You could make stuff, share it. Depending on how much engagement it got, you'd earn some money. That led to them getting banned from X or their account shut down.

Speaker 1:

Woah. I didn't realize that.

Speaker 2:

So yeah, we'll see how this nets out. Alex, I'm sure will I would hope would adjust course. Yeah. He is they're clearly very talented at a lot of things. Yeah.

Speaker 2:

But it sounds like they've been blurring the line

Speaker 1:

I mean, for for what it's worth, I still think that I don't know how fast you would get steamrolled by the labs, but I do think that there's an interesting opportunity right now for, like, a little bit of what CapCut's doing plus Nano Banana Sora stuff, where you have a lot of video generation, but then you also have the ability to lay over just actual text, actual motion graphics. And if you pair up some of the more deterministic tools, like basically tool use, like how ChatGPT can also do a math problem in Python and be 100% accurate. You could imagine if you want to just draw a rectangle on top of the video that you've generated or make it black and white, That's something that doesn't require a generative AI model. You can just call a tool that has the same functionality as Premiere Pro, and then it'll just do that. But you have to actually wire all that up, make it all work together.

Speaker 1:

And maybe that's where this goes, and then I don't really know how defensible that is, but we'll see. Yeah. I was

Speaker 4:

just going to say, like, I mean, you're kind of just describing like a rapper. Right?

Speaker 1:

Yeah. So at

Speaker 4:

least in the text field, like, seems it like rappers, at least of late, like, have not been Yeah. You know, people are much less bullish on rappers now than they were, like, a year and a half ago.

Speaker 1:

Yeah. Especially the the the huge problem I

Speaker 2:

don't know. SD kid. Pretty good.

Speaker 1:

Yeah. I like that. Let me tell you about Lambda. Lambda is the superintelligence cloud, building AI supercomputers for training and inference that scales from one GPU to hundreds of thousands. The other really hard thing about the video wrapper, the video, like, app is that many, many video producers, they start fresh every day.

Speaker 1:

Like we when we edit Die at TBPN, our thirty minute cut down of the show, like, the project is started from basically a clean slate. And so actually shifting from one software to another is really, really easy. I mean, you still have to learn the new tool. It has to be better. Has to have the features.

Speaker 1:

But it's not like, okay, well, there's all these networks of documents. There's a system of record. It's different if you're editing a movie over two years. You're probably not gonna switch tools mid movie. But if you're a Shorts editor, you can switch from CapCut to Instagram edits, like their video edit editing tool, really quickly.

Speaker 1:

You can go over to After Effects or or Yeah. Premiere. And you can just be like, for this, for this video, we're using this. Like, this is this happens to us all the time. Like, we'll we'll have a video editor on the team who's like, oh, yeah.

Speaker 1:

They use DaVinci Resolve. And it's like, it doesn't even matter to us because it's like it's like, oh, yeah. They just use that and then they plug in and then they deliver a video file at the end. Render it out. Anyway.

Speaker 2:

One more thing on Higgs Field. I would expect that Disney comes after them, too. Generating Disney characters right now.

Speaker 1:

Yeah.

Speaker 2:

Disney invested a billion dollars into OpenAI as part of this licensing deal, one year exclusive. Yep. There's they I would expect that pressure to come soon.

Speaker 1:

Let's move over to Bill Ackman. But first, let me tell you about Shopify is the commerce platform that grows with your business and lets you sell in seconds online, in store, on mobile, on social, on marketplaces, and now with AI agents.

Speaker 2:

Bill Ackman makes a big bet on Meta. Pershing Square disclosed a roughly $2,000,000,000 position in Meta. Hedge fund manager Bill Ackman likes Mark Zuckerberg's chances in AI. Pershing Square revealed the stake at an annual meeting of one of its fund on Wednesday. Mhmm.

Speaker 2:

The position amounted to 10% of the firm's capital at the end twenty twenty five or roughly 2,000,000,000 based on past disclosures.

Speaker 7:

Mhmm.

Speaker 2:

Meta shares are down about 13% over the past six months, a decline that Pershing Square attributes to investor concerns about the sums the company is spending on AI. CapEx. Pershing's thesis revolves in part around AI boosting Meta's content recommendation and personalized ads Yep. And potentially unlocking new opportunities in wearables or AI digital assistance for businesses.

Speaker 1:

Ben Thompson.

Speaker 2:

Keep reading. I'm gonna pull up

Speaker 1:

Meta's business model is one of the clearest beneficiaries of AI integration, Pershing Square said in its presentation. Ackman tends to concentrate his stock portfolio in a small number of high conviction bets. He only had 13 different positions at the 2025, including other big tech companies, Alphabet and Amazon. What what did you say?

Speaker 4:

He just liked me for real.

Speaker 1:

No. I I You co tweeted this. Right?

Speaker 4:

Yeah. We we there's a I mean, I I didn't like really anything. I just posted a Jeremy Giffon.

Speaker 1:

You you created this post. Yeah. And you photoshopped Jeremy's name on there because it was your idea. It came from you. What did Jeremy Yeah.

Speaker 4:

Give him credit.

Speaker 1:

He said, the idea that it's hard to beat the market is mostly trotted out by money managers. What they really mean is that it's hard to do when you manage other people's money because it's difficult to get paid for doing the obvious thing. I think it's substantially easier than most people think for the person solely running their own cash. So Ackman is charging 2 and 20 to own the MAG seven. You love to see it.

Speaker 1:

But a lot of people don't have diamond hands. He will diamond hands for you, by by holding Alphabet, Amazon and Meta apparently. Let me tell you about Railway. Railway is the all in one intelligent cloud provider. Use your favorite agent to deploy web app servers, databases, and more while Railway takes automatically takes care of scaling

Speaker 2:

modern security. Speaking of Meta, Ray Ban maker Luxottica says it more than tripled Meta AI glasses sales in 2025.

Speaker 1:

Take off.

Speaker 2:

The French Italian eyewear brand said it sold over 7,000,000 AI glasses last year, up from 2,000,000 that the company sold in 2023 and 2024 combined.

Speaker 1:

Wait. So so sorry. Say that number again. How many did they sell last year?

Speaker 2:

7,000,000 devices last year.

Speaker 1:

I I think I'm I'm starting to see them. I mean, Ashley Vance came on the show and was wearing for fun, but also for good content. He's gonna actually include that content in the stuff that he makes. It's a video maker's tool, so it's a prosumer tool. He, of course, had a whole bunch of other nice cameras here, but that is another tool that you can't really replace.

Speaker 1:

He doesn't want a GoPro strapped to his head. Yeah. That doesn't make any sense. Then you have the extreme sports. I'm going skiing in a couple weeks with a bunch of friends.

Speaker 1:

I'm sure I'll see a pair of of the vanguards out there on the slopes. And then and then you just have the funny people on Instagram reels saying, computer, give this guy a great day. Like that that comedy is great. Did you see that that that guy ran into another creator that was doing this?

Speaker 2:

Gelling at him?

Speaker 1:

Yeah. Yeah. Yeah. Computer. Give this guy a bust down AP.

Speaker 1:

Something like that. It it it was just like very very funny and every time I see one on a a truly viral Instagram Reel, I'm like, okay. There's there there's something here that's unlocking a different form of of video, not unlike when we saw the first GoPro go out and we started seeing truly crazy first person video of surfing, which, like, was just impossible. Or, like, I made I would go

Speaker 2:

scuba diving. Went surfing with Rocco Basilisko.

Speaker 1:

Rocco Basilisko.

Speaker 2:

The The Luxottica exec Yeah. And heir who led the project with Meta. Yeah. He took out a new pair of the Oakleys. Yeah.

Speaker 2:

And we did lose them briefly, but we recovered them.

Speaker 1:

They need Metacrokies. Right? Isn't that what

Speaker 2:

they're called? Crokies. Didn't have crokies.

Speaker 1:

Yeah. You need the metacrokies. Let's get on that, Zuck.

Speaker 2:

But anyways, Bloomberg reported that Meta and Luxottica were discussing doubling production to at least 20,000,000 by the end of this year to meet current demand.

Speaker 1:

I I I think they're gonna be

Speaker 2:

They're selling. They're selling. Yeah. I give them credit. Give give Zuck some credit.

Speaker 2:

And again, I I saw somebody had effectively jailbroken their Metairie bands and they hooked it up to a Mac Mini and they were buying stuff. I've talked about this least once on the show. But again, there's that's it it's it's funny to watch somebody jailbreak a device to do something that the device will obviously do like in the relative near term. Yep. That's like the part of the entire thesis behind the device.

Speaker 2:

But

Speaker 1:

Yeah. Well, I mean, I don't know. Will it actually tell you when you get an iMessage in the next three years? Is that coming? I don't know.

Speaker 1:

It might not. That those walled gardens might hold for a while. Do you think the meta Ray Bans will outsell sunglasses broadly? Or is the TAM sunglass buyers? Everyone has one pair of sunglasses, so you sell one pair of these to all the people?

Speaker 1:

Or is it and some people just don't like sunglasses, so they don't wear them? Or are they like TAM expanding in some way?

Speaker 2:

They've got to be that's a good question.

Speaker 1:

Yeah. And and I will give you assume the next Because version I but but beyond that is is like truly indistinguishable.

Speaker 2:

Yeah. So my my framework has been that the immediate market of people, over a billion people that just wear glasses Mhmm. Daily so that they can read feels extremely in reach. Right? You're wearing this all day long.

Speaker 2:

Yeah. Why not add some functionality to it? Totally. Certainly, there'll be some Luddites that just want the analog magic of just a pair of plastic with some glasses in it. But but then, yeah, I don't know.

Speaker 2:

I I've never it's funny. I I love the way sunglasses look but I've never been a sunglasses guy. Yeah. I just don't. Really?

Speaker 2:

I don't. I rarely rarely wear them. I still once or twice a year will buy a pair Yeah. And then I because they look cool Yeah. And then I don't actually end up wearing them.

Speaker 1:

I like sunglasses when it's bright out. I just like the functionality of them. But I do lose them constantly. So

Speaker 2:

Oh, you're one of those guys.

Speaker 1:

Yeah. I'm a I'm a I'm a sunglass loser. But the the question is like, if you comp to the Apple Watch, like it feels like more people are wearing watches than before the Apple Watch because watches were sort of like a niche. You had to be into watches to wear one because everyone had the phone in their pocket and they were like, why do I need a watch? And once the Apple Watch came out, a lot of people were like, well, I I want a super watch on my wrist.

Speaker 1:

I want a computer on my wrist. I want the

Speaker 2:

You want a bus

Speaker 1:

down computer. Bus down Apple Watch, I think we're onto something. They're doing orange. They're doing Hermes orange. They're selling very well in China.

Speaker 1:

The bust down, they did the gold. It was $10,000. Do you remember this?

Speaker 2:

And it flops. Right?

Speaker 1:

The the Apple Watch edition. It was $10,000. Gold plated, solid gold, something like that. I don't know if it flopped. They did discontinue it.

Speaker 1:

But it it

Speaker 2:

Getting roasted in the chat. I like sunglasses when it's bright.

Speaker 1:

Jordan doesn't like sunglasses when it's That's a hot take.

Speaker 2:

That's a bold statement. Heavy hitting analysis.

Speaker 1:

But I like I like the idea of the of the bust down Apple Watch. Get it at a 100 k. The problem with the the the gold Apple Watch was that you buy this thing $10,000, and it immediately loses all its value because it doesn't even run the latest software, and it just stops working after, like, five years, which is ridiculous. Like, if you buy if you're spending $10,000 on a watch, you want it to be useful in a hundred years. You want it to be around for a long time.

Speaker 1:

Yep. But I know.

Speaker 2:

Should we read through Jeremy's actual Sure. Response?

Speaker 1:

Oh, I I oh, did he post again? Because I read his oh, oh, okay. So he did talk about this. Really quickly, Labelbox, RL environments, voice, robotics, evals, and expert human data. Labelbox is the data factory behind the world's leading AI teams.

Speaker 1:

And, yes,

Speaker 2:

Jeremy says, responding to Ackman charging two and twenty to own the mag seven. He owns Google, Amazon, and Meta. He says this is good capital allocation, but it requires permanent capital. People don't actually want the best returns. They wanna feel clever and do some some, like, weird little thing that makes them feel interesting or different or something.

Speaker 2:

But sometimes you just gotta do the obvious thing and just be really strong on it. I always admired those managers who have been paid two and twenty for like forty years to just own Berkshire. It's awesome because they were right, but they were and they were providing a service. The service is that the LP does not have the personal confidence to own Berkshire on their own. That's crazy.

Speaker 2:

So they actually need to pay a huge premium for someone else to give them the confidence to own Berkshire. It's just a tax that they pay for not having the conviction to do it themselves. And I think that that's great and solves a real problem for a lot of people. I love it. Well said.

Speaker 1:

Let me tell you about CrowdStrike. Your business is AI. Their business is securing it. CrowdStrike secures AI and stops breaches.

Speaker 2:

We got one more post Yeah. Yeah. Before we bring in our first guest.

Speaker 1:

What you got?

Speaker 2:

We can pull this up. People are reacting to the men's luge doubles. They're saying it's one of the most baffling things I've ever watched. National champion speaker says, how do you find out you're good at this? What's the conversation look like?

Speaker 2:

I really wanna know. I I didn't I didn't know this Doubles is guess I I may have seen this before. I I remember as a kid watching singles.

Speaker 1:

Yeah. Yeah.

Speaker 2:

But this is baffling.

Speaker 1:

There's also something called skeleton.

Speaker 2:

Like, what are you what are you

Speaker 1:

think it's an evolution. I think it starts with, like, a toboggan, and you have, like, a bunch of people essentially in, like, a rowboat, you know, in in, like, a long like, if you're doing crew, you have, four or five people. And then you're just, let's do one less. Let's do one less. Let's optimize and reduce weight.

Speaker 1:

And then they're finally down to just two people, and they're basically just on this, like, sled together. And, yeah, I guess you get here. But very, very funny. What else is going on in the Olympics? Personal injury attorney in his fifties is on the cusp of becoming the oldest American Winter Olympian in history.

Speaker 1:

All he needs is for one of his teammates to slip and fall. This guy is elite. Remember the last Olympics where there was that great meme that came out where it was like the one person with like all the gear and all this other stuff and then the random guy I think from Turkey who just showed up and was just like and just like was I think he won the gold or something. I don't know. That might have been too poetic.

Speaker 1:

But modest proposal says it's an

Speaker 2:

incredible Being a personal injury attorney is probably a great gig if you're in one of these kind of fringe Olympic sports. You know, you wanna have some flexibility in your schedule. You wanna be able to make enough to reinvest into the sport. Yeah. This guy is clearly locked in.

Speaker 1:

This guy is amazing. Okay. So, Rich Ruhonen is a 54 year old personal injury attorney who plays an indispensable role for the US curling team. He cooks the players omelettes before important matches and grills steaks after huge wins. He handles the early morning grocery shopping so they can sleep in.

Speaker 1:

What a team player. He chauffeurs them around in a rental minivan. He even pays for some of their flights and hotel rooms. But those aren't his only jobs. He is also the most unlikely curler at the Milan Cortina Games, where he could become the oldest American athlete in Winter Olympics history.

Speaker 1:

And just in case people don't believe that someone nearing retirement age could possibly be an Olympic athlete, Ruhonen answers the question on a homemade t shirt. I'm not the dad and I'm not the coach, he says. But he is an American curling legend. Ruhonen first appeared in the national champion tournament in 1998. He's been doing it for Overnight?

Speaker 1:

Six years maybe. Won it a decade later and has been a fixture in the sport since long before any of his teammates were born. What he had never done was curl under the sport's brightest lights. Rohanin just missed qualifying for the Olympics several times. His most recent heartbreak came before the twenty twenty Olympic Games when The US trials mixed in doubles 2022.

Speaker 1:

2022. Sorry. When The US trials mixed in mixed doubles came down to the last shot, Rojonan decided to focus on his law practice in the senior circuit. At 50, he finally gave up on his Olympic aspirations. I honestly thought it was over four years ago, he said.

Speaker 1:

Well, I probably thought it was over eight years before that. Then he got an unexpected call. He's back. The captain for one of the nation's up and coming teams was battling a rare autoimmune disease and needed a substitute. On paper, the Gen Xer wasn't the most natural fit.

Speaker 1:

The other Curlers were all in their mid twenties, roughly the same age as Ronan's children. He even has a few decades on team USA's coach.

Speaker 2:

Few decades.

Speaker 1:

But on ice, he blended right in. So when Skip Danny Casper recovered from his illness in return, Ronan stuck around as the alternate. Team Casper then qualified for the Olympics and became team USA. Rohanan says throwing the rock at the Olympics would be the single greatest moment of his entire life.

Speaker 2:

Rock. My

Speaker 1:

kids know it and my wife knows it.

Speaker 2:

I'm gonna start using I'm gonna I'm gonna I'm gonna put on curling at home and just start using some of the hardcore terminology.

Speaker 1:

Yeah. Just be like, I

Speaker 7:

can't I can't

Speaker 1:

That you know that happened a couple decades ago because on CNBC, which all the hedge funds watch, after the market closed, they would just roll over to NBC coverage when the two fund when the two companies were together, and they would show curling just because of the time change that's what was on. And so all these Wall Street guys became obsessed with curling, got really into it, started placing bets and stuff. It's fun story.

Speaker 2:

Yeah. I wonder how how big is the, like, gambling and sports betting around the Olympics?

Speaker 1:

I don't know. I I saw some numbers for the for the Super Bowl. It was really, really big, but you have to imagine that the Olympics is also huge. While you look that up, let me tell you about Turbo Puffer, serverless vector in full text search, built from first principles in object storage, fast, 10 x cheaper, and extremely scalable. And our next guest is here so we can come back to Olympics coverage after.

Speaker 1:

But let's bring in Brian Johnson to the TVP and UltraDump. I will tell you about vod.co while he walks in. We're d to c companies, b to c b to b startups, and AI companies advertise on streaming TV. Pick channels target audiences and measure sales just like Meta, and you might be doing some advertising soon because there's a huge launch. How are you doing?

Speaker 3:

Hey. I brought you guys a gift.

Speaker 2:

Thank

Speaker 1:

you. What'd bring?

Speaker 2:

Are we doing shots?

Speaker 3:

We're doing shots.

Speaker 2:

Oh, amazing. Here we go. Amazing.

Speaker 1:

Yeah. Gotcha.

Speaker 3:

Have you done a shot of olive oil?

Speaker 2:

I haven't. Take out our

Speaker 1:

Never done I've never done a shot of olive oil.

Speaker 3:

What are

Speaker 1:

the benefits? Break it down. Why would I do this? I like olive oil on a steak.

Speaker 3:

Yeah. I mean, have you is olive oil a part of your life?

Speaker 1:

Only in the cooking context. Yeah. But I should just be drinking it?

Speaker 2:

Shots yet.

Speaker 1:

No? What what yeah. What are the benefits? And what is this? Did you grow the olives?

Speaker 1:

Is it cold pressed?

Speaker 2:

Just this much? Should we not should we not do a little bit more?

Speaker 1:

No. No. We don't want. No. No.

Speaker 1:

No. No. No.

Speaker 3:

Is live TV. This is

Speaker 7:

live TV.

Speaker 3:

Yeah. This is this is the precise dose. Fifteen ml. Fifteen ml. One tablespoon.

Speaker 1:

Okay. And What's this gonna do

Speaker 3:

for you? It's the I think it's the superfood of superfoods.

Speaker 6:

Okay.

Speaker 3:

It does I mean, just like you look down the list of things it does.

Speaker 1:

Okay.

Speaker 3:

It's just good for almost everything. Okay. So

Speaker 1:

Is it high calorie? How many calories?

Speaker 3:

120 130 calories. Okay.

Speaker 2:

Calories don't scare you, though.

Speaker 3:

No. I mean, it's 15% of my daily caloric intake. I think I consume more olive oil than any food.

Speaker 2:

Yeah. How many how many, like, shots is that roughly?

Speaker 3:

So forty five so three tablespoons a day, forty five ml, three shots a day.

Speaker 1:

Okay. Okay.

Speaker 2:

Wow.

Speaker 3:

So one with every meal.

Speaker 1:

One with every meal. Okay. Well, cheers.

Speaker 2:

Cheers. Cheers.

Speaker 1:

To Brian Johnson.

Speaker 2:

Great day. Cheers.

Speaker 1:

Down the hatch. Delicious. That is good. Smooth. Very smooth.

Speaker 1:

No problem. I could do that.

Speaker 2:

No bite.

Speaker 1:

And and and and so and so I just drank 15 milliliters of of, of olive oil, and I'm going to live forever now. Right?

Speaker 3:

It's Basically.

Speaker 1:

So now I can go back to diet coke and nicotine One shot of steaks. Right?

Speaker 3:

Yeah. Do you feel the sting now?

Speaker 2:

A little bit. I do. I do.

Speaker 3:

Like a little cough bubbling up.

Speaker 1:

Yeah. Yeah. Perfect for broadcasting.

Speaker 3:

Yeah. Yeah. That's it's a good sign of a good olive oil. So we we source this in Okay. Both hemispheres.

Speaker 3:

It's always fresh.

Speaker 1:

Oh, okay.

Speaker 3:

So it comes off. Yeah. Yeah.

Speaker 2:

Yeah. So Where, yeah, where where more specifically?

Speaker 3:

This one's this is some chili.

Speaker 1:

Chili. And and the brand is called snake oil.

Speaker 3:

That's right.

Speaker 1:

Very funny.

Speaker 3:

Yeah.

Speaker 1:

How does this fit into the overall blueprint, what do you call, empire at this point?

Speaker 3:

I mean, we the whole thing on Blueprint is we are trying to basically say, what does the evidence say for what you can do to be healthy? Yeah. And we just went after the best foods, the best therapies. And what I found is, basically, most things don't work. Mhmm.

Speaker 3:

Most things are bullshit. Mhmm. And so we've tried to narrow in to focus on the most narrow number of set of things to do, and olive oil just stacks. It's like one of the very best things you can do Mhmm. In life.

Speaker 2:

Is part of why this isn't pushed more aggressively by the sort of health industry is that it's just kind of hard to deliver. A lot of people don't wanna just be consuming oil, and that's kind of like the bit is that some people are selling pushing snake oil.

Speaker 3:

Yeah. That's yeah. I mean, yes. I mean, people I think have a natural aversion to fat still. Was that there was a long period of time where fat was bad, and you saw marketing 30% less fat, 40% less fat.

Speaker 3:

So people think olive oil fat, therefore bad.

Speaker 1:

Yeah.

Speaker 3:

Yep. So there's like hangover.

Speaker 2:

Yeah. I had a buddy that had internalized that so much that he was on a almost entirely fat free diet. He got his labs done. Yeah. He had like low his test was in the low hundreds.

Speaker 2:

All he did was add like an extra avocado a day. That was the only thing he Yeah. His tests like went up like

Speaker 1:

That's crazy.

Speaker 2:

Like by a pretty meaningful multiple.

Speaker 1:

So walk me through, you know, average American might spend, I don't know, $10,000 a year on food. What what does it look like to get into the Blueprint ecosystem at that order of magnitude? Then what can people do at a $100,000 a year? And then I wanna hear about the million dollar a year Yeah. All in plan.

Speaker 3:

Well, we announced that today. Okay. So it's called Immortals. Immortals. A million dollars a year.

Speaker 3:

Okay. And it's my exact protocol.

Speaker 1:

It's your exact protocol.

Speaker 3:

So my doctors, my concierge team, all the testing infrastructure

Speaker 1:

The blood work too.

Speaker 3:

Everything. Every test, every therapy. And so it's I mean, this has been really hard to build Yeah. Because it hasn't existed.

Speaker 1:

So you've

Speaker 3:

just had to scour the evidence, build out the infrastructure globally. Mhmm. And this is basically when someone says

Speaker 1:

Oop. Getting enough feedback. Let's turn that, speaker off. Sorry. We're we're working on bringing the soundboard to life in the TVP and UltraDump.

Speaker 1:

Sorry. Continue.

Speaker 3:

Yeah. So the I guess, like, if you look at a few, three different frames, like, one is we measure probably around 250 to 300 things that kill me.

Speaker 1:

Mhmm. That will kill

Speaker 2:

you. Yes. Actively.

Speaker 3:

Yeah. They they actively kill you.

Speaker 2:

Okay. Everything in the gas station.

Speaker 3:

Yeah. There yes. Basically. I mean, everything in modern the modern world.

Speaker 1:

Yes.

Speaker 3:

And so if you look at it from, like, don't die Mhmm. Okay. So what things actually cause you to die? And then if you look at it from the positive things of what makes you live Mhmm. That list is probably, 250 things that kill you and 250 things that make you live well.

Speaker 7:

Mhmm.

Speaker 3:

And if you take on that burden as an individual to say, I want to do all of these things that make me avoid the things that make me die and do the things that do make me live well, that's a like a Herculean task.

Speaker 9:

Mhmm.

Speaker 3:

So this program basically takes all of those things, makes it easy. You just show up. It's like, what do I do? Yeah. And the system tells you what to do.

Speaker 3:

Okay. Yeah.

Speaker 2:

What what is the what does the program look like more specifically? Like, what how how is To start with what So you're doing three people. Right?

Speaker 3:

Yeah. Three

Speaker 2:

thoughts. Three to start. What do their weeks, months, year look like?

Speaker 3:

So we'll do first it depends on who the person is. Mhmm. So we're we have an interview process because the person needs to be willing to work. Yep. You can't just like show up and like, give me a pill.

Speaker 3:

You have to put in probably around like

Speaker 1:

I was was promised immortality with a single shot.

Speaker 3:

Yeah. Yeah. Yeah. Yeah.

Speaker 1:

And you're telling me

Speaker 3:

That's why it's called snake oil.

Speaker 1:

Yeah. I got you got the best.

Speaker 3:

So we have an interview process. We'll make sure the person's willing to put in the work Mhmm. That they've got a pain tolerance. You know, they can like you know, like, they have to be willing to do stuff. Yeah.

Speaker 3:

You know, sometimes therapies hurt. Yeah. Sometimes it hurts to be hungry. Sometimes like so there's some discomfort. Mhmm.

Speaker 3:

And then they we'll choose the three people, and then we'll do a comprehensive baseline. So for me, over the past five years, we we are collect counting yesterday. We have a few billion data points on me over the past five years.

Speaker 8:

That's a lot.

Speaker 3:

And so then then if you say Yeah. What of those few billion data points are usable? Probably a few 100,000,000.

Speaker 1:

Yeah.

Speaker 3:

But still, we have a few 100,000,000 data points. And so what we've done is to date, I guess, like, years ago, we that existed in spreadsheets and files. We've now set up an AI system Mhmm. Where we take this we take all the context Mhmm. And now we have this inference engine to say, like, what relationships can you find in all this data?

Speaker 1:

Yeah.

Speaker 3:

And so what we're building is an AI, a Brian AI, that basically watches after you twenty four seven. So you start, you do a whole bunch of comprehensive baselines where you're at in the world.

Speaker 1:

Yeah.

Speaker 3:

And then we start doing targeted protocols, like how do you isolate cardiovascular health? And how do you look at, you know, so on and so forth? Then you're gonna find issues as well where the person may have a surprise finding. How do you address that? Mhmm.

Speaker 3:

They'll have some, you know, anomaly. So then we just get after it. We just say, like, how do you basically the baseline measurements and make all of them better?

Speaker 2:

Yeah. Do you want these people to be local? Can they be anywhere?

Speaker 3:

Anywhere.

Speaker 1:

Cool. Are they all over the world or US based? Is there advances?

Speaker 3:

We're open.

Speaker 1:

Open to everything. Okay. Interesting. How how are you thinking about AI? When I think about a product like snake oil, I think not a lot of AI disruption risk should not be sold off in the SaaSpocalypse.

Speaker 1:

You're not gonna

Speaker 3:

Yeah.

Speaker 1:

Vibe code an olive tree. The the land exists. The tree exists. You have to select it, press it. There's probably a lot of people involved that there might be some automated ERP system at some point.

Speaker 1:

Yeah. But how are you thinking about the impacts of AI on your business, your life? I know that that's a big piece of the don't die philosophy is that we are gonna go through a change, but how are you processing the most up to date advances in AI?

Speaker 3:

Yeah. You're exactly right. It is, like, this olive oil, we source from farmers all around the world that go through extensive screening. We test every single every single molecule that we manufacture, we third party test. So they're all very manual processes.

Speaker 3:

And then on the other side, we have AI where you take all the data. Yeah. You do the inference engine. You try to find new insights that's never been found before. But we're this really weird, mixture of, like, leading edge AI Mhmm.

Speaker 3:

And, like, duct taped together Mhmm. Of all the manual things. Like, we'll go into someone's house, you measure for mold and toxins and water toxins and air toxins, and you look at materials in the house. So it's it's still very manual and automated. So so it's kind of a cool world where we're we're insulated to some extent in terms of software.

Speaker 3:

You just stand it up. You're very vulnerable Yep. To speed. Where this one, you're dealing the physical world constraints. You have to bring together all these different disciplines.

Speaker 1:

Yeah. Yeah. And there's also this interesting break in that if you wanna measure the change in someone over a year and you wanna do a year long study, like, even if you have a million Einsteins in a data center, like, you have to wait a year to get those results. I'm sure there's more things that you can do digging up historical data, all data that you have. But there is just this natural break on progress when you have to wait for results and then feed that back in.

Speaker 3:

100%.

Speaker 1:

Makes sense.

Speaker 2:

You talked about testing. What are you finding in organic products? If something's organic, do you just automatically assume that it's not gonna kill you?

Speaker 3:

I assume it's worse than not no, I'm serious.

Speaker 1:

It's junkyard dog theory. I'm vindicated. It's true.

Speaker 3:

Honestly, organic only looks at a certain subset of toxins. Sure. It's not all toxins. Yeah. And then it's a very limited screening protocol.

Speaker 1:

Okay.

Speaker 3:

So it's really a marketing tactic. Mhmm. When we test organic Mhmm. It typically performs worse than nonorganic on many on many variables. So, no, I think it's worthless, really, as a as a marketing protocol.

Speaker 3:

And I would say more broadly, I don't trust anyone. Like, literally, I don't trust marketing. I don't trust brands.

Speaker 2:

Trust no one.

Speaker 3:

I don't trust technicians. I don't trust practitioners. Like, when we go out there, honestly, it just that's why this protocol is we've built it in such a meticulous fashion. We don't trust any we don't trust ourselves. Just trust the data.

Speaker 3:

Yeah. Do our protocols. See what it produces. But I when I go in the world, I'm just like, this is scary.

Speaker 2:

Where, where do you get things like produce? Things that aren't nest you know, you're not gonna add apples to the Blueprint website. Right? Yeah. If you wanna eat an apple, what do you do?

Speaker 3:

It so there's no safe place. Exactly. Brian Johnson.

Speaker 1:

Yeah. We No safe place to eat an apple. No apple is safe.

Speaker 3:

Like, you know, people identify like, you know, like plant based proteins have high toxins. Like, the big because they isolate that because you can test it. Mhmm. But if you eat a carrot, it can have the same, if not more toxins. Yep.

Speaker 3:

And so it's just that these this fresh produce is not measured. Yeah. So people freak out of what they can measure. But if you we've been testing fresh foods and packaged foods, and, like, toxins are throughout. Yeah.

Speaker 3:

And so it's just a very skewed perception of where toxins are at. But, you know, I I typically buy with a farmer's market. We'll do, you know, air warm. We'll do whole foods. Mhmm.

Speaker 3:

When we test these fresh foods. So we do wanna start growing our own. Mhmm. But even then, it's hard because you're in

Speaker 2:

those In LA, it's a little rough.

Speaker 3:

Yeah. And even those systems themselves, like, someone will try to get an escape route say, like, well, I've got, you know, I have a cow and a and a, you know, pasture fed, want irrigated, whatever. Like, they can't escape the toxins within the environment. So Yeah. It's just very, very hard to create an isolated nontoxic.

Speaker 1:

So there's lots of lots of toxins at risk out in the real world. Talk about other risks in the real world. Have you ever been framemogged?

Speaker 2:

Sure. People have tried. You know what I mean?

Speaker 3:

That is so good. We should put that on the list. That that escaped our

Speaker 1:

For a million bucks, you gotta protect

Speaker 2:

me from

Speaker 1:

frame mockers. That's, like, the first thing

Speaker 3:

that I want from a protocol. Yeah. What a blind spot we had. We we should have picked

Speaker 1:

that one up. Yes. How are you processing the Luxmaxing virality that's going on? It feels, like very tangential in some ways. There's some there's some overlap.

Speaker 1:

I I I've seen, you know, the Luxmaxers say, don't worry about the macros or the micronutrients or the toxins at all. All that matters is how much protein you're getting because they they have a very live fast, die young. But there's still some overlap in that they're trying to look good. We've talked about this before.

Speaker 2:

Yeah. Many ways, I feel like you've you've been Luxmaxing. You wouldn't necessarily call it that. But you can just see every now and then a photo will go viral where it's like you every six months. Yeah.

Speaker 2:

And people are like, wait, it's working. Yeah. Yeah. So in some ways it's kind of marketing. There's a lot of work kind of under the surface that's But to at the same time, it's very clear, like people want you to look better every month or they're gonna say, like, hey, can I really trust this guy?

Speaker 3:

Yeah. I mean, there there's two things. One is I'm like you guys, your disposition towards advertising

Speaker 1:

Yeah.

Speaker 3:

You know, it's like, yes, and. Yeah. Like, if someone's out there doing their thing, you know, like, more power to them.

Speaker 6:

Yeah.

Speaker 3:

Yeah. And, like, whatever their jam is, they're doing looks smacking, like, you know, do your do your art

Speaker 1:

Yeah.

Speaker 3:

And be beautiful. That's great. And so, yeah, I'm very much a proponent of, like, let humans be beautiful in their manifestation. Sure. We're like, we're basically we're saying that, you know, as we all realize, like, something's happening in the world right now.

Speaker 3:

Mhmm. It's big with AI.

Speaker 1:

Yeah.

Speaker 3:

In some form or fashion, it's evolving the world. And our primary hypothesis is as this evolves, the major shift is humans are going to want life.

Speaker 2:

Mhmm.

Speaker 3:

Like, you know, if AI takes our jobs and they take what we do and we wanna find new identity, there's gonna be a major shift towards don't die. And it's gonna become the probably the most dominant ideology in the world very quickly because we're gonna be pointing ourselves towards vibrancy.

Speaker 1:

That is that is an interesting encapsulation because it does feel a little bit like the the two paths meme with the dark cast and the and the bright, you know, palace in the sense that a lot of the Luxemaxers are saying they're black pilled. They don't believe that there will be a future. So health actually takes a backseat Yeah. To Lux because that's what helps you get get ahead this year, this month. And it's very short term ism.

Speaker 3:

Yeah.

Speaker 1:

It's very short term thinking. You've always framed it from day one around very long term thinking. So even though I think people might say, oh, well, like, he's trying to look good. That guy's trying to look good. They're similar.

Speaker 1:

There's actually a huge divergence in the philosophy. At least it feels like that. I don't know, Jordy. What do you think?

Speaker 2:

I had another question.

Speaker 1:

Please. Take it down, Jordy.

Speaker 2:

So you have this you're starting with the ultra premium

Speaker 1:

Mhmm.

Speaker 2:

Product, the $1,000,000 a year. And I can see all the justification for that. I imagine you wanna go more and more kind of like down market over time so that somebody, if they have a million dollars or a $100,000 or $10,000 or a thousand or a 100 can can kind of benefit with the program. It feels like there would be an insane amount of demand for something like, right today, a thousand dollar program, something that's effectively a PDF that somebody could pay for, download. And because they paid for it, we talked about this yesterday because Ty Lopez was in the news because of his whole like Radio Shack Yeah.

Speaker 2:

Debacle. And I was saying I'd When I was probably 18 or 19, I paid for a PDF of a workout plan. Yeah. And because I paid for it, I followed it really closely and I got a lot more value for my money.

Speaker 3:

Yeah.

Speaker 2:

Whereas I feel like for you, the second that you launch like a traditional info product, you will be attacked even though even though I think that at a thousand dollars, you could put something together that would provide somebody potentially tens of thousands, hundreds of thousands. Right? Just because in the health game, if you if you just buy a PDF and then that gets you to sleep an extra hour a night for the rest of your life Yeah. Like the the economic value creation or or the the money that you're not spending on hospital bills or things like that would be tremendous. Like, how how are you thinking about that?

Speaker 2:

Because I again Yeah. Like, I think that there would be value there for me under paying paying again for that thousand dollar product that I could just kind of follow and learn something from.

Speaker 3:

Mhmm.

Speaker 2:

The value would be there. But how are you thinking about it?

Speaker 3:

Yeah. That that's exactly what we're doing. And so the the target audience we're after is those who want to say yes to health. Mhmm. And we say, do this.

Speaker 3:

Mhmm. Yeah. So we're not if the if if you if you wanna go out and find source your

Speaker 1:

own

Speaker 3:

peptides, inject yourself, you know, and, like, do your own protocol, like, do that. We are going to deliver autonomous health. Mhmm. And so like there's self driving cars, there's self writing software. Mhmm.

Speaker 3:

This is autonomous health. Mhmm. So we're building AI that just says, do this. And so Yeah. Do remember early on when Facebook did that study where they sent 300 likes on the platform, and they could predict you better than your partner could?

Speaker 3:

This was like the first time social media showed the the power of prediction.

Speaker 6:

Okay.

Speaker 3:

And so there's like a an equivalent question for us is, how much data do we need on your body, like, of of blood, of wearables, of whatever to predict you Yeah. Better than you can? Mhmm. And how do we stitch this together? So I think the free version Yeah.

Speaker 3:

Is, like, some minimal viable amount of data. Yeah. That goes into our AI inference machine, and we then say, do this, and we can get higher yield than anything they could ever do themselves.

Speaker 1:

Mhmm.

Speaker 3:

Yep. And once that happens, once you take away Mo, you're like, I'm never getting into it with a human driver ever again.

Speaker 1:

Yeah.

Speaker 3:

Right? It's a very clear so this will be the same thing where once you deal once you deal with the immortals of the product Yeah.

Speaker 1:

Mhmm.

Speaker 3:

You'd like, I'm in. Like, definitely, this is a better control system than anything I could produce.

Speaker 2:

How how are you thinking about wearables? Most of the people that come on the show will be rocking an Apple Watch, or maybe a Whoop, or an It's very, very common. At the same time, it's missing visual. It's not like pulling in Yeah. It's not like the the health decisions that somebody makes every day is not can't always be picked up here.

Speaker 2:

It'd be great to have be like, okay. What did this person eat? What time did they actually go to bed? All all these different things. What what do you think the future of health wearables is?

Speaker 3:

Yeah. So that's exactly, again, what we're building with this first three people in the more immortals. Like, if you say, so my life over the past five years has generated a few billion data points. A few 100,000,000 are useful. Mhmm.

Speaker 3:

That has created a a a dimensional representation of a human that is higher fidelity than anything ever produced in in history.

Speaker 7:

Mhmm.

Speaker 3:

And so we've taken that model and replicated it with others, and so we're gonna try to basically cast this gigantic net of data capture. Mhmm. We'll bring it all in and say, where's the highest signal inputs? Yeah. And then we'll scale those things.

Speaker 3:

And so, like you're saying, like, we may find that lighting at bedtime

Speaker 2:

Yeah.

Speaker 3:

Is an incredibly important factor. And, like, you know, right now, people don't really think about that or the lighting environment in the room, and you can't do, like, specific photon counts. Yep. And so we're gonna try to tease out, like, where are the little teeny tiny things that have this gigantic yield? Mhmm.

Speaker 3:

But it's all about data collection. It's about the inference engine, and it's about people who are willing to say, yes. I'll follow the protocol because it's inherently better than what I would do myself. Well, they're just guessing. And then we'll have everything Did

Speaker 2:

you ever think about a program where an actual live human just follows around the person in the program and is effectively there twenty four seven? Mhmm. Kind of like a effectively a nanny for the human because it's like, hey, like, you say you're doing everything right, but like right before bed, you were you had a phone just blasting light in your face. It it feels like that's kind of what you can get to with wearables over Yes. Time.

Speaker 2:

And you can probably get there to get today. But I think a lot of people, even if they do a program, will still be kind of sneaking little things in throughout the day that working against them.

Speaker 3:

And they even forget. They're like, I do everything you say. Like, I finished my final bill of day, four hours before bed, and blah blah blah.

Speaker 2:

You I are hiding bright lights before bedtime.

Speaker 3:

Yeah. Yeah, exactly. I can't sleep. Like, what do do before bed? Well, I'm in my bed, like, you know, scrolling TikTok.

Speaker 3:

Yep. And, okay, well, could be the source.

Speaker 1:

What's the current thinking on peptides? There's a variety of them, semaglutide all the way down to Reta. What's the most up to date, like, evidence or benefits, cost that you've seen?

Speaker 3:

I mean, generally, peptides are fantastic. They are drugs, drug equivalents.

Speaker 1:

They have

Speaker 3:

drug like ethnic sizes. And some peptides have done the clinical work, like tirzepatide and But then there's a whole bunch of other classes of peptides that we have no human data on. Really? Yes. So the thing That people are taking a lot

Speaker 2:

of, right?

Speaker 3:

A lot of, yeah.

Speaker 2:

So we're going to get some human data.

Speaker 1:

And is that currently Reta? Is that the one that's, like, not

Speaker 2:

I don't know about Reta, but

Speaker 1:

Or there's, like, a number of ones, like, that I'm not even thinking of.

Speaker 3:

There's a lot.

Speaker 1:

Yeah. Okay. There's lot. There's There's, a long tail of, like, Yeah. All sorts of slightly different one molecule change

Speaker 3:

or something. Okay. And it's currently I mean, it's all the rage. Yeah. The The

Speaker 1:

Chinese peptide thing is a whole meme in San Francisco.

Speaker 3:

It's whole thing. Yeah. Yeah. And so my take on this is is if you if you do it, like, you know, be cautious because we do not have them characterized. Like, when Mhmm.

Speaker 3:

When you hear about a drug that does blood pressure control

Speaker 2:

Mhmm.

Speaker 3:

You hear all the benefits, and then you hear, like, the commercial, you hear, like, this

Speaker 1:

may Very low cause status.

Speaker 3:

This may cause Yeah. Death or kidney damage or, you know, like, you hear all the side effects because it's been well characterized. So it's been it's gone through clinical trials, you understand the pros and the cons, and you make that trade off. With peptides, you don't know. Yeah.

Speaker 3:

It's like your your bro buddy's like, it does this amazing thing, and it

Speaker 2:

fixes And the whole supply chain is incentivized to hold the statement after the fact of like, hey, by the way, this could do this terrible thing.

Speaker 3:

Yeah. So if you're doing it, proceed with caution because they're not characterized, we don't know. It's not to say they couldn't have benefits.

Speaker 2:

Mhmm.

Speaker 3:

It's that you don't know. And that's the the worst thing in health is the blind spots. Yeah. People wanna chase the positives, but rarely adhere to, like, you know, there could be some downside, so keep costly.

Speaker 1:

Yeah. Wasn't that the name of Marty Mercury's book, Blind Spots, the new FDA commissioner? I'm pretty sure that's what he called it. Do you do you have confidence in the FDA right now? Do you have confidence in the FDA's ability to actually collect all the data and make good recommendations?

Speaker 1:

Because there's some things where I I totally agree with you. I'd much rather take the one that's gone through FDA trials. Been approved. It tells me the downsides. But at the same time, people are like, but I also got all this sugar in my cereal for years.

Speaker 1:

Like, what were did they sleep at the wheel back then? Like, what's going on?

Speaker 3:

Yeah. I'm I'm guessing that we like, the FDA has a very viable purpose in society. They do

Speaker 5:

Of course.

Speaker 3:

They do good work. Yeah. I'm guessing

Speaker 1:

They stopped the literal snake oil. Famous.

Speaker 3:

Exactly. Yes. They did. Yeah. Like, they've had they've had a positive role.

Speaker 3:

Think that we will exceed them in value in many core areas. Okay. Because they you they can take, you know, well structured, well funded clinical trials. Yeah. But then other things like peptides that are very promising Sure.

Speaker 3:

They're not going to have the clinical work. So what do you do? Mhmm. So if we could get, you know, a certain number of people on the platform who are doing peptides, great. Mhmm.

Speaker 3:

And if they can start adding dimensionality to their their bio collection, we could have the most valuable dataset. Yeah. So it's it's Immortals is the product. That's the high end right now, but then also the free version we're building. Sure.

Speaker 3:

Yep. And our goal is to create the world's largest dataset Yeah. Across all these sectors. And then you basically can create an ecosystem where the person could be paid for their data. Mhmm.

Speaker 3:

So they collect the data. They're starting to do the experimentation. Interesting. Companies come in and buy the data they give out. So we're trying to create this Yeah.

Speaker 3:

Financially incentivized ecosystem where, like, in the world right now, you can't go anywhere without somebody or something trying to take advantage of you. Mhmm. Like, it's it's just everywhere. Like, try to think of a company that genuinely acts in your best interest. Like, unequivocally, always shows up acting in your best interest.

Speaker 2:

Diet Coke. I would tell They take us It's always there when John needs it. Except when the flight attendant says, how about Pepsi?

Speaker 1:

Yeah. It's true.

Speaker 3:

So we we're try to stand in that role where, like, we Yeah. Where, like, we're also jaded because everyone's trying to take advantage of us. We're try to stand in that role that you can unequivocally trust us. We will always act in your best interest.

Speaker 1:

Mhmm.

Speaker 3:

And if we do that, then, like, I'm willing to share my data. I'm willing to engage in the the platform and do the exploration.

Speaker 2:

It's a high bar. I wanna run through a lightning round of questions. Mhmm. What have you learned from Ray Pete?

Speaker 3:

That there's a lot of repeaters on X. Yeah. And they really want You to follow their Yeah. They do.

Speaker 2:

They want you to peat out?

Speaker 3:

They do. Yeah. So they they're definitely evangelists for so Ray Pete Ray Pete has a lot of evangelists.

Speaker 1:

Carrots Yeah. Underrated, overrated?

Speaker 3:

Yeah. Carrots are a great food. You know, they they're they're not benign. Like, no food is benign. It has Yeah.

Speaker 3:

Pros and cons.

Speaker 1:

Think so.

Speaker 2:

Else you got, Jordy? If where would you live if you weren't a public figure? Oh, no. And you could just optimize for health.

Speaker 3:

Yeah. You know, Kate's here. Hey, Kate.

Speaker 2:

Hey, Kate.

Speaker 3:

Yeah. We were just having this conversation. She's from Australia. We were just having this conversation trying to figure that out. It's it's really an interesting question.

Speaker 3:

Yeah.

Speaker 2:

Yeah. Yeah. LA is certainly not the place if you're purely optimizing for health. Yeah.

Speaker 3:

Yeah. That that's very true. Yeah. So, like, it has been top of mind for us, and we are trying to figure out, like, what if we did something, next level, where would we go and how we

Speaker 2:

do it?

Speaker 3:

I would not have an answer yet.

Speaker 2:

Yeah. Yeah. I think it's an important question. There's data on golf courses. I don't know if you've seen this.

Speaker 3:

I have.

Speaker 2:

If you live near a golf course Yeah. It like reduces your life expectancy because they use such heavy pesticides Mhmm. Yeah. On the golf course because golfers How close to

Speaker 1:

golf It was one in my town.

Speaker 2:

That's probably not great. These studies were done on like people that live in like a golf

Speaker 1:

Oh, like right on

Speaker 2:

So like you're living on the golf course and they're they're basically, you know, golfers just want it to look pretty. Yep. They're not asking like, hey, what did you do to make it look so pretty?

Speaker 1:

Yeah. Real

Speaker 2:

experience sport golfing. Are your long term kind of concerns around the health of us being in this studio eleven to two every

Speaker 1:

Rubber smell? Yeah. You smell it?

Speaker 2:

Yes. So that is I think it's these things.

Speaker 3:

Honestly, love that question so much. I would love

Speaker 2:

And I've and I've thought about it a lot. And I I would love to work on this because like Yeah. An LED light here.

Speaker 3:

It's so much in

Speaker 2:

our head. I don't know about these lights.

Speaker 1:

Maybe you can consult on our next on our next video. We're moving soon.

Speaker 3:

So I would love to come in here and do a comprehensive analysis.

Speaker 1:

Oh my god. No.

Speaker 3:

It would be so good.

Speaker 2:

No. The air the air quality, I mean, we spend

Speaker 3:

a lot of time in here. Yeah.

Speaker 2:

And we're planning to spend a

Speaker 1:

lot Brian's gonna come back and say, you guys have ten. We're gonna be like ten years? He'll be like nine, eight, seven.

Speaker 3:

Mean, of all, your c o two levels for sure are too high. Sure. I can feel the c o two. Right? Do you Kate, do you feel the c o two?

Speaker 3:

Yeah. Saw the CO2.

Speaker 1:

A CO2

Speaker 2:

meter, for sure.

Speaker 3:

Yeah. So the space is too

Speaker 4:

small. Have those my

Speaker 2:

yeah. I have the meter in my house. Yeah. I'll move around to the different rooms to kind of track it, and I've got a whole like, I've a bunch of work to the HVAC system.

Speaker 3:

And see, over over a thousand is gonna impair your thinking.

Speaker 2:

Totally. Yeah. So that's

Speaker 1:

obvious from the stream. You watch it. Like, you know, these guys take something like

Speaker 2:

I like sunglasses right out. What do you think how are you thinking about parasites? There's tons of health influencers that will make parasites their whole thing, and that, like, every the solution to every problem is, like, gotta fix the parasites.

Speaker 3:

Yeah.

Speaker 2:

How how are you thinking about them?

Speaker 3:

We I've not done a deep dive into that. I don't know. So, these people who have parasites, and they're trying get rid them?

Speaker 2:

The the argument is everyone has them. Yeah. Some amount are probably helpful

Speaker 3:

Yeah.

Speaker 2:

Because they'll, like, eat heavy metals. Mhmm. And then you'll eventually kind of Yeah. Release them, things like that. But you should do a deep dive

Speaker 3:

on Okay.

Speaker 2:

Noted. Do you think How do you cycle supplements? Right now, I'm taking L theanine, glycine, and a and a number of other things before bed. It's been super effective at getting REM and deep sleep up But for I'm curious when you find a supplement that is working and it's like measured in my case, how should you cycle it? How do you think about cycling it to make sure that you continue to get the benefit?

Speaker 3:

Yeah. This is one of the questions we're asking in the inference engine is because we've done a whole bunch of these different cycles. Yeah. And the it's difficult because, it's never truly a a nonconfounded experiment. Right?

Speaker 3:

There's always some confound.

Speaker 1:

Mhmm.

Speaker 3:

And so when you're trying to tease out little tiny signal, you do need to be robust about the analysis. So we don't know. Like, I you know, we have, like, some short term data on that, but I not not strong enough that I think I'd express an opinion. Yeah. So we're hoping hopeful that the bigger data sets will give us insight.

Speaker 3:

Yep.

Speaker 2:

Same kind of question on fasting. I've started I inverted the fasting schedule because we do the show. We'll have, like, a big breakfast at eight We'll have I'll have a second meal at around 02:30, and then I won't eat until the next day at 8AM. So typically, somebody might start eating at twelve, have another meal around dinner, and then wait until the next Yeah. So I flipped it.

Speaker 2:

Any any learnings from personally?

Speaker 3:

The the marker I would look for that probably is

Speaker 2:

And and to be clear, part part of the reason is I actually there's a functional reason, which is that the way that we do the show, I need to be Mhmm. Like Powered up. Fueled up Yeah. To then podcast for three hours. Yeah.

Speaker 2:

But then I also don't like being having a big dinner and going to bed.

Speaker 3:

Yeah. Yeah. You guys work out in the morning together. That right?

Speaker 1:

Yeah. 06:30?

Speaker 2:

So, yeah, at 06:30, huge meal.

Speaker 1:

Huge meal at eight.

Speaker 3:

Eight.

Speaker 1:

Yeah. Eight to nine.

Speaker 3:

Yeah.

Speaker 1:

Anyway, thank you so much for coming on

Speaker 3:

the show.

Speaker 1:

Yeah. This is fantastic.

Speaker 2:

Yeah. Yeah.

Speaker 1:

Can go all night, all day. Yeah. Let me tell you about Congrats

Speaker 2:

on this.

Speaker 1:

MongoDB. Choose a bit of

Speaker 2:

Build the end of the week.

Speaker 1:

Flexibility and scale with best in class embedding models and re rankers. MongoDB has what you need to build what's next. And without further ado, we have Matthew Zeitlin, the correspondent from heat map news in the restream waiting Let's give

Speaker 2:

it up for correspondence.

Speaker 1:

I'll tell you about Figma while we wait for him. Figma make isn't your average vibe coding tool. It lives in Figma, so outputs look good, feel real, and stay connected to how teams build. Create code back prototype Tyler

Speaker 2:

Tyler tested the c o two. We're at 790

Speaker 4:

We're at eight ten right now.

Speaker 1:

Eight ten

Speaker 2:

and rising.

Speaker 1:

He caught us on a lowdown.

Speaker 2:

It's going up.

Speaker 1:

He said a thousand. He said I was it was it was good at to

Speaker 4:

to see what it was at the end of the end of the show.

Speaker 1:

Oh, okay. Okay. Because we will add more c o two as the show goes on.

Speaker 2:

Edge in the chat says, Jordy has three

Speaker 1:

days in one. He's figured out how to manipulate time. You are manipulating time. Anyway, we have our next guest in the restream waiting room. Let's bring him into the TVP in Ultradome.

Speaker 1:

How are you doing? Sorry to keep you waiting.

Speaker 2:

What's happening?

Speaker 10:

I'm doing great, guys. Any any opportunity to follow-up Brian Johnson is is one I

Speaker 3:

will take.

Speaker 1:

It is a fun it is a fun story. I Yeah.

Speaker 2:

He was your he was your opener.

Speaker 1:

I mean, you've clearly been doing a lot of looks, Maxxing. What's working You for

Speaker 10:

know, I I do have friends. Actually, John and I play a poker game with a Ray Pete Ray Pete accolade.

Speaker 1:

So Okay.

Speaker 2:

I've been

Speaker 10:

I've been at least hearing about pitting, you know, forever.

Speaker 2:

Early. I've been I I immediately resonated with Yeah. The work of Ray Pete. Yeah. Have followed a lot of it myself.

Speaker 2:

I'm I I think it would absolutely astonish and scare Brian Johnson how much sugar I have. That that's kind of

Speaker 1:

You do love

Speaker 2:

one sugar. One of the main takeaways from Ray Pete's work is that sugar is Mhmm. Is not nearly as bad as many people would say and and may may be actually great Mhmm. In a lot of ways. So I've I've been running that experiment.

Speaker 1:

We'll find out.

Speaker 2:

But

Speaker 10:

I I have a two year old at home, so, I can't eat cookies around him. You know, I can't it's a it's you know, then he starts asking for them and Yeah. Yeah, it's getting in the way of my pitting, actually.

Speaker 1:

I had a friend who had had a bunch of and wound up just obsessed with dark chocolate because it was the only thing that was at all sweet that he could eat that the kids wouldn't immediately go for and consume olives.

Speaker 2:

No. The the explaining to my four year old Yeah. Why I can have cookies before bed but he can't is is not good.

Speaker 1:

That's tough That's a tough one.

Speaker 10:

Yeah. I'm I'm still waiting for that level of cognitive development from Yeah. From my son. We're not really at the explaining stage yet.

Speaker 1:

Yeah. Yeah.

Speaker 2:

Anyways, it's great it's great to I've be the We've we've covered a bunch of your posts Yeah. Over the last six or twelve months

Speaker 1:

Yeah.

Speaker 2:

And always always enjoyed them. Why don't why don't you give a quick intro on yourself, and then, you know, we can talk about all the stuff that you talk about.

Speaker 10:

Yeah. Yeah. My name's Matt Sailin. I'm a reporter at HeatMap. We're a climate and energy focused newsroom.

Speaker 10:

We report on climate and energy every day, and I mostly write about, you know, energy policy generation, a lot of data centers. You know? Yeah. We've kind of really reoriented what we're doing around data centers. That's where all the demand is coming from.

Speaker 1:

Yeah. When did that reorientation actually kick off? Because it felt like it became a story sort of mid last year, but have you been tracking it longer? Like, when was the inflection point?

Speaker 10:

So I think around, like, 2023

Speaker 2:

Mhmm.

Speaker 10:

I wrote a story about how just load growth was coming. That we'd essentially had flat electricity demand for since in the twenty first century.

Speaker 1:

Yeah.

Speaker 10:

And when I wrote the story, it was more about electrification of cars and then home heating, you know, heat pumps, stuff like that. Yeah. Stuff that client people are really interested in. And then and data centers to some extent, also factories are something I was really looking into as a big electricity demand. Mhmm.

Speaker 10:

And then, yeah, it's mean, it was since the growth of g it was since the launch of GPT three, it became pretty clear that data centers was was where it was at as far as load growth story was.

Speaker 1:

Yeah. And then how like, how widespread is the focus of AI data center energy consumption? Because it feels like if it's going up next to your house, you're gonna be calling your representative. But for a lot of people, it's abstract. They maybe see an Instagram reel, some slop, and they're like, I don't like it, and I don't want my rates to go up.

Speaker 1:

But as we

Speaker 2:

We've been kind of wrestling with this because, general Yeah. I think AI is very good. Yeah. There's a lot to be excited about. I think that more intelligence having, you know, people globally being able to access intelligence if they're accessing a little bit of slop is like gen genuinely good.

Speaker 2:

Yeah. But at the same time, I understand the people that are just like, what's the what's the benefit of having a data center in your backyard? It's like actually probably close to zero because Yeah. Or negative because it doesn't I don't need the data center in my backyard. It's not exactly pretty.

Speaker 2:

We're not making them look like cathedrals yet. Oh, yeah. Maybe we should. And I can just if it if it's out somewhere that I've never even been to, I can still get all the benefit of it. And so so, yeah.

Speaker 2:

I think these are things that the whole industry is wrestling with, politicians are wrestling with, and and for good reason.

Speaker 10:

Yeah. I mean, so the way we kind of I like to look at electricity prices. That's kind of a nice

Speaker 1:

Sure.

Speaker 10:

Solid thing to focus on. And obviously, they're they've been up in a lot of parts of the country. You know, you guys are in California. I grew up there. I hope you don't have a pool at home, you know.

Speaker 10:

Anything around heating or something like that, a pool or something like that, a disaster

Speaker 1:

Sure.

Speaker 10:

As far as prices go. But, no, where we've seen electric

Speaker 2:

prices We gotta replace the pools with data centers.

Speaker 10:

Exactly. Exactly. You need a you need a Blackwell rack.

Speaker 1:

Yeah. Or just a stack of Mac minis. Yeah.

Speaker 10:

Of course. Yeah. John John can get you John can send you over a Mac mini.

Speaker 1:

Yeah.

Speaker 10:

But no. Where are we seeing kind of electricity price increases? California has been a big one, and the Northeast have been big. Those are not areas with lots of data center development. Think it's in part because the prices

Speaker 1:

Sure.

Speaker 10:

Are so high. Yeah. Where we have seen both data center development and price increases has been kind of in the Mid Atlantic, Midwest areas. Mhmm. This is Virginia.

Speaker 1:

Yeah.

Speaker 10:

This is Ohio. This is Indiana. Mhmm. This is areas where if they're all in one electricity market called the PJM interconnection.

Speaker 1:

Okay.

Speaker 10:

And the way this market works is that these utilities have to procure capacity in advance. Mhmm. And that market, that capacity market and the and the ability to get new generation online there has been completely messed up. And so we're seeing these auctions generate billions of dollars of revenue for the existing generators, and that is directing that's translating pretty directly into higher prices. New Jersey has probably seen their prices residential prices increase, you know, say, 20%.

Speaker 10:

That's because they're in a system that is not incorporating the data center demand that well. But there are other parts of the country where electricity prices, you know, haven't gone up

Speaker 1:

How have you

Speaker 10:

as much.

Speaker 1:

How have you been reacting to some of the big AI companies, some of the hyperscalers, just start to put out statements saying that they will pay above market rates, subsidize energy? Do you think that that's going to be effective? Can you just do that and then keep the prices stable? Also, a little bit of grid history here would be helpful because it feels very intuitive, but I believe, like, we sort of have a common carrier rule where even if if I'm doing, you know, research and you're watching Netflix, like, we still pay the same price. We don't put, like, a moral weight that I'm aware of, but, any of that would be helpful.

Speaker 10:

Yeah. So typically, you know, residential users pay the same kind of kill per kilowatt hour

Speaker 1:

Yeah.

Speaker 10:

Rates. Sometimes there's no time of use differences, especially in a state like California. Mhmm. But to to get to this point about and then industrial users often pay as well. Those would be the data centers.

Speaker 10:

What what they mean when data centers say they're gonna pay for their own electricity is not just, like, the literal electricity, the electrons they use. But when you bring on a gigawatt sized data center, that's like putting a whole city.

Speaker 1:

Yeah.

Speaker 10:

You know, on on know, a gigawatt can power 800,000 homes.

Speaker 1:

Wow.

Speaker 10:

That's like putting a whole city on the grid. This requires a lot of upgrades to the whole system, a lot of new equipment you have to buy. And those typically system costs are spread out to everyone on the system. And this is how electricity utility regulation and pricing has worked for, like, over about a hundred years. This guy Samuel Insul, who worked for Thomas Edison in in Chicago in, I think, the twenties, basically came up with a version of this idea.

Speaker 10:

And so in theory, having new sources of demand over time should lower prices for everyone because you have a big electricity buyer that can deal with those kind of distributed system costs. But in the short run, you know, they're incurring lots of new costs and new infrastructure development to the system as a whole. And so I think what the Microsoft has made a commitment to this and Anthropic, yesterday did. You know, it's you can't, you know, do a utility rate case in a press release. But what they are committing to, I think, is that in their electricity rate that they pay, they wanna get all of those extra system costs that they're incurring Mhmm.

Speaker 10:

To just be onto them.

Speaker 1:

Mhmm. What about the other the other the other stories and and pushback to data center build out? How did you process the whole water risk hype cycle? It feels like there was a lot of fear, and then there were some more statements from Google, some studies. How has your community and you personally, like, processed that narrative?

Speaker 10:

Yeah. So, like, water this water issue is, I think, kind of a surprise for a journalist covering this space. It is definitely something that the people, especially people online, are, like, really interested in. You know, it's actually pretty easy to track how data centers can affect electricity prices.

Speaker 1:

Yeah.

Speaker 10:

How data centers can affect water usage in any kind of system systemic sense. It doesn't really show up that much. The water usage isn't that great. But it's definitely something

Speaker 2:

But the numbers sound great. The numbers sound great. Yeah. It's like Yeah. I mean we're using a 100,000,000 gallons.

Speaker 2:

Yeah. It sounds like a lot. The Yeah. But, you know, the typical golf

Speaker 1:

course. Yeah.

Speaker 10:

Yeah. Well, golf courses are But, you know, household usage of water is huge. You know? And if people individual households' usage of water over a year sounds really, really big.

Speaker 1:

Yeah.

Speaker 10:

I think there's some issues around construction and water usage, but those are obviously more temporary. Sure. Yeah. But it's definitely something that people are concerned about. And so I think these hyperscales are really going out of their way to kinda talk about being water positive.

Speaker 10:

Yep. They're really kind of leaning in on the water aspect. But I think the electricity aspect is much more where the rubber meets the road, and they have to make some commit you know, one gigawatt data center, people think costs about $50,000,000,000 50,000,000,000 ish dollars. Obviously, the vast vast majority of that is the chips, but, you know, 10 to 15% of the lifetime cost of that is electricity, and you're paying all these extra grid costs. That means with, like, one of these really, really big data centers that all these companies are rolling out now, they're volunteering to incur what could be hundreds of millions, if not billions, of, quote, unquote, extra cost.

Speaker 10:

But that just might be the price they have to pay to get approval from the state governments to build these facilities.

Speaker 2:

Yeah. Who who have been some of the craziest beneficiaries of the general energy, the boom in in sort of energy demand due to AI? Any anybody that was just kind of like sitting around a few years ago and then, you know, just kind of like tripped into making like a $100,000,000 or a billion dollars due to having Yeah.

Speaker 10:

I mean, there's two great examples of this that are kind of on the opposite ends of of kind of technological sophistication. One is Caterpillar, which we obviously associate with mining and construction. You know, anyone, I think, with boys at home probably has some Caterpillar toys.

Speaker 2:

Oh, yeah.

Speaker 10:

They probably identify all all the equipment they use. But they they also have a business called Solar, which builds these kind of small turbines, small gas turbines. You know, the big gas turbine in a power plant is, you know, hundreds of hundreds of megawatts of capacity. These are a few dozen, maybe at the biggest. And they're kind of used for, like, remote power start often in the oil and gas industry or, like, you know, compress on a turbine pipeline or something.

Speaker 10:

This business has gone huge, and they're now a huge supplier to data centers because they're so desperate for power. I mean, Elon Musk, XAI is the big innovator here of realizing that grid connection studies can last for years. But if you just kinda pile up these small inefficient turbines, you can get the power going really, really quickly. Caterpillar's a big supplier to Meta in Ohio. Mhmm.

Speaker 10:

The Socrates project is, I think, a few 100 megawatts. And Williams is actually a pipeline company. It's kind of assembling all these kind of, turbines and reciprocating engines, essentially, like giant car engines, to power these data centers. And so that business has been doing incredibly well.

Speaker 1:

Car engines, what do you run on gasoline? Diesel?

Speaker 10:

Natural gas.

Speaker 1:

Natural gas. Okay. Yes.

Speaker 2:

What what kind of environmental studies have been done if a data center sets up, you know, in your backyard and they've got a bunch of these gas turbines and they're running? Is that like, you know, if you can, you move away or does it dissipate quickly? Do you has there been any kind of significant Yeah.

Speaker 10:

I mean there? These turbines are less efficient and dirtier than, you know, a very large scale turbine system you would buy from a GE Vernova or Siemens. I know with with XAI, they were kind of really monkeying around with these rules about how you could only have it up for three hundred sixty four days. There by the time the EPA certified it, they were already moving to Mississippi. I mean, these systems are not really designed to kind of be to function as power plants in the same way that, like, a power plant is, but people are so desperate for power that they are gonna be using them.

Speaker 10:

Yeah.

Speaker 1:

I mean, it feels like there's, going to be a variety of backlashes, a variety of pendulums swinging. The hyperscalers were very focused on net zero clean energy. Now it feels like they're trying to go as fast as possible, so there's a lot of natural gas. But then they're also still firing up nuclear projects, more wind, more solar. How do you see the mix, like, shifting over the next couple of years?

Speaker 1:

Do you think there's

Speaker 10:

any Yeah. So the

Speaker 2:

sort of

Speaker 10:

white pill here? Hyperscalers, your Googles and your Microsofts, especially. And and Meta and Amazon do have these sustainability commitments. These are a 100% renewable power commitments. They still are procuring lots of clean energy.

Speaker 10:

Sure. Google bought a clean energy developer Mhmm. Pattern for 4 and a half billion dollars at the end of last year. And what they do is that, you know, in some cases, they get the natural gas turbines, and they also buy, say, solar projects in a similar area. That's what Meta is, right, doing right now in Ohio.

Speaker 10:

But, yeah, I mean, they do these they do these reports and say Microsoft's emissions have gone up a lot because they're consuming, a lot more power. And they've gotten way more adventurous in the stuff they're willing to invest in because the scale of their power needs have really changed. So I think all four of the biggest hyperscalers have nuclear projects going on. Google's a big investor in enhanced geothermal. We just talked

Speaker 1:

to Jeff Lawson yesterday. He raised, what, $450,000,000 to do not nuclear fission, traditional nuclear, but but fusion, which is

Speaker 2:

Yeah.

Speaker 10:

He's doing the he's doing the lasers.

Speaker 1:

Yeah. Lasers, which is, you know, he said there's no science risk and I hope that there's not, but it feels like, you know, it hasn't really produced energy on the grid just yet. So it is, you know, very forward looking. But Google Ventures is one of the investors because they're and they also have a deal with Commonwealth Fusion Systems and a bunch of nuclear projects as well. That's fascinating.

Speaker 1:

What what can you tell me about the nature of the pushback politically against data centers? I was looking at this data center watch stat. 55% Republican, 45% Democrats in affected districts. So it it feels like I don't want a data center in my backyard is not a left or right wing issue, but No. No.

Speaker 1:

When It's you see

Speaker 10:

fusion. It's it's one of these kind of fusion issues. Mhmm. We've done a lot of great reporting at this. We have a pro service called Heatmap Pro that really focuses on this stuff.

Speaker 1:

Okay.

Speaker 10:

And what we found is that, yeah, it's like it's it's anyone who's kind of distrustful of institutions in general will tend to really not like a data center Sure. In their community. Yeah. Also, people who it also brings the other Democrats and Republicans because people who oppose renewable energy projects, which at heat map is what we were kind of born to start tracking

Speaker 5:

Sure.

Speaker 10:

Was local opposition to renewables.

Speaker 1:

Interesting.

Speaker 10:

The same people are opposing the centers, and they're using the same techniques. They're opposing, say, rezoning agricultural land into industrial land. Something really common with solar panels.

Speaker 1:

Yeah.

Speaker 10:

Same thing's going on with data centers. Same type of people are are opposing it. And it it's interesting because at the at the highest levels of politics, you know, the the administration right now is very pro the data center AI build out. You know, they have, you know, they have Sam Altman at the White House announcing, you know, multibillion dollar investments in this stuff. Yeah.

Speaker 10:

And yet a lot of data centers are built in Republican areas. That's typically where there's enough space. They tend to kinda be in exurban areas. And, yeah, they get a lot of local opposition. There's a lot of local opposition in Indiana, and a lot of local opposition in Kentucky.

Speaker 10:

Yep. Virginia, this is really kind of transforming the politics around there as well.

Speaker 1:

So OpenAI has a pack now. Anthropic has a pack now. I think for the drama heads in the chat, we'd hope that they're just going to fight with each other endlessly and make content for us. But assuming that they want to bring about support, broad support for their data center build out efforts, What would you recommend? What's the playbook for an AI lab that wants to build a gigawatt data center somewhere in America?

Speaker 1:

Where are they going? What promises are they making? What promises can they keep? What would you recommend?

Speaker 10:

Yeah. I mean, we see this in the kind of renewable and nuclear space a lot is essentially you do a lot of kind of expensive, extensive, long lasting local engagement

Speaker 7:

Mhmm.

Speaker 10:

With the community. And, like, look, if you're a data center developer, you're gonna end up, you know, building a lot of high school gyms. Oh, it's You're gonna really have to get the local people there

Speaker 1:

Yes.

Speaker 10:

On board with your project. Because typically, at the state level, you actually governors like data centers. You know? They provide tax revenue. Yes.

Speaker 10:

Union like data centers. The IBEW, the International Brotherhood International Brotherhood of Electrical Workers, very pro data center. And they're a pro data center force in the Democratic Party. But oftentimes, these decisions are made at these kind of community board meetings, zoning meetings. And for them, you really have to convince that you're providing something on the ground because what Jordy was saying at the very, very beginning, there's no benefit in terms of what the data center produces Mhmm.

Speaker 10:

To having it near you.

Speaker 2:

Yeah. There's no local AI movement. I want my for yourself.

Speaker 1:

I want low latency inference. I want it in my backyard. I'm a Yimby. Give me a massive data center. I was hearing bad stuff about living next to a golf course.

Speaker 1:

Tee up the one gigawatt data center right there. I'm good to go. My locally I like the idea inference. I like the idea of showing up to the local, t ball league and being like, why is the team called the Core Weavers or something like that?

Speaker 10:

Because So nuclear power plants famously do this. Yeah? Yeah. No way. I mean, the Simpsons parodied it for, you know, you have the isotopes or whatever.

Speaker 1:

Yeah. The isotopes.

Speaker 2:

And you

Speaker 10:

do see this in communities of nuclear power plants. Yeah. But the thing about data centers, they don't provide very many jobs Yeah. After they're constructed. Yeah.

Speaker 10:

So that that part is very difficult to get over.

Speaker 2:

We sorry to interrupt. The solution is you get they have the data center and then there's just a big room with a bunch of Peloton's and you can just go in there and just generate energy for the data center and you get paid that way. So we get a gym, jobs, and a data center in one space.

Speaker 10:

I mean, I I think I think Brian Johnson for one might support that. Although, you know, as we know with the 49ers, they don't like being by an electrical substation when they practice because they're worried about the injuries from the EMF. So Yeah. That might be a that might be a tough sell.

Speaker 1:

Yikes. Demystify New York for me. What's it like living there?

Speaker 10:

Oh, I love it. I've, you know, I've been here for over for over a decade. You know, it's back. It's too cold right now. There's too much snow, which is quite weird.

Speaker 10:

I I cover climate change. Where is the so called global warming right now? Mhmm. Who knows? But, no, it's it's it's great.

Speaker 10:

People are on the subway again. Really nothing to complain about, you know. Apparently, bonuses on Wall Street were great in last year. Our projected budget deficit almost got cut in half magically in a few weeks.

Speaker 1:

So Wow.

Speaker 10:

What could you complain about?

Speaker 1:

There you go. What was your review of Davos from an energy perspective? From a tech perspective, it felt like, oh, like Davos is, a real place that needs to be taken seriously. Like, every company should be figuring out their Davos strategy for next year, whereas, like, previous years were sort of quiet.

Speaker 10:

Yeah. I mean, Davos used to be the place where they would you would you would hear the most bogus sustainability commitments. You know? And for better or worse, they kinda stopped doing that. I think as it's gotten more serious, they've they've stopped, you know you know, you're not hearing that you're gonna owe nothing or be and be happy anymore.

Speaker 10:

So Yeah. It's just a more it just seems more serious. And, you know, as a as a obviously, as a newsroom and a reporter that's focused on climate change, like, I have a point of view on this, but also, like, I just like seeing people deal in reality. Sure. And I think Davos is more of a reality based place now, which I as a reporter, it sounds as great.

Speaker 2:

That's good. Yeah. Jordy, you have something? How you said something which is funny because you cover you cover climate change like Yeah. Professionally.

Speaker 2:

But you said there's snow out which is like we're supposed to have climate change. Doesn't feel like we have climate change. But like, how how is kind of the broader clean energy industry even processing? Like, how are they processing that? Because that that is kind of like more like the average viewpoint of Americans.

Speaker 2:

Yeah. Like it's cold this winter. No climate change here.

Speaker 1:

Yep.

Speaker 2:

And that's obvious like, you know, you can have, you know, different spikes Yeah. And and you're kind of like lived experience is not necessarily entirely representative of kind of

Speaker 1:

But it might inform who you vote for or what you vote for. Right?

Speaker 10:

Yeah. I mean, one of the great ironies of climate policy in The United States is that some of the strictest decarbonization standards and rules and restrictions are in the Northeast. Maine, Massachusetts, New Hampshire, Vermont. Okay. They're really into decarbonization.

Speaker 10:

They're really into preventing climate change.

Speaker 2:

Yeah.

Speaker 10:

And then in Florida, they don't really seem to care that much at all. And, obviously, that's the place that's most at risk in The United States from high temperatures and rising sea levels. Yeah. So, yeah, I mean, just an issue where, I mean, this is something that people in the industry complain about a lot. Well, now they're complaining about it.

Speaker 10:

Mhmm. This stuff is super politicized.

Speaker 1:

Mhmm.

Speaker 10:

And, you know, I think a lot of people in the industry are when a Republican is president, really wanted to say, like, oh, this is not about politics. This not about climate change. This is about clean, fast, cheap energy. You know, solar is a thing we can get on the grid the quickest. Solar and storage, we can get on the grid the quickest.

Speaker 10:

But, yeah, I mean, it's funny. The biggest the most successful green tech entrepreneur and the biggest solar maxi in human history talking about. Is Elon Musk. He's also the second or third biggest Republican donor of all time. The Republican party has tried to dismantle a lot of tax benefits and programs that benefit Tesla, and he doesn't seem to care that much.

Speaker 10:

Yeah. So, yeah, I mean, people's politics are really kind of often override Yeah. Everything else. And that's that's not obviously that's not something unique to climate change, but it's a huge challenge for these businesses.

Speaker 1:

Yeah. What are you tracking on the solar panel manufacturing supply chain? We talked to the CEO of T1 Energy. Seems like there's some green shoots, but even his optimistic case was, oh, maybe we get to, like, 1% production relative to China.

Speaker 10:

I mean, yeah, China has solar and batteries.

Speaker 1:

Mhmm.

Speaker 10:

The amount of kind of time, energy, and money that has been devoted to that has been immense. And every time it's a it's a big balloon squeezing effort too. I'm sure t one and First Solar is the other big American solar manufacturer can talk about this. We try to, you know, put tariffs on Chinese made solar. These factories start popping up everywhere.

Speaker 10:

They pop up in Malaysia, Cambodia, Thailand, Indonesia. And then we start putting tariffs on those factories because they're owned by China. Yeah. But it's tough. I mean, they're the cheapest they're the cheapest panels.

Speaker 10:

They get done fastest. And, yeah, these domestic solar manufacturers are really similar to the car companies, ex Tesla, they're very dependent on government policy. Mhmm. You know, the developers, if they could, would just buy Chinese solar panels. Yeah.

Speaker 10:

But, you know, there's a lot of political reasons why they're being encouraged. And now because of these tariffs

Speaker 1:

Yeah.

Speaker 10:

To buy American ones.

Speaker 1:

Yeah. We have we have a friend Casey Hanmer who says, like, hey. Look. If they're gonna be subsidizing them, we need to buy buy buy buy buy every solar panel we can. And we're like, that's

Speaker 10:

I know. I I talked to Casey for stories too. Oh, yeah.

Speaker 5:

He's great.

Speaker 10:

He is a mind expanding person

Speaker 1:

Appreciate it.

Speaker 10:

To talk to. I mean yeah. But it's it's true because it's just tough. You know? Yeah.

Speaker 10:

If you're trying to get everything onto solar and batteries

Speaker 1:

Yeah.

Speaker 10:

You don't wanna then disemploy an American supply chain, especially because with energy right now

Speaker 2:

Yeah.

Speaker 10:

American energy is largely domestically produced, which is a huge change from, you know, ten to from, you know, fifteen or so years ago. Yeah. And so if you're trying to have a a climate policy that encourages solar Yeah. You you don't wanna then also disemploy, you know

Speaker 1:

Yeah.

Speaker 10:

Lots of people.

Speaker 1:

Zoom out a little bit for me and and and tell me your prediction or or feeling around, like, long term energy production capacity in America. It's been flat. It's growing very low rates, nowhere near the rollout of phones or electric cars or compute. Like everything else is growing exponentially. It feels like you're going to watch a sharply exponential line crash into a linear line.

Speaker 1:

What is your feeling around this idea that we might actually start seeing 10% growth rates in energy production nationally?

Speaker 10:

I mean, just look at how hard it is to build a data center

Speaker 1:

Yeah.

Speaker 10:

Which is just a warehouse

Speaker 1:

Yep.

Speaker 10:

With computers in it. And then think about building power plants

Speaker 2:

Yeah.

Speaker 10:

On that same scale. I mean, people have a lot of trouble building solar power, which is zero emissions, and it's fine. It just looks weird. Imagine trying to do that with gas with gas power plants.

Speaker 2:

So But what if I were a billion AI lawyers in my pocket just suing everyone left and right?

Speaker 1:

I show up into town. I sue everyone to make friends.

Speaker 10:

Well, I mean, this is a serious concern. You know, litigation risk is a huge issue with energy projects.

Speaker 1:

Yeah.

Speaker 10:

And one can only imagine what Harvey could do with filing with, you know, gumming up the NEPA process

Speaker 1:

Oh, yeah.

Speaker 10:

For infrastructure.

Speaker 1:

Yeah. Well Yeah. So I

Speaker 10:

think it's more like we're kind of growing at half a percent a year.

Speaker 1:

Yeah.

Speaker 10:

Maybe two, maybe 5% some years.

Speaker 1:

Okay.

Speaker 10:

10 would be tough.

Speaker 1:

Okay. Well, that's the next bottleneck after the chip bottle.

Speaker 2:

Well, I have something. So Yeah. Your friend John Palmer Yeah. Is also our friend. Oh.

Speaker 2:

We read his fantastic piece. To me, it was more impactful than something big is coming.

Speaker 1:

Yeah. Something little is It hit me like a ton of bricks.

Speaker 2:

It really hit it really hit us hard but he just announced that Stripe is acquiring What? His company.

Speaker 1:

Oh, way.

Speaker 2:

Party Dowser. I wanted to ask can we hit the gong with you for our friend John. Oh, this should be an honor.

Speaker 10:

Congratulations. Congratulations, This means you have to start buying in bigger, more frequently.

Speaker 2:

Yeah. You have

Speaker 10:

to call my you have to call my worst bets.

Speaker 1:

Excuse in this.

Speaker 2:

All in.

Speaker 1:

Poker game? Texas Hold'em? No

Speaker 10:

Limit Texas Hold'em.

Speaker 1:

No Limit Texas Hold'em.

Speaker 10:

We're actually thinking that maybe the guys, we could get together and start a podcast. Here we go. I think we call it All In.

Speaker 1:

Yeah.

Speaker 2:

Poker theme, little talk about a little business, a little politics, a little tech, energy, geopolitics. I think I think you guys would have something there.

Speaker 10:

Yeah. We'd expensive sweaters and yeah, it'd be great.

Speaker 1:

I I would would be the first one to subscribe.

Speaker 2:

Well, it's so so great to finally have you on. When you

Speaker 10:

when you're Of course, guys.

Speaker 2:

Gonna publish your next story.

Speaker 1:

Yeah. But We'll talk to you soon.

Speaker 10:

Thanks so much.

Speaker 1:

Have a good one. Cheers. Goodbye. Let me tell you about Plaid. Plaid powers the apps you use to spend, save, borrow and invest securely connecting bank accounts to move money, fight fraud and improve lending with now with AI.

Speaker 1:

Is there any other breaking news, Jordy? Anything we need to talk about before

Speaker 2:

I'm we super excited for John Palmer and the PartyDow team. It's great. It's funny. John and I Yeah. Didn't meet until last year.

Speaker 2:

Yeah. But we used to kind of have this like tension Oh. Because I had a company called Party Round. He had Party Dow. You were doing some We're both in that 2021 era.

Speaker 2:

I was doing Sort

Speaker 1:

of team of rival situation. Crypto

Speaker 2:

kind of stunts with Party Round. Yeah. He was he was getting quite a lot of attention. We both raised from Andreessen.

Speaker 1:

Woah. Okay.

Speaker 2:

But now we're now we're now we're now we're he's Former riders. Very happy for the whole team. Turn business partners. And very bullish for Stripe crypto.

Speaker 1:

For sure. And that's why he's probably in cloud nine writing long form jokes, which is great. Like, oh,

Speaker 2:

yeah. Shocking the Internet.

Speaker 1:

Shocking the Internet.

Speaker 2:

Well

Speaker 1:

Well, you know what time it is? It's time for the lightning round, the Lambda lightning round. That's right. Bring down the second gong mallet. Fire up the Lambda lightning round.

Speaker 2:

Let's bring in.

Speaker 1:

I'm gonna tell you about Vanta, automate compliance and security. Vanta is the leading AI trust management platform. And we will bring in June Park, the cofounder and CEO of Similee. Welcome to the show. June, how are you doing?

Speaker 2:

What up?

Speaker 8:

Hi, everyone. Great to be here.

Speaker 1:

Welcome to the show.

Speaker 2:

Dude, you know you're you mean serious business

Speaker 1:

because look at this whiteboard. You. I love the whiteboard.

Speaker 8:

We are big fan.

Speaker 1:

Detailed. Please, first time on the show, introduce yourself. Introduce the company.

Speaker 8:

Yeah. So really excited to be here. I'm Jun. I'm the CEO of Simile. Yeah.

Speaker 8:

Simile is a company that is building foundational order of human behavior Mhmm. That can stimulate our society from ground up at the level of individuals and populations and predict its outcomes.

Speaker 1:

Interesting.

Speaker 8:

We use that technology to create products that enable people to talk to millions of similes Okay. We call agents Yeah. That represent real people.

Speaker 2:

Yeah. I'm super excited about this. Yeah. When when we were working to get you on the show, I was thinking about it. There's been so many times in my career where I I was doing something like launching a product or launching a campaign or talking with a founder that was doing one of those things.

Speaker 2:

And like oftentimes you have a really strong sense of how something will do after you've built up some intuition. Mhmm. But I've always thought it was crazy that for some, you know, think of like a physical product company, like you have to spend a year and all this time and money and then just like launch it and be like, well, hope I hope people like it. And it feels like with enough data and

Speaker 1:

Yeah.

Speaker 2:

Maybe a model like you're building it similarly, could get to the point where like you can get a much more accurate read on how society or a niche will respond to something without having to actually make the thing.

Speaker 1:

Yeah. What was the first thing you built for the company?

Speaker 8:

So for the company, we work with vendors that actually connect us with real human subjects. So we actually partner with people and get their data. And what we found was if we have rich data about individuals in our community, we can actually create really high fidelity simulation of these people. This was on the basis of the research that we've done at Stanford where we led agents and simulations based research.

Speaker 1:

Mhmm.

Speaker 8:

And what we are now finding is that this technology can actually create insights for our decision makers.

Speaker 7:

Mhmm.

Speaker 8:

So our core product basically is one that allows our Fortune 500 customers to actually interact with and talk to these agents that represent these real people that we collected Mhmm. And basically provide insights. So that's the platform that we're creating.

Speaker 1:

What's your thought on game engines as a piece of simulation? I've seen some economic research that maybe used Unreal Engine, but is there a role for simulated worlds or deterministic worlds that are then puppeteered by agents in any of this, or is that just like a different fork of research entirely?

Speaker 8:

No. I think it's all connected. In fact, we actually started our research when we published the generative agents paper, which introduced the idea of creating simulations in agents. Mhmm. We actually demoed this in a game town.

Speaker 8:

One, because we thought it was so visceral and so fun. Yeah. But also at the same time, if you want to create a simulation of our entire world and how our society might get shaped in the coming decades Mhmm. Then you really ought to be modeling not just people, but also the environment and the model as a whole. Mhmm.

Speaker 8:

So there's absolutely a place for these kind of game simulation like characteristics to this field. Basically, the idea here is down the line, can we actually create simulations of the entire market

Speaker 1:

Yeah.

Speaker 8:

Entire nation? That's where we're headed.

Speaker 1:

So, yeah, what what are some of the case studies that you've seen with large companies? Is it something as granular as should they charge $2 or $2.50 for a particular product, or is it more broad? Like, if we do a Super Bowl ad, will it will it convert or,

Speaker 2:

you know, or just putting are such a good example where Yeah. You're spending maybe 1,000,000, 8,000,000 on the buy Yeah. Half million creative, all this stuff. Social. And then, it'll either get people to hate you Yeah.

Speaker 2:

Or love you. Yeah. But we should be able to predict these things better. And obviously, historically, there's focus groups. You could try to get a reaction, but we're still at this point where sometimes you just need to do something to see.

Speaker 8:

Indeed. In fact, as somebody who's running this company, I actually simulated how today's conversation might go.

Speaker 1:

Really? Because some of you

Speaker 8:

are so much on online that we could actually get an update to actually see how conversations like this one actually might pan out. Mhmm. So I could be better prepared to talk about some of the topics that we wanted to cover. So core use cases today actually does cover things like so we work with Fortune 500 companies that have earnings call. Many of them have come to us and said, can you actually create simulations of, let's say, earnings call that's going to happen?

Speaker 8:

And there, basically, the idea here is can we simulate the each analyst who might actually ask questions during this earnings call, all the way to how CEO and CFO might be able to respond to these questions. That's a common ask. So these are the kind of simulations that we create for that particular use case. And what we find is we can actually predict eight out of 10 questions on average for these earnings call, like the actual questions that would be asked. And we can actually simulate how the conversation might flow from that point on.

Speaker 8:

So you can test out different strategies.

Speaker 1:

Yeah.

Speaker 8:

But beyond that, one of our sort of customers right now is CVS. CVS, of course, a large retailer. Yeah. And there, CVS has been working with Simuli to create simulations of hundreds of thousands of their customers. Mhmm.

Speaker 8:

What they want to know is, let's say, CVS's new product. They have new store layout. They have new concepts they want to test. These are the kind of things that CVS today might go to a large human panels Mhmm. But instead, they can now work with Simile to get insights that's much more granular and that's much faster to gain.

Speaker 8:

Mhmm. Now another example here, Gallup is sort of a long standing polling company, has been working with Simile to create digital panels of their human panel members. Sure. So if they want to ask questions to that panel, but for whatever reason they are not reachable Yeah. Then you can actually come to Simile to create simulations of that panel.

Speaker 9:

Mhmm.

Speaker 1:

What's the moat? How do you think about like, in the earnings call example, a lot of earnings health transcripts are already in pretraining corpuses, and you could dump them in and sort of do a quick fine tune or even just throw it in the context window. And if you're a public company CEO, you could probably just ask any of the frontier models. What do you think the analysts are going to ask? And is it that that strategy only yields two accurate questions and you're at a higher benchmark rate?

Speaker 1:

Or do you think that there's a different piece of your business that the big labs won't go after?

Speaker 8:

So fundamentally, the model that we're trying to create is different than what big labs might be interested in today.

Speaker 1:

Okay.

Speaker 8:

That we are less interested in creating intelligence that's superhuman, that's super rational. But instead, what we are interested in creating are models that actually model people's irrational half on their brain. These are people's values, preferences, taste. And the way we go about doing this is a true combination of modeling and product frontier. So Simile is unique in that we as a team actually is composed of both of those talents.

Speaker 8:

Mhmm. So myself and my co founders, Michael Bernstein and Preston Liang, are all researchers who have been making contributions at the frontier of AI ranging from generative agents to generative AI and foundation models. Mhmm. And there, the core mode here is what are the best kind of data that we can collect? And then using the data to create the most high fidelity agents that represent the people we have interviewed, collected data from.

Speaker 8:

Mhmm. And then we overlay that on top of amazing product. So here, one of the amazing part about the models that actually map people's values and preferences is the better model immediately means better insights for customers like CVS and Gallup. So that there's an amazing alignment between the modeling frontier and product frontier, which is not something that we see every day. Mhmm.

Speaker 8:

So it's a true combination of those two things coming together that makes Simile special.

Speaker 1:

Yeah. What, what what do you see progress scale with more? Are you compute constrained, or is it, much more about data constraints?

Speaker 8:

So we are charging the frontier on both front. Mhmm. So down the line so today, we model hundreds of thousands of people Mhmm. Who signed up to participate in these studies. Mhmm.

Speaker 8:

And then our customers, of course, are they have appetite to down the line simulate millions. Mhmm. Our goal eventually is to ask ourselves, what would it mean if we can create simulations of 8,000,000,000 people, the entire earth? That's the future that we're headed. Yeah.

Speaker 8:

So there, certainly, there's a question around what is the data? Yeah. But also, how do we compute these kind of simulations efficiently?

Speaker 2:

Yeah. Right? Are you worried about getting too accurate that life just become becomes boring because

Speaker 1:

everything Is is free is free will real? Do we have free will?

Speaker 2:

It's like you have

Speaker 1:

Are we living in a simulation?

Speaker 2:

It's like you've read you you've scroll you went to the end of the book, and you read the last page, and you're just like, like

Speaker 8:

Yeah. There's something fun about, actually, this discussion. So I don't know if you all have watched The Matrix, or I don't know how many of the audience actually have watched The Matrix.

Speaker 1:

I love The Matrix.

Speaker 8:

Amazing. I

Speaker 2:

have seen that one.

Speaker 1:

You have seen that one. Yeah. Jordy hasn't seen a lot of movies, but he's seen The Matrix.

Speaker 8:

There's actually a quote that I actually like quite a bit that I think is actually quite profound. It's something that the Oracle says, which basically is, you're not necessarily here to make a decision. You've already made that decision. You're here to understand the decision and why you made that. And that sort of captures essence of simulation.

Speaker 1:

Mhmm.

Speaker 8:

Right? The the models that we're creating is isn't merely trying to predict the future, although we are good at that and that is one of the core capability. But what to me is more important is understanding why we made the decision that we created. And creating these kind of bottom up simulation where we can actually go back and actually trace through the audit blocks of how society might unfold Mhmm. Is actually an amazing way to gain that interpretive interpretability layer of our reality.

Speaker 8:

Mhmm. So that is, to me, the best way to shape the future. Right? If we want to shape new policies, new product that would actually serve the people that is in our society. Mhmm.

Speaker 8:

The best way is actually to understand those people and then actually communicating with them at a scalable way Mhmm. To basically create those policies. So that's how we view things.

Speaker 2:

How about that? Before before that, the chat wants to know, do you think do you think if you worked with Clavicular, the live streamer, you would be able to predict whether or not he would get frame mugged if he went to different, you know, locations or or maybe universities? Like, maybe you could have said, don't go to ASU. Just don't. Just don't.

Speaker 2:

We're just pretty into that.

Speaker 1:

How much did you raise?

Speaker 8:

So we raised $100,000,000 Mhmm. From index and other participant. We're also really excited to invite angels, such as Andrew McCarthy, They're very excited. Okay.

Speaker 2:

That's great. Big dogs.

Speaker 8:

But ultimately, what we are excited about is that there is a vision that I think is extremely compelling, that this is a vision that this particular team have been working towards for the past five years Mhmm. That we can create with the technology that we have today, really high fidelity simulation that can actually provide insights to create better society. Mhmm.

Speaker 1:

Well, thank you so much for taking the time to come chat with us. Congratulations on the massive round and all the progress.

Speaker 2:

Great to meet you. Keep Very exciting.

Speaker 1:

Keep tearing through the Fortune

Speaker 2:

Give us five Don't be afraid to send us some Alpha if you're predicting about us. Yeah. Yeah. Know.

Speaker 1:

Does any

Speaker 2:

I like this. I think you're starting a new meta. Somebody in the chat said it's kind of the Jensen, Steve Jobs combo Oh. Effect with the outfit. I like it.

Speaker 1:

Looks good.

Speaker 2:

Yeah. It's it's great.

Speaker 1:

Well, have a great rest of your day. We'll talk to you soon.

Speaker 3:

Team.

Speaker 1:

Goodbye.

Speaker 2:

Cheers.

Speaker 1:

Let me tell you about Gemini three Pro. It's Google's most intelligent model yet. State of the art reasoning, next level vibe coding, and deep multimodal understanding. Up next, we have David Rischer. He's the CEO of Lyft.

Speaker 1:

And we're gonna ask him about the Ferrari Luche. Ferrari Luche. Yes or no? Is it the ideal Lyft mobile? Yes?

Speaker 1:

You you like the Johnny Drive, the Johnny Ive inspired interior? You're down?

Speaker 7:

I am down. I mean, I look. I like innovation. I like cool new stuff, so why not?

Speaker 2:

Give it

Speaker 9:

a try.

Speaker 7:

I've not been in one, though, to be clear.

Speaker 1:

Yeah. And I don't think anyone has. I mean, they did a they did a PR, you know, day. A They didn't show the exterior of the car, but they did let some journalists come and twist the knobs and hear the click clocks.

Speaker 2:

Well, I I I'm with you. I like that there's some fresh ideas in there. Yeah. And I hope the other manufacturers

Speaker 1:

Yeah.

Speaker 2:

Adapt as well.

Speaker 1:

What's the freshest idea you have for 2020 for Lyft? What's happening? What's happening in your world?

Speaker 7:

Oh, dude. A lot. Well, so we just announced earnings, which is always kind of an interesting time.

Speaker 1:

Yeah.

Speaker 7:

So record bookings Mhmm. Which is great and accelerating. Yeah. Record profits, which is nice as a business. Mhmm.

Speaker 7:

Over $1,000,000,000 in cash generated, which again is nice as a business. And the only reason I start there is because to me it says, like, this is customer obsession driving profitable growth. Right? Like, I took this job maybe almost three years ago now. Yeah.

Speaker 7:

Actually, fun fact, I was sort of convinced to do it on Valentine's Day three years ago. So it's coming up right on that magic moment. My wife and I had such a romantic evening

Speaker 2:

Mhmm.

Speaker 7:

Talking about Lyft that night. But anyway, so the point is

Speaker 2:

like That's amazing.

Speaker 7:

It was it was it was a thing of beauty. But anyway, so yeah. And so here we are super well positioned for 2026. Why? Because international expansion, that's a fresh thing.

Speaker 7:

That's a big deal for us.

Speaker 1:

Yeah.

Speaker 7:

Autonomous vehicles, self driving vehicles, that's a super, super big deal for us. So Yeah. We've got a couple of big, what we call growth vectors for the business.

Speaker 1:

Yeah. What's the biggest what what's the market internationally that's still ripe for ride sharing to come?

Speaker 7:

Yeah. So it's really interesting. You know, Europe is is an established ride share market, of course, but there's still a huge huge taxi market in Europe. And it's really interesting. Like, if you look at the total mobility, like, outside of people driving their own cars in still a ton of people every day pick up the phone and super old school, you know, call a phone number and get a taxi to their house and stuff like that.

Speaker 7:

So that's actually a big transition thing that's happening that we're really kinda helping powers that kinda moving from offline to online. And then just like all over the world, I mean, self driving cars are gonna be a big deal for all of us. And, you know, I look, you know, you know, three, five, ten years, you know, it'll be 5%, and then 10%, and then 20%, and then 50%, then that's gonna be huge. Oh, and bikes are always big in Europe too, even bigger than The US.

Speaker 2:

How are your conversations like with other public company CEOs right now? We're in this period of insane volatility. It seems like no matter what management team says on an earnings call, you're gonna have, you know, some some massive volatility. But yeah. What what are those conversations like?

Speaker 2:

I'm curious. Do you guys have a little group chat where you're like, I'm going in. I'm going in.

Speaker 7:

Going in. You go wish me luck.

Speaker 1:

What's gonna happen here. You

Speaker 7:

know what? We don't actually have those conversations that much just because they're kind of boring. You know what mean? It's sort of like everyone has the same level of frustration, of course, where it's sort of like, oh my god, you know, everything seems so volatile. And it's so weird.

Speaker 7:

Was actually just thinking about this yesterday. I mean, what happens right now, I don't know if you guys know this, but what happens is you come out with earnings. And then, like, for example, again, we had to record you know, bookings, record rides, record profits,

Speaker 3:

you know, good good good good.

Speaker 7:

Okay. So for, the first thirty seconds, headlines say, you know, whatever. Lift beats on profits or, you know, the right way. Lift, you know, highest rides ever, something like this. And then there's some market reaction.

Speaker 7:

Now the market reaction might have something to do with some totally different thing, you know, some weird or cruel thing that some bot remember, happens after hours. So some bot picks up some weird or cruel thing that doesn't fit in the model, and then all of a sudden, you know, starts selling shares. And then, you know, the sentiment turns. And this is all after hours stuff, super thinly traded. Okay.

Speaker 7:

So then the headlines get rewritten in real time by other bots who are not reacting to the news of this earnings. They're reacting to the reaction of the news, and that

Speaker 2:

and then they're creating so they they see the stock price move, and then they're creating a written narrative of why it's happening that then gets turned into a a an article that then other reading.

Speaker 7:

100%. And remember, a lot of people don't read much beyond the first paragraph, and so they'll just see this headline, which used to say, you know, Lyft beats on profit or Lyft beats on whatever, and now, you know, Lyft shares down, you know, 10% or whatever it is. And so it's such an interesting thing. Anyway, it's a little bit of a meta point, but like right now, as society, it's like we're almost we're almost reacting to the reaction more than to the thing itself, which is which is a little bit mind bending.

Speaker 2:

Yeah. That's Yeah. That's super fascinating. We're reacting We're we're we're letting the bots react and then deciding how we feel based on based on that.

Speaker 7:

That's that's exactly right. And the funny thing is, the the bots are reacting for their own algorithmic reasons that might not particularly be well plugged into any reality. Anyway, you can go on in this for hours, but it's a little, like, at the sort of super, like, species level, we're kinda, like, we're kinda, like, letting the machines take over a little bit. You know? Yeah.

Speaker 7:

We're not forming our own opinions. We're letting them tell us what our opinion should be and then reinforcing that, which is a little bit a little counterintuitive. But anyway, like, of course, some versions have gone on for a long time. It's just sort of exaggerated now. I remember this is something I I worked for, as you guys know, Jeff Bezos for a long time.

Speaker 7:

And, you know, Amazon, for example, back in the February was, you know, $6 stock or something like this. And people would ask him, like, how do you stay, you know, sort of sort of focused? He's like, the good news is, like, I know the data, and this is how I feel too. Like, I was in a meeting yesterday about our self driving car initiative. It's amazing.

Speaker 7:

And I was in another meeting yesterday about our Super Bowl performance. It was amazing. We were up 13% year on year. We picked people up faster, and we took we were down 20% in surge pricing. In other words, our prices were were 20% better than they would have been if we'd had our surge pricing algorithm from last year, which is great because it means that we're able to do this at a lower cost to to customers.

Speaker 7:

So like, I see the data and I see how well positioned we are right now. So my last thought on this is this is now my my new favorite CEO tool. Back to your point. You know what these are? Noise cancelling headphones.

Speaker 7:

Always cancelling them.

Speaker 1:

Can you unpack a little bit more of the Waymo partnership? Talk about how that's progressing and just broad rollout your plans where you how how fast do you see that becoming meaningful and growing?

Speaker 7:

For sure. So I'm gonna zoom out one click and, you know, just to sort of remind everyone. So I'm super bullish on AVs, autonomous vehicles, entering the rideshare space.

Speaker 1:

Mhmm.

Speaker 7:

And the reason I'm so bullish is because they're they're market expanding, you know, when when and we see this in market after market. When AVs come in, people take more rides overall because it's cool. It's a it's a cool new product. It's kind of fun. It's sort of futuristic.

Speaker 7:

It's reliable. So it so it expands beyond the current, you know, population of people who take rideshare today. And then the cost over time will become lower, you know, for example, insurance costs are lower. That'll be pretty obvious. So it's nice when the market expands and when your cost of doing business goes lower.

Speaker 7:

Okay. So then you ask about Waymo. Waymo, of course, is the leader here in The United States, no question, and we have a very specific partnership with them announced in Nashville. Yeah. That was literally one of the meetings I was in yesterday where we're talking about all of these different ways where we're integrating across the two platforms.

Speaker 7:

Everything from fleet management to demand generation, and, you know, if we get a ride, are we gonna are we gonna, you know, ask Waymo to take it, or is a human driver gonna give it? Like, these different, you know, sort of details that you have to work out when when you're doing a deep partnership, which we expect to go live later this year. The expectation of that partnership, certainly that we have, and I believe that Waymo has as well, is that this will scale, you know, beyond just Nashville, but Nashville is very much our focus right now.

Speaker 2:

Mhmm.

Speaker 7:

I'll tell you one last thing. Taquidro, who's the CEO of Waymo, was just on a a show yesterday, And she was asked about this partnership too, and she was it was nice to see her also being very enthusiastic about it and saying, you know, the thing that I think should be obvious to many people, which is even if Waymo wants to go it alone in some markets like they're doing here in San Francisco, it's really helpful for them to have a demand gen fleet management partner like us in a place like Nashville and hopefully elsewhere as well. Because it allows them to really focus on what they wanna do well, which is, you know, create the world's safest driver. That's the way they call it, and not have to worry about all the other details of writing of a rideshare network. So anyway, it's, you know, stay tuned.

Speaker 7:

We've actually, you know, broken ground on the big depot, and we're doing all this sort of work. And in the next couple months, we'll start to roll rides out in in Nashville.

Speaker 2:

What is talk about the I'm I'm super curious to actually understand, like, what does the depot look like? Mhmm. Like, is what is the process? What are all the things that goes into it? What are what are the jobs?

Speaker 2:

That kind of thing.

Speaker 7:

Yeah. So first thing is it's big. It's big. It's It takes some space because you you have to think. I mean, you're talking about hundreds of it's something that has to be able to clean, maintain, charge, you know, tune up all these different things, repair in some cases, you know, couples hundreds of cars, hundreds of hundreds of cars.

Speaker 7:

And, you know, like at at 02:00 in the morning, you know, these cars are not necessarily gonna be out on the road, so you gotta have a place for, you know, hundreds of cars to sort of go through this cycle when the when the demand is low. There'll be real jobs there, and actually we're hope and we have a we have something called FlexDrive there, which is our own fleet management subsidiary, and that's kind of is the basis of this thing that we're kind of building it around, let's say. And so some of the folks working at FlexDrive, you know, will transition into doing this a b stuff. We'll also frankly end up hiring some drivers in the community who kind of want extra money and you know know a lot about cars and and so forth. But, yeah, I mean, think of it as sort of a a big, very kind of modern, you know, kind of, let's say, there are also some pit stops around the around the town that just do charging, but think of it as sort of charging and maintenance, but done for kind of, you know, 2026 and beyond instead of Yeah.

Speaker 2:

So so, yeah, that was gonna be my next question is is Yeah. Do you have for a city like Nashville, you have one major depot. You have some charging stations. Would you ever break out the depot? So basically having, like, multiple spaces for storage or is, like, one for a national sized city, is like one facility gonna be enough to to to kind of operate?

Speaker 7:

I think, you know, one big central and then a couple of satellites is probably right. It's it's sort of tricky to cite these things because if you put them too far outside of town, the good news is it's cheap, but the bad news is they've got what we call deadhead miles, so just miles going back and forth with probably no rider in them. So, you know, you you so you have to find a place that's kind of the right, you know, ours is kind of near the airport, all this sort of stuff. And then and then you want a couple and it's gotta have a lot of power too. Like, literally electricity.

Speaker 7:

Like, we work with the utility to figure out what's the place that can support a fairly big power load. And then so anyway, that becomes your central thing, and then you have a couple of satellites around that.

Speaker 1:

How do you think about the the the scaling and rollout of self driving vehicles just broadly in America? I think a lot of people lean back on what the AI systems can do, incidents per mile, and there's a lot of smooth curves there. But we in a lot of AI world, we're running into the real world. Is there enough energy for this data center? Is there enough are there enough chips for this new model or this new new training run?

Speaker 1:

At a certain point, how are you feeling about the ability of the OEM market, the car production, supply chain, even things like regulation, like Yeah. How how how do you see all that playing together? What's the most important thing to focus on?

Speaker 7:

I mean, you're you're you're literally laying it out. Like, think of this whole thing. You've got first of all, you gotta have great tech that works at scale Yeah. Without a driver in it that can go, you know, for a long time with with no instance. So that's a very, important thing.

Speaker 7:

Then you have to have an OEM. Right? And that OEM has to be willing to make an investment. Right? Because it's not just like, cool, let's just take the same lines and just, you know, throw some AV stuff on at the end.

Speaker 7:

It's like, you want to design production lines that build that AV. And and think about what you know, you can see this with Waymo. Right? Like Waymo literally is receiving, and I'm talking about the Jaguar platform that they used mostly so far. Know, they're literally like receiving these kind of half built cars and then kind of disassembling them and reassembling them.

Speaker 7:

It's very complex. So you you wanna be able to build that in a very cost efficient way. So OEMs have to have to be willing to make a big investment there. Then you've got to have the regulations line up, and that's a city by city, state by state thing. Right?

Speaker 7:

So we just signed an agreement. This is outside The US. We just signed an agreement with Hamburg, Germany. That'll be the first city in Germany and one of the first in Europe to say, yeah, we welcome AVs onto our roads. Yeah.

Speaker 7:

Here's the kind of framework we have to use. Anyway, so that's got a lot. Then someone's gotta buy these things and finance these things. There's gotta be financing. By the way, these things might last three or 400,000 miles, but unlike a normal used car, they'll have zero value at the end.

Speaker 7:

Right? Zero value. No one's gonna buy a used AV that's been on the road for 100,000 miles and uses old tech. So, okay, that's the whole financing model that people have to figure out. Then there's gotta be all the stuff that we were just talking about, all the fleet management stuff, and that's physical world stuff.

Speaker 7:

Right? That's like who's gonna charge these things? Who's gonna reboot them when they need rebooting? Who's gonna clean them when someone, you know, throws up in the car? All these things.

Speaker 7:

And then and who's gonna maintain them? And then you have to have all the demand, you know, management and so forth that we've we're we're pretty good at. So I think your analogy is a good one, which is that and it's why you see so many tiny pilots, which are all kind of hand built and and artisanally done. But when you think about scale, it really is gonna be years and years and years until all of these things line up in market after market after market. Yeah.

Speaker 7:

I don't know if that exact answer a question, but

Speaker 2:

Do you track the EV AV rollout in China? Like Mhmm. What what's the timeline for manufacturers to just be, you know, BYD that can just plug in to a platform, a Chinese equivalent platform like Lyft and just like Yeah. Start earning autonomously?

Speaker 7:

I mean, I will say this is an area where it's just flat out China is ahead. Flat out China is ahead. I mean, this may be slightly different from the question you're asking, but when I talk to AV so if you're if you're in the business of of producing AV technology, you know, it could be a mobile eye, it could be a a wave, it could be any any number of people who are producing this technology independent of the OEM. When you talk to them, their biggest frustration, I'm actually gonna quote one of them right now, is saying, we talk to quote Western manufacturers, and and it really doesn't matter whether you're talking about VW or Stellantis or Ford or whatever, just Western manufacturers. And and then you talk to Chinese manufacturers.

Speaker 7:

And and in China, they're talking one to two years. And then that same conversation with a Western manufacturer is a five year conversation. You know, it's a it's a 2032 conversation instead of like a 2027, 2028 conversation. So the short answer is, you know, there's good AV tech from a lot of places, but frankly, the OEMs are in very, very different positions in terms of their ability to to to spin up quickly and and, yeah, and get and get a platform out that actually works.

Speaker 1:

Well Makes sense. Congratulations on all the progress. Thank you so much for taking the time to come chat with us.

Speaker 2:

And fantastic shirt, by the way.

Speaker 1:

We love the shirt.

Speaker 2:

Coming in strong.

Speaker 7:

Well, thank you, gentlemen. It means a lot coming from two fashionable contestants.

Speaker 1:

Yeah. In the plain white shirt. I'm really taking a risk today.

Speaker 7:

I mean, you're bold. It's it's on brand for you. You're you're bold.

Speaker 1:

Yes. In in a world of, you know, plain black t shirts of tech, the the White collar shirt. White collar shirt does kinda stand out. So thank you.

Speaker 7:

I appreciate it. Hey, congrats

Speaker 8:

to you

Speaker 7:

guys on all your progress and your Super Bowl ad. That was amazing.

Speaker 1:

Thank you. That was a lot of fun. It was a good

Speaker 7:

time. Yeah. Of fun.

Speaker 2:

Slightly smaller ad than yours. Yeah. We're working our way up.

Speaker 7:

We're getting No, dude. You gotta start, you know, step by step by step by step. I will tell you I did zoom in on all the logos I was looking for the Lyft logo. It must have been hidden. I don't think I saw it, but, anyway, was a small

Speaker 2:

Somebody's getting fired.

Speaker 1:

Tyler is looking into it right now. We're gonna send you evidence if it's in there. We but we might have made a mistake. It's possible. Were there were there were few that were behind certain letters, and so they

Speaker 7:

got And that was it. I looked very closely.

Speaker 1:

Okay. Okay. Pretty sure

Speaker 7:

it was hidden. It's no problem at all. I'm not holding it

Speaker 6:

against it.

Speaker 1:

We'll figure it out.

Speaker 2:

Yeah. We take we're taking

Speaker 1:

it Super Bowl happens every year.

Speaker 7:

Yep. We go. Sixty one is just around the corner.

Speaker 2:

There you go. Great to see you, David.

Speaker 1:

Have a great rest your day. We'll talk to you soon. Cheers. Thank you. Let me tell you about Cognition.

Speaker 1:

They're the makers of Devon, the AI software engineer. Crush your backlog with your personal AI engineering team. And without further ado, Todd McKinnon from Okta is in the restream waiting room. Welcome to the TV in Altium. Todd, how are you doing?

Speaker 2:

What's happening?

Speaker 6:

I am excellent today. Thank you so much for having me on.

Speaker 2:

Thanks for having me on. Thanks for joining.

Speaker 1:

First time on the show. Please kick us off with an introduction.

Speaker 6:

I'm Todd McKinnon. I'm the founder and CEO of Okta. Yeah. And, yeah, I'm excited to be here coming from the office in San Francisco, and I feel like Yeah. My hair is not nearly good enough to be on the show.

Speaker 1:

It looks fantastic.

Speaker 2:

What are you talking about? Come on.

Speaker 1:

I mean, since it's the first time, would you mind going back and telling us the the a little bit of the history, the story, the I I love these, like, founder journeys. So you can give it you can take as long as you want or as short as you want, but whatever whatever you haven't gotten sick of talking to people about.

Speaker 6:

Yeah. It's interesting. It's a lot there's lot of echoes in what's going on today with AI. Yeah. But seventeen years ago, I started Okta the same year Super

Speaker 1:

great success.

Speaker 6:

The same year Steph Curry was drafted.

Speaker 1:

No way.

Speaker 6:

I don't know if that makes it seem like a long time ago or not that much time ago, but seventeen years ago.

Speaker 2:

Okta really is the Steph Curry of the Enterprise.

Speaker 1:

It is. I've always said that. Yeah. I've always said that. Lots of people.

Speaker 1:

Yeah.

Speaker 6:

He needs better supporting cast around him maybe, but I'll take it.

Speaker 1:

But yeah. I mean, how did you even come up with the idea? I think a lot of people start a company. They think, like, I'm gonna build something for myself. I'll go consumer because that's hot and trendy.

Speaker 1:

Like, how what was the inciting element? The like, the problem that you identified?

Speaker 6:

Well, I was running engineering at Salesforce. Okay. And I was working with all these customers that were adopting Salesforce. And then I looked around and I said, hey. AWS is launching Google Apps for domains.

Speaker 6:

Mhmm.

Speaker 1:

Just what

Speaker 6:

was called back at the time is launching. I said, hey. You know, there's really gonna be a cloud version of everything in IT, and it's probably a great time to go out and start something new and build something that would be needed in that world. And we settled on this platform for identity management, which seems maybe maybe pedestrian or incremental. But at the time, people were had these applications like Box and Salesforce and didn't really work well with their Windows login and Windows PC, and that was our wedge to get started to what now is a Yeah.

Speaker 6:

$3,000,000,000 a year company, 6,000 people, and a really important part of this whole.

Speaker 1:

Was this in the sweet spot of venture capitalists during that time? I mean, Salesforce pedigree, it feels obvious in hindsight, but what was the pitching process like?

Speaker 6:

Well, it was very different world because we were in the, you know, aftermath of the financial crash. Yeah. And if venture capitalists were were you know, where's are are our capital calls gonna be returned? Are we gonna be money? Like, are we gonna be so people were very reticent.

Speaker 6:

We actually were the first company to raise money from Andreessen Horowitz. No way. So We marked

Speaker 1:

were the very first company?

Speaker 6:

The very first. Yeah.

Speaker 1:

Wow. And on the first investment, it's Okta. That's insane. I mean, mean, congrats to you, but also Andreessen. A lot of lot of people are like, yeah.

Speaker 1:

The first investment I made was not great. Public company. Wow. I mean, I'm sure they

Speaker 6:

they made plenty of personal. Has Slack and

Speaker 1:

Yeah. It's a great fund.

Speaker 6:

Yeah. Nysara's in there.

Speaker 2:

And then Skype? Wasn't Skype in the first one?

Speaker 1:

Yeah. Skype's a weird one because it was like a take private and then a Yeah. IPO or something or sell.

Speaker 6:

Yeah. I remember, like, talking to my friends and they're like, oh, yeah. You raised Andreessen Horowitz. That sounds great. They the company did that weird Skype thing where they bought it eBay and took it and worked out okay.

Speaker 2:

Yeah. Yeah. Yeah.

Speaker 6:

Yeah. But at the time, you know, it was it was people thought, oh, it it was not maybe it's not big enough. They said, oh, you're you're providing a single sign on for cloud apps to your, you know, Windows domain. Is it big enough? And it's like for the last seventeen years, it's like, you know, is it a big enough market?

Speaker 6:

Is and every four or five years, more cloud and then mobile and then what happened with the really focus on security and how important identity management passwords are to security. Yeah. And now what's happening with AI COVID remote work, identity just becomes more and more and more important. And now with AI, we're on this precipice of that and a huge catalyst for identity. That's why we're so pumped up and excited about what's going on.

Speaker 2:

Yeah. So let let's get right into it. Please. You Okta powers identity for biggest companies in the world. Mhmm.

Speaker 2:

And right we're at this inflection point right now where people are adding effectively adding adding adding agents to their workforce. And oftentimes, those agents are are playing the role of of humans in some way, functioning with different applications, working across the company. Like, how are all those conversations going? How are how are organizations adapting? And and how are you guys fitting into it all?

Speaker 6:

Well, on a personal level, I've never seen anything like it, the level of interest. The uniformity of the interest from customers in every vertical industry and every size company, they wanna do something with AI and they wanna invest in agents and it's it's quite a tidal wave of interest. Mhmm. I think now what really gets exciting is is how we turn that continue to turn that interest into actual revenue and actual usage of the new products. Yeah.

Speaker 6:

But the precondition is there. People are definitely interested. I think what's happening is it's not a huge shock. Every you know, boards of directors and CEOs, they all read x. Yeah.

Speaker 6:

They know how excited everyone is about this, and they're like, oh, give me some of that. Right? Come on.

Speaker 2:

Come I'll take 1,000,000 agents, please.

Speaker 6:

Yeah. Exactly. Yeah. And and then and then the poor the poor people put together these prototypes and these demos and and they were kinda, like, hacked together and they had access to every piece of data in the whole company. And they're like, that's awesome.

Speaker 6:

Put it in production. And the developer, like, wait a minute.

Speaker 1:

Yeah.

Speaker 6:

I don't care. Put it in production. I told my board I was gonna have an agent, and this is an agent. Go. Go.

Speaker 6:

Go. And so what we're trying to do is trying to help them manage that and help them, hey. Track these things. Make sure whatever these agents are connecting to, you know how it's connecting and and lock it down. Make it secure so when they go into production, it'll actually work.

Speaker 2:

Yeah. What kind of general horror stories have you heard at at companies? Is it an yeah. Is it an agent that can effectively go and pull data somewhere that they shouldn't be able to

Speaker 1:

Prompt injection, like, give me a refund there's

Speaker 2:

different so risk factors that are that are, you know, I'm Personally, most of the time when when this happens at a company, hopefully, it's just internal and, like, you can correct it quickly. Mhmm. But it doesn't feel like we're that far off from, you know, a company not being ahead on this kind of stuff and, you know, having an agent that can just query their own internal data and and having some type of bigger leak.

Speaker 1:

But Mhmm.

Speaker 2:

Obviously, you're working to prevent that.

Speaker 6:

It's very it's very simple. It's it's the the agent was hooked up to multiple data sources. And to get access to more data than you wanted, you just need to ask the agent. Mean, prompt prompt injection sounds you know, it's something cyber people say to convince software engineers. It's a, you know, maybe a hard thing to prevent, but we're talking about simple cases where the agent has too much access.

Speaker 6:

And by a couple couple prompts, you can get it to tell you the customer information of a different customer. So it's we're we're and luckily, I think if you zoom in Silicon Valley, we're so obsessed with the latest and greatest. We we think that everyone has fully agentic customer support and fully agentic everything. But the reality is most companies are very early. Yeah.

Speaker 6:

So it's a good time to layer in some maybe controls and some visibility into what these agents can connect to and the data they can access Yeah. So we can avoid some of the really bad mess ups going forward.

Speaker 1:

Do you think that Claude Bot now Open Claw sort of Mac mini agent personal open source story is useful in concretizing what is going to happen in the enterprise over the next twelve months for you as has it been a reference point in discussions with leaders that you work with?

Speaker 6:

I think it's it is. It's another catalyzing event because I think in some cases, people are a little bit behind on the power and the business value that could be delivered from these

Speaker 2:

Yeah.

Speaker 6:

Agentic systems. And I think when they can I was just at my kid's soccer game the other day, and they were talking about Clawbot? They were talking about, you know, and just these weren't tech people. They're on the sideline. And I think they were talking about how powerful it was and how they're gonna get restaurant reservations, and I think that catalyzes the the side of the equation, which is go, go, go.

Speaker 6:

Yeah. And I think, unfortunately, you're also gonna see more security issues and more connections that were exposed in Mac Mini with personal data. It wasn't a new Mac Mini. It was like your family Mac Mini, and now your family Mac Mini is is you know, the details are all over the web.

Speaker 1:

You're gonna

Speaker 6:

see some of that. And so we have to get both right. We have to get the business value and and get people headed in the right direction and then also put the right controls in place.

Speaker 1:

How do you think business models evolve in a world of, you know, potentially millions of agents? Companies aren't really used to running payroll for millions of people, identity for millions of people. Some people are. But how do you think all of that changes going forward?

Speaker 6:

Well, I've I have a bunch of super strong thoughts on this. Some Yeah. Some of them which are maybe counterintuitive. I think first of all, I think in five years, there's gonna be way more software engineers than there are today.

Speaker 1:

Woah.

Speaker 6:

I and and it's a little bit of a trick answer because I I mainly because I think there's gonna be way more software.

Speaker 1:

Okay.

Speaker 6:

Yeah. And, you know, they'll be more productive, but you're you'll have more software engineers. I guarantee you all these companies Yeah. Meta, Amazon, Google, Anthropic, they'll all have more software engineers in five years than

Speaker 1:

any day. And double clicking on that, do you think it's that we're going to see more computer science degrees become software engineers in the traditional sense or that designer that you have, that PM, that business leader, maybe even your CFO, all of a sudden is software engineer or is creating software?

Speaker 6:

I think so I think what a software engineer does but I'm not trying to cop out and say, all all people are gonna be vibe coding, and that's my software engineer account. I mean, people building and architecting software and knowing the intros of how it works. Yeah. They're not gonna be writing as much code clearly, but I'll give you a secret. For the last thirty years, we've had integrated development environments that are helping us write code more efficiently.

Speaker 6:

This is a whole another level, but Yeah. It's not a new thing to get better tools to help us make software. So there's gonna be more software engineers. Yeah. But there's gonna be even more than the increase in software engineers.

Speaker 6:

There's gonna be way more software, which is super exciting and and super super daunting. Now back now, if you layer on, you know, are people gonna build all their own software or buy software for vendors? I I think that is unequivocally, they're gonna buy more software from vendors for sure.

Speaker 1:

Yeah.

Speaker 6:

Now, the trick Yeah.

Speaker 2:

You kinda people have to people that are just bearish on enterprise software have to remember that there's been open source equivalents of, like, almost every major platform forever. Yeah. It's like you could just pick it off the shelf and Yeah. Run it and maintain it and

Speaker 1:

It's easy.

Speaker 6:

Right? Yeah. And I think kinda like there's gonna be this kinda goes back with my first thing I said about more software engineers.

Speaker 2:

Yeah.

Speaker 6:

There's gonna be more software. So there's gonna be more packet software. There's gonna be more custom built software. Like, for example, there'll be a lot of, I think, people use companies using these software development tools to build the applications that have to span the multiple silos and the multiple systems. Mhmm.

Speaker 6:

Right now, you these companies have a very hard time of building something that goes from SAP to content management to Salesforce in an end to end process. Yeah. And maybe there'll be more some more customization and more vibe coding and more, you know, people doing cloud coding to build that. Yeah. But the core vendors and all of the core data sources and the core business logic and processes, that's gonna get bigger and bigger.

Speaker 6:

And I'm a 100% convinced of that, which is dangerous. You should never be a 100% convinced of anything. I'm trying to make a point here dramatically.

Speaker 1:

No. I

Speaker 2:

love it. Year last year and continuing into this year was all about coding agents. Do you have based on the conversations you have with with Fortune 500 companies and and things like that that are maybe out slightly outside of of just like San Francisco tech. Mhmm. Like, what other forms of agents are you most kind of bullish on for this year?

Speaker 2:

Orchestrators or orchestration?

Speaker 6:

I I think I think it's so back to my what I was talking about before. Now, by the way, I I don't I don't think that the vendor lineup of established software companies is not gonna change. Just because I think there's gonna be more and more vended software sold Mhmm. It doesn't mean that the people that are leading all the categories right now are gonna be leading. I think one of the most exciting things about what's, mean, this is an amazing time in our industry, and personally, it's super motivating for me and for our team.

Speaker 6:

But the most amazing thing is every leadership category is kinda up for grabs. Yeah. You have I mean, how crazy is this? That's great. Five, ten years ago thinking, oh, AWS might not be the number one infrastructure provider.

Speaker 6:

Yeah. I mean, that's crazy. You thought that was locked in,

Speaker 2:

but you

Speaker 6:

see Google and and Oracle and these companies, that's up for grab. You see and and I think the same thing is true for all of the categories. So it's not like there's no potential for disruption. I mean, in our world, it's very possible that if you look in cyber, that in five or ten years, the biggest category in cyber is identity.

Speaker 1:

Yeah.

Speaker 6:

I mean, that's pretty exciting. Right now, it's, you know, firewalls and SOC and endpoint, but it could be identity. And and everyone sees this. You pile all the network sees this, and ServiceNow sees this. They're doing acquisitions, and that's exciting.

Speaker 6:

But, anyways, back to my point was is that it's exciting and things could change. And the real exciting agentic or the new type of software, I think, is the type of software that crosses all these silos. If you think about it, that's what people were uniquely good at doing. Mhmm. They were uniquely good at doing, okay.

Speaker 6:

I'm gonna look in this system, and I know about this little corner case and this other content management system, and I can apply some intelligence on that and go back to update. That was very hard to do for packaged application vendors because they're inherently in their own silo, and they have to make things comply to their own business rules and database. And it's very hard I mean, I worked in enterprise software for the all the years before started Okta. It's very hard for an HR company to think about end to end or a Salesforce automation company to think about end to end. So I think this agentic systems that can cross these silos is a very, very exciting opportunity.

Speaker 1:

Last question for me. Is 2026 the best year to move to San Francisco in history?

Speaker 6:

Oh, now you've got me you're getting me really worked up now. My strong conviction. I I love San Francisco. And I think all of these people that said San Francisco is dead and it's a mess was totally overblown. San Francisco is an amazing beautiful place.

Speaker 6:

And no matter, you know, is the government perfect? No. Is any government perfect? No. But it's an amazing place and all the people still wanna come here.

Speaker 6:

Yeah. You look at I mean, how many people moved to Miami are now coming back? I mean, where are the great new companies being started?

Speaker 1:

I think

Speaker 6:

So I love San Francisco.

Speaker 2:

Tech is just Behind

Speaker 6:

me that says San Francisco. Yeah.

Speaker 2:

You got a Yeah. You got a bridge. You got a bridge. Hey, we think somebody should we think

Speaker 6:

that the Golden Gate is not really gold. Yeah. But whatever.

Speaker 1:

But your gate is We

Speaker 2:

think you should you should sponsor you should pitch the city to sponsor the Golden Gate Bridge. Yeah. Let's put it

Speaker 1:

It it is. It's a big metaphor. Hanging under it

Speaker 2:

For Okta connects

Speaker 1:

your identity as you go into the city. Yeah. This is great.

Speaker 6:

Connect Marin to to Fort Point. Yeah.

Speaker 1:

Yes. This is integration. This is fantastic. Well, thank you so much for taking the time.

Speaker 2:

Yeah. Great to have to come hang out.

Speaker 6:

Thanks, man.

Speaker 1:

We'll talk to you soon. Soon. Come back soon.

Speaker 2:

Thanks. Love it.

Speaker 1:

Let me tell you about Cisco. Critical infrastructure for the AI era. Unlock seamless real time experiences and new value with Cisco. And without further ado, we have doctor

Speaker 2:

We got it.

Speaker 1:

Alex Zandowski from the biological computing company finishing out our Lambda lightning round.

Speaker 2:

What's going on?

Speaker 1:

How you doing, doctor? What's

Speaker 3:

going on?

Speaker 1:

It's good. Nice to see you.

Speaker 2:

Lots of whiteboards today.

Speaker 1:

Yes. Ask you to explain that shortly. Well, first, give us an introduction on yourself since this is the first time on the show.

Speaker 5:

Yeah. Absolutely. So, my background, I'm a neurosurgeon, a neuroscientist.

Speaker 1:

Mhmm.

Speaker 5:

I've been growing neurons on electrodes for about twenty years.

Speaker 1:

Okay.

Speaker 5:

I spent the bulk of my career studying how neurons and brains process information.

Speaker 6:

Okay.

Speaker 5:

First in humans by implanting electrodes in the laboratory by growing neurons, as I mentioned, all electrodes.

Speaker 1:

Yeah.

Speaker 5:

And, yes,

Speaker 1:

now we're When say when you say growing neurons on electrodes, does that look like a petri dish? Is that organic? Is that meat space, or is this a digital representation of the neuron?

Speaker 5:

So there's no neurons on this. Okay. But this is the dish that we grow them on.

Speaker 1:

Okay.

Speaker 5:

You can see it here. Yeah. The little the little box in the middle has about 5,000 electrodes.

Speaker 2:

Wow.

Speaker 5:

And, yeah, and we grow them on there, and they're alive, and we use them to process information as computers.

Speaker 1:

They're they're biological. These are cells that are all connected to create the neuron. Fantastic. What can you do with it? I feel like a lot of people are pretty happy with their NVIDIA GPUs.

Speaker 1:

What do I need a biological computer for?

Speaker 5:

Yeah. Absolutely. So first, first of all, neurons and brains are extremely more energy efficient than silicon. It's it's well known. And our first application is to help solve this energy, this looming energy crisis that's happening.

Speaker 5:

Mhmm. You know, second of all, neurons can change the the way they're connected to each other. So as you all know, current hardware is rigid. It doesn't change. Yeah.

Speaker 5:

Whereas as, you know, you're learning about our, product and our company today, your brain is changing the way they're connected to each other. So we're leveraging all of that for compute. Mhmm.

Speaker 1:

How much of this is a science project? How long will you be in r and d? What is your timeline? I mean, we're no we're we're we're not afraid of science projects here. We talk to folks all the time that are give us timelines like we're gonna be live in 2035, and, we love that.

Speaker 1:

But it feels like you're making some pretty advanced progress. So talk to us about how the business shapes up when you see this going into production, what the milestones are.

Speaker 5:

Yeah. So I have, like, a big note here, sticky note on my monitor. This is not science fiction. This is not research. You know, if if I was gonna get anything across, that would be the thing.

Speaker 5:

Right? We're doing

Speaker 1:

this now. We're we're Okay. We're deploying it.

Speaker 5:

Yeah. So far, our products are twofold. One, we're using the biological network to create a software layer that plugs directly into ANNs, artificial neural networks now, to make them better better, faster, cheaper.

Speaker 1:

Okay.

Speaker 5:

So that's currently happening. In addition, and you asked about what this thing is, this is what we call our algorithm discovery platform. So we're using real brain cells to identify what's coming after the transformers. Yeah. So when we do that is we build state of the art models in house, and we parameterize them.

Speaker 5:

We ultimately use the biological network to understand neuroscience principles, and define the neuroscience principles that can be plugged into these state of the art models, and we do this in a loop.

Speaker 1:

Mhmm.

Speaker 5:

And in the end, these plugins improve them and make them better, faster, cheaper.

Speaker 1:

Do do you have any reflections on the shortcomings of current AI models, why they differ? There's that sort of famous quote of, like, you don't need like a like a teenager can just learn to drive a car in, a month of or, you know, a couple hours in the seat. And, yes, we're able to train Waymo's, but on an energy and time perspective, like on an apples to apples basis, you're effectively, you know, doing like a million hours of training and you don't have to give a human a million hours of training to drive a car. So what is different about the current structure of AI systems versus the way humans actually learn.

Speaker 5:

Yeah, absolutely. So that's a bit of a history lesson. So, you know, going back to the 1950s, where we had the first perceptron, we call it Pitts neuron, that was actually based on how we understood, at least at the time, how neurons function. Fast forward to the 1980s, back propagation won the Nobel Prize a couple of years ago. That's when a divergence happened at that point, as back propagation is not a real neuroscience principle, and that's when this divergence happened.

Speaker 5:

And ultimately, since then, software and hardware became very unbiological in the sense that, again, it's rigid, in the way it process information. At the same time though, interestingly, around the nineteen eighties, neuroscience flourished.

Speaker 1:

Mhmm.

Speaker 5:

We started

Speaker 1:

to be able to

Speaker 5:

grow brain brain cells in a dish. We started to understand what the neural signals coming from a brain actually means. This is the basis for a lot of the brain computer interface companies that are currently out there. And so three years ago, we started the company, we said, okay. Well, you know, now we're gonna take a 2026 approach in understanding of how neurons process information, and we're gonna apply it towards novel AI systems.

Speaker 2:

Mhmm. How do you keep the neurons alive? What do you feed them?

Speaker 5:

Yeah. So they live in a media that's got proteins. It's got sugar in it. It's it's actually relatively simple.

Speaker 1:

Pete, it's Pete. It's Ray Pete. It's on the Pete diet. It's got sugar. Jordy, you're you're you're

Speaker 2:

gonna prove this. This is great. Peter, super food.

Speaker 5:

Yeah. These these these techniques have been kind of perfected over the years. Again, this

Speaker 2:

is one of

Speaker 5:

the things that allows us to do this. We can buy it for over a year. Yeah. Yeah.

Speaker 1:

Sugar, protein, any any olive oil? Any fats? We got we got some Brian Johnson steak oil right here. You can put that in there.

Speaker 5:

Definitely some olive oil. We'll try it out.

Speaker 2:

What Yeah. Coming like

Speaker 1:

Yeah.

Speaker 2:

Fast forward to the end of this year Mhmm. What what are what have you done that allows you to feel comfortable taking like, you know, Christmas off or or New Year's Eve, something like that?

Speaker 5:

Yeah. That's a great question. So customers, collaborations, design partnerships, you know, right now, the adapter product that we have plugs into video generation models. And so, you know, we are, open to working with any foundational model lab that's compute constrained, and so, we can make the models more efficient.

Speaker 2:

Which is every lab?

Speaker 5:

That's right. I mean, you know, similarly, as I mentioned earlier with our algorithm discovery platform, we're looking for partnerships with the hyperscalers. I mean, you know, they have research arms that are interested in what's coming after transformers.

Speaker 1:

Yep.

Speaker 5:

Right? And so we'll take the holidays off if you know, once we're working with them as well. Mhmm.

Speaker 1:

Well Very cool. You raised some money. How much did you raise?

Speaker 5:

I'm very happy to say. So we raised a $25,000,000 seed.

Speaker 2:

Seed? I'm out. There we go. That's but what what's after a mango seed? What's bigger than a mango seed?

Speaker 2:

Because I think of a mango seed as, like, eight. This is like a watermelon.

Speaker 1:

Yeah. No. Watermelon has small seeds. Watermelon, big fruit, small seeds. It doesn't Mango is the biggest seed.

Speaker 1:

It's the it's the end of the road. After that, you just go just go giant AI seed round. Well, congratulations and thank you so much for coming on the show and breaking it down for us.

Speaker 2:

Yeah. Come back like, as you guys have Yeah. Breakthroughs and things like that. Come back and learn more. You don't need to have fundraising news to come on and talk about biological computing.

Speaker 5:

You guys you guys come to the lab.

Speaker 1:

Yeah. That'd be San Francisco and Mission Bay.

Speaker 5:

You guys are always welcome to come.

Speaker 1:

Amazing. See some brains. Thank you so much. We'll see you soon. Have a good rest Let of your me tell you about Graphite.

Speaker 1:

It's code review for the age of AI. Graphite helps teams on GitHub shift higher quality software faster. And without further ado, Andrew Huberman is the founder of Huberman Lab. He's here in the TVP in Ultradome. Welcome to the show.

Speaker 2:

Here we are.

Speaker 9:

Cheers, guys. Cheers.

Speaker 2:

In a can.

Speaker 9:

Can you hear me?

Speaker 1:

Yes. We can hear you loud and clear. Great. Give us the latest on Matejina Yerba Mate.

Speaker 9:

Yeah. Zero sugar cold brew Yerba Mate. I'm half Argentine, so I've been drinking mate since I was a kid.

Speaker 1:

Okay.

Speaker 9:

It's got caffeine, and it's a super smooth arc of caffeine. That's why I love it. Yeah. But the zero sugar cold brew, yeah, we made five different flavors. It's organic BPA, BPS, PFAS free cans because a lot of people worry about that stuff.

Speaker 1:

Yes, Meyer.

Speaker 9:

Nobody wants the credit cards worth of microplastics in their brains, testicle, and ovaries. Yeah. And it's free of all that, and it's awesome. I'm I have very high caffeine tolerance, so I drink, no joke, like four to six of them a day. But most people do just fine with one or two.

Speaker 2:

I'm right I'm right with you. I don't know about that. It's crazy. So so we have, like, such a regimented schedule that my caffeine intake during the week as I wake up Yeah. I don't follow your protocol around waiting.

Speaker 2:

I get right into the Yerba Mate. I actually have AB tested it. Yeah. I noticed I personally noticed no difference whether I get started.

Speaker 1:

So you have one in the car before the gym?

Speaker 2:

I have in the car on the way to the gym.

Speaker 1:

Caffeine before the gym is such a hack, and I I don't do it, and I need to get back into it.

Speaker 2:

I don't know.

Speaker 3:

Sometimes it's tough.

Speaker 1:

How how underrated is caffeine as a performance enhancing drug in the gym? I feel like that's Oh. It's good.

Speaker 9:

Well, look, if you have anxiety issues, you should be careful with caffeine. But if you don't, caffeine's great. It's definitely a performance enhancer, so much so that, you know, the people that regulate this stuff in sport set an upper limit on the milligrams of caffeine that you can ingest. Yeah. But what I cut off the caffeine intake at about 2PM.

Speaker 9:

Mhmm. But I listen. I also drink caffeine first thing in the morning if I'm going to work out in the first two hours of the day.

Speaker 2:

Sure. Yeah.

Speaker 9:

And that's about three, four days a week I do that. Other times, I, you know, I wait a little while. So it's really an option. But I love Yerba. I also drink coffee.

Speaker 9:

I'll have a couple double espresso, and I love it. I mean but like you guys, I'm get up, hydrate, electrolytes, caffeine, family, work, work, work, work out, work, work, work. And then by 08:00 or 09:00, if you cut the caffeine out earlier, you're gonna crash on a steep cliff, and it feels great because you fall asleep easily.

Speaker 1:

Yeah.

Speaker 2:

Yeah. How's I the always I always stop at I always stop basically when the show ends. That's when I'm having my Yep. Last caffeine I for the sleep amazing and then get it started again. Yeah.

Speaker 2:

The the challenge is on the weekend because in the week, our our schedule is so regimented. And then on the weekend, it's a little more flexible. And I'll start feeling if I haven't had my fourth yerba by the way.

Speaker 1:

My fridge at home is filled with my taste. Yeah. Because if I don't, I'm like not a good dad. I'm like sleeping.

Speaker 9:

I mean, in South America, you'll see people drinking yerba after meals because it helps with digestion. It's a mild appetite suppressant. So if you like I like to work out fasted. It's not going to kill your appetite, which is good because, you know, we still need to eat. Everyone's like trying to eat less these days it seems, but still need to get the nutrients.

Speaker 9:

And then it's high in antioxidants. It's got a lot of positive benefits and it's not necessarily an alternative to coffee. You could do both. But it's not an energy drink, quote unquote, in the sense that it doesn't have alpha GPC, taurine, all those things that a lot of energy drinks have. Not to knock on those things, but, you know, I think Yerba, coffee, those Yeah.

Speaker 2:

The the challenge is is making traditional energy drinks a habit. You're like supplementing a bunch of things that you don't necessarily like, you're you you have to understand you're signing yourself up to be like constantly dosing supplements that a lot of people just aren't fully aware of because they they just look at it. They're looking how much caffeine does this have and not not really paying attention to the other stuff.

Speaker 1:

How is the oh, business sorry.

Speaker 9:

No. Go ahead.

Speaker 1:

I was just curious about the state of the business distribution. The I mean, you have a massive audience, so the logical place to start is direct to consumer. But the final boss of almost every consumer product is retail. How are you thinking about the rollout of the product?

Speaker 9:

Yeah. It's going great. I mean, we partnered with Matinas. So I'm a partial owner with SciComm. You guys know Rob and Yeah.

Speaker 9:

The rest of guys at SciComm, Andrew Wilkinson from Tiny. Oh, yeah. And we are available online through Matinas, through Amazon. We launched in Whole Foods, in Costco Canada, and in Sprouts Markets just recently. And pretty soon, it will likely be everywhere you look.

Speaker 9:

So and people love it. You know, I'm very happy when people say they love it. It's delicious. You know, 90% of the world's adults drink caffeine every single day. It's the most popular drug in the world.

Speaker 9:

And most people need it to function pretty, pretty well. But recent studies showed they looked at coffee, but a couple of cups of caffeinated coffee per day seems like it can offset some of the dementia risk. Now there are I mean, it was with a study like that, there there are a bunch of, you know, issues that's not necessarily causal. Right? People drinking caffeine probably working more, focusing more and more cognitively engaged, etcetera, etcetera.

Speaker 9:

But as long as it doesn't send you into a spiral of anxiety, you know, caffeine is quite good for you. That's very clear.

Speaker 2:

Sort of random, but do you have any insight into what what different groups of Olympic athletes are are doing? There's when when I I've seen some I've heard some stories, like, there there's such a range with these athletes where you might have, a a an Olympic snowboarder who's like, you know, hard hardly sleeping and just like having some energy drinks and heading out there. And then you have like the guy that's doing like curling is like the, you know, the most regimented or whatever. Obviously, everybody's different. But what do you think the best athletes in the world are doing this Olympics?

Speaker 2:

Like what's cutting edge that maybe normal people aren't thinking about?

Speaker 9:

Yeah. So, I mean, this also showed up in the NFL a little while ago. But people are taking vasodilators. You know, Viagra and Tadalafil commonly goes by Cialis. Yeah.

Speaker 9:

You know, Tadalafil is a vasodilator. It lowers blood pressure. And, you know, people know of it as Cialis for erectile dysfunction, right? But it was originally developed as a drug to improve prostate health because when you vasodilate, when you, you know, dilate the vasculature, you get more perfusion of the prostate, and you need perfusion of the prostate to avoid infections, but other things as well. So it, the basic takeaway is that most every male 40 and older should probably be taking somewhere between two point five and five milligrams of tadalafil, not necessarily for erectile function, although it will augment that as well, but to lower blood pressure and to improve vasodilation for the brain, for the prostate.

Speaker 9:

And I'm not saying this as a biohacker or a podcaster. We had our head of male sexual health from Stanford. His name is Mike Eisenberg. He's an MD PhD. He is best in class in terms of male sexual health, endocrinology, etcetera.

Speaker 9:

And that's his recommendation, and it's one that most every male, maybe 35, but at least forty forty and older should take just as a preventative. You know, strokes

Speaker 2:

So you don't think that'll people are taking it, athletes are taking it as effectively To lower type of performance?

Speaker 9:

Yep. To lower anxiety pregame because to it lowers your blood pressure a little bit. They think it might be a performance enhancer by delivering more blood to the muscles. You know, you have to look at what's on the banned list and there's obviously lots of things on the banned list. And then they're going be the things that people are going to use because they're not on the banned list and they'll be banned in the next Olympics.

Speaker 9:

Often how it goes. So I pay a lot of attention to this over the years. If you looked in the nineties, you saw things like bromocriptine, apomorphine, things that people who know those names will augment acetylcholine, dopamine in order to improve fast transmission of neurons and alertness and focus for getting quickest out the blocks and sprinting, for instance. Those drugs are now banned. Okay?

Speaker 9:

You'll see in sports where staying calm is an advantage. Any shooting sport, for instance, most of the banned drugs are things that lower heart rate. Imagine skiing up to a target, picking up your gun and shooting, the less you're shaking, the better. So there's a bunch of banned drugs, and then there are all the things that people will take to lower their blood pressure. Some of those are not banned, and this is kind of how the cycle goes.

Speaker 9:

We heard about the skiers injecting their penises in order to expand their bulge size, I guess it were, in order to get a little more air resistance, in order to perhaps capture a little more airtime. That was, I hit the press mostly because it's kind of, it's kind of humorous when we think about it.

Speaker 2:

That was real. But if look at any sport,

Speaker 9:

you're going to see this. You're going see this. You're going see peptides like BPC-one 157. I believe it's banned. But that's a naturally occurring gut peptide.

Speaker 9:

Very hard to test for because you have tons of it circulating, and the synthetic versions may or may not differ from that version as much so much that you can detect it.

Speaker 2:

Yeah. Interesting. Do you

Speaker 1:

have a question about peptides broadly?

Speaker 2:

Yeah. I would love I would love, like, an update since we last talked. Probably tens of millions of Americans have jumped into the fray.

Speaker 1:

It's taken over San Francisco. Chinese peptides is like a whole meme in tech. I think everyone's wondering, Faustian bargain, at what level should they be experimenting with these things, what type of research they should be doing. What is the current thinking around peptides broadly?

Speaker 9:

Okay. I'll hit the the key bullet points here for the novices and for the aficionados. So a peptide is a short chain of amino acids. You know? So insulin's a peptide.

Speaker 9:

So when people say peptides, they're generally referring to things that people take by injection or orally in order to augment some effect, healing, growth hormone release, etcetera. Just as when we hear the word steroids, I mean, estrogen's a steroid hormone, but when we hear about steroids, we're typically thinking about androgenic steroids, right? Testosterone, etcetera. Okay. So when it comes to peptides, some are FDA approved, right?

Speaker 9:

Some are made by drug companies that passed through their patent. Many of the so called growth hormone secretagogues, tesamorelin, ipramerelin. These are things that cause the pituitary to release more growth hormone. These are been tested, FDA approved. Most people are not getting them from drug companies.

Speaker 9:

They're getting them from either compounding pharmacies, which are our lower cost alternative that there isn't as much stringency in terms of purity, so they varies by compounding pharmacy, or from what we call gray or black market sources. So when you say Chinese peptides, what you're talking about is you can go online and buy peptides that are listed or labeled as for research purposes only. And that's what a lot of people are injecting. And that is a concern. And I, you know, I'm not saying this because of my relationship to Stanford or anything like that, but I'll just say, I do not think anyone should inject peptides that are labeled as for research purposes only because even if it says 99% purity, the 1% is likely to be something called LPS, lipopolysaccharide, which can cause inflammation.

Speaker 9:

It's not a good thing to be injecting. Maybe one injection doesn't do anything, but when you're injecting every day for five, you know, five days a week and for a month that you can start running into some issues, autoimmune effects and so and so on. That said, let's just acknowledge what people are doing. And there's some peptides that have very impressive results for which there's essentially no human good human data, lots of animal data, and lots of anic data out there. For instance, people will claim, although there's no really good clinical trial that BPC one five seven, which, which is a synthetic version of a gut peptide can accelerate their wound healing.

Speaker 9:

What's the problem? We don't have a control experiment. It acts systemically. So I can't inject it in one elbow, not the other. If I have two elbow injuries and say, okay, it worked better for this one, not the other.

Speaker 9:

People claim it could be placebo, but people claim that it helps them

Speaker 2:

Study where they break both of your arms. Just get miserable. But it but it but but the issue is that it goes through your whole body. Yeah.

Speaker 9:

Yeah. You know? And it increases vascular growth. It increases nerve growth. It increases cartilage regrowth.

Speaker 9:

From the animal studies, we know that. So the logical backbone is there. People are using it like crazy. The lethal dose is very, very high. I don't even know if anyone's achieved a lethal dose.

Speaker 9:

The one study that was done in humans looked at, believe it or not, BPC enemas at very, very high doses for a gut, like a colitis issue, I believe it was. Listen, I think you have to just decide what your margin for risk is. The other one that's very interesting that not a lot of people are talking about is pinealine, which I've tried it. I don't take it any longer. I tried it for a little while and it doubled my REM sleep.

Speaker 9:

I was getting close to three hours of REM sleep per night. It's thought to increase regeneration or the health of pinealocytes, the pinealine, which make melatonin and so on and so forth. But the big one, the, the peptides that's gonna Yeah.

Speaker 2:

So so everything. I I think I I tried that, the the one you just mentioned last week. I was hanging out with with David Senra, and a friend of ours had it. And it was in a I think it was in a capsule form. Expected it not to work very well.

Speaker 2:

Friend said it works fine. I tried it. No effect on my data. Is that is that how how do you think about form factors?

Speaker 9:

Injectable only and on an empty stomach not having meaning not having eaten in the previous two hours thirty minutes before bed.

Speaker 2:

Because that's an issue because it's a a gray market right now. People will just be like, yeah, I'll sell you a peptide in this capsule. Sure. Technically, it's Mhmm. It's what I'm saying it is, but I'm giving it in a form factor that doesn't actually work, and so it's all placebo.

Speaker 2:

But I just looked at the data and I was like, I tried this two nights in a row. It had zero effect. Mhmm. And I don't think a lot

Speaker 9:

Yeah. Of people I doubt it was pinealine. It might have and and sometimes they'll throw some melatonin in there so you feel an effective for some people. You got to get it from a compounding pharmacy or a reliable source. I'm not suggesting people do it.

Speaker 9:

I actually stopped taking it because, and this is not a promotional, but I take AGZ, which is their sleep formula. I helped design it. My sleep on AGZ is amazing. Amazing. I'm getting two and a half hours of REM sleep per night.

Speaker 9:

I get tons of deep sleep. I feel great. So I don't need pinealine. But the peptide that's going to change everything, and this is more up your guys' alley because this gets right into the business sphere, is Lilly has a patent. Eli Lilly has a patent on a peptide called retreutide.

Speaker 9:

Retreutide is sometimes referred to as GLP-three. It's a triple agonist of the GLP-one, which we're all familiar with, glucagon and something called GIP, I believe. If I'm wrong, someone will correct me, of course. The interesting thing here is that in a phase three clinical trial in humans, it caused up to one third loss of body weight. So a loss of one third of body weight in about six months time.

Speaker 9:

And it seems like there's some degree of muscle sparing. It works phenomenally well for weight loss. The bodybuilding community has been onto this for a long time, right? Bodybuilders always get there first. Then what happens is in Florida and The United States, doctors who work out in gyms with people who know how to gain muscle and lose fat quickly start experimenting.

Speaker 9:

Then it goes into their high level clients. Then it shows up in Hollywood. Everyone lies or avoids answering the question of how they got so jacked. They talk about eating chicken breasts and they're actually taking growth hormone incipient and Winstrel and retrutide, retrutide, excuse me. Yeah.

Speaker 9:

And then everyone hears about it and they start, you know, whatever, looks maxing it. And then it's a big giant mess out there. Here's the deal. We are looking at a potential change in the laws around peptides such that buying peptides would become illegal. I actually think this is a terrible idea, but the motivation behind this is largely because Lilly owns the patent.

Speaker 9:

People are already injecting retreotide. They can buy it out there. They're taking it at maybe a third of the dose that's recommended in the trial. And they are seeing phenomenal results at fat loss. Not a week goes by where I don't get a 100 questions about retreutide to my phone, my personal phone, let alone email and etcetera.

Speaker 2:

Wow.

Speaker 9:

So the reason why peptides might soon become illegal to purchase through even compounding pharmacies, let alone gray market, is because Lilly would like to protect the domain over that patent. This is going to be a trillion dollar drug. Mhmm. Trillion dollar. But people are already taking it.

Speaker 9:

I've never tried it, but people I know who have taken it said it works phenomenally well at a fraction of the dose. So

Speaker 2:

What are today, look are what are some benefits that, outside of weight loss, are there secondary?

Speaker 9:

Reduced alcohol appetite, reduced impulsivity, perhaps. Remember that the receptors for these things are all over the brain and body. And, you know, the other GLP agonists have been looked at for alcohol use disorder, for binge eating disorder, and for a lot of, you know, behavioral and impulse control disorders because a lot of the receptors are rooted in the hypothalamus, which controls a lot of impulsivity type behaviors and anxiety related behaviors. So, you know, my stance on things is, you know, do as you wish, but know what you're doing. But with the important caveat that if you are an adult and your body is fully formed, you are in a completely different landscape than if you're 14, 18, 20.

Speaker 9:

You know, I worry very much. We're hearing more and more about young people, especially women in the past, but now also young guys who are thinking, oh, you know, they've got to take TRT. That's crazy. You know, I've talked about the fact that at 45 I'm 50 now. 45 years old, I started taking a microdose of sipientate, microdose, but micro, so I maintain fertility.

Speaker 9:

I've checked that. I take HCG, etcetera. Never in a million years would I suggest anyone else do that in their twenties or thirties, unless they have a serious issue. With peptides, there's the risk of where you're getting it from purity, right? But there's also the risk that if you're augmenting growth hormone in your teens, your twenties, you can really mess up your hypothalamic pituitary body axis, all the organs of your body in major ways.

Speaker 9:

You can shut down fertility. You get if you come off these things, you you may revert to a state of, you know, a depression, etcetera. I I just think the whole looks maxing phenomenon is really, really dangerous and foolish, especially in people younger than 40.

Speaker 1:

Yeah.

Speaker 9:

Right? It's just

Speaker 2:

Well, especially because there's a way to Luxmax without hitting your face with a hammer.

Speaker 1:

Sleep diet

Speaker 2:

exercise. You know, taking

Speaker 1:

Get you 99% of the way there.

Speaker 9:

Also, see, you need to evaluate kind of where you are at the very end of puberty. Puberty is a very protracted thing. You know, I hit puberty when I was 14. I didn't shave till I was 20. I swear my head and face changed shape.

Speaker 9:

I mean, over time, you got to let your as people develop over a long period of time, if you introduce exogenous hormones or you start blunting estrogen or you start, you know, because guys will say, oh, they have too much estrogen or something, or they start taking peptides. I mean, you can really throw off the trajectory of your physiology. And then you're the person in your forties and fifties often who looks much older. This is the thing that people don't realize that there's always a dynamic tension between the things that give you vitality and the things that extend your life. For instance, if you take growth hormone, you'll feel more vital.

Speaker 9:

You'll gain muscle more easily, recover from injury more easily. You'll lose visceral and body fat. Guess what? You'll also age more quickly. The reason bigger dogs die earlier than smaller dogs is they have higher dosing of growth hormone and IGF-one.

Speaker 9:

It's well established.

Speaker 3:

So

Speaker 9:

you get vitality, but you lose time. With testosterone, you have to ask yourself, what are you gaining? What are you losing? If you can go from, you know, having no energy whatsoever to the ability to work and exercise, increase libido, etcetera, well, then you're probably gaining years if you're doing those things. But I'm very concerned about people just, you know, trying to get a whatever, an angle jawline, you know, abs, etcetera.

Speaker 9:

I mean, it's because of phones, right? Of people taking photos themselves. I mean, I'm old school. So I feel like if you're a guy taking pictures of your abs and posting it online Like, you need to invest some money in a in a in a psychologist, not in a in a peptide. But that's just a you know, that's a that's a generational thing.

Speaker 9:

Right? I think I'm more impressed by what people can do with their physicality, try and hit it some numbers, some lifts, some running, some hiking, some rucking. And then but this overemphasis on appearance on social media, I think, is going to cause a lot of destruction for people that try and chase that.

Speaker 2:

Mhmm. Do you get questions from parents who have teenagers that are and they're considering giving their teenagers HGH?

Speaker 9:

All the time. So people who have kids that are much shorter than their peers nowadays, often I'm not suggesting doing this will delay puberty in their kids. They'll sometimes give them growth hormone. We know that some of the drugs, the prescription drugs that help with ADHD can blunt growth at high dosages. There's actually a very interesting drug that went through phase three or is at the end of phase three now, which was designed, excuse me, designed for excessive daytime sleepiness called Sunosi, which is kind of a dirtier hit of the dopamine and epinephrine and a little bit serotonin receptors that is promising for ADHD.

Speaker 9:

That's a non amphetamine ADHD drug. That's very interesting. That might be a good alternative to some of the Adderall, Vyvanse, etcetera. Of course you need a doctor's prescription for these things. I'll just throw out a name.

Speaker 9:

I have no formal affiliation to him, but there's a YouTube channel, a guy, we had him on the podcast, Doctor. John Kruse, K R U S E, who has covered every aspect of adult and child ADHD from behavioral tools to the different drugs, the amphetamine based drugs, Sunosi, and all of those modafinil, very, very high information content, short videos, ten to fifteen minutes, nothing like mine. You know, mine will cure insomnia. His will help people. So I I send parents there.

Speaker 9:

Yeah. And then look, I can say, what are you doing giving your kid growth hormone? But like, I don't know. I was never the tallest in my class. I was never the shortest.

Speaker 9:

I'm, you know, six one if I stand up straight, you know, five eleven if I if I slouch. And so I do understand why people feel like they need to do this for their kids based on what you hear out there. But when you start playing at home endocrinologist, and now it's very easy to find a physician that'll work with you on these things, You are you're, you know, you're you're playing a dice game, and you're playing it. So you have to think real carefully about when when and if to do these things. But ultimately, you know, parents decide and kids decide.

Speaker 2:

Switching gears, I wanna talk about cannabis.

Speaker 1:

I was about to ask.

Speaker 9:

Yeah. Are you guys fans? Be honest.

Speaker 1:

No. I'm not a fan.

Speaker 2:

No. Yeah.

Speaker 9:

Me either. It used

Speaker 2:

to be I mean, I grew up in I grew up in Northern California, so it was always it's funny. Wine country is like cannabis country. You know? It's less visible than, you know, the vineyards and things like that. But

Speaker 1:

to be expensive, difficult to obtain, laced with oregano. Now it's, like, industrialized, available at the push of a button. And and, I mean, The New York Times is writing that you see these charts, like the number of hospitalizations. That was never a thing in the nineties that I remember. And it feels like we're on the cusp of really just rediscussing this as a society on are we in an epidemic?

Speaker 1:

Do we need to have a more national conversation? But what are you seeing in the research and in your world about how big the problem is?

Speaker 9:

Yeah. Well, as a personal stance, largely for legal but regulated. That just has my personal stance. I want to just throw that out. That's just me personally.

Speaker 9:

I've hosted two guests, one clinician, one scientist, about this, and I did a solo episode. I did the solo episode. And in addition to making some small errors in there about receptor affinities and things like that, I made a statement, which was based on the literature I was seeing, which is that there is a population of people, especially young males who have a predisposition to schizophrenia or bipolar, meaning they have a first relative with those conditions that are very high risk for cannabis induced psychosis, some of which doesn't reverse. Yeah. Which is really scary.

Speaker 9:

I got accused of misinformation

Speaker 1:

Mhmm.

Speaker 9:

By traditional media. Not, you know, not like being off misinformation. And I'm going keep my hands under the table because my response to that, I think they can understand.

Speaker 1:

Yeah. So listen crazy.

Speaker 2:

Those are

Speaker 9:

the data. General American medicals

Speaker 2:

Yeah. I I I witnessed the at this point,

Speaker 1:

yeah, everyone has the experience. The anecdote is Yeah. There

Speaker 2:

Me too.

Speaker 9:

Yeah. Well, now the pendulum has swung back. Yeah. And those same publications are saying, it can cause psychosis. Okay.

Speaker 9:

Look. So there's more data now. I get it. Right? And the media game is a whole discussion into itself.

Speaker 9:

So the I had one guest come on, very thoughtful guest, a researcher from Canada, who explained that the psychosis is often caused by an over ingestion of too much THC. And in his words, just to be fair to him, because he's a serious researcher, serious scientist, said that when people smoke, so if they inhale it by vapor smoking, they tend to hit a plane of high that's not excessive and the risk for psychotic episode is lower. Then the clinician I brought on said that he's not observing that. He's a clinician Keith Humphrey from Stanford. He's not observing that.

Speaker 9:

He's an expert in addiction of all kinds. And so what we do know for sure is that THC concentrations are very high. It's very easy to over ingest through an edible versus through inhalation. Look, I think that the issue is how often and how much and on what genetic background is somebody doing this. Now, the problem is you don't want to be part of the experiment.

Speaker 9:

You don't want to be the person that discovers that you have a predisposition to psychosis by smoking high THC weed. We see the same thing with kratom. So with kratom, you've got kratom products that people will say, oh, you know, kratom helped them get off opioids. And I don't deny that, right? You've got activist groups that are pro kratom.

Speaker 9:

You also have people who will say they took so called kratom products, which are isolates of the molecule. This is what we tend to do in this country. We find an interesting plant that has some psychoactive properties. And then, I don't know, maybe pull the nicotine out of tobacco, concentrate it. Next thing you know, they're making money and you're putting in, you know, five pouches a day.

Speaker 9:

They'll take kratom the plant, which is chewed. The leaves are chewed. People say, oh, it's kind of a balance between alertness and calm. Great. They'll isolate kratom.

Speaker 9:

Then they'll do kratom isolates. And now you're getting people who are having opioid like addiction to kratom isolates to synthetics. Likewise with THC, you had a balance between THC and CBD in the plant, different strains, etcetera, etcetera. It was a kind of a community for a long time. This has, you know, it even has religious uses, etcetera, and cultural, you know, relationships.

Speaker 9:

And then all of a sudden it's like, no, we've got really high THC concentrates that you can buy at the convenience store. And then people with a predisposition are having psychotic episodes. Here are a couple of things. I'll say one thing facetious and then a couple of things serious. One, I really love to compete with people who like cannabis because unless they're a martial artist or a creative, like a musician, you're going to blast past them.

Speaker 9:

So, you know, it's kind of like recommending great Netflix series to your competitors. Yeah. Right?

Speaker 2:

Yeah. Or remote work. I recommend

Speaker 9:

Or remote you're going to okay. Now that's the facetious side. The serious side is, you know, couple of guys or friends, gals, two getting together and smoking some weed. And one person's like, oh, this really calms my anxiety. It makes them feel as if they can engage socially.

Speaker 9:

And the next thing you know, they're losing motivation for a bunch of other things layered on top of social media, the challenges of life as they always existed, but also nowadays. And the next thing you know, you've got somebody with some serious mental health issues and a full blown addiction. I do know that my colleagues that work on addiction, doctor Anna Lemke, who's an MD who wrote Dopamine Nation, Keith Humphries at Stanford, they'll tell you that many, many more people are coming to them of all ages saying they struggle with cannabis. I will also say that you're not going to get REM sleep if you are ingesting cannabis, certainly within a few hours of sleep. Anyone that quits cannabis will tell you that their REM sleep, their dreams are just comes back like crazy for many months, if not years.

Speaker 9:

And you need REM sleep to offload the emotional load of prior day experiences. So I get it. And then the last thing is that there's always the comparison to alcohol. I actually did a post about this on X. But it's crazy, right?

Speaker 9:

I mean, I've talked about alcohol before. Zero is better than any. Two per week, you're probably fine. If you're going drink more, do more things for it to, you know, support your lifestyle. I'm not a teetotaler.

Speaker 9:

Do as you want, but know what you're doing. But people always say it's better than alcohol as if you had to pick one. And maybe you do. Maybe in order to socialize, people feel like they have to pick one. I don't know.

Speaker 9:

I've always been able to say whatever the heck is on my mind without any alcohol or drugs. So you don't want to be around me if I were drunk because I'll tell everyone everything, you know? But no, I do it anyway. And I think it's just an issue of, I get it in your teens and twenties, late teens and twenties, college thirties. When you go to a bar, you don't want to sit around and necessarily sober.

Speaker 9:

I get it. So what are you going to do? You're have a couple of drinks or weed. Which one is better? It really depends on who you are.

Speaker 9:

And the comparison is just a kind of a fool's errand. You're never going to solve what's better, but there are many, many more people smoking or ingesting high THC cannabis. And you have examples of people who are extremely successful, extremely ambitious who use cannabis. But those people probably need it to not go over the peak of anxiety from their ambition. They're using it to stay in a healthy zone.

Speaker 9:

Whereas people that are, you know, have more apathy, they're less motivated, they should stay away from cannabis in my opinion, if they want to make something of their life. Mhmm.

Speaker 1:

I completely agree. That's great. Yeah.

Speaker 9:

That and so yeah. Yeah. Hard to fit into a post But on

Speaker 2:

So so there's been a lot of studies on the impact of social media on people's mental health.

Speaker 1:

Brain rot.

Speaker 2:

Right? Young people, teenagers. What about and then right now, one of the most popular words in the world is or phrase is brain rot. Mhmm. Right?

Speaker 2:

Are there any large scale studies being done on the impact of consuming an hour or two hours a day of short form video? Like, what what are you tracking? Because if you just look at the data, YouTube has I I forget the exact number, but the average person in the world is watching, like, a meaningful amount of Shorts every day. Some people are watching hundreds and hundreds. Others are watching zero or or maybe a few.

Speaker 2:

Yeah. But there's a range.

Speaker 9:

Yeah. Well, I'll just offer a tool that I think is very useful. I mean, I don't have problems with behavioral restraint, but I but I do have a lockbox for my social media phone. So I took my an old phone, I put social media on that phone. So that if people text me social media links, I can't open them.

Speaker 9:

It also means that when I take out that phone, I'm posting or commenting or or doing that. And I keep it in this lockbox, which is like a supermax prison. You can't even code out of it. It has a no code out setting. And I'll leave it in there about twenty plus hours a day, and it's just great.

Speaker 1:

Yeah.

Speaker 9:

It's just great. I mean, your productivity will go through the roof. I mean, it's getting very easy to blow past your peers now just by avoiding being on social media Yeah. A little bit I mean, David Goggins said this. It's like, it's easier than ever now to to be successful if you just don't do certain things.

Speaker 9:

Whereas in the past, was like, do I have to do to be successful? Like the shortest productivity self help book will be, you know, lock your social media phone in box for twenty three hours a day. And then I do think that, so I'm not aware of any studies looking at one to two hours per day. I do think that when you approach social media or you're getting online, I think it's important to realize that the brain has an absolutely insatiable appetite for short form video, But its appetite for long form audio is also very high. This is why podcasts continue to grow in their reach.

Speaker 9:

People don't want to hear audio turning over quickly unless there's video associated with it. So I try to get on social media at least once a day, maybe half hour, forty five minutes at the most. But I'm posting and commenting. I think at the moment where you forget time and you're just like, it's sucking you in is the time where it needs to go back in the lockbox. But listen, no joke.

Speaker 9:

Listen to you guys. I listen to news on social media, and I think that's the world we're in. I think that some people also can be on social media and experience some distance from it. Other people are more kind of pulled in emotionally to what's going on. It depends if you're a creator and people are, you know, commenting about you or you're actively engaged in the conversations or if you're simply an observer.

Speaker 9:

But I think an hour and a half a day should be the upper limit for anyone that actually wants to accomplish something in life in probably more like forty five minutes to an hour unless you're watching news if you're learning. How do you know if you're learning? What we know from neuroplasticity studies is that if you reflect on something the next day, you're likely to not forget it. It sounds sort of trivial, but most of learning is anti forgetting. In fact, there was a beautiful study done where they have people read some passage once, twice, three times, or four times versus reading it just once and then self testing on it in their mind at some point later.

Speaker 9:

And the self testing in your mind, what do I remember? What don't I remember? Ah, I forget that part. Extended the memory of what was in that passage out six months to a year in some cases. So the reflection on what you saw is the key.

Speaker 9:

So if you think, okay. What did you see yesterday that was of interest? Where you actually learned something of value? Then you're actually learning from social media or a podcast. But if you're just getting inundated with sensory information, forget it.

Speaker 9:

That's called entertainment. You're just numbing out or they're rage baiting you. And, you know, I mean, between what happened between, like, the Minnesota stuff, right, then the Epstein thing Mhmm. And then, you know, the next political whatever, you know, the amygdala is just getting hammered. And we know when the amygdala is activated, you're getting many more sort of bits of memory stored.

Speaker 9:

So, but you're just getting the peak of that information. You're kind of getting a summary. So look, your your brain real estate is your most valuable real estate. What are you going to put in there? What what are you learning?

Speaker 9:

And so I would say, take a walk and ask yourself, what did I learn on social media today or yesterday? And there's actually some learning to be had. Like, I've reflected on what I've seen in this Epstein thing. There were a lot of scientists there. There were a lot of people that were that buried his offenses so that they he could come out as this kind of like science finance liaison, various I mean, it was wild how this could happen, right, after a conviction.

Speaker 9:

And so there's some some consideration to be had, but you need to get away from the content in order to think about that Mhmm. And then return to it in a way that's that's more, I think, nuanced and that can serve you as opposed to just getting, you know, rage baited into it all.

Speaker 2:

Yeah. Never get rage baited. Excites you about AI? Anything?

Speaker 9:

I'll tell you where I'm using AI. I've wrapped my book. I'm doing edits on it right now. And I didn't use AI to write my book. But what I think AI is great for is for designing little exams for yourself for exactly what I just described.

Speaker 9:

So I will have, I like Claude. So I'll just ask Claude to say, to test me on, give me a 20 question multiple choice test on the microbiome as it relates to the solo episode that I did, plus, you know, some recent work from my colleague, Justin Sonnenberg, for instance, Stanford, perhaps the best researcher on the human microbiome in the world. And Claude will give me questions and I'll go, oh, that's cool. I remember that. Oh, wait.

Speaker 9:

That's where my knowledge falls off. What percentage of the vagus nerve is is motor descending? Okay. Those things like are important to me, maybe not you. And so I'm trying to cons I'm using it to consolidate information that I've learned prior that I might otherwise forget and update my knowledge.

Speaker 9:

And then I'll go to the papers, make sure the papers are actually what's linked there because, know, as we know, AI can hallucinate. I think that's fantastic. So it's a teacher. I'm using it to self test, which I know based on a lot of literature can help stamp down memories. So that's one way that I'm using it.

Speaker 9:

I think I love the AI that's woven into Eight Sleep that is now I'm no longer adjusting the temperatures. It's doing it for me.

Speaker 3:

Yeah.

Speaker 9:

Love that. Awesome. I'm giving it my blood tests and it's helping me navigate that. I mean, I had a blood lipid issue that was very slight, but I wanted to keep it in range. And so I started taking nattokinase.

Speaker 9:

Have no relationship to any company that sells it. Boom, right back into range. And AI helped me understand what nattokinase, what sort of shifts can occur. I love Claude and I don't, you know, I mean, I'd love to work with them, but I don't have any relationship to them now. So love, love, love what they're doing.

Speaker 9:

And I just think it's super exciting. I'm I'm not afraid of AI, but I'm listen. I'm born and raised in Palo Alto. So Yeah. Like, heretical to say you don't, you know, bright sight of the most recent 10.

Speaker 1:

Last question for me, and we'll let you get out of here. Tell us about the fish behind you.

Speaker 9:

Oh, yeah. So those are my discus fish. They're freshwater discus. Okay. And next to me on either side, it was octopus on this side, and there'll be another octopus on that side.

Speaker 1:

Very cool.

Speaker 9:

Octopus died after two years. They lay eggs, they die. That's just what they do. Was very dramatic. Cannibalized her tentacles, everything.

Speaker 9:

They're a big drama.

Speaker 1:

That is

Speaker 9:

drama. Yeah. Yeah. They remind me of some online influencers who are always trying to call attention to themselves. But in the end, it self destruct.

Speaker 1:

It self destructs.

Speaker 7:

Is it relaxing

Speaker 1:

to you? Is there Tell me more about why fish and Yeah.

Speaker 9:

So I've long been a fan of Aquaria since I was a kid. Mhmm. And it's just the goal of setting up an ecosystem that's really in balance. It's Yeah. It's made me very tranquil.

Speaker 9:

I've always had them in my office in my lab. Yeah. And then some years ago, I discovered a guy named Takashi Amano who unfortunately passed away some years ago, but he developed this thing of aquascaping where it's really emphasizing the plants and the fish and the lighting. So we built these black box rooms. I'm living so I moved into an art gallery.

Speaker 9:

I converted a three story art gallery into a living space. So I have a gym with my fish tanks, octopuses Yeah. A loft upstairs. And then in my basement is a no phone, no electronics zone where I draw, do illustrations, and I prepare for podcasts. And I have it set up with these warm incandescent red lights.

Speaker 9:

My girlfriend and I like to go down there and listen to music and just hang, like get totally away from the world. And I'm, I just love the feeling of kind of being underwater a little and they it relaxes me. And I've got a bulldog puppy that was born a week ago, and he's gonna be

Speaker 2:

here soon.

Speaker 9:

So more flora, more fauna is always good.

Speaker 1:

Let's bring the gong for the

Speaker 2:

bulldog puppy. Gong for the bulldog. Bulldogs are the best. Name? No.

Speaker 2:

Got a name yet?

Speaker 9:

Yeah. Strummer after the great Joe Strummer from the clash. I went to the the, to New York last weekend. My girlfriend and I went to go see meet the breeds, a 140 purebred dogs. Everything from the little teacup ones to the bull mastiffs that looked like Bert Kreischer and everything in between.

Speaker 9:

And after we left, we just looked at each other and we're like, Mudded Bulldog, which is what I had before. I love all the different breeds. They teach you a lot about nervous systems and temperaments, and bullmastiff is the perfect dog for Bird. I think he's got a couple of them. But for me, it's always going to be the mudded bulldogs.

Speaker 9:

I don't think I'll ever deviate.

Speaker 1:

Very cool. Amazing. Thank you so much for taking

Speaker 2:

the Great to see you. Great to catch up.

Speaker 9:

Great to see you guys. Thanks. Congrats on the on the Super Bowl. Yeah. I hope to see you in the Olympics.

Speaker 1:

We will be. We are. We are. We We're gonna be in the Olympics. Amazing.

Speaker 1:

So Martina's logo will be in there. Right? Yeah.

Speaker 9:

Awesome. Yeah. Maybe maybe we gotta get Mataena into the Olympics. I gotta talk to Rob. Yeah.

Speaker 9:

I have an idea in minute, but Rob Blug

Speaker 2:

the whole team. Blug team

Speaker 1:

Blug the whole team has a Here

Speaker 2:

we go.

Speaker 1:

Here we go.

Speaker 7:

Love it.

Speaker 1:

Yeah. Yeah. We opened the show. You're watching TVPN with these every day. Thank you so much.

Speaker 1:

Yeah.

Speaker 9:

Thanks, guys. Appreciate We'll talk to you soon. Take care.

Speaker 1:

Goodbye. And with that, we do have to jump on with Tokyo. So is there are there any posts

Speaker 2:

Any other news, Tyler? Did we miss anything?

Speaker 1:

Andrew Reed said literally everyone's capitulated on the AI trade. There's definitely no bubble. Surely now is the perfect time to declare victory.

Speaker 4:

Clawd Toad is growing a 100% month over month. Wow. 2,500,000,000 run rate.

Speaker 1:

100 a month over is that good?

Speaker 4:

Yeah. Also the, the Codex Spark came out today. That's like, the first Oh, yeah. OpenAI model running on Cerebras.

Speaker 1:

Yeah. I tried

Speaker 4:

it out. Oh, you did? Fast.

Speaker 1:

You got it working? So you're on the pro account? Is that the $200

Speaker 2:

a month?

Speaker 4:

Pro. Yeah. $200

Speaker 2:

a month. Okay.

Speaker 4:

It's 5.3 Okay. Spark. Okay. I was on x high, which is like the highest that, you know, they Actually, tiers.

Speaker 1:

Okay. I think I need to hit the update button.

Speaker 4:

Yeah. It's super fast. It's like so much better than Really? Like I I tried

Speaker 1:

Codex Spark. Here it is.

Speaker 2:

Extra high.

Speaker 4:

I think I tried 5.3 in in Codex like maybe a week ago. That was the last time I used it. It's like unbelievably it's so much faster.

Speaker 1:

That's amazing. Okay.

Speaker 4:

So this is apparently what they had internally. I think Yeah. Rune posted about this a couple weeks ago.

Speaker 1:

Yeah. Yeah. Yeah. No. He he was really baiting it because he was like, it's slow.

Speaker 1:

It's slow. Like, you know, this could take us years to figure out, and then it ships a week later. Classic. I love it. Well, Tyler, build me an app, an amazing app.

Speaker 1:

The best app ever.

Speaker 2:

Right now.

Speaker 1:

Win awards and make me a legend in Silicon Valley. That's my prompt, and I'm waiting for Codex 5.3 Spark extra high to one shot it for me. We'll we'll check-in with that tomorrow. Anyway, thanks for watching. Leave us five stars in Apple Podcasts and Spotify.

Speaker 1:

Sign up for the TVPN newsletter at tbpn.com. And we'll be back tomorrow

Speaker 2:

for a Friday show. I gotta press it twice. The bomb's still going. We love you guys. Thanks for hanging out with us.

Speaker 2:

Can't wait for tomorrow.

Speaker 1:

Great

Speaker 2:

show. It's been an honor.

Speaker 1:

It has.

Speaker 2:

Have the best evening of your entire life.

Speaker 1:

Yes. And we'll

Speaker 2:

see you Friday.

Speaker 1:

Goodbye. This is weird.

Speaker 4:

Nice work, brothers. I'll see you on the next one.