TBPN

  • (04:18) - AGI in 2035
  • (21:28) - SoftBank’s OpenAI Windfall
  • (27:42) - Buffett Goes Quiet
  • (40:08) - 𝕏 Timeline Reactions
  • (01:25:49) - Casey Handmer is a physicist-engineer and entrepreneur who earned a PhD in theoretical physics from California Institute of Technology (Caltech) and worked at Jet Propulsion Laboratory before founding Terraform Industries to build large-scale systems that convert sunlight and air into synthetic hydrocarbons.
  • (02:08:05) - 𝕏 Timeline Reactions
  • (02:10:21) - Growing Daniel, a tech industry commentator, discusses the importance of moral discernment in technology development, emphasizing that creators should strive to build products beneficial to society. He critiques the trend of funding ventures like gambling apps, arguing that such investments prioritize profit over societal well-being. Daniel also highlights the need for self-reflection among tech leaders, urging them to consider the ethical implications of their innovations.
  • (02:32:11) - JD Ross, co-founder of Opendoor and WithCoverage, discusses his transition from real estate to the insurance industry, emphasizing the importance of aligning business incentives with customer needs. He highlights the value of offering return policies in home buying to build trust and ensure product quality, and shares insights on integrating AI into business operations to enhance efficiency and customer experience.
  • (02:46:58) - Scott Shapiro, a former product leader at Facebook and Google, joined Coinbase in 2019 to lead the core consumer app product team, aiming to make cryptocurrency more accessible. In the conversation, he discusses Coinbase's launch of a new platform for individual investors to purchase digital tokens before they are listed on its exchange, with Monad being the first to sell its digital coin on this platform. He emphasizes the platform's focus on retail investors, the improved regulatory environment, and the commitment to quality over quantity in token offerings.
  • (02:55:18) - Laurence Allen, CEO of Terranova and a recent UC Berkeley graduate, discusses his company's innovative approach to combating land subsidence and flooding by using terraforming robots to inject wood chip slurry underground, effectively raising land levels. This method offers a cost-effective solution for areas like San Rafael, California, which face significant flood risks due to subsidence. Allen also highlights Terranova's recent $7 million seed funding led by Congruent Ventures, aiming to expand their projects in flood prevention and wetland restoration.
  • (03:09:39) - 𝕏 Timeline Reactions

TBPN.com is made possible by: 
Ramp - https://ramp.com
Figma - https://figma.com
Vanta - https://vanta.com
Linear - https://linear.app
Eight Sleep - https://eightsleep.com/tbpn
Wander - https://wander.com/tbpn
Public - https://public.com
AdQuick - https://adquick.com
Bezel - https://getbezel.com 
Numeral - https://www.numeralhq.com
Polymarket - https://polymarket.com
Attio - https://attio.com/tbpn
Fin - https://fin.ai/tbpn
Graphite - https://graphite.dev
Restream - https://restream.io
Profound - https://tryprofound.com
Julius AI - https://julius.ai
turbopuffer - https://turbopuffer.com
fal - https://fal.ai
Privy - https://www.privy.io
Cognition - https://cognition.ai
Gemini - https://gemini.google.com

Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

Technology's daily show (formerly the Technology Brothers Podcast). Streaming live on X and YouTube from 11 - 2 PM PST Monday - Friday. Available on X, Apple, Spotify, and YouTube.

Speaker 1:

You're watching TBPN. Tuesday, 11/11/2025. We are live from the TBPN ultra dumb. The temple of technology, the fortunes of finance, the capital of capital, ramp.com. Time is money.

Speaker 1:

Save both. Easy use corporate cards, bill pay, accounting, a whole lot more all in one place.

Speaker 2:

That is right. Okay. More importantly than RAM, it is Veterans Day.

Speaker 1:

It is.

Speaker 2:

It is a holiday. Yes. It is an important holiday. Yes. And we would like to say thank you to everyone that served.

Speaker 1:

Of course. Yes. Thank you to everyone who served. Missed missed

Speaker 2:

opportunity to get the secretary general back on the show.

Speaker 1:

Yes. Yes. I'm sure that big events happening in DC and all over the armed forces community. We we inadvertently did some defense tech day yesterday with Niros and Andoril. That was fun, taking a little tour of both of those companies.

Speaker 1:

And defense tech seems to be just cooking, like, on a Steady state. Yes. Yeah. Steady state. And, I don't know.

Speaker 1:

Just much less, muscle much less froth, much less, you know, calls for is this the top of defense tech? It's more just like

Speaker 2:

Possible we already passed the top.

Speaker 1:

Of defense tech?

Speaker 2:

Yeah. What do mean? You know, well well, there I mean, there had been some rounds maybe earlier this year that felt a little bit frothy.

Speaker 1:

Oh, okay.

Speaker 2:

But like the near nearest round yesterday, like feels like they're probably the best or one of the best teams Yeah. That's focused on this problem. DJI can make tens of millions of drones a year. Yeah. We cannot.

Speaker 2:

Yeah. Somebody Ideally, multiple companies will work on that problem. And so it feels like a just a fair bet in the context of our national security.

Speaker 1:

Totally. Yeah. And, yeah, I mean, we certainly haven't seen there's just like way fewer top signals in defense tech right now, it feels like. Like, there's no like, oh, crazy SPAC going out that's, like, frothy memes, you know, around this stuff. It's it's just a lot it's a I mean, it's still, like, a long way to deliver all the value there and and really scale those companies.

Speaker 1:

But, overall, I think, like, a lot of support. You know, I was kinda joking with Soren, like, what's it like working in, like, a noncontroversial industry? And you you were, like, doing a double take. But it is, like, a less controversial industry, in my opinion, than AI right now. In terms of Than chatbots.

Speaker 1:

Just in terms of, like, in tech, AI is controversial because of the, the adult content and the IP rights and all and the leverage and the debt and all that stuff. And then outside of tech, you have energy prices and politicization. Is the AI woke, or is it too right wing or left wing? Is it gonna be used to manipulate the election? Like, I think if you actually went, like, just random person on the street, like, you know, poll of, of popularity, like, Palmer Lucky, who's for a long time been like a contrarian, a con controversial figure.

Speaker 2:

Yeah.

Speaker 1:

Right? It feels like he would pull more popular than the Sam Altman or a or an AI lab director, AI lab CEO, even though he's, like, building weapons, like, most controversial thing.

Speaker 2:

Part of it is, like, how hard can you crush a Joe Rogan appearance?

Speaker 1:

He did do well. Yeah. Yeah. So May may maybe that's it. But just in general, it feels like the idea of, like like, the the American populace is not frustrated directly over weapons production right now.

Speaker 1:

A little bit, but not not not as much as people as it used to be years ago. The idea of, like, working with the military is much much less of a of a cornerstone issue, a hot debate, than it was six years ago, eight years ago. And the conversation has moved on to job displacement, to energy production, to, you know, Internet addiction, that type of stuff. Anyway, to restream.io. One livestream, 30 plus destinations, multistream, reach your audience, wherever they I was noticing a pattern that everyone has ten year AGI timelines right now, and it and it started we we we started when Sam Altman put out that post, like, super intelligence is just a few thousand days away.

Speaker 1:

Yeah. And at the time, it was kind of odd because, like, when he wrote that post last year, everyone was like, AGI is one year away. AGI is two years away. It was like fast takeoff time. Like, everyone was very excited.

Speaker 1:

And then he came out and was like, it's a few thousand days away. And if you do the numbers, I think Tyler has the the clock. How how long is a few thousand days? Three thousand six hundred and fifty days. How many years is that?

Speaker 1:

Ten years. Ten. It's a decade. Puts us squarely in 2035. Right?

Speaker 1:

And so Sam came out and, you know, and he he was kind of in his blog post of the superintelligence age talking about, you know, maybe we're a decade away. Then Andre Karpathy goes on to our cash just a couple months ago, a couple weeks ago, says AGI is a decade away. And then to our cash posts this this, this probability density of of when AGI will be achieved. There's a chance that America does it. There's a chance that China does it.

Speaker 1:

And the median, the 50 per the fiftieth percentile exactly twenty thirty five. Exactly 2035. Could have been any and it could have been 2030, could have been twenty fifth 40, could have been 2050, but it was 2035. Then And

Speaker 2:

you know that that, that chart, that, like, very, schizo chart that says periods when to make money. Have you seen this floating around on x? And it's like it was created, like, I guess about a hundred years ago. People people reference it anytime it like actually aligns to events because it basically has years in which panics have occurred, years of good times, high prices, and and, time to sell stocks, and years of hard times, low prices, and a good time to buy stocks. And so it's basically like, astrology for stock picking.

Speaker 1:

Okay. And and what is it saying right now?

Speaker 2:

2035 is a year is they're predicting is when a panic will occur.

Speaker 1:

Oh, interesting. Well, that certainly aligns with all these AGI timelines.

Speaker 2:

There you go.

Speaker 1:

The other one that we mentioned on the show yesterday was George Hotts, the founder of Comma, and generally regarded as, like, one of the sharpest, hackers, computer scientists, AI developers. He was looking at the Tesla FSD data and projected that, if the rate of improvement continues, you basically see a doubling in capability every year. You double it out. The if your if your benchmark is how many interventions happen per thousand vehicle miles traveled and you compare that to how many crashes happen when a human's at the wheel, he says Tesla's about eight years away, which puts him, like, squarely in the 2035 camp, 2033, I guess, in his case. And then I was looking at Meter, and this one we'll have to debate a lot more, But METER has been tracking AI's ability to complete long tasks.

Speaker 1:

So they've so they've they've and it's growing exponentially. It used to be, like, six seconds. Now it's, two hours. And, you know, when you talk to, anyone who's in the AI field, they'll tell you that the agents are getting more and more capable of handling longer longer time horizon tasks. The question is, I feel like humans don't have a time horizon.

Speaker 1:

I feel like humans, they're just born, the goal is, like, survive, be fruitful, multiply. Right? Yep. And so, I feel like if you're if you're tracking the meter data, you need to get out to, like, thirty years, like a full career. Right?

Speaker 1:

Like, the prompt needs to be, like, go make money, and then it just goes and becomes a lawyer and, you know, lives its full life and retires after, after a thirty five year run. And, of course, when you track out the doublings, one in 2035, the meter is projecting based on that log curve that or that logs that log graph, that AI will be able to have a time horizon in the in the decades. And so you would you would have it some something similar to what a human would have. And so, my my read on the meter data is that, you know, AGI 2035 again. It's kind of it's may maybe the messiest, the the the least, like, definitive.

Speaker 1:

And all of these predictions have different methodologies. Some of them are extremely quali quantitative. A lot of the some of them I'm, like, reading really reading into. Some other people are just saying, like, it's a decade away, specifically. But what's interesting is that it just feels like ten years is this consensus right now, and there's much less diversity of opinion.

Speaker 1:

There aren't that many people saying two years anymore. There aren't that many people saying fifty years anymore. Everyone's kinda saying ten years. And I just wonder, like, let's put aside like, let's try and accurately predict when this thing happens, and let's just analyze it from a psychological perspective. And, like, what does it mean when the tech community all has a consensus of something that's a decade?

Speaker 1:

Like, a decade away could just be what people say when they don't know. Like Yeah. If you ask me when flying cars are gonna happen, I'm gonna say a decade. If you're gonna say when quantum computing, oh, that'll be a decade. Oh, Mars, yeah, that's a decade.

Speaker 1:

It's easy. It's like it's optimistic. It's a way when I say a decade, I'm I'm I'm not saying it's not gonna happen, but I'm saying, like, you're not gonna be able to hold me to my bad take Yep. Because It's good you're not gonna be able to clip this in one year and be like, he was wrong. Right?

Speaker 2:

Yeah. The it it is funny the quote most people under s overestimate what they can accomplish in one year Yes. Underestimate what they do what they can do in ten years. And in this situation, we're just like, yeah, we're estimating that we can just achieve AGI in ten years. So what?

Speaker 2:

A a year is actually so it's effectively, like, inverted.

Speaker 1:

Yeah. Yeah. Yeah. Yeah. And so but the other weird thing that I was grappling with was, like, when I personally feel, like, about AGI, like, I'm convinced by these.

Speaker 1:

Like, I feel like AGI is ten years away. And I don't feel like I'm, like, coping or doing some sort of, like, mental logic jumps or something. It's just like if you actually force me to put a prediction down, I probably would say about a decade. And I feel like I've been around there, been on the Kurzweil timelines. Maybe maybe I'm twenty forty five, more like Kurzweil, but still, like, a decade, two decades.

Speaker 1:

Like, I feel like it's real. I I feel like I do believe it's gonna happen. I don't I don't feel like I'm I'm like AGI never. I'm not in that camp. And so then I go to, well well, like, how should I actually be changing my behavior?

Speaker 1:

Like, if something big is coming in a decade, it feels like you should actually should not just be acting normally. But it it feels like there's some sort of preference falsification going on. Like, how many everyone's saying a decade. How many people are actually acting like it's a decade? Like, what should you be doing in the intervening years if it's a decade away?

Speaker 1:

Like, are you just supposed to, like, build technologies that are fun little dopamine rewards? Are you trying to, like, accumulate as much capital as possible before?

Speaker 2:

I think well, yeah. So so when, I mean, who plans around ten years at all? Right? People tend to Yeah. Go for things that they want today Yeah.

Speaker 2:

In some ways. Right? So if somebody like, let's say, they're in their twenties. Yeah. They're like, I wanna own a home Yeah.

Speaker 2:

By the time by 2035. Yeah. They want that thing today and so they they under but they might understand, okay, it's gonna take some time to get there. Yeah. But actually making real changes in your life for this like impending scenario that's hard to predict

Speaker 1:

Yeah.

Speaker 2:

Entirely. Yeah. It's very, very difficult. And I don't know that I don't know that any you know, it was maybe more popular earlier this year to joke about, you know, what you know, the golden retriever maxing. Right?

Speaker 1:

Oh, yeah. We would would talk

Speaker 2:

about this. Right? But it feels like that dialogue has kind of changed.

Speaker 1:

Mhmm. How so?

Speaker 2:

Part of it goes back to the other, like, enduring meme, you know, you have one year to escape the permanent underclass and then everyone woke up and, like, a few months ago and said, okay, maybe we actually have, like

Speaker 3:

A decade. Longer.

Speaker 1:

At least a decade.

Speaker 2:

Now everyone's still just focused on escape, whatever that means, whatever their whatever their number is.

Speaker 1:

Yeah. It's a it's a odd time. I'm gonna tell you about Privy, and then we'll get Tyler's reaction. Wallet infrastructure for every bank. Privy makes it easy to build on crypto rails, securely spin up white label wallets, sign transactions, integrate on chain infrastructure all through one simple API.

Speaker 1:

Tyler, what what do you think about my take? What do you think's what do you agree with? What do you disagree with?

Speaker 4:

Yeah. I I think it's definitely interesting that people seem to kind of align around this 10 thing. I think there's also some sort of bias. Right? If you think AI is coming in three years

Speaker 1:

Mhmm.

Speaker 4:

You're probably not just gonna be like writing blogs. Like, you're maybe you're gonna start a macro hedge fund. Maybe you're gonna go work at one of the labs to like really try to influence how it's gonna happen if it's gonna happen super quick.

Speaker 1:

Yeah.

Speaker 4:

So I think there's some sense of that. And if you really don't believe AGI is coming at all, then like, you're also probably not gonna be writing blogs about AGI. You're just gonna be like

Speaker 1:

Yeah.

Speaker 4:

Doing your normal job or whatever. So I think there's some like confirmation bias there.

Speaker 1:

Ten years? Exactly the time of a venture capital fund life cycle. Coincidence?

Speaker 2:

Tyler's probably the most AGI'd per pilled person on the team. And so it's interesting that you're you kind of

Speaker 1:

I think you

Speaker 2:

to day acts like everyone else on the team.

Speaker 1:

Yeah. Yeah. What what's the thing you've done? What's your biggest, revealed preference?

Speaker 4:

That's hard to say.

Speaker 1:

Maybe just hanging out on the show.

Speaker 2:

Yeah. Just hanging out podcasting instead of going to universities. Just like, yeah, nothing matters.

Speaker 1:

I guess. I mean, there's a little bit of that. Yeah. Yeah. Maybe maybe AGI.

Speaker 1:

And it's just it's just hard because, like, in in the intervening years, there's clearly, like, it's just incredibly, like, high leverage work to be done. It's so easy to pull forward. Like, okay. Well, like, the the the robots are gonna be able to do x. So maybe you shouldn't spend the next ten years doing x.

Speaker 1:

But if in fact x is very valuable, then the next ten years, you can capture a ton of value as you leverage more and more technology right up into the point where you don't need to do it anymore because it's fully automated.

Speaker 4:

Yeah. I think there's some sense of, like, at least for me, it's like, okay. Maybe I would be working at maybe I would have started some rapper company Mhmm. Or something like that. Yeah.

Speaker 4:

But I'm more bullish on AI than I think most people are

Speaker 1:

Mhmm.

Speaker 4:

To where I I think that stuff is generally not super kind of long term oriented. Mhmm. So maybe I I think something some sort of that kind of

Speaker 1:

Do you think do you think we'll be podcasting in 2045?

Speaker 4:

Yes. But I I

Speaker 2:

I I say yes because I think the podcast is to live.

Speaker 1:

I I agree. I think it might be around way past AGI.

Speaker 2:

I I mean, I don't know.

Speaker 1:

I don't know. I think

Speaker 2:

I think everybody I Everybody wants to believe that. Like,

Speaker 4:

I think

Speaker 2:

But I don't know. I I have yet to

Speaker 1:

see I don't know that mining is going to be around forever. I don't think people mine for the love of the game. I don't think people go in the coal mine and mine, and they would do it if they didn't get paid. But I think people do play chess

Speaker 2:

Children yearn for the mine.

Speaker 1:

And I think people podcast for fun.

Speaker 2:

Children yearn

Speaker 1:

for I

Speaker 2:

don't think they actually real life game of Minecraft.

Speaker 1:

Yeah. May yeah. May may may I mean, Minecraft for sure. Playing games for sure. That's actually sort of where where my take

Speaker 2:

But imagine imagine, you have an army of robots

Speaker 1:

Mhmm.

Speaker 2:

And you just get to use a computer to control them to mine for for minerals. I mean, that that sounds like it could be pretty fun and fulfilling.

Speaker 1:

Maybe. Maybe. I don't know. That sounds like a that sounds like a loophole. I was talking about actually, like, swinging an axe in a, you know, in a mine shaft.

Speaker 1:

Yes. Maybe maybe there will one day be an AI agent for mining. We've actually talked to some folks who are doing AI for mining. If you're mining value in your code base, get on Cognition, the makers of Devon, the AI software engineer. Crush your backlog with your personal AI engineering team.

Speaker 1:

Yeah. Where where landed at the end of this was talking about this clip of Alex Wang that's sort of going viral. People are dunking on it, because he recommends that kids learn to vibe code. And, and I just disagree with the haters. Like, I I think the haters are wrong on this one.

Speaker 1:

And I was thinking about, like, well, I have a kid. Like, would I teach him to vibe code? And I was playing with Legos last night. I was assembling Legos. And from what I've done vibe coding and what I've done with Legos, I'm like, this is going to be very fun, and this is going to be an activity that we do together.

Speaker 1:

And I'm super I I'm like super in agreement that I think learning to vibe code is good. What do you think, Tom?

Speaker 4:

I think like the point of view of like the people hating where that like

Speaker 1:

Yeah.

Speaker 4:

Oh, your your 13 year old should be like making b to b sass, which is like not not that's different than saying they should be vibe coding. Because vibe coding is just like it's like basically playing a

Speaker 1:

video game where it's like Yeah.

Speaker 4:

Making Minecraft mod or something. That's like Yeah. Seems totally fine.

Speaker 2:

Yeah. Some of the criticism some of the criticism was like, you should just be doing really hard things.

Speaker 1:

Okay.

Speaker 2:

But then have to ask is like, okay, should 13 year olds like only be hand coding? Like if you wanna get them into maybe making doing engineering work, is it Yeah. They'd doing it with a pen and paper? Like, what what is the alternative? Or should they just be just doing math problems or studying physics?

Speaker 1:

I had some sort of I had some sort of advanced Lego thing, robotics thing where that you could actually write software for and kind of make it drive in circles or you could program it to drive in a figure eight.

Speaker 2:

Yeah. And that was really fun as a kid. When I was 13, I was working on little iPhone apps.

Speaker 1:

Yeah.

Speaker 2:

And knowing the vibe coding tools that are available today Yeah. I would have been able to make way more progress. I would have had a lot more fun. Yeah. It would have been like having an expert, like, software engineer sitting next to me, kind of like pair programming.

Speaker 2:

Right?

Speaker 1:

Yeah. Yeah.

Speaker 2:

And so, I just yeah. I'm I'm on your side. I think I think people generally just just don't like Alex because he's been wildly successful. And, people wanna try to

Speaker 1:

He's basically the youngest, most successful person in the world, which invites a lot of criticism. Yep. And and, also, it's it's tough when, when you build a business that it's not like I think when Evan Spiegel blew up with Snapchat, he was very much loved because people were like, I like Snapchat. I got a date on Snapchat. I met my girlfriend on Snapchat.

Speaker 2:

I use it all the time.

Speaker 1:

Or I use it with my friends or my boys or whatever or like my in my group chats on Snapchat. So it's very easy to be like, yeah. He's rich, but I like the thing he made. Whereas with Alex Wang, it's like, yeah. He's rich, I have no idea what the thing he I don't interact with it at all.

Speaker 1:

And so it's a lot harder. But, yeah, I don't know. I I feel like I feel like a a generic call to action to young people to vibe code to be entrepreneurial is good. I mean, I I I've, you know, discovered entrepreneurship in some ways through Tim Ferriss who had a similar sort of, like, the four hour workweek was not some, like, deep understanding of how business and technology works. It was more just like, hey.

Speaker 1:

You could set up a business. And, like, it could be a really interesting life, and it's very different than, like, becoming a lawyer or becoming a doctor. Like, when you're just, like, a blank slate kid, you're kind of offered a menu of options, which is like, do you wanna become a fireman? Doctor. Policeman.

Speaker 1:

Like,

Speaker 2:

Construction worker.

Speaker 1:

Construction worker. And and Businessman. And and, yeah, you hear businessman, but you don't really know what that means. Businessman. And it's very different to be like

Speaker 2:

Accountant.

Speaker 1:

A businessman is an investment banker or a consultant if you stay on the track, and you're just like, I only I always ask for permission. Right? I I always go into the next thing. There's I always have a boss. Entrepreneurship is this, like, sort of vague thing that's, like, hidden, and it's nice when when it's served up.

Speaker 2:

Yeah. And there's so many different flavors of it.

Speaker 1:

Yeah. Yeah. So I I enjoy I enjoy a little serving of of entrepreneurial insight. What what what do you think, Tyler?

Speaker 4:

Fun fact. So in fifth grade, in the yearbook, they asked, like, what you want to be? Yeah. I put investor.

Speaker 1:

Investor in fifth grade? Yeah. That's

Speaker 4:

I didn't even really know what that meant.

Speaker 1:

How I mean Are

Speaker 2:

you sure you didn't put capital allocator?

Speaker 1:

You put in

Speaker 4:

the s what

Speaker 2:

I meant. Yeah.

Speaker 1:

Yeah. I mean, in in in fifth grade, they had a, it was like, bring in a mobile. Like, create a mobile for someone that you respect. And everyone in the class brought in, like, a picture of their dad. I And brought in a picture of Bill Gates because I was like, it made the computer.

Speaker 1:

Like, the computer's awesome. And everyone was like, what are you, like, doing, dude? Like, stop deifying business people. I was like, I will never stop deifying business people. It's great.

Speaker 2:

It's lovely. Ever since the fifth fifth grade, I knew I wanted to get into asset management.

Speaker 1:

Love it. I love it. Well, speaking of asset management, Masayoshi Son is back in the news. SoftBank's big profits jump this quarter came from OpenAI's increased valuation. That SoftBank lifted higher by buying shares from employees, and selling them at a higher price.

Speaker 1:

And so there is a question about is this cert too circular? It's unclear if, SoftBank was really the price setter on this deal, but, Dar Just Dario is certainly

Speaker 2:

Just Dario is saying SoftBank books a capital gain on an investment it hasn't paid for and recorded in its assets yet.

Speaker 5:

Mhmm.

Speaker 2:

So I guess the narrative here

Speaker 1:

is Gains on OpenAI shares by designating Vision Fund two to underwrite the second investment tranche at 500,000,000,000 value that SoftBank Group cannot pay and created a forward derivative contract worth 8,000,000,000 all gains out of out of thin air. How is this legal? I don't know. This doesn't seem that crazy.

Speaker 2:

Scenario where you you would do this if you were a company that invested in a fund. Yes. And it's called maybe some capital, but not all of it yet. And so you can show that there's a gain, even though you haven't paid, actually paid in the capital yet.

Speaker 1:

Yes.

Speaker 2:

But I think the way in which SoftBank is doing this It doesn't seem illegal, but I think it's nontraditional. Sophie over on X says SoftBank is selling its NVIDIA stake to fund companies whose main expense is buying from NVIDIA. Okay.

Speaker 1:

But don't VC firms run run run into this all the time? And there's always a question about, like, like, how they mark to market and if you're marking to market, an asset that you you invested in and that's why a lot of venture firms don't want to do like, it's it's it's a little bit iffy to lead back to back rounds, but because at a certain point, your LPs are going to start having questions. But it does happen, and it's not the end of the world. And as long as you get other buyers like, didn't founders fund do ramp back to back or something like that? Right?

Speaker 1:

Yeah. But then there was, like, a secondary transaction that came in pretty quickly, and then eventually, so an another firm came in. And so once the other firm comes in, that mark gets a lot stronger, and then ramps actually, like, mark themselves to market multiple

Speaker 4:

times.

Speaker 2:

I mean, yeah. The the this is

Speaker 5:

I don't know.

Speaker 2:

The the bit the more important the more important

Speaker 1:

God, dude. I tried to slide off the camera to sneeze at the PTZ company.

Speaker 2:

The more important thing here is that this is not the first time, Masa has exited NVIDIA. He exited Okay. In 2019 before the run up. He was then NVIDIA's largest shareholder, and he had a five per se 5% stake in the company that he sold for 3,600,000,000.0, and it would now be worth over 200,000,000,000. So this this was ultimately So how much

Speaker 1:

did he wind up investing totally? Because he must have made money overall. How much has

Speaker 2:

I don't know. I don't know exactly when he actually bought back in. Yeah.

Speaker 1:

Because if if this was all from that initial slug, I I feel like this this five this 5,000,000,000 could be on a basis of, like, 500 mil or something. It's still relatively small for the size of of SoftBank. But, well, that's cooking. Let me tell you about Figma. Think bigger, build faster.

Speaker 1:

Figma's design and development teams build great products together. Let's see. SoftBank sells its entire stake in Nvidia for 5,830,000,000.00. Quantian says, no. They expect one of us in the wreckage, brother.

Speaker 1:

Have we started the fire? Why I I don't understand how this links to what's happening here. Why who who's in the fire? Like, is this how is this I I I don't understand this, mate. Do you understand this?

Speaker 2:

I just like the line they expect one of us in the wreckage, brother.

Speaker 1:

Yeah. It's funny because it's from The Dark Knight Rises. Right? But but, like, selling Nvidia, selling your stake in NVIDIA like, are people mad about this because they think NVIDIA is gonna rip further? Did Selling your

Speaker 2:

stake in NVIDIA Yeah. To buy to actually buy shares in a company that you've already marked a you're you're showing a gain on Yeah. In the last quarter. Yeah. So basically, SoftBank is booking profits.

Speaker 1:

Sure.

Speaker 2:

OpenAI profits. They are then selling NVIDIA shares to fund the original investment of which they've already booked the profits on. And the number one driving force behind NVIDIA's growth is OpenAI.

Speaker 1:

Yeah. Is there just lot true?

Speaker 2:

I mean How big how

Speaker 1:

big is OpenAI as a customer of NVIDIA? I mean, yeah, it's probably the biggest. But, I mean, the rest of the hyperscalers are still are still buying a ton of NVIDIA. Right? I don't know.

Speaker 4:

I yeah, I would say, like, indirectly, they're a big customer. But, like, obviously, they're not it's not like they're funding the data centers.

Speaker 1:

Yeah. Yeah. Yeah. Okay. So James Thorne says, today, we will get the SoftBank completely sold out of NVIDIA fear from bears on mainstream media and Wall Street.

Speaker 1:

Does anyone do objective research anymore? SoftBank initially bought its NVIDIA stake through the Vision Fund in 2017, then exited completely in January 19. They missed it. They lost patience not the first time. And so yeah.

Speaker 1:

When did they buy this latest slug? That's the question. SoftBank sold 32,000,000 shares of Nvidia in October, also sold part of its stake in T Mobile for 9,000,000,000. It's not the first time they've cashed out of the chipmaker. That's a funny way to put it.

Speaker 1:

Despite the sale, SoftBank remains tied to NVIDIA through its other ventures. Yeah. That makes sense. Anyway, let me tell you about Vanta. Automate compliance, manage risk, improve trust with AI.

Speaker 1:

Vanta helps you get compliant fast, and they don't stop there. Our AI and automation powers everything from evidence collection to continuous monitoring and security reviews and vendor risk. Warren Buffett says he's going quiet. The world's most famous investor Warns against corporate greed as he prepares to hand over the reins of Berkshire Hathaway. There are a bunch of interesting posts, but there's also a story in the Financial Times.

Speaker 1:

I don't think it hit the I don't think it hit the paper edition yet. But we can read through some

Speaker 2:

of paper paper edition always lagging.

Speaker 1:

It lags. Redefinitive. So Warren Buffett has told Berkshire Hathaway shareholders that he is going quiet, quote unquote, as the world's most famous investor draws a line under a career that shaped corporate American Wall Street over the past six decades. As the British would say, I'm going quiet, sort of, Buffett wrote in a letter published Monday. The 95 year old will step back from day to day responsibilities at Berkshire at the end of this year when he retires from his role as chief executive.

Speaker 1:

Craig Abel, of course, is coming in. Investors have long seen Buffett often called the Oracle of Omaha as a corporate folk hero, interspersing guidance on his portfolio company's performance with life and business advice, his frank manager manner with shareholders, both his annual letters and his marathon question and answer sessions during the annual meetings in Omaha has been a hallmark of his tenure. Buffett's letter to shareholders on Monday took the same tone with warnings about corporate greed interlaced with calls for kindness. He noted, for instance, that requirements for executive compensation disclosures backfired as business chiefs engaged in a race to earn more than rivals. What often bothers very wealthy CEOs, they are human after all, is that other CEOs are getting even richer.

Speaker 1:

Buffett said, envy and greed walk hand in hand. Interesting. Buffett added that Berkshire should try to avoid future CEOs who are looking to retire at 65 or who want to become look at me rich or initiate a dynasty. Look at me rich. What an interesting turn of phrase.

Speaker 2:

Yeah. It's very notable that, Buffett has never been photographed doing a money spread wearing Balenciaga, Rick Owens.

Speaker 1:

Does billion. Does he have any chrome hearts?

Speaker 2:

Probably no chrome no chrome heart. He's never been photographed

Speaker 1:

With chrome hearts.

Speaker 2:

With chrome hearts It is cool. He's gonna continue to do his thanks thanks a letter every Thanksgiving.

Speaker 1:

I'd love that.

Speaker 2:

I thought that was Thanksgiving, of course, one of the most underrated holiday.

Speaker 1:

Yeah. So he he took the pledge to give away, to give away his wealth. And so, it's gonna be a pretty big changing of the guard, and and and that's gonna have a pretty significant

Speaker 2:

The thing I'm looking, forward to seeing is is how many what the attendance will be like in the, shareholder meeting next year.

Speaker 1:

Mhmm.

Speaker 2:

Like, is the is

Speaker 1:

Is the Buffett or is it its own Yeah.

Speaker 4:

Exactly. Yeah.

Speaker 2:

I my expectation is that they will retain a huge amount of people because it's people that have been going every single year for a long time. Totally. It's just like a kind of a meetup for people that have probably you know, that have spent probably decades at at least some of them decades

Speaker 1:

Sure.

Speaker 2:

Visiting and being a part of it.

Speaker 1:

So Yeah. I also wonder, like, to what degree can Greg Abel, like, actually pick up the torch? Because there's that interesting stat that, like, if you go back thirty years ago when Buffett was 65, he was I don't think he was close to the richest man in the world. He he compounded a ton in his third thirty year run. Like, from 65 to '95, he had a particularly good run that that made that took him from, like, pretty rich to one of the richest people in the world.

Speaker 1:

And so it'd be very interesting to see, like, Greg Abel, what if he really goes on a run for thirty years and is just executing, executing, executing, it's very rare. We just don't see that many business executives or that many, like, people broadly where that's, like, the the highlight of the career is 65 to '95.

Speaker 2:

You think

Speaker 1:

It's exciting.

Speaker 2:

Do you do you think that that he can be fearful when others are greedy? They're kind of acting like it right now. Right? Yeah. Half it.

Speaker 2:

Like, I I don't think it's quite half a trillion. I think it's more like under under 400,000,000,000 of cash. But Yeah. They will be getting they will be getting calls quite a lot of calls at some point.

Speaker 1:

For sure.

Speaker 2:

And we'll see how they play it.

Speaker 1:

Yeah. Before we react to some of the reactions on the timeline, let me tell you about graphite dot dev code review for the age of AI. Graphite helps teams on GitHub ship higher quality software faster. So Howard Anglin is taking shots at at Warren Buffett. So Warren Buffett's talking about his final letter.

Speaker 1:

And Howard says, Buffett has commissioned no plays, no poems, no symphonies, no operas, no ballets, funded no paintings or sculptures that will outlive him, endowed no theaters, choirs, orchestras, built no monuments, monasteries. Is this true at all? I feel like he has to have I feel like Warren Buffett has to have given away enough money to endow at least one theater. Like, how do you not just get hit up by your local theater in Omaha once and say, sure. I'll send a check.

Speaker 1:

Like, I would be shocked if he's never done anything. And, also, he's committed to giving away half of his wealth. So, like, some of the money is gonna wind up with the opera, I imagine, just because it's gonna go out and get diffused amongst all the different charity efforts. It's not gonna all go into one thing. So, Howard agrees with him on this.

Speaker 1:

He says he's right. The greatness does not come about through accumulate great accumulating great amounts amounts of money, which is what he is known for. But beyond that, it's fortune cookie level advice. It's not wrong, but it that's about it. His partner Charlie Munger's contribution to architecture was to fund factory like college dorms in which a majority of the apartments don't have windows.

Speaker 1:

The is was that UCSB? Where is that? Yeah. It's have you have you been to that dorm? Yeah.

Speaker 1:

Did you live there?

Speaker 2:

I'm trying to figure out which

Speaker 1:

We need to pull So he he made photo of the Charlie Munger

Speaker 2:

Yes, dorm.

Speaker 1:

The Charlie Munger Dorm at, at UCSB. It's

Speaker 2:

Munger Hall.

Speaker 1:

Munger dorm UCSB. It is a wild picture. It's so funny. And it's like

Speaker 2:

Yeah. So they have banned they they

Speaker 1:

Oh, they didn't do it.

Speaker 2:

Made the donation.

Speaker 1:

Yeah. Yeah. Yeah.

Speaker 2:

There's some reference to it on the campus, but they abandoned the plans to build a windowless dorm.

Speaker 1:

So it's like it's so Soviet? It's extremely Soviet. I don't know why they had

Speaker 2:

the thought

Speaker 4:

there were fake windows and they were like TVs essentially, but they would show like outdoor scenes.

Speaker 1:

It should be a vertical TV windows that you can just put Sora on and just watch AI

Speaker 2:

solve all problems. Complaining about the cost of housing. I gave you housing, and now you're complaining it doesn't have windows.

Speaker 1:

Yeah. You give a mouse a cookie with these people. K. It is so funny. Apparently, it it it's it's also just so funny.

Speaker 1:

Like, why are you so involved in this project? Like, I thought he was a bigger donator to Stanford first. And then also, why like like, I feel like if I if I donate money somewhere, I'm like, okay. It's gone. That you enjoy it, and it's you it's on you.

Speaker 1:

Like, you are the one that takes care of it. To be in the weeds to the level of, like, deciding where the windows go is a wild move. Like, I thought you had other stuff to do. I would just be Yep. I feel like if I was in this position, I would just be too

Speaker 2:

busy. Have other buildings on campus

Speaker 1:

Okay. That With windows?

Speaker 2:

Is among our residents

Speaker 1:

Okay.

Speaker 2:

That do have windows. So

Speaker 1:

I don't know. I mean, also, like, is is it safe to take the other side of this and just say that, like, he was about to create the greatest lock in of

Speaker 2:

all he was basically saying, like, college students spend very little time in their actual dorms. Yep. Because they're out and about in the world. They're studying Totally. Library.

Speaker 2:

They're at events.

Speaker 1:

Totally.

Speaker 2:

They're really just using the dorm to sleep. Why don't we create common areas that have windows Yes. And outdoor areas Yes. But then you go in your pod. Yeah.

Speaker 2:

Just go in your pod.

Speaker 1:

Almost like a cave. Yeah. It's like a cave.

Speaker 2:

Because UCSB, the the complaint from a housing standpoint is that, like, every nobody has like nobody Everyone shares their room almost all the way through everyone shares a room almost all for you.

Speaker 1:

Oh, interesting. So you never move off campus.

Speaker 2:

Yeah. And it's like it's like one square mile that people really live. And yeah, there's super limited housing. And so I think Munger was coming at this, like, probably, like, very pragmatically. And it's like, hey, you get you're giving up windows.

Speaker 2:

Yep. You're gonna get your pod. But in exchange, you get your own private space. Yeah. People call it Dormzilla, and they canceled it.

Speaker 1:

They call it Dormzilla. I forgot about that. Do we find a picture of Dormzilla?

Speaker 2:

Do we mean, the the the the actual designs, I saw it for a second.

Speaker 1:

It's so funny. Why can't we do a real examination of the rules that state every bedroom must have a window? This is, Eric Adams floated a similar idea to Munger calling for stripping legislation that promises each citizen each city citizen window access this past March. You don't need you don't need no window where you're sleeping. It should be dark, he said.

Speaker 1:

Wow. That's actually directly in the quote. That's this is a direct quote.

Speaker 2:

That's from Munger.

Speaker 1:

This no. This is from Eric Adams, the New York New York guy.

Speaker 2:

I was like, that sound

Speaker 4:

that doesn't sound that doesn't sound like that.

Speaker 1:

It was a crazy

Speaker 2:

crazy I put a I put a picture of the floor plan in the chat team if you guys

Speaker 1:

can

Speaker 2:

pull this up. They should have called this the Munger Bug Den.

Speaker 1:

14000, the cave, the Munger Cave. I don't know. It's pretty it's pretty fun. It's also it's also just I mean, maybe UCSB. When I think of UCSB, I think of, like, you know, it's this beautiful, like, so just amazing, like, near the beach, beautiful sunlight.

Speaker 1:

Like, you definitely don't wanna be in a cave. Like, it's like, I I feel like I tore I tore I kind of get the Soviet cave, the the Soviet like, the Soviet block era, like, you know, big buildings because it's like, you don't wanna go. Dormzilla. Dormzilla is so good. I love it.

Speaker 1:

It was like, this is what I need to do. It's just like also, just like, can't you just give, like, a little bit more money and get the windows? Like, windows aren't that expensive. Right? It's such a wild such a wild thing.

Speaker 1:

Anyway, it's so so funny. So I don't know. Tyler, did you figure out if Buffett has ever commissioned one play, poem, symphony, opera, ballet, painting, sculpture, theater, choir, orchestra, monument, home, museum, church, monastery, chantry.

Speaker 2:

He's a to give him credit. He has the char Charles Munger Physics Residence

Speaker 1:

Okay.

Speaker 2:

Which is a beautiful, stunning building.

Speaker 1:

With windows.

Speaker 2:

With many windows. There

Speaker 1:

we go.

Speaker 2:

And so, yeah, he decided to window max on this one.

Speaker 1:

Yeah. He's taking a barbell strategy. Some of the buildings have left But, obviously, that's

Speaker 2:

that's Munger. That's Munger. Yeah.

Speaker 1:

Yeah. Yeah. This is Buffett. So, yeah, what what did we learn about Buffett?

Speaker 4:

So I I think, basically, all of his donations, he does do a lot of donations, but they're all they're basically he just gives stock to people. Okay. So he's never like saying, I want this museum to be built in this way. I don't think he just gives it to a, you know, endowment or Yes. Institution or or some charity Yes.

Speaker 4:

Then they like, do it And

Speaker 2:

is that because he wants them to, like, like, kinda try to 10 x it

Speaker 1:

10 x again? Yeah. He wants bag holders.

Speaker 4:

Yeah. But, like, I don't think he has very strong opinions on, like, should kids have windows?

Speaker 1:

Or can you find an example of of when he gave stock to an organization? Because if that organization went and commissioned an opera, I'm putting this in the truth zone. And I'm thinking that I'm thinking that he did, in fact oh, the Munger Research Center in Pasadena does have many windows. Credit where it's due. Thank you, Taylor.

Speaker 1:

You can't commit to the Munger Cave, then you are really committed to deliver.

Speaker 4:

He he's given a lot to the Bill and Melinda Gates Foundation.

Speaker 1:

Okay. That's kind of sort of a circular thing. It doesn't make any sense to me. Anyway, let's move on from Warren Buffett. Let me tell you about Julius, the AI data analyst.

Speaker 1:

Connect your data, ask questions in plain English, and get insights in seconds. No coding required. We have a clip from Scott Bessent on what is this? Newsweek? ABC.

Speaker 1:

Newsweek.

Speaker 2:

ABC this week.

Speaker 1:

ABC this week. Okay. I'm learning. The 2,000 dividend could come in lots of forms. Let's play the clip.

Speaker 6:

You know, it could the the $2,000 dividend could come in lots of forms and lots of ways, George. It could be just the tax decreases that we are seeing on the president's agenda. No tax on tips, no tax on overtime, no tax on social security, deductibility of auto loans. So those are substantial deductions that, you know, are being financed in the tax bill.

Speaker 1:

Tyler. So, do you feel like this

Speaker 2:

How how is that gonna help the day trading community?

Speaker 1:

Yeah. I

Speaker 4:

was planning on doing rampant speculation with this $2,000. What am I gonna do now? Yeah.

Speaker 2:

You're actually you're you're basically fully you've already committed the funds. Right?

Speaker 4:

Yeah. Yeah. I've already placed a ton of parlays. Yes. So, like, what am I supposed to do now?

Speaker 4:

I

Speaker 1:

can't pay my parlay bill with a tax decrease on auto loans.

Speaker 2:

Yeah. You're not looking for passive income. You're looking for massive income.

Speaker 4:

Bro, I'm looking for massive income.

Speaker 1:

I'm looking for generative media on a platform for developers. I'm looking for fall, the world's best generative image, video, and audio models all in one place, develop and fine tune models with serverless GPUs, non domain clusters. Doomer says they should put a button on the IRS website that allows you to either receive the $200 stimmy check, or if you press the button, you have a fifty fifty chance of getting either $4,000 or zero. I really think a lot of people would press that button. Absolutely, absolutely wild.

Speaker 1:

Certified bangers. X now has an account called at bangers, and it is an account that highlights the the best bangers. They're testing something new, certified bangers. We wanna recognize the very best post that move the timeline ranked by authentic interactions. If your post is featured, you will get a server certified banger badge on your profile for the month.

Speaker 1:

Here are the top five bangers for October. Very cool concept. Very similar to a project that, Jordy sort of incubated, about twelve months ago, almost a year ago.

Speaker 2:

Yeah. It must have been January.

Speaker 1:

Was during the fires. I remember I remember they ripping. It was called banger archive, and Jordy would go and screenshot a post that, had done very, very well and resurface it, and the findings were shocking. Do you wanna break down what you learned from

Speaker 2:

that project? If you take any post that had gotten, like, a minimum of 3,000 likes and just screenshotted it and then posted the screenshot and said banger, it would immediately get usually more likes than the original post.

Speaker 1:

It was crazy.

Speaker 2:

Were there were some posts that that had I screenshotted them. They had, like, maybe 20,000 likes, and then I posted on there to get, like, a 150,000 likes. Yeah. Yeah. It's, like, millions of views.

Speaker 1:

And some of it is, like, there's there's a timelessness, and there's just, like, a it's just high quality content is high quality content. There's certain certain sentences that just are always funny. They're just always funny. Some of them get funnier. Some of them some of them did fall off.

Speaker 1:

Some of them would flop even though they were a a banger previously. There were a bunch of interesting findings from the, from the banger archive

Speaker 2:

Including at least one person that was that got mad very mad. They were like, you could have just reposted it.

Speaker 1:

Exactly. And And

Speaker 2:

the whole but but the issue is

Speaker 4:

it doesn't work.

Speaker 1:

Archive it then because if they delete it, then it doesn't stay, but they want the they want the right to delete. So I understand where they're coming from. Bangers on Bobby and

Speaker 2:

Chad says, don't care about a, Donald Boat says, don't care about a badge. They should come to your house and tattoo it on your body. Yeah. Having a badge for a month is

Speaker 1:

Yeah. Yeah. It it I don't know. It's cool. I I I think it's interesting because

Speaker 2:

It's also I mean, so looking at the post that they they have here, they're just kinda random.

Speaker 1:

Yes. So so so we were debating this. So first off, I I think what what is interesting about this is that it's not a new functionality in the X app. Like, there are plenty of ways to surface popular posts, mainly in the Discover page on the, on the Explore page with the little window glass. So you you search there.

Speaker 1:

Before you search, you see for you, you see trending topics, you see who to follow, business news. There's a whole bunch of things that are that are going on that are that are like, it's basically, like, new UI, new projects. Like, they could have created, like, a separate tab or something like that. They didn't do that. They just created a an actual account on X that's native, and then they created an organization, and they just give you the organization tag tag if you post a banger, which is, it's just a very interesting use of the of, like, the the x primitives, like, the code that's already there.

Speaker 1:

Like, clearly, this did not require software engineering to do, which is Yeah. Just just interesting way to, like, create some new content on

Speaker 2:

Zoom Zoomer says, did you guys seriously not include this banger? Zoomer is gonna be so excited about it. We honestly need to get Zoomer on the show. I invited him. Because he's got some truly wild take.

Speaker 1:

He has. He he's

Speaker 2:

gotta put him into He's been on roller coaster. Format.

Speaker 1:

He's been on a roller coaster for sure. But yes. So we were debating, is the, is the Banger's account going to be teapot, tech, Twitter, venture capital related stuff? Will you see Naval in there? Will you see Marquis Andreesia, Growing Daniel, Rune posts?

Speaker 1:

These posts that go super viral, but within tech, Or will it be more broad? Will we see something from basketball Twitter or something from, you know, all sorts of different, you know, subcommunities? And it was just an interesting way to to check the the the pulse because I feel like we are in a little corner of X, but, obviously, there are folks that use X, and they never interact with tech Twitter, really. They they're over they're over in some other they're talking about, basketball or they're talking about football or they're talking about the Oscars or they're in film Twitter or they're I there's so many different little sub pockets. Well, the result was that they went broad.

Speaker 1:

A lot of the tech folks think it's slop, but, some of these were kind of interesting. The first one that they highlight is from zero x 45, and it says, can y'all please tell me the lore behind your profile picture pictures? And it got 285,000,000 views. So people really enjoyed quote tweeting this. It got it got 80 or it got, 39

Speaker 2:

The challenge there is that in itself is not a banger. It just got a lot of reach. Yes. Because people were quoting it.

Speaker 1:

Like, it

Speaker 2:

has zero value as an individual post.

Speaker 1:

It well, it starts a conversation. It's it creates more content. Like, it is sort of a viral seed of of content in this weird way, in in a way that, there are special words in that post that lead it to generating more content than than an average post. If if if you just posted, like, how'd you pick your profile picture? Not a banger.

Speaker 2:

Yeah.

Speaker 1:

This the lore, it specifically it specifically worked, and it obviously did get a lot of a lot of attention. But I agree with you that this is not something that, like, sparks thought, I suppose, or, laughter. I don't know. What is the lore Tyler, what's the lore behind your profile picture?

Speaker 4:

I think it was my Halloween costume. I was dressed up as Julius Caesar. Oh. It's last year. That's fine.

Speaker 4:

It's just

Speaker 2:

a I thought that's just how you dressed back, well, in during your studies.

Speaker 1:

Yeah. That's not exactly like deep lore. But

Speaker 4:

Okay. What's the lore behind your picture? It's just a picture of you at wasn't it like a stock exchange or something?

Speaker 3:

Yeah.

Speaker 1:

There's not really deep lore, but, I mean, I I could tell a whole story about it, how it was at the New York Stock Exchange for the Klarna IPO. And it was also the day Charlie Kirk passed away or was assassinated, which was horrible. So there's a lot of lore there. I don't know. But it doesn't have the same, like, deep lore as I don't know what I don't know what people expect.

Speaker 4:

Who Jordy, do you have any lore behind your

Speaker 1:

Now I'm curious. Who had the biggest quote tweet of this? Did anyone have a really, really great, view post engagements? Quotes. No one yeah.

Speaker 1:

I'm not seeing anything that was, like, wild. Oh, a lot of people did, did, did actually engage with this. Cold Healing says their their profile picture is their apartment in Peoria had a total ivy colored ivy covered window that I really liked. One day in mid spring, I took all the furniture out of my bedroom to take this photo.

Speaker 2:

I mean, the criticism here, the most common criticism has been congratulations. You brought Reddit gold to x.

Speaker 1:

It does seem a little bit like gold. It is even gold colored. How a mathematician sees the world. I saw this going viral. I didn't realize this was this viral, 200,000,000 views.

Speaker 1:

What is the, what is this saying? Tyler, did you see this? Do you know what this what why this went so viral? Just because it's it's is it funny specifically? Like I

Speaker 4:

I don't. It's just like they just drew a bunch of lines and like random math equations.

Speaker 1:

Yeah. Like, is this actually how anyone sees

Speaker 2:

No. It it was like everyone was quote tweeting it like how podcaster sees the world.

Speaker 1:

Yeah. I think people were just spinning off of it on different ways of of so it just became a good meme template. Yeah. That makes sense. What was the other one?

Speaker 1:

When your AI girlfriend was on AWS US East one, sixty million views. That really shows you how big the, like, the tech community is or, like, the AWS, like, ball knowers.

Speaker 4:

I think this one was is closer to what we would describe as, like, a normal a banger.

Speaker 1:

Totally. Totally. I mean, NetCap Girl, one of the first posts we reacted to on this show. Great great poster. Absolutely a banger.

Speaker 1:

The I'm I'm shocked that 60,000,000 people have the context to enjoy a post about AWS US East one, but I guess it is a dominant dominant technology that lots of people interact with. So, they must know, or maybe there's enough context to enjoy it.

Speaker 2:

Well, in other news

Speaker 1:

Yes. What else?

Speaker 2:

There is a JPMorgan AI CapEx report, and their Max Weinbach says, it's a great way to put AI ROI into perspective to drive a 10% return on our AI, on our model AI investments through 2030 would require 650,000,000,000 of annual revenue in perpetuity which equates to $34.72 a month from every current

Speaker 1:

iPhone user in in perspective.

Speaker 2:

Ross Hendricks says the AI math ain't math and two ways to fix it. 20 x the revenue in the next five years or slash data center capex. I'll take the latter on a 100 to one odds. So what does a capex pullback look like? There was another post here I I thought was solid.

Speaker 2:

Hari, Raghavan says annual US spend on software is 400,000,000,000. Professional services is 2,800,000,000,000.0. Logistics labor is 1,000,000,000,000. Health care admin, 1,000,000,000,000. Education labor, 1,000,000,000,000.

Speaker 2:

That's 6,000,000,000,000. Globally, it's probably three x. Every market is up for grabs. All of them, if you don't get that, you aren't ready for what's coming. 650,000,000,000 is a joke.

Speaker 2:

So again, I think, understand the concern. But, when you put it in terms of every basically, every iPhone user needs to spend $34 a month

Speaker 1:

on Yeah. Well, obviously, not directly

Speaker 2:

Yeah. Yeah. No. I know. I know.

Speaker 2:

But it's like If using

Speaker 1:

Uber and on your phone and Uber is spending a dollar per user on AI inference just for fraud detection. Like, that counts. Right? It's like it's like, how much are you spending on cloud computing? Like, cloud computing is a huge market, but you don't think about it as like, oh, yeah.

Speaker 1:

Everyone spends

Speaker 2:

my household cloud budget.

Speaker 1:

Exactly. It's like, no. There are some customers, some companies that spend billions and billions of dollars on cloud hosting, and then you interact with services all over the place. Yeah. And the broad cloud computing market is just bigger than the AI market right now and will be for a long time.

Speaker 1:

When you look at the AI CapEx numbers, they're all smaller than just the traditional CapEx numbers. Like, the actual just building of a normal data center is still higher than building of an AI data center. And that's because they're actually monetizing cloud computing very effectively. Yeah. Because everyone everyone needs a database for something or other or a or a web hosting platform, something like that.

Speaker 1:

Everyone needs Turbo Puffer. Search every byte. Serverless vector and full text search built from first principles on object storage, fast, 10 x cheaper, and extremely scalable. Where else did you you wanted to skip skip ahead a little bit, or you wanna go back to, the AI CapEx build out. It is it is very odd to put it in those terms.

Speaker 1:

I remember someone putting Bitcoin in the same terms to me, when Bitcoin was at, like, 10 k or something. And he was like, look. The math is, like, in order to have, like, a stable price at, like, the 100 k price targets that people are talking about, you would have to have, you know, billions and billions flowing in every day and stuff. And it's like, somehow that just wound up happening. And I remember being very convinced by this, by saying, like, yeah.

Speaker 1:

That doesn't make any sense. Like, why how how could it possibly become, an an investment vehicle that is constantly seeing so much demand that, that it can justify such a huge such a huge price. And yet, like, it just did because it just worked its way into, like, every single person's pension fund. And, like, every single person's ETFs, like, have a little bit of Bitcoin in them.

Speaker 3:

And, Oh,

Speaker 2:

yeah. And you stayed long even if you thought that that scenario was crazy because you were you you were like, well, I should have some exposure if it's at all within the realm of possibility.

Speaker 1:

Yeah. Yeah. Yeah. Exactly. There were more reasons to, like, stay long than than actually dip out.

Speaker 1:

The the most interesting point from the seminalysis, segment yesterday was, this do you pick up on when he when he mentioned, the Tyler's standing up? Tyler's rocking the standing desk today. Give us a review. How how is it feeling? You you you you you haven't been standing the whole time.

Speaker 1:

You've been sitting down. You need

Speaker 4:

a sitting around. Yeah. Like, stand up, sit down.

Speaker 1:

Yeah. Well, we need to get you one of the taller taller chairs. But there there was a in this in the seminalysis interview yesterday, our our guest was talking about this idea of putting an h 100 index on the stock market. That was, like, something you kind of alluded to. And I feel like that would be pretty, pretty crazy.

Speaker 1:

And I I don't even know how mechanically that would work. But, like, if all of a sudden

Speaker 2:

Just treating it like a commodity.

Speaker 1:

Yeah. But actually taking something from currently, it lives on the balance sheets of neo clouds and hyperscalers. And actually taking a product and just, like, making it making it a commodity that you can just invest in directly is not really, like I don't think it's very simple. I think it I think it actually is a complex process, but it certainly would potentially generate a ton more liquidity. Like, there would just be a lot more investment dollars flowing in if if there was, like, a way to, like, directly get in on the AI trade by just investing in, like, the underlying

Speaker 2:

AI CapEx trade is betting on AI, and then we need other people betting on the AI CapEx.

Speaker 1:

Yeah. Yeah. I I don't know. I don't know. I I mean, like

Speaker 2:

Derivative bets.

Speaker 4:

John, so you're saying we're going higher.

Speaker 1:

I mean, that's certainly what seminalysis was saying, and it certainly it it just feels like the it was foolish to it was it it was it is foolish or it was foolish to to sound the alarm bells around the idea of, like, debt being used at all. It's like, debt is fine. Debt is good. The question is just like, are we actually overlevered? There's not like so when I look at, like, oh, Meta has, like, 4,000,000,000 of debt for this one little side project thing.

Speaker 1:

I'm like, that's still a very stable business. Like, that's not like, oh, they're super indebted. Like, they got oh, how are they gonna pay their debts? Like like, there's just a wild continuum on, like, how much leverage is in the system, what that leads to in terms of, like, risk. We're we're we're now we're at a turning point where we might be going really heavy levered.

Speaker 1:

We might be bringing in, you know, more ETF money. Guarantees. I don't even know if there's a neo cloud in the S and P 500 yet. Like, that is an interesting break point. Because once you're in the S and P 500, you like, there is way more liquidity because there are funds, ETFs, that just buy the S and P 500.

Speaker 1:

And so once you're in the S and P 500, you sort of have, like, price support. You have extra investors that are, like, lining up, and they're like, I'm riding with you no matter what as long as you're in the S and P 500. And so

Speaker 2:

Yeah. I mean, it's notable that Oracle has fully round tripped from before the OpenAI announcement. So be leading up to the OpenAI announcement, they were at

Speaker 1:

Yes.

Speaker 2:

At $241 a share. They're now at $235 a share. Yeah. And so the market is no longer giving them credit for the OpenAI deal.

Speaker 1:

That is interesting. Yeah. Well

Speaker 2:

Nabius is down 10% in the last five days. CoreWeave is down 25% in the last five days. And Iron is down 17% in the last five days. So jitters, to say the least.

Speaker 1:

Microsoft can't power all of its AI chips, says CEO Satya Nadella. This is from ZeroHedge. I don't know if you wanna read through a little bit of this.

Speaker 7:

But

Speaker 2:

Yeah. The the thing that was notable here is CoreWeave on their earnings call yesterday.

Speaker 1:

You too.

Speaker 2:

CoreWeave was saying that they aren't power constrained, that that's not that's not their problem.

Speaker 1:

What are they what are they constrained by?

Speaker 2:

I mean, they had a weaker forecast, which is part of why it it traded it's traded down dramatically today. But again, they they they grew revenue two x and they still traded down pretty dramatically. So

Speaker 1:

Well, there's more kind of the stories. From Joe Wiesenthal. But first, let me tell you about Google AI Studio. Create an AI powered app faster than ever. Get started.

Speaker 1:

Gemini understands the capabilities you need and automatically wires up the right models and APIs for you. Get started at a i.studio slash build. Joe Wiesenthal is exerting a Bloomberg News article, that says two of the world's biggest data center developers have projects in NVIDIA's hometown that may sit empty for years because the local utility isn't ready to supply electricity. In Santa Clara, California, where the world's biggest supplier of artificial intelligence chips is based, Digital Realty Trust applied in 2019 to build a data center. Roughly six years later, the development remains an empty shell awaiting energization.

Speaker 1:

Stack infrastructure, which was acquired earlier this year by Blue Owl Capital, has nearly has a nearby 48 megawatt project that's also vacant. Meanwhile, the city owned utility Silicon Valley Power struggles to upgrade its capacity. 48 megawatts. What are we doing here? That seems like that's that's child's

Speaker 2:

center for ants?

Speaker 1:

That should be that should be child's play, but they're not able to do it. What's going on?

Speaker 2:

Byron Dieter, was responding to Elon Musk's post saying

Speaker 1:

Oh, yeah.

Speaker 2:

Elon was agreeing that every AI application startup is likely to be crushed by rapid expansion of foundational model providers. Byron said, I don't buy it at all. No chance that four companies deliver all software value in the future, especially as a software TAM is growing 10 x to include a labor component. Mhmm. Yeah.

Speaker 2:

I think we'll have to see. Obviously, the the labs are extremely ambitious, but I I tend to side side more with with Byron here that as as long as the labs don't have as as long as the labs are saying, like, you can access our best intelligence Yeah. Via API, it feels like there's gonna be durable Yeah. Competition while you have open source models, you know, lagging but not lagging by very much at all.

Speaker 1:

Yeah. My current my current way to think about this is that the labs are new hyperscalers, as you've as you've put it before, which means that they will compete in a lot of different categories, and there will be matchups. And, ultimately, it will depend on the quality of the teams. So, the way I think about, like, browser based versus AWS's competitor is, like, well, if Paul at browser based goes up against the AWS project manager and works harder and is more creative and and marshals capital and does the right deals and partners with the one password team, for example, he can create a sustainable business. Sometimes that might be a duopoly.

Speaker 1:

Sometimes that might be total victory. And and a lot of it depends on the the nature of the market that they're going after. Like, is it possible that in five years, OpenAI just gets completely destroyed on AI vertical video? But AI vertical video is a category wins and that there's actually a different startup that cracks it and makes it great and grows it huge? Like, I think absolutely that's a possibility.

Speaker 1:

It just matters on, like there's a there's a matchup that happens in every single category, and I don't think it's I don't think it's fair to say that the Foundation Model Labs will win every category by default, or they'll lose every category by default. It's like Google happened to win in in email and maps, but they lost in chat, and they lost in in social networking and in RSS feeds. They they, like, they won in some places. They lost in others. And that's just the nature of these competitions.

Speaker 1:

It's like, if there's a particular team in place at one company and then there's a start up that's, you know, better than that team, then the start up wins. If the team inside the company is better than the start up, you're like you get Paul Buchheit at at Google, and he's building Gmail. I'm sure there were a lot of email founders out there that were trying to compete with Paul Buchheit, but Paul Buchheit smoked them, and Gmail is what it is today. And so I still lean on, like, who are the actual folks working on the project? Are they are the best people inside the organization or outside the organization?

Speaker 1:

Yeah. And that's how it actually

Speaker 2:

Over on the App Store charts, ChatGPT is still number one. Threads is number two. Mhmm. Shout out Connor Hayes. Number three is Go Wish, your digital wish list.

Speaker 2:

Create wishes and get inspired. Number four is Google Gemini. Number five is Sora. So Sora is sliding down the charts, but still in the top five. We'll have to look if Fubo can overtake Sora.

Speaker 2:

That would be that would be a win for organic media.

Speaker 1:

I am a little lost. Let me tell you about ProFound. Get your brand mentioned in ChatGeeBT. Reach millions of consumers who are using AI to discover new products and brands. Do you wanna did you see

Speaker 2:

it go down?

Speaker 1:

Jay post.

Speaker 2:

Sorry. Jay Boltard One says, calm down people. Moss's son is a male Cathy Wood. Him selling NVIDIA today means NVIDIA has plenty of upside left.

Speaker 1:

It is. It is.

Speaker 2:

I mean, it just comes down to he will take he's selling NVIDIA. He's gonna give that money to OpenAI and then OpenAI is gonna give it to NVIDIA. Sold. We will see. Casey Hanmer, who's coming on the show in twenty minutes, says that remember we could automate 99.99% of air traffic control if it wasn't deliberately being used as a government funding canary.

Speaker 2:

So we will have to ask him. He had another post, earlier this week

Speaker 1:

Government funding canary or funded canary canary? He said funding canary.

Speaker 8:

Casey said

Speaker 1:

in the coal mine? What is a what is a canary? We yeah. We have to ask him about this. I saw I saw his other post, which was interesting.

Speaker 1:

I realized the dooms the doom graphics engine running on a pregnancy test CPU could handle all of North American traffic control. I'm gonna sit down for a bit. This needs to be in the bangers. This get get this man the bangers tag. This is such a good

Speaker 2:

There's an article. I was curious. There's an article here from The Verge that says digital pregnancy tests are almost as powerful as the original IBM PC.

Speaker 1:

That's insane.

Speaker 2:

A lot of computer to read an old fashioned p strip.

Speaker 1:

Yeah. That's what it does. You know that. Right? Like, the like, when you see those digital tests, it's literally just, like, basically pointing a camera at a at a test strip and being like, did it turn blue or not, or did it turn red?

Speaker 2:

No way.

Speaker 1:

Yeah. No. That's how it works. So so in so, basically, they make a test strip, and then it has a chemical in there. And when the person, like, pees on it, it like, the the the the the human sample interacts with the chemicals and turns a color.

Speaker 1:

Yeah. And so, normally, you just look at it, and you're like, oh, it turned the color. Like, it's pink. So that means one thing. Or it's blue.

Speaker 1:

That means another thing. But but they like, to make it digital, they just point a camera at it and say, like, what is it looking at? I'm pretty sure. I might be wrong. Who knows?

Speaker 1:

This is probably fake news and it'll go viral on Instagram as it always does.

Speaker 2:

In other news, Jan Lecune

Speaker 1:

He's out.

Speaker 2:

Says, see you.

Speaker 1:

He's out.

Speaker 2:

He's out. He's out. At from Meta, he's gonna launch his own startup. Tyler, what's your reaction?

Speaker 4:

Yeah. Mean, this is pretty big news. I I think people have kind of expected this for a while

Speaker 1:

Yeah.

Speaker 4:

Basically since Alexander Wang came in. Mhmm. Because Alexander Wang kind of, in some some sense, like, took over as, like, the chief AI person at Meta, was

Speaker 1:

Not enough room in this town for two AI

Speaker 4:

Not enough room for

Speaker 1:

scientists. Coons. Yeah.

Speaker 2:

Yeah. Yeah.

Speaker 1:

Not enough room in this town for two chief AI scientists.

Speaker 4:

Yeah. And and so Jan is like very goaded AI researcher. Yes. He's been in the game for like super long. Yes.

Speaker 4:

He did. He I think probably his most famous paper is like La Net, which is like the original convolutional

Speaker 1:

paper. La Net? La Net?

Speaker 4:

Well, I I I don't know

Speaker 1:

that's name it after himself?

Speaker 4:

I think that's what

Speaker 7:

people call it.

Speaker 1:

I I don't

Speaker 4:

know what the actual paper is.

Speaker 1:

That's awesome.

Speaker 4:

It sounds like the it's like AlexNet.

Speaker 9:

Yeah. Yeah.

Speaker 3:

Yeah. It's like

Speaker 4:

not actually called Alex.

Speaker 1:

Yeah. Yeah.

Speaker 2:

Yeah.

Speaker 4:

But, yeah, it was like the OG convolutional neural network paper and then that kind of basically led into deep learning. Yeah. Yeah. Yeah. And then the main thing is is, like, today, he's, like, very much seen as being bearish on LLMs.

Speaker 4:

Sure. He doesn't think that they can reason. He doesn't think they can, like, do novel tasks or whatever. So he's been like very outspoken about that. I think that's maybe part of the reason why he's he just has like kind of a a different vision from from Zuck and Alexander Wang.

Speaker 4:

But yeah, I mean, is pretty big news.

Speaker 1:

Do you think he will raise a private credit fund? Get into I'm data

Speaker 2:

no longer interested in research and the science behind it all. I'm interested in finance.

Speaker 1:

I'm interested in finance.

Speaker 4:

I mean, he he has I'm exactly what it's called, but he has like another kind of path that he sees to AGI. He's Mhmm. He's he's still a professor at NYU, which I think is where he does most of his research these days.

Speaker 1:

Interesting.

Speaker 4:

So I think he'll probably just stay in academia.

Speaker 2:

Does he have any active classes going on this semester? Because if he does, we should send you on to the campus to to study. Not just not just from the timeline, but in in the classroom, and you can do a little little on the ground reporting. Taylor Taylor in the chat says, la credit.

Speaker 1:

Has he La credit. Taken a formal victory lap on on scaling laws not holding? Because from your perspective, it's a preemptive it would be a preemptive victory lap. Because I know Gary Marcus was sort of taking a victory lap with Richard Sutton after that Dwarkesh podcast. And then Andre Karpathy sort of, like, confirmed that it might be a little bit harder to scale up the current paradigm.

Speaker 4:

Yeah. I mean, I I don't think anyone is saying, like, scaling up is gonna get easier. Right? You just need to you need bigger data centers. You need to just, like, spend way more money.

Speaker 4:

I I I think it it I I don't think today you can just say that, like, pre training is dead, though. That's, like, definitely not true.

Speaker 1:

Isn't isn't that sort of Yan Lakun's take, though, Or Richard Sutton's take?

Speaker 4:

Or I I think Yan Lakun is is generally against all LLMs. He's not just saying that, like, LMs were they were very promising and then they kind of fizzled out. Mhmm. I I think he's kind of always been against LMs as a way to AGI.

Speaker 1:

Yeah.

Speaker 4:

I I think it would definitely be preemptive to say that he was like correct though.

Speaker 2:

Yeah. Okay. Don't wanna cut you off Tyler, but this is more important. OTP in the chat says, Buffett is low key a huge boxing fan and I know at least one gym around his state that he has at least had a hand in funding. Oh.

Speaker 2:

And I looked it up and he's a huge fan of Terence Crawford, the boxer. And he will go watch, he'll go could see the fights and he likes the cheap seats because he can see he has a better he feels like he has a better angle.

Speaker 1:

That's interesting. I like that.

Speaker 2:

I did not know. So he is he's a patron of combat. We're not ready to say he's a patron of of of the arts, but in some way this is martial. Never. Some somewhat of a martial

Speaker 1:

down on Warren Buffett. This guy is amazing. Loves McDonald's. Loves boxing apparently. Loves allocating capital efficiently.

Speaker 1:

You know? He he he stays in his lane. It's great. Do we have an update from Brian Johnson? He came back from his his trip.

Speaker 1:

He was on psychedelics. I feel like I'm expecting at him to fall in love with at least one fast food restaurant. I want him to to flip around on at least one and say, okay. Yeah. I'm I'm I'm an in and out guy now.

Speaker 1:

He's been he's been quiet. I I do wonder if he'll if he'll update on anything or if he's just, like, so so powered through that it just will not affect him at all. Because like, wasn't he was kind of framing it as like, I'm I'm this is like a big deal. It seems like he got through it just super fine.

Speaker 2:

Saw a very viral post of of somebody saying, I don't think it's a good thing that billionaires can talk about taking schedule one drugs publicly with no repercussions. It did seem that he was doing it like this seems like such still such a gray area where you can get it prescribed by a doctor

Speaker 1:

Mhmm.

Speaker 2:

But it's still schedule one. He posted an update on his experience. He also talked more about why. He says it's potentially a longevity therapy for psilocybin expands lifespan in mice. It can reduce inflation markers tied to aging.

Speaker 2:

It can increase brain entropy, breaks rigid patterns, and boost long term cognition and flexibility. And

Speaker 1:

Yeah. I I don't know. I I kind of agree with that with that that person who is who is saying, like, they shouldn't be able to talk about it publicly. I feel like he should have a disclaimer or something because there's gonna be there's people he's such a huge audience that there's gonna be some people that just look at what he is and just run

Speaker 2:

straight into five minutes. Podcast era Yeah. You had, Tim Ferriss

Speaker 1:

Yeah.

Speaker 2:

Who, to his credit, was, like, doing work on himself Yeah. And would share what he was doing. Yeah. But he would talk about doing things like ibogaine

Speaker 1:

Yeah.

Speaker 2:

Which is like a, know, super powerful psychedelic. Yeah. And he wasn't directly saying, hey, I think you should go take this. But when hundreds of thousands of people like follow Tim for health advice, it sort of is like an indirect. It's somewhat of an it's it's somewhat of an endorsement even if it's not directly.

Speaker 2:

So.

Speaker 1:

No. No. I mean, I a 100% think some people will see this and be like, oh, looks like he had a good experience. Let me jump straight to exactly his protocol, which is clearly not accessible for the average person. Like, he has been building up a tolerance to this particular chemical for probably a decade.

Speaker 1:

And so it's going to hit him very differently than it will someone who's not in the same place. We were joking yesterday about, like, oh, if he wanted to, like, really challenge himself, he would have been at, like, a crowded concert or something. And there are gonna be people that see this post and and wind up actually doing that and winding wind up in a very, very rough spot. I feel like there should be more a little more disclaimers on this. Like, hey.

Speaker 1:

I'm I'm running a crazy experiment on myself. Like, do not try this at home. He needs the it's the it's the you know, you ever watched Jackass back in the day? Remember? It opened every every episode opened with, like, don't do not these are stunt professionals.

Speaker 1:

Like, they do this professionally. Like, yes, they just look like random guys, but, like, behind the camera, there's a

Speaker 2:

huge scene.

Speaker 1:

They are stuntmen. Like and and even that was still like was like, okay. Kids were, like, you know, trying to trying to jump their skateboards down too many stairs and getting hurt. But it wasn't, like, that bad. But there was still, like, a do not try this at home.

Speaker 2:

I did see some people responding and saying, I took a similar dose to this, and it ruined basically ruined my year.

Speaker 1:

That was you know?

Speaker 2:

Terrible. It took me, you know, months and months to recover.

Speaker 1:

So I would say stay off the psychedelics. Stay on linear because linear is a purpose built tool for planning and building products. Meet the system for modern software development, streamline issues, projects, and product roadmaps. That's where you wanna be. You wanna be locked in in linear.

Speaker 2:

Yeah. I do I do

Speaker 1:

Knocked out.

Speaker 2:

I I have I have joked in the past. It's like, is is, you know, are psychedelics gonna fix your life or just doing the tasks that you've been avoiding? Is that gonna make you feel relief?

Speaker 1:

Yes.

Speaker 2:

You know, is that gonna make you feel better about your

Speaker 1:

life? It's like that meme of like the the anime ninja. If you're tired, do it tired. If you're sober, do it sober. Right?

Speaker 1:

Just do it.

Speaker 2:

John Cugen.

Speaker 1:

If you're sober, do it sober.

Speaker 2:

Well, if you want to help make history, AngelList is launching a new product to fund the most creative, hardworking technical founders building the innovations that push humanity forward. AngelList is hiring a marketing lead to launch and scale this product. Eric over at AngelList, a friend of ours and the chief legal officer over there Mhmm. Says you don't need prior experience running a marketing function, but you do need to be scrappy, data driven, and comfortable in a startup environment. Your own marketing strategy across a variety of channels to build awareness and drive growth.

Speaker 2:

I know what this product is. I'm very excited about it. And but I I, unfortunately, can't share. But if you reach out to Eric, if you DM him on x, and you're interested in the role, I'm sure he can share more. But, excited, excited for this one to come online, in the next couple months.

Speaker 2:

I thought this was good. Somebody found a post from Sam Altman in 2016.

Speaker 1:

I didn't realize this is old. I thought he just posted this. Oh,

Speaker 2:

man. 2016, so almost ten years ago. Yeah. Says, digital addict he, digital addiction is going to be one of the great mental health crises of our time. And Blade says, Unc had this thought and immediately was like, how do I profit from this?

Speaker 2:

The greatest to ever do it. TBH.

Speaker 1:

Hilarious. Hilarious.

Speaker 2:

Of course, ChatGPT wouldn't launch for many many years.

Speaker 1:

Yes.

Speaker 2:

But in that same he did when did he write the merge?

Speaker 1:

That that was, like Around the same time. But also

Speaker 2:

2017.

Speaker 1:

I don't know. To to be fair, like, I like, there are lots of things that you could like, if you're if you're trying to, like, layer up, like, the the OpenAI is bad, like, case, like, you don't start with, like, the product's too addictive. Like, I don't think that's the claim. The claim is, like, it uses too much water, or, like, it it, like, one shots people or it does erotica or it's like AI slop. But very few people are saying, like, oh, yeah.

Speaker 1:

People are on their phones too much, and it's because of OpenAI now. People are like it's people are on their phones too much, but it's because of Instagram, and it's because of, like, social media. Actually, not the

Speaker 2:

AI Because sore.

Speaker 1:

Well, I think it's a little bit too early to say that. I don't think it's that I don't think it's working well.

Speaker 2:

Certainly short short form video.

Speaker 1:

Have you ever been in in in, like, the gym or in a coffee shop and seen someone scrolling Sora, like, over the shoulder?

Speaker 2:

I haven't seen it. I meant I meant vertical video broadly.

Speaker 1:

Totally. Totally. We we which is, I guess I guess, like, a good take here, which is like, how do I profit from this? I gotta make an AI version of it. That's that's probably, like, more of what they were thinking.

Speaker 1:

But but in terms of in terms of actually profiting on, on, like, digital addiction, it doesn't seem like it's going very well.

Speaker 2:

Yeah. Four four o

Speaker 1:

seems to

Speaker 2:

be extremely addictive. Yeah. So much so that

Speaker 1:

Very for a very small amount of people, like,

Speaker 2:

Yeah. For sure. Yeah. And we don't know how many people.

Speaker 1:

Yeah. You know what else is addictive? Doing your sales tax with numeral, putting your sales tax on autopilot, spend less than five minutes per month on sales tax compliance. I'm also addicted to the Financial Times. The Financial Times has coverage of Sequoia Capital and, and what's going on with, Roloff Bota out.

Speaker 2:

Both of these top top lieutenants, pushed Roloff out.

Speaker 1:

That's what they say. Leadership duo prepared to chart a new course at Sequoia after ousting of Imperial Bota. Venture capital firms limited partners hope Grady and Lynn can boost strategy and bring super intense period to close. They wanna become less intense? That is very interesting.

Speaker 1:

I would not think that. Okay. Let let's run through let's try and understand the level of intensity at Sequoia now and potentially in the future. So from this article, we'd love seeing, Andrew Reid there. They they the the Financial Times, let's just say, they they had to find a picture where he was pushing.

Speaker 1:

It looked like he was pushing. They're like, you know, we're gonna tell a story about about be someone being pushed out. We need a picture of one of the lieutenants pushing as much as close as we can get to that motion because that's the story we're trying to Let's let's figure out what's actually going on. Sequoia Capital's roll up, Bota, was ousted by top lieutenants who lost confidence in his ability to keep Silicon Valley's most powerful venture firm ahead of its rivals. Bota stepped down as managing partner of the group last Tuesday following an intervention from Alfred Lin, Pat Grady, and Andrew Reed said multiple people with knowledge of the matter.

Speaker 1:

The trio of senior partners had the blessing of the wider firm and Doug Leone, Sequoia's former managing partner, said the three of the people. Their move came on the back of concerns about Bota's management style and questions about Sequoia's artificial intelligence investment strategy and followed high profile clashes between senior figures at the firm, the people said. Financial Times spoke to 10 people close to the firm. It's a leaky bucket over there at Sequoia Capital. They got the Financial Times on speed dial, I guess.

Speaker 1:

Including those

Speaker 2:

least one of them was an LP.

Speaker 1:

Yes. You know, also also it says people close to the firm, not necessarily people that are actually at the firm. It's hilarious to be like, oh, sorry. I can't I have to I have to I can't be on our on our all hands today. Have a call.

Speaker 1:

Oh, with who? I'm going to the dentist, actually. I'm going to the dentist. Yes. I'm going to the dentist.

Speaker 2:

You went to the dentist last week. Oh, yeah. I'm going again. Here you go. I have another checkup.

Speaker 2:

I just gotta we just gotta give props for one second to Sean Maguire Yes. For there being an article about Sequoia and not being it. I mean, round of applause.

Speaker 1:

It's amazing. He's like

Speaker 2:

This is the first Freedom. He's like

Speaker 1:

I'm not in the doghouse for once. Oh, thank goodness. I love it. Yeah. Props to Sean for staying out

Speaker 2:

of the You did it. You did it. Said it was impossible.

Speaker 1:

Dodging the bullets. Okay. So investors who have worked with both the and institutions that bankroll Sequoia known as LPs. The l LPs are snitching. The LPs are are snitching.

Speaker 1:

His ousting was motivated by a belief that a new generation of leaders would better serve Sequoia's LPs, they said. Yeah, I mean, typically, if there's a change in leadership, even a change in the GP structure at all, you would expect an email to go out to all the LPs. That's probably what, what they're leaking, but then a lot of the LPs probably got on the phone with people, and then the Financial Times got on the phone with those with those LPs. One of one of those described the removal as a revolt against both those imperial style of leadership following a period of upheaval at one of Silicon Valley's most successful and enduring firms. On an IQ level, he is off the charts.

Speaker 1:

Off the charts. They didn't say which way he's off the charts. Are they calling him zero IQ? Negative Negative one ten. Negative one IQ?

Speaker 1:

That would be terrible. Now, they clearly mean he's very intelligent. He's a 180, 200, 300 IQ. Who But the heart of the matter is that Roloff is one of these people who always needs to be seen as the smartest guy in the room. Interesting, the person said.

Speaker 1:

Adding that both his emotional intelligence did not match his intellect. Roloff is an, a legendary investor, a leader in human being, Sequoia's new leadership team told the Feet.

Speaker 2:

I mean, it's possible he was using too much Claude. And Claude kept telling him after every meeting, he would dump in like the meeting notes and he'd be like, Claude.

Speaker 1:

Am I goated?

Speaker 2:

Am I goated?

Speaker 1:

Like, you're definitely in the conversation.

Speaker 2:

You're absolutely goated.

Speaker 1:

You're absolutely right. The reason Sequoia has stayed Sequoia for fifty three years is they refused to cement themselves in hierarchy. Grady and Lynn will now run the firm while Reed and Grady will co lead Sequoia's funds investing in more mature startups. So Reed is going up to the, to the growth stage. Lynn and another partner, Luciana Lixandreau, will co lead, the firm's early stage investment funds.

Speaker 1:

So Alfred Lynn will be doing some seed stage investing, early stage. Botha, who's run its US and European businesses since 2017 and took over the whole firm in 2022, will remain as an adviser. The 52 year old grandson of Roloff Pique Botha, the last foreign secretary under South Africa's apartheid regime and later a member of Nelson Mandela's first government was hired by PayPal, by to PayPal by Elon Musk early in his career. He's led investments in Instagram, YouTube, MongoDB. Sequoia has has returned more than 50,000,000,000, to its US and European investors.

Speaker 1:

Despite those successes, partners decided to lend who is back to Airbnb, DoorDash, and OpenAI, and Grady who's behind investments in Snowflake. Zoom and ServiceNow were better placed to lead Sequoia under both the Sequoia has taken a more cautious approach to AI investment than some rivals invested a little more than 20,000,000 in OpenAI in 2021 when the chat GPT maker was valued at about 20,000,000,000 and has boosted that stake in subsequent rounds when OpenAI raised funds at a 260,000,000,000 valuation. Sequoia offered to invest 1,000,000,000, but ultimately was given a stake a fraction of that. Oh. So they didn't get their they didn't get their 1,000,000,000 position, which still would have been less than 1%.

Speaker 1:

Sequoia also holds a stake in Musk's x AI, but is focused on early investments in AI application companies such as Harvey, Sierra, and Glean, an approach also advocated by Grady. This is very interesting. Well, we have our first guest of the show, Casey Hanmer, in the Leader's Room waiting room. Welcome to the show, Casey. How you doing?

Speaker 1:

Happy Veterans Day.

Speaker 3:

Thank you. It's good to be here.

Speaker 1:

Great to

Speaker 2:

be here. Is this a real background? Background? What's going on here?

Speaker 3:

I mean, it's a real photograph.

Speaker 2:

It's a real photograph.

Speaker 1:

What is it a photograph of?

Speaker 3:

Sorry. There's a weird lag there. That's a photograph of some sample containers made last year of pipeline grade synthetic natural gas made in America. Fossil carbon free, 100%.

Speaker 1:

Yeah. Give us the update on on Terraform. How how's it going? What's the latest?

Speaker 3:

Oh, Terraform's going great. Yeah. Yeah. Thanks. We're 23 strong now, and we're, like, literally putting finishing touches on on our first full scale synthetic fuel system.

Speaker 3:

So this is a machine that takes in sunlight and air and produces pipeline grade natural gas. Mhmm. We also kicked off work on our methanol process, which is kind of the other half or the flip side of the coin. So methanol is in the department of liquid fuel, and and natural gas is the department of gaseous fuel. Mhmm.

Speaker 3:

So between those two, we pretty much covered everything that humans need. And, yeah, it's just kind of a fairly mind boggling boggling thing to think that, you know, in in a hundred years, in a thousand years, in ten thousand years, humans who need hydrocarbons for any purpose, whether that's flying rockets or jets or making paints and fertilizers or, you know, pigments or medicines or whatever, we'll do so using a, a solar synthetic process and that we're building the first one.

Speaker 1:

Do you fit neatly into any of the current, the current

Speaker 2:

Market mass.

Speaker 1:

Political I I was I was gonna say, like, political, like, trends or initiatives. Like, I mean, we had Isaiah Taylor on yesterday from Valar. He's part of this,

Speaker 3:

like Right.

Speaker 1:

Group. Yeah. Just raised, and there's, four companies that it feels like the administration is really trying to accelerate in nuclear. It feels like solar you know, Elon's a solar maxi. There's tons of energy around solar generally.

Speaker 1:

But are there specific programs or milestones that are or just conversations that are happening on Capitol Hill that you're, like, kind of drafting off of?

Speaker 3:

I mean, the the genesis of Terraform, which actually four years ago yesterday, was a basically very suggestive spreadsheet. And very early on, I said, like, I don't I don't really want to make this dependent on political, know, various political whims. Yeah. And so it hasn't been. But but, actually, across the spectrum, like, we we deal routinely with, you know, crunchy, you know, coastal liberal elites and and heartland Americans and everyone in between.

Speaker 3:

And I've never met anyone who says they'd like their fuel to be more expensive, please.

Speaker 5:

Mhmm.

Speaker 3:

So we're in in the department of making more fuel, making it better, making it cheaper, making it locally, making it free of various geopolitical catastrophes. And Yeah. I think that actually, as I said, never met an American who disagrees.

Speaker 1:

Yeah. What about the AI narrative? Is there is there are you drafting off of that, or is it just a, like, a separate industry that's growing? Or is there is there basically a way to lower your cost of capital by, you know, doing a deal with someone who wants to buy a bunch of, synthetically produced natural gas?

Speaker 2:

Press release economy.

Speaker 1:

Yeah. Just kidding.

Speaker 3:

I should I should hire you guys as my PFM.

Speaker 1:

I mean I mean, like, like, there's there's a right way and a wrong way to do it. And at a certain point, it's like if it's if it's raining, make hay. Right? Like, there's just a little bit of, like, okay.

Speaker 3:

Yeah. For sure.

Speaker 1:

Maybe there's something. But I I I'm wondering how you're, like, tussling with it. Like, what's real? What would be over your skis to do? What's what's the actual application of the technology?

Speaker 1:

Like, take me through the the various intersections.

Speaker 3:

Yeah. One of the really fun things I get to do as a as a CEO of this company is is, you know, basically make spreadsheets and charts all day long and Mhmm. And try and figure out how this stuff will go. And and I've I've got a blog post. You know?

Speaker 3:

I like to have ideas and write them down two years ago saying that the the way that this is gonna go is we're gonna need we're gonna need solar power to power AI data centers beyond a a significant, you know, significant leap beyond what we're able to do with natural gas. Mhmm. And, you know, people have paid attention to that. We've had significant inbound interest from, you know, most of the major players at this point saying, hey. Can you help us figure out how to do solar plus battery, you know, mostly all all completely off grid AI training data centers and inference data centers?

Speaker 3:

And we said, sure. Happy to help. And we've done done quite well by that. It's actually a very, very strong market signal that that maybe I'm I'm I'm in the wrong job, and I should be a energy system design consultant rather than a CEO. But tough luck to my stars.

Speaker 3:

I am determined to figure out how to build a a hardware company and manufacture stuff because Yeah. You can never have enough factories. You can always have another factory. More factories is more better.

Speaker 1:

More factories.

Speaker 3:

As far as natural gas goes, yeah. I mean, I think that the the primary the primary constraint that the the AI hyperscalers are seeing right now is in the in the Department of of Methane Destruction. Mhmm. That is to say that the turbines that produce power opposed to the department of methane creation, which is us and then every driller between here and maybe Charlotte or something like that. United States at this point at least is is, you know, once again, the the great beneficiary of God's bounty when it comes to synthetic fuel, when it comes to fuel in general, actually.

Speaker 3:

Synthetic fuel is is a work in progress. Yeah. Break breakdown I

Speaker 2:

would can can you say more about the very suggestive spreadsheet?

Speaker 3:

Yeah. Sure. Essentially, you look at a chart, and you you try and make a prediction for what oil will cost to produce in five years, ten years, fifteen years, twenty years, twenty five years. I mean, peak oil. It's it's an idea that's been around for a long, long time.

Speaker 3:

And, of course, over time, oil drillers are very, very studious and hardworking people, and they they figure out better ways of getting oil out, and we've seen the fracking revolution. And I think that's just a a wonderful thing. I I I genuinely do think it's a wonderful thing. Yes. It has certain climate implications.

Speaker 3:

But at the of the day, the the human like, human welfare benefits to cheap oil are, you know, 100 times better than the than the climate problems caused by it. It's just a matter of fact that, like, we will run out sooner or later. And so oil is unlikely to get radically cheaper in the future. On the other hand, solar is getting something like 40% cheaper per doubling of cumulative production, and that takes about two years. So we're seeing, like, 20% cost reductions per year.

Speaker 3:

And when you see something like that, it's a little bit like, you know, Steve Wozniak and Steve Jobs or with Bill Gates in the nineteen seventies looking at the the arrival of integrated circuits and and consumer grade computing systems, like very, very primitive computers by today's standards and thinking, you know, that this this is a this is a product that has already hit product market fit. It's already hitting this this this cycle of of cost improvement. There's gonna be a big wave here. You know? How do I ride that wave?

Speaker 3:

How do I co commodify that complement? How do I build value on top of this platform? And I'm thinking, what can I what can I use these cheap solar panels for? You know? Like, there's already plenty of people out there building large utility scale solar arrays and plugging them into plug plugging into the grid.

Speaker 3:

And the grid itself even even then was was kinda strained by by that, in particular, the kind of grid development bureaucracy was strained by that. What else can you do this before? And I was like, well, off grid off grid stuff, obviously, behind the meter, build a giant solar array and then have a captive load attached to it for, you know, AI computing type stuff. That's an obvious one to do. And then, you know, fuel synthesis.

Speaker 3:

Fuel synthesis was was an idea I've been thinking about for probably thirty years at this point because we have to do it on Mars someday. And I was saying, well, on Mars, it's enormously expensive due to energy cost, but energy cost on Earth is getting cheaper. I wonder when these lines cross. You know? If I'm able to surf that wave of cheaper solar and I can pass most of that value onto my customers by keeping my costs under control, then at what point should it be cheaper to synthesize fuel rather than dig it out of the ground and ship it halfway around the world?

Speaker 3:

And the answer then, and I think the answer is basically the same now, was sometime this decade. You know, 2030 at the outside. Okay. I wonder if anyone else has noticed this. A few people had.

Speaker 3:

What are they doing about it? Not what I would do about it. Okay. Well, I guess I should probably stop, you know, kicking myself in the in the shins at at NASA and and go and do this at go and start a company and and at least give it a go. Find out what it's like.

Speaker 3:

See see if building a hardware company is as fun as everyone makes it sound, and and here we are. So, you know, I'd say it's a infohazard to play around in spreadsheets too much.

Speaker 2:

What happens to the price of land as solar as as solar energy production costs decline?

Speaker 3:

That's a good good question. Because because ultimately, the the value of of land under solar is the value of what people are willing to pay for that power and things they do with that power. And so what we've seen, for example, is that that the the value of land under solar is something like a fifty, hundred times higher on a per acre basis or per revenue basis than than your kind of average agriculture, even in a fairly developed agricultural state like The United States. And that's the same for synthetic fuel. The the flip side actually and the thing that I worry about in terms of AI safety is that it's pretty clear to me that the economic utility of of land that's being used as solar array to power an AI data center is over a $100,000 an acre at present value.

Speaker 3:

And that's just higher than almost any kind of existing human you use other than particularly dense and and large cities, which makes me wonder, maybe not in ten or twenty years, but in thirty or forty years, whether the AIs will encroach upon our agricultural land and make it harder for us to eat.

Speaker 2:

Yeah. This runaway AI that just decides it's in my best interest to blanket as much of The US or or the world as I can with solar to, you know, further myself.

Speaker 3:

Yeah. I mean, it seems to create a strong economic forcing function, and maybe we can slow it down a few years with usual regulatory stuff. But I think that may be one of the reasons why why Elon has been talking more about doing large scale AI development in space. You can convince the the AIs, they're better off doing it in space where they don't have to fight with humans. Maybe they'll go there first instead of I mean, there's there's there's plenty of land on Earth.

Speaker 3:

There's there's more land on Earth that's not being farmed than there is land that is being farmed, but, you know, it's not inexhaustible. Yeah. Something like like a hundred hundred thousand terawatts lands on the oceans and 50,000 terawatts lands on parts of the earth that no one lives on or uses. And humans consume about 10 terawatts of energy. So there's plenty to go around for now.

Speaker 1:

Yep. For now.

Speaker 2:

That makes sense. I wanna get into NASA, government shutdown, bunch of stuff. But before Let's

Speaker 3:

touch the third rail.

Speaker 2:

I'm ready. No. Let let's no. Let's go even more schizo. Tucker had a guest on the show yesterday that was talking about talking about chemtrails.

Speaker 2:

What what's your take on Mhmm. On on on chemtrails?

Speaker 3:

I'm a I'm a pilot, and I've I've flown around a fair bit, and I've never seen one. I think I think that, like, you know, playing around in kerosene is not very good for you. And and we we do see, like, healthy impacts from people who live too close to freeways and things like that. You know, there there are definite health impacts from from burning the longer chain hydrocarbons. As far as chemtrails go, I I must confess, I'm not not as deeply familiar with various conspiracy theories like that.

Speaker 2:

Yeah. Although I do think that

Speaker 3:

NASA's been covering up life on Mars. So we'll see about that.

Speaker 1:

Covering up life on Mars?

Speaker 2:

Yeah. Be before we talk about life on Mars, the chemtrails thing is funny because you have to believe that that, you know, tens of thousands of people are coordinating to do this large scale operation. Oh, yeah. Of Not not one of them has ever come out and said, hey, here here's here's here's evidence that, you know, this massive operation is is happening without

Speaker 3:

There are aircraft which do spray chemicals. Right? Like, called crop dusters.

Speaker 1:

Yeah. Crop dusters.

Speaker 7:

And they

Speaker 3:

have a very particular look and feel.

Speaker 1:

Of course.

Speaker 3:

And then and then, of course, if you if you if you spend much time doing, like, structural calculations and you you look closely at, say, a triple seven or something, it's pretty clear that that sorry. My son's at work today, and he just wanted to see if he'd come in as

Speaker 2:

It's a family friendly shout. It's Veterans Day. Yeah. Yeah.

Speaker 3:

His face is not on the Internet, though. He's not a veteran if he was, and hopefully, never will be. Crop dusters aircraft. You know, fly to Australia every now and then, and you're like you know, actually, the conspiracy theory that that planes can't contain enough fuel makes more sense to me because it seems unlikely to me that that you could actually keep a plane that big up in the air fourteen hours and cross an entire ocean, like, halfway around the world. Oh, yeah.

Speaker 3:

Like, where where where are you putting all the all the chemtrails you're spraying out? Like, the overhead bin. There's never any room in there. Yeah. So I don't understand.

Speaker 1:

Like, yeah. Yeah. We got plenty of space. That's doesn't work out. Let's bring the fuel and the and the toxic chemicals or whatever.

Speaker 1:

Yeah. I mean, there there are there are levels to the to the conspiracy because there is just the world of, like, what if it's just, like, leaded gasoline where it's, like, sort of the chemicals are deranging people in a bad way because it's just pollution. Like, I don't think people Yeah. Really debate that, like, the possibility of pollution is happening. It's different when it's like, okay.

Speaker 1:

Yes. It's like specific chemical that does a specific thing and and triggers a specific reaction from the population.

Speaker 3:

I mean,

Speaker 2:

there are Okay. Why are the

Speaker 3:

that can make you crazy. Most of them taste very good.

Speaker 2:

Really? Like what? Stevia.

Speaker 3:

Diet Coke? No. Mean, like, quite quite literally, like, until the invention of GLP one agonists, like, something like a third to a half of Americans were gonna die quite younger than they otherwise would because of, you know, weight weight weight related heart disease and obesity. Why? Because sugar tastes good.

Speaker 1:

Yeah.

Speaker 3:

Yeah. Okay? That's true. That's a poison. It's poisoning you.

Speaker 3:

It's it's actively degrading the quality

Speaker 1:

of life.

Speaker 2:

Ray Pete would would disagree. Would disagree. But But Why are we covering up life on Mars? Yeah.

Speaker 3:

So to be fair, no, you don't wanna be the boy who cried wolf and shout, it's aliens. It's aliens. It's aliens. Sure. But so so so what tends to happen in in these in in scientific fields is it becomes very fashionable to be quite skeptical

Speaker 1:

Mhmm.

Speaker 3:

Of of things, as as it should be. But at the same time, you know, evidence will accumulate over time, and and so you'll have this kind of neat cottage industry of people who can explain away any evidence they see Mhmm. Without without necessarily updating their their prior prior knowledge or prior assumptions about it. And I I I was trained at Caltech by the geologist. I've forgotten his name.

Speaker 3:

This is age. Okay. It's coming to

Speaker 1:

you.

Speaker 3:

It'll come back to me eventually. Koshenk. Joseph Joe Koshenk. And a legend legend among men. And and he was the guy who who, when he got his sample of ALH eighty four zero zero one, a Martian meteorite that was found in Antarctica, you know, a while ago now, that he he he identified these cubo, like, hexeroctohedral magnetite crystals that they were in the meteorite.

Speaker 3:

It's it's not controversial that they were found in the meteorite, and it's also never ever been demonstrated they can be made by any known lab process other than culturing magnetotactic bacteria that use them for navigation Mhmm. Essentially. So so, you know, the the meteorites are full of magnetite. Magnetite is a naturally occurring mineral. But Mhmm.

Speaker 3:

There's a particular configuration of of of its crystalline form, we we only know can occur we we only know ways it can occur with biological precipitation. So, yeah, that's that's it's called the magneto fossil if you wanna Google it at home. And I think it's it's it's about as strong as evidence you could possibly hope for finding without actually going there and kicking rocks yourself.

Speaker 1:

Are you familiar with this thing, the great unconformity in the Grand Canyon? Have you ever heard of

Speaker 3:

Yeah. I've been there.

Speaker 1:

Yeah. Can you explain this to me? Somebody sent this to us.

Speaker 3:

Was like,

Speaker 1:

you gotta talk about this on the show. And I was just gonna read the Wikipedia, but now we have someone who actually is familiar with it.

Speaker 2:

No. No. Expert.

Speaker 1:

I don't wanna pop quiz you, but, like, it would be cool if No.

Speaker 3:

I've been there. I've there several It's it's it's amazing. I've taken my kids there. Okay. Yeah.

Speaker 3:

So so if you go to the Southern Rim of the Grand Canyon, there's a trail called the Bright Angel Trail. And if you're very, very fit and it's not too hot, you can walk down to it and back in a day. But I would recommend starting at midnight. Right? Mhmm.

Speaker 3:

You get down to it in the early morning and then and then, you know, after after after daybreak, but then you get back up before it gets hot. Or take a couple of days. Enjoy it. But there's also an exposure just Northeast Of Vegas at a place called Frenchman Mountain, which you can just drive right up to. Although last time was there, was unfortunately covered in broken Okay.

Speaker 3:

Great unconformity. So over time, you know, the the oceans rise and fall due to climatic variations, and also land rises and falls due to geological stuff. And when land is above sea level, in general, it's it's eroding. And when it's below sea level, it's in general accumulating sediment. And so what you see in the Grand Canyon, that is to say that most of the Grand Canyon is a series of layers of of carbonates, which means it's it's shallow oceans that are forming reefs and sand sand dunes and sand sandstones, which means it's, you know, riverine kind of delta environment or something where where the land is gradually sinking relative to the ocean, creating an accommodation space in which which new new dirt and and soil and rocks and stuff can can come down and and fall and and be compacted and form these layers.

Speaker 3:

And then what happens is if the ocean falls down or the land gets lifted up, then see erosion. And what happens is that newly formed rock gets eroded away for a while, then it sinks again, and then it starts to build up again. And so you get these like what are called little unconformities, which is where you've got a little period of missing time, a million years here, a hundred thousand years there, ten years over here, and maybe ten million years over there, where where the land has been above sea level and so no new rock has been forming, or if it has since been eroded away. Mhmm. Okay.

Speaker 3:

So this occurs at all scales. And it turns out that the at the bottom of the Grand Canyon or very close to the bottom is the thing called the Great Unconformity. And it's kind of mind boggling because the rocks on on one side of this are this kind of the Vishnu Schist basement rock. It's this ancient rock that forms the, you know, the core of our North American continent, at least in this place. It's it's ancient at crystalline.

Speaker 3:

It's 1,600,000,000 years old. It's like 13% of the age of the universe. And then literally, like, one thumb's width away from that above it is the the oldest sedimentary rocks in the Grand Canyon, are, I don't 500,000,000 years old. I think they're just after the after the Cambrian explosion. Okay.

Speaker 3:

This is stretching my memory. It's a long time ago now. Yeah. 535,000,000 years ago. And it's just kinda crazy.

Speaker 3:

So so why It's a million. It's a missing billion years. Yeah. It's a missing billion years. And so so so in that time, rock formed.

Speaker 3:

Right?

Speaker 6:

Sure.

Speaker 3:

And probably entire mountain ranges formed. Like, it takes it takes a hundred million years to form a mountain range and then eroded away to nothing.

Speaker 7:

Sure.

Speaker 3:

Right? So like the Appalachian Trail. The Appalachian Mountains were were a mountain range taller than the Himalayas a hundred and thirty million years ago. Pretty much eroded away to nothing. So you could form and destroy a mountain range the size of the Appalachian Mountains.

Speaker 1:

In that time?

Speaker 3:

Eight, nine, 10 times in that time. Entire continents form and sink. Entire supercontinents form and sink. Probably not dinosaurs back then as far as we know, but but there was definitely, like, life and lots of life in the oceans and and some plants and stuff on on land and some insects and things. Actually, insects, I think, came later.

Speaker 3:

Look. Don't quote me on this. Check Wikipedia. Yeah. Ask Grok.

Speaker 3:

GROCopedia. But it's all gone. It's all gone.

Speaker 1:

It's all gone.

Speaker 3:

At least you know? And and I mean, when you look at one of those geological time scales, you see a lot of recent rocks and not so much detail in the older rocks. And that's just because, like, as you go back in time, get exponential destruction of the geological record. There is actually a corner of the Eastern Grand Canyon where what's called the Grand Canyon Supergroup, which is a a a series of layers. It's inclined like this kind of pokes up.

Speaker 3:

That's the only place on earth that we have rocks from that age. That's kind of an intermediate intermediate age rock. It's Yeah. Just bizarre. So you can go there.

Speaker 3:

You can just just outside Vegas. Like, skip skip the tables for half a day, drive out, and

Speaker 2:

Maybe they could put a casino at the great unconformity.

Speaker 1:

On on the Wikipedia page, the reason this is so interesting is because on the Wikipedia it said on the Wikipedia page, says, there is currently no widely accepted explanation for the great unconformity among geoscientists. There are hypotheses that have been proposed. It is widely accepted that there was a combination of events which may have caused such an extreme phenomenon. One example is a large, glaciation event which took place Yep. During the Neo Protozoic, around 720,000,000 years ago.

Speaker 1:

And so there's, like, a whole bunch of theories, but it's just quite there's, like, it's

Speaker 3:

How did they grind this way? So much so much rock? Yeah. Because because, essentially, like, where this where the rock is forming and being destroyed is very close to sea level, almost always. Right?

Speaker 3:

Yeah. And and so at some point, the rock that is on the underside of the Great Unconformity had to have been lifted up near sea level. Mhmm. And then it has to have sunk down a mile or two to form all the rocks above it.

Speaker 1:

Mhmm.

Speaker 3:

Right? And now it's been lifted up again above sea level because the bottom of the of the Grand Canyon, the Colorado River is obviously above sea level. Otherwise, it wouldn't flow to the ocean. So kinda crazy that this goes up and down. Yeah.

Speaker 3:

I mean, that's geology for you. There's still work to be done. Maybe you'll figure it out one day.

Speaker 2:

Okay. Let's let's talk about

Speaker 3:

supporting a new grade Right? Like, the the rocks below the grade unconformity, the vision is just to being eroded by the Grand Canyon as we speak. So if the Grand Canyon ever fills in and starts to, you know, build rocks again in the future, the the Grand Unconformity will be below where it is now Mhmm. In that part of the world.

Speaker 2:

Yeah. Okay. Well, let's let's talk about something we can all agree on, which is three eye Atlas. What's going on with that? What?

Speaker 1:

I don't know.

Speaker 2:

Know, this is a

Speaker 3:

comet. Yeah.

Speaker 2:

Some people think it's a they're saying it's a comet. A lot of celebrities are saying it's a an alien Okay. Okay. Ship. What's the Casey Hammer?

Speaker 2:

Yeah. Give us a breakdown on it. We haven't been following it. There's been enough happening in in technology. Likewise,

Speaker 3:

I've been focused mostly on the finer details of, like, fixing electrolyzer seals and things like that. I would say that these interstellar objects are super interesting Mhmm. And I would like to get a close look at them. And I really wish that we had some like, maybe Tom Mueller could build a series of spacecraft with, like, lots of onboard ISP Mhmm. That we park out, you know, near the moon or something.

Speaker 3:

And then when we one of these comes through, we press the button, and it immediately, like Oh, okay. Adjusts orbit and shoots off in that direction, we get a nice flyby. We get

Speaker 4:

a few

Speaker 3:

photos. That'd be super cool. We're we're seeing them about once a year at this point. That's just because our our telescopes have gotten better. As far as three eye Atlas goes, I have a cousin who who routinely texts me updates on, it's an alien ship for sure.

Speaker 3:

I I don't think it is. I think it's a comet. I think that actually some of the earlier ones will be better than it. I think it's a very large comet, a very old comet. Seems to be a comet to me.

Speaker 3:

But

Speaker 2:

advanced alien, you know, race, would you not wanna just attach yourself to a comet that was headed in the direction you wanted to go and just ride along with it?

Speaker 3:

No. No. I think I think

Speaker 2:

Too slow.

Speaker 3:

Well, so I went yeah. Exactly. I went well, who knows how long the aliens live? But I I went to I went to Caltech to study kind of warp drives to try and understand how to do that, and I didn't succeed, obviously. But someone might one day.

Speaker 3:

But I think that if you've seen Avatar two, the the spaceships that Jim Cameron puts there for the humans to fly to Alpha Centauri are, like, kind of what I have in mind when I think, like, how would how would humans or aliens that are comprehensible, eligible to humans travel? It'd be something like that. You know? Enormously energetic, relativistically accelerated, very, very bright, very flashy, very obvious. Or or maybe something like walk drive if we can ever figure it out.

Speaker 3:

But I think, know, a comet that takes a hundred thousand years to kind of wander its way from the nearest start of here, very boring. Very boring.

Speaker 2:

Yeah. Boring. Yeah. Alright. Talk about that.

Speaker 1:

Moon or Mars? This was a debate that we were having earlier. Seems like, the Jared Isaacman nomination was kind of reinvigorating some discussion over whether The US should prioritize Mars over the moon. Elon has been pro Mars for a very long time. Moon is kind of incidental.

Speaker 1:

Other folks see moon as, like, a key geopolitical race that we have to address right now and win, and then that gets you a ticket to the real the real competition, which is Mars potentially, how are you thinking about the trade offs between resources spent going to the moon, resources spent going to Mars?

Speaker 3:

So initially, I was quite pro Mars, and I wrote a couple of books about doing stuff on Mars and realized that, you know, its advantages are actually, like, both both both destinations are pretty difficult in their own way. They both have their challenges. And and, actually, I think what SpaceX correctly realized is that if you're gonna do anything meaningful on either place, you have to develop a rocket system like Starship Mhmm. That can basically fire hose mass at pretty much any target in mind. You know, it's not a fair fight, you know, kind of situation.

Speaker 3:

And and so once you have something like a Starship to the you know, doing something pretty cool on the moon is very straightforward. But if you're if you're constantly trying to put together some mission that's, like, just made of, like, little bits and pieces and and small puny rockets and very expensive, you know, slow going contracts and stuff, you really struggle to do anything meaningful on either of them ever. So, yeah, I'm I'm very much in favor of of, you know, kind of incidentally on the way to Mars, setting up a couple of lunar research stations and putting a few thousand people there and running it like the space station is now, but for more than six people. Yeah. And I think we could totally do that for, like, Antarctica program level budget with Starship.

Speaker 1:

Yeah. My my my theory on it is just as, you know, an outsider has always been fire hose mass at low Earth orbit, get starlings going every day, tons of I love those montages that are just showing Falcon nine's up and back every single day. Do that in low Earth orbit, then do that for the moon, and same thing. Like, dozens of flights to the moon and back every single day. And then Yep.

Speaker 1:

And then start, you know, weaving to the to Mars because it feels like Mars is just a completely different, set of of trade offs and cadences based on when we can actually launch. It's not every day. And so, just getting the reps of, like, humans just even just a human or, I don't know, maybe a robot at some point. Like, getting off of a like, landing a rocket on a celestial body and then getting out and jumping around. Like, even if that's if that's happening every single day, it's gonna be that's a lot of that has to transfer to Mars, I would imagine.

Speaker 2:

What's the most underrated planet?

Speaker 1:

Yep.

Speaker 3:

Underrated planet.

Speaker 2:

What's a planet that you think about a lot

Speaker 1:

I spent

Speaker 2:

a lot

Speaker 3:

of time thinking about Mars.

Speaker 1:

Mars? Like Still underrated? Mars. Most people would put it they're number two

Speaker 3:

I I after Earth. Yeah. I had a little little side project where I I built, like, the highest resolution topo map of Mars using using a bunch of, like, NASA data that's kind of in the archives.

Speaker 1:

Interesting.

Speaker 3:

Not not being directly funded. So I've I've a six meter resolution global top of tomography map from Mars. Wow. And I've been doing some, like, hydrology simulations and stuff with it to see what happens when we terraform. It's yeah.

Speaker 3:

So it's kind of hard to appreciate, but, like, planets, even small planets like Mars are overwhelmingly big. Once you start dealing with these planetary scale datasets, it's like 200 terabytes. You know, I start asking my wife, can I please buy another hard drive? And she's like, you've already spent $5,000 on hard drives this month. You should slow down.

Speaker 3:

I'm like, well, Okay. But, like, planets are really, really big. And yet, I have a, you know, almost photorealistic render of of Mars to, you know, sub 100 meter resolution sitting on a on a spinning platter on my desk at home. It's pretty wild.

Speaker 1:

That's pretty cool.

Speaker 3:

And then we get to do that. I could never have dreamed of that when I was a kid. It's just un unimaginable.

Speaker 2:

Yeah. Air traffic control, should it be run by A pregnancy pregnancy test.

Speaker 3:

No. I don't think so. But it could be. So so, you know, I mean, at the end of the day, like, the the shoulds, the the normative discussion is is kinda beyond my pay grade, but the descriptive discussion of, like, whether whether it could is something that I think I can opine on. And and, yeah, I mean, fundamentally, what a a graphics, you know, system does on a computer is it's doing, you know, roughly a billion collision calculations a second.

Speaker 3:

And Mhmm. That that's a much more sophisticated graphics engine than than Doom, but but still it's something that you can buy in any PlayStation or something today. And ATC is just not that complicated. Mhmm. Right?

Speaker 3:

It has it has a bunch of edge cases, but it's not that complicated. And and it's also a a system that for justifiable reasons is very, very conservative about adopting new technology. But at the same time, you know, the the cost that we endure as a as a civilization to various air traffic control problems, and they're not only only caused by government shutdowns Mhmm. Kind of undeniable. And there's also an efficiency aspect.

Speaker 3:

You know, we could probably shave 5% of our fuel usage if we were able to have basically weather weather optimal director point to point flights for for all aircraft that that wanted them rather than usual ATC thing, which is, you know, you basically follow existing instrument approved routes between, in many cases, like, places on maps that were once radio navigation beacons but no longer exist.

Speaker 1:

Yeah. You mentioned, giving in giving the project to the Mag seven. I wonder, do you think that there's an opportunity for a start up to sort of do, like, the Andoroll flipped model, build it with venture capital dollars, and then try and sell it to the government once you have a better system in place? And, also, is the actual software and hardware, like, actually the problem? There was that, Nathan Fielder Yeah.

Speaker 1:

HBO special that was kind of like, it's maybe more about pilot communication and human training, human error when something goes wrong, and it's not necessarily like a computer system and, like, a bad line of code.

Speaker 3:

Yeah. I think you probably still wanna have have people sitting in in control towers looking at radars. Yeah. I'm not saying, like, the the solution to our problem is to fire every last air traffic controller. Mhmm.

Speaker 3:

Quite the opposite. But, like, think about what those air traffic controllers are doing a lot of the time. It's stuff that could be a few lines of code. Right? Like, it's one of these things we got, I don't a thousand corner cases, but 99.99 is just one case.

Speaker 3:

Mhmm. It's just the standard, like

Speaker 1:

Did these intersect?

Speaker 3:

In in instrument routing. Well, just like you basically get a clearance to fly to fly a certain route at a certain time. And then, you know, all goes well. You'll never see another plane. But but, yes, every now and then, you've gotta gotta keep an eye on things.

Speaker 3:

You gotta control the airspace. Yeah. It's not it's not a complicated puzzle. I mean, like, Microsoft Flight Simulator 95 had a pretty good simulation of this. So the sort of thing that I think is, well well within our technical capabilities.

Speaker 1:

Yeah. I something guess that

Speaker 3:

we actually want to do.

Speaker 1:

Yeah. I guess to some degree, you can forecast the entire flight path before the plane takes off and make sure that it does not intersect with any other flight paths at any point in time.

Speaker 3:

Yeah. That's relatively straightforward. I think I think some people on X were like, oh, you know, sure. We should just hand another monopoly to to Starlink to do all this stuff. Think no.

Speaker 3:

Actually, I don't think that's the case. I think that you could actually design a series of algorithms that would run centrally, but also autonomously on every aircraft. And so you have like, everything in in aviation is is, like, multiple layers backups. So so there's always always something checking the checking the system that sorry. And a coworker just brought in a a new baby because I I started this company, then all my employees start having more children.

Speaker 3:

I think I'm doing something right.

Speaker 2:

That's that's a great sign.

Speaker 3:

Yeah. But yeah. So so the so the idea there is, like, your your onboard computer onboard your aircraft, if you're flying a, you know, commercial seven thirty seven or something, should be, like, getting minute by minute updates on weather and tailwinds and headwinds and so on and being like, well, you know, I can save 5¢ fuel fuel if I go up 1,500 feet, you know, route into the jet stream. Sorry about the bumps, everyone, and we get the

Speaker 1:

route

Speaker 3:

faster. And and then it just kind of automatically does the flocking thing where it it sees and avoids and and doesn't intersect with other other aircraft. But, yeah, just just the the technology adoption cycle at the FAA is understandably and, you know, airlines and and aircraft manufacturers understandably are pretty conservative, but it's also very, very slow. And so we end up with a situation where where, you know, delays and and actual crashes and stuff are occurring because the system is not as good as it could be. So I think the the way I mean, it's way beyond my pay grade.

Speaker 3:

But the way you would do this is you would you'd you'd basically commit to buying or purchasing a family of of improved products that are interoperable according to some pretty straightforward standard. You could base it on, you know, Australia and Europe has largely automated ATC at this point. And and then you integrate Europe

Speaker 2:

automated automated

Speaker 3:

what is there already?

Speaker 2:

Europe automated At the same time traffic control before

Speaker 3:

Europe in Australia is a terrible place to be a pilot. Right? Like, in The United States, you can you can fly VFR. Like, as when I was a private pilot, I flew VFR. Visual flight rules.

Speaker 3:

18,000 feet.

Speaker 1:

Not instrument. You can

Speaker 3:

you can do basically anything you want. And I don't know if the if the ATC systems could could handle that quite yet. But when I was flying, we didn't have ADS B rolled out very well. So, know, you basically, you had to look out the window all the time and try and spot other aircraft. Yeah.

Speaker 1:

Yeah. Do you have a

Speaker 3:

take dicey in LA sometimes.

Speaker 1:

Do do you have a do you have a strong take on the AI CapEx build out? It feels like I I I just feel like your company and a lot of the a a lot of the companies kind of of your vintage are very much aligned with, like, do something really ambitious that as it scales, it compounds. And all of a sudden, like, once it's working, you're marshaling. Like, I I do imagine when I think about your the story of your business, I think about, you know, billions and billions of dollars flowing into more and more solar panels and just this crazy flywheel once the technology gets working. We've talked about this in the past.

Speaker 1:

It feels like that's kind of happening in the context of, like, token factories right now. But are there anything that that has given you jitters to the structure or how things are playing out? Or do you or or do you see it as, like, as like, one narrative is just, it it felt cool to watch big things get built. And I'm wondering, like, how you're processing, like, the AI CapEx build out, all the crazy CapEx that's going on.

Speaker 3:

I think The United States has always done well when they're able to take a a difficult problem and transform it into a form that could be solved by pouring money on it and then pouring money on it. Yeah. Let's give it up for

Speaker 2:

Yeah. Pouring

Speaker 3:

And so if if someone were to ask me, like, do you really think that every dollar being spent on hyperscaling right now is being spent the best possible way it can? No. Of course not.

Speaker 1:

Yeah.

Speaker 3:

Right. But is it nevertheless efficient from a perspective of competition to to spend as quickly as they can to wrap this technology up right now? Yeah. Yeah. Mean, I there's obviously a they're there.

Speaker 3:

I use all the models every day. Yes. I'm a discerning customer, and I think that's quite transformational. I mean, frustrate me as well. But, like, the the the the direction of improvement is very clearly going in in one direction.

Speaker 1:

Mhmm.

Speaker 3:

And this is not a a race where United States can afford to be anything but first. And and I think it's a real question to wonder, like, well, what what does the UN Security Council look like in fifteen, twenty years when when, you know, basically, most of them today have their own nukes, and I think they all actually, UN Security Council permanent members all have their own nukes these days. And in the future, most of them will not have their own AI sovereignty.

Speaker 2:

Oh, do you think

Speaker 1:

AI sovereignty projects are not going to bear fruit because they're under scoped or under scaled? Because I feel like a lot of countries would say, like, yeah. You just answered the question. Lot of countries would

Speaker 3:

When you can fast forward that.

Speaker 1:

Yeah. We're doing a we're doing a one gigawatt data center. Like, we're good. We're checking The

Speaker 3:

United States is is deploying hundreds of billions of dollars in this direction right now. Yep. And and how much is I mean, China, I I guess, is serious about it. Do you think Russia's serious about

Speaker 1:

Do you think The

Speaker 3:

UK and France are? It does not Yeah. What about Russia?

Speaker 2:

Russia's biggest clusters are, like, a thousand GPUs.

Speaker 1:

Australia, we did hear, is investing in quantum computing. Do you think that that's gonna be relevant?

Speaker 3:

No.

Speaker 1:

Can you say more?

Speaker 2:

Just sent the market down.

Speaker 3:

Oh, I can I'll send I'll send it Australia. Shambled.

Speaker 1:

No. No. No.

Speaker 3:

No. It's Australian because Australia has has, like, basically the best solar resource on earth and and it has, you know, it's a modern western liberal democracy with, like

Speaker 7:

Sure.

Speaker 3:

Rule of law and and incredible natural resources and and 25 plus million well educated wealthy people. Yeah. And and every time there's a new wave of technology, they go, oh, never mind, mate. We'll catch the next one.

Speaker 1:

We'll this.

Speaker 3:

And drives me crazy. So so that's why I'm here. But

Speaker 2:

they have Kira. Yeah. You ever surfed? You ever surfed there? It's one of the best point breaks of all time.

Speaker 3:

Well, Yeah. Yeah. No. No. The beach is doing comparable.

Speaker 3:

And like, I live in I live in the San Gabriel Valley. Yeah. I come to Santa Monica about once every two years Yeah. Under duress, I will add. And people say, oh, Santa Monica.

Speaker 3:

It's the most amazing beach. I'm like, yeah. Maybe if you're like from, I don't know, like Canada or something. Yeah. But I grew up on I grew up on Kilcair Beach in in the center coast of New South Wales.

Speaker 3:

It's I would I wouldn't I wouldn't stop all of Santa Monica for like a single bucket of sand from that beach.

Speaker 2:

It's the best sand.

Speaker 3:

Absolutely incredible.

Speaker 2:

Best sand on Earth.

Speaker 1:

You've been there? Yeah. You've surfed in Yes.

Speaker 2:

Australia? All over Australia. I

Speaker 1:

didn't know that. Oh, that's cool. Yeah.

Speaker 3:

E v p n down under. Yeah. We need to mark your calendars.

Speaker 1:

Down under. I I like I

Speaker 2:

like I love Australia. Australia feels like a just a very large

Speaker 1:

The Fosters California.

Speaker 2:

It feels like California is a continent.

Speaker 1:

A violet crumble. That'll get me going.

Speaker 2:

I love violet crumble. Crumble. Last thing I was curious to get your take on. This company, t one energy went super viral last week because they're just making a massive amount of solar panels. Everyone got really excited because including ourselves saying, wait, we know how to build things.

Speaker 2:

And turns out it was a Chinese company that was like a forced seller of their of their manufacturing facility. Do you think there's anything that we can learn from t one energy and and what what the the original kind of builders of this facility did in terms of, like, just scaling that out?

Speaker 3:

I I haven't actually heard the story, so I'm not the right person to comment.

Speaker 2:

Yeah. T one energy. I mean, you you would you'd be interested in the story. It's a it's a $8,800,000,000 stock. They're making one gigawatt worth of solar a year, and people got really excited.

Speaker 1:

That's tiny.

Speaker 3:

Yeah. I mean, like, globally, we're make 1.4 terawatts of solar this year and deploy about 700 gigawatts. Yeah. So Yeah. A couple of years ago, used to say it's it's about one megawatt per minute, but now it's it's closer to one megawatt every forty seconds.

Speaker 1:

Forty seconds.

Speaker 3:

So yeah.

Speaker 2:

We gotta get those Okay. Numbers

Speaker 1:

And and so is is are you, like, in

Speaker 3:

you Like, 2,000,000,000 modules this year.

Speaker 1:

Like, would you encourage one

Speaker 3:

for every family on Earth.

Speaker 1:

Would you encourage America or, like, the like, the venture ecosystem or the government to, like, to, like, try and bring, you know, manufacturing capacity back to America? Because when you've talked about this before, you've always said, like, hey. If China's gonna be subsidizing it, just buy as much as possible. Be and that makes sense, like, economically. But over the long term, like, if you need to scale exponentially, there there might be a case for reshoring.

Speaker 1:

Do you think that that's a good a good idea?

Speaker 3:

Yeah. I think controlling supply chain for for solar technology is super important. And and just because China currently leads doesn't mean The US can't lead in the future. Yeah. I mean, we've we've seen this in the past.

Speaker 3:

Right? Yeah. The the industry is extremely dynamic. You wouldn't have to try all that hard to you know, if you if you had like a 5% edge over your competition, you could be the number one in five years, something like that. Right?

Speaker 3:

I haven't seen factors in production in United States.

Speaker 1:

Why didn't you look into that or or or why aren't you vertically integrating? Is that just something that might happen in the future?

Speaker 3:

I mean, might be the last thing we vertically integrate. Sure. It's it's it's a well it's well it's it's not really a problem for venture or government. Right? It's problem for banks.

Speaker 1:

Sure.

Speaker 3:

Right? It's like buy buy the machine from from Germany, put it in a shed, and put it in a giant building, you start churning them out. So it's it's it's pretty well understood how to do this. Yeah. It's not really a technology problem technology development problem.

Speaker 3:

But that's my bread and butter. My bread and butter is, like, doing doing weird complicated stuff.

Speaker 1:

Is the reason that it's in China mostly labor cost or environmental concerns or regulation or just the cost of capital? Do you have an idea of, like like I understand, like, the whole story of yeah. The whole story of, like, why the iPhone landed there. Like, I understand that story a little bit. I don't necessarily understand why solar panels went there specifically.

Speaker 3:

Yeah. I mean, Australia invested a bunch of money into developing this technology, and and then Germany spent a bunch of money on manufacturing it at a steep loss. Mhmm. And both of them basically gave up at some point. And meanwhile, China looked around and said, we just had the Beijing Olympics.

Speaker 3:

We're kind of a big deal now. And, you know, we we'd like to do something about Taiwan sooner or later. Mhmm. But the last time anyone got uppity in the East you know, Eastern China Sea area, they they they they did a lot of damage until until United States curb stomped them with by choking their supplies of of energy. Mhmm.

Speaker 3:

And it turns out that neither Germany nor Japan had had anywhere near sufficient supplies of of the domestic energy sources. And once the combined bomber offensive and the the submarine war took out their ability to transport oil from where it was being made to where it was being used, basically, both war efforts collapsed within months. So if you're China, you'd be like, where's my oil coming from? You know, 16 so twelve twelve million barrels a day is coming from coming from The Middle East. A long way away.

Speaker 3:

Fire up by via, like, Straits Of Oman and Straits Of Malacca Straits Of Hormuz Straits Of Malacca, which we don't control, can't control, don't have a deepwater, blue water navy, you know, via via our, you know, historical and and current geopolitical adversaries who kind of allow the oil to pass by because they're they're playing nice today. Right? Like, maybe maybe it's actually something we should spend. Yeah. Because The US sorry.

Speaker 3:

The the Chinese political economy is is much more kind of expansionary in its spending. It's much more supply than driven rather than demand driven. So if we're gonna spend a bunch of money on, like, you know, more or less inflationary projects, may as well spend them on giant solar panel factories, and they got really good at it. And Yeah.

Speaker 2:

Know, all

Speaker 3:

credit to them. Yeah. Like, it's a it's a it's a bloody lot of hard work to do that, and they did it. And now, basically, we all benefit from that, you know, if we're willing to buy from them. But that doesn't mean it's something that's, like, intrinsically impossible for United States to do.

Speaker 3:

You know, it's it's just a question of will. It's a question of who who actually wants to go and do this at scale.

Speaker 2:

Mhmm. Always a pleasure.

Speaker 1:

Always a pleasure.

Speaker 2:

Thank you for taking the time. After you Wow. After you create abundant energy, we're gonna require that or we're gonna make a a bid for you to become a, you know, a a more, invest more of your time in podcasting. Yes.

Speaker 1:

Yes. We do enjoy this.

Speaker 3:

That'd be hard hard to do at this point. Yeah. Really sure.

Speaker 1:

Yeah. You have. You've been on a tear. Congratulations. You've been, you've been all over.

Speaker 3:

Well, I know your secret. That sitting around talking about stuff is is incredibly fun.

Speaker 1:

It's so much

Speaker 2:

It really is.

Speaker 1:

Well, thanks for doing it with us today.

Speaker 2:

Great to see you.

Speaker 1:

Come by.

Speaker 10:

You too.

Speaker 2:

Talk soon.

Speaker 1:

Happy Veterans Day. We'll talk to you soon. Let me tell you about fin dot a I, the number one AI agent for customer service, number one in performance benchmarks, number one in competitive bake offs, number one ranking on g two. We have a surprise guest joining us in just a few minutes. We'll see if he can join in the Restream waiting room.

Speaker 1:

He says he's down to join. In the meantime, let's go through some more posts.

Speaker 2:

What else Back in the timeline.

Speaker 1:

Timeline. There's some news in the Wall Street Journal.

Speaker 2:

You see some of these images leaking from nana banana too?

Speaker 1:

No. What is this?

Speaker 2:

They are a little too, photorealistic for for my Interesting. For my preference.

Speaker 1:

Okay. What's going on here? Oh, wow. Okay. So it's got Jeffrey Epstein and With Diddy.

Speaker 1:

Diddy. Together. These are very accurate images. They seem remarkable third term. Yeah.

Speaker 1:

Is this, so this is from Roberto Nixon who's who came on our stream at Metacinact. He says, wild times. Nano Banana two apparently leaked a few hours, today on some platforms, and the limited results by a handful of users have been pretty insane. The model was apparently from an uncensored, slightly older checkpoint. When it actually launches, it won't be this uncensored, of course, but crazy to think what a handful of engineers at the Frontier Labs have access to and the chaos they could cause if they so wished.

Speaker 1:

I wonder yeah. It's so it's so interesting where I I would love to actually trace where these leaks came from, what the whole flow is because, it's possible that, like, it's possible that there's, like, some Photoshop in here as well. Like, if you're in the business of creating a leak or a fake leak, you can, you can do all sorts of funny things. There was a, what what was it? There was someone who what what did they do?

Speaker 1:

They went and they they filmed something normally, and then they went into After Effects, and they made it look like they were on a green screen for it. And they made fake behind the scenes footage for their real footage. So it looked like they were sir so they actually went surfing, and then they and then they and then they made a a CGI version behind the scenes. Just like that, there there's that Apple, trailer now for the for the introduction to, Apple TV. And they say, we made it all, practically.

Speaker 1:

We did it all practically. Let me, see. We need a we need an image for, for our next guest because we are not doing video. We are just doing

Speaker 4:

He is semi

Speaker 1:

non But we have Growing Daniel in the Restream waiting room. Let's bring in Growing Daniel into the team.

Speaker 2:

There he is.

Speaker 1:

Growing Daniel, how are you doing?

Speaker 7:

Doing really great. How are you guys doing?

Speaker 1:

Fantastic.

Speaker 2:

You're doing well. It's great to finally have you on the show. It only only took

Speaker 1:

Very excited to have you.

Speaker 2:

Invites, but you've been busy, and you finally made it.

Speaker 7:

It's an honor to be here. I didn't I didn't have anything to say before.

Speaker 2:

I'm glad. Now you do. I'm glad. Yeah. Yeah.

Speaker 2:

You're launching the moral discernment company of San Francisco.

Speaker 1:

Getting into moral discernment. Finally. I would love to know what that what that means. Well, like, is this is this is this a vibe shift for on your part, or do you feel like this is a continuation of your, you know, affinity for moral discernment? Walk me through how you evaluate, like, what is or is not a good start up idea or, like, a virtuous.

Speaker 1:

Like, Trey Stevens says that good quests philosophy. Like, how how are how do you think about judging the the work of the technology industry that we see every day on display on X?

Speaker 7:

I guess it's not really complicated to me. I mean okay. So if you're selling, like, gambling, then, you're basically, selling drugs to people. Mhmm. Like and everyone knows that.

Speaker 7:

Right? I I didn't think it'd be a big thing to say, but you should wanna build things that are good for people. And I think that's exactly what the pope was saying. If you are engaging in the act of creation, then you should try and create something that's actually good for people. And I think over the past ten years, we've had this, like, really weird, I guess, school of moral philosophy that we all collectively call woke.

Speaker 1:

Sure.

Speaker 7:

It's been really dominant, and it's had an extremely aggressive, priest class that tries to get every tries to coerce everyone into bending the knee to it. And nowadays, that's not a very popular perspective, thankfully. I think that that fever kinda broke. But I don't think what replaces it is nothing. I don't think that's a good solution.

Speaker 7:

I think everyone should want to should want to help.

Speaker 2:

Actually, you cannot criticize technology at all. If it's software, you cannot criticize it. We're banning you you we're making it illegal to identify negative externalities of any technology product.

Speaker 7:

Oh, and a lot of this stuff is even externalities. It's literally just you're selling drugs to people. Oh, yes. Core product. Right?

Speaker 7:

Like, it's the core product. It's not like somebody got, you know, left fielded walking down the street by gambling apps.

Speaker 1:

It's Okay. Funny.

Speaker 2:

People people used to be criticized, you know, three years ago. Was like, stop building enterprise SaaS. Yes. And people were like, okay. I'll build gambling apps.

Speaker 1:

Okay. But on the gambling issue, walk me through how you think about the utilitarian calculus of, like, what is or is not gambling. Because, when I think about, like like, putting money in an s and p 500 ETF in an IRA that I hold for thirty years, I don't think of that as gambling. But then if I'm if if you're a professional fund manager and you're managing retirements, like, that's not gambling. There is risk, but then you go down to, like, zero day options, and it starts to look a lot like gambling.

Speaker 1:

And so do you believe in this, like, moral relativism? Do you think that there's a clear line, a bright line, like, is or is not gambling these

Speaker 2:

days? Income.

Speaker 1:

I mean I mean, yes. Sometimes they're just interfaced with it. But is it is it I is it I'll know it when I see it?

Speaker 7:

I don't even think gambling's wrong, by the way. Like, it's like, I think it's fine to get together with your friends and play poker on a Sunday or something. Like, that's totally fine. So as far as, like, gambling goes, I'm I'm saying that if you're gonna dedicate your life to building something, which is what you do when you start a company, it's not like you're just, like, you know, doing something as a hobby, then what I'm saying is that you should reflect morally.

Speaker 1:

Yes.

Speaker 7:

If you get like, I I don't think I'm not here to tell you what is right and wrong, but the pope's entire point was that you should think about that and try and do good things. And, I didn't think anyone on earth, I guess one person for sure, had a problem with that. And so I I don't have the moral guidebook. There is a guidebook. I I I own a copy.

Speaker 7:

It's called the bible. Yes. But I do not I cannot, you know, extrapolate that to every modern situation. But I do encourage everyone who's building anything to think about what they're building, what good is it, is it actually helping people. And by the way, b to b SaaS doesn't get me up in the morning, but, hey, that's honest software that actually, you know, solves people's problems.

Speaker 2:

Honest software.

Speaker 7:

To build, I think.

Speaker 1:

I like that. I like that. I mean so let let let's get into, Steel Manning, the other side of this because

Speaker 5:

Mhmm.

Speaker 1:

That's what's fun. It's okay to it may make some good content. So it it so it's okay to it's okay to, you know, hit the casino, the table, like, once every couple years, you know, on a on a vacation or something. Isn't that what Andreessen's doing? Like, like, by by total by net asset value, by the actual deployment of the capital, like, 99% of the dollars are going into honest b to b software, Databricks.

Speaker 1:

It's going into things that feel virtuous. And then, yes, they do have Speedrun, which is an incubator. And, yes, within Speedrun, there are some crazy, crazy companies, and I'll give you that. But as a percentage of the capital that Andreessen has allocated, we are in the, yeah. Once a year, he goes to Las Vegas.

Speaker 1:

We're not in the he only funds gambling. So how do you react to that?

Speaker 7:

I think that if you are, if you're going to Vegas, then you're there's nothing the whether or not you're, betting black or red is pretty it's it's an extremely amoral situation.

Speaker 6:

Okay.

Speaker 7:

When you come out and you put $15,000,000 into Cluelie, I think that is not an amoral situation. You made a decision there about what you wanna fund, what you wanna see in the world.

Speaker 1:

Okay.

Speaker 7:

And so what I see, if he if he wants, you know and again, that's, you know, obviously, a 16 z will do what it do what it wants. Right? Like, they don't have to listen to me. But for a lot of people, I think the Cluely investment was a real moment, at least for a lot of my founder friends. Totally.

Speaker 7:

That was a moment where it was like, oh, okay. Like, we're just literally gonna fund whatever retarded thing now. No matter, you know, whether or not it's good, it's just more vibe, more more, you know, bigger tweets come on. And, you know, just try to keep this constant sense of excitement going. And for what?

Speaker 7:

For for this cheating app? Like, this is stupid. Like, what are we doing? I didn't come out here to build cheating apps. I didn't come out here to build gambling apps that advertise that they can, you know, help people gamble under the age of 21.

Speaker 7:

Like, these things do not seem they don't seem they don't seem immoral to me. They seem actually immoral, and that's a decision they're making. And I would encourage them to not make that decision.

Speaker 1:

Well, the good news is that it does feel like Cluly has pivoted to to just b to b sass, I guess. So maybe maybe it's total victory SaaS victory. For for honor SaaS. I have one more Yeah. I guess I one more steel man

Speaker 2:

that I'll Go go for it.

Speaker 1:

Daniel With helmet. Toy with the helmet. I should get the helmet out. Although, I don't know. Can you see us?

Speaker 1:

Or is or can I

Speaker 7:

can see you right now? Yes.

Speaker 2:

Okay. Good. So John's gonna throw on the the helmet. The helmet. Get locked in for the second helmet is the steel helmet.

Speaker 2:

Steel man.

Speaker 1:

So if I wanna steal I

Speaker 7:

was really hoping it'd be a crusader helmet. No.

Speaker 1:

Maybe. So this is the steel man helmet, which is which is Okay. For when you're steel manning a point. And so it's very hard to wear. But

Speaker 7:

As it should be.

Speaker 1:

Yes. Heavy is the head that wears the steelman helmet. So the second steelman is is while I agree with you that you should not mock the pope, you should not try and dunk on the pope. It's kind of rude. I I would say that potentially, it is okay to critique the pope for, trying to set regulatory frameworks around business.

Speaker 1:

And I would cite the The Wall Street Journal's postmortem, like, the obituary on the previous pope and kind of look back on some of the stuff that the pope had done that was, maybe leaning too heavily into some of the environmentalist trends and maybe caused too much of a shift that actually sort of hurt the quality of life in Europe and America. And, and it was not it it wound up being incompatible with human flourishing. And so the pope is maybe not always correct, and so there does need to be some level of dialogue between even Catholics and the pope.

Speaker 7:

John, did you just say the pope isn't always correct?

Speaker 1:

I'm saying I have the steel man Hamlet on for a reason.

Speaker 5:

Yeah. Of course,

Speaker 7:

you could say that. Of course, that's a totally you don't even have to wear the helmet. That's a totally reasonable take. Of course, the pope's gonna say things that are incorrect. And especially, I think, to moral issues, on on political issues, the, the pope, does not just raise supreme when we can't disagree with them.

Speaker 7:

Yes. I do I do think that if the pope makes a very, like, you know, anodyne and earnest post about how we should care about the things that we're building and the effect it has on people, then you shouldn't mock him. Yes. Yes. So I especially especially, I mean, if you're some, you know, lowbie, that's whatever.

Speaker 7:

But if if you're Marc Andreessen, come on, man. So I guess, seeing it from him, I was like, that bothered me. But, no, absolutely. Obviously, I disagree, with, you know, probably many things the pope believes on on political issues.

Speaker 1:

Sure. We do have a question from the chat. Why is the pope above being mocked? I think we touched on this, but is it just is it just purely respect? Do you think all people should avoid mocking the pope or specifically Catholics, specifically religious people, specifically Americans?

Speaker 1:

Is there any more nuance to who gets to mock? It sounds like Lobes maybe could mock the pope if they're, you know, some sort of a non

Speaker 7:

Stop them.

Speaker 1:

Silly. Yeah. She is lower stakes.

Speaker 2:

Young growing Daniel. Young growing there's a young growing future growing Daniel.

Speaker 1:

Would would you have mocked the pope two years ago? Probably.

Speaker 2:

I think I think my my view is just is not to interrupt, but I I I I'm guessing you agree. Just like it it should be okay to expect like, the technology industry broadly

Speaker 1:

Mhmm.

Speaker 2:

Has so much power. It impacts our lives in so many different ways that people within the industry should criticize it. People outside the industry should criticize it. We should we should expect that people understand the weight of their work even if it's not in defense tech or national security related, etcetera. Like, the work impacts our lives and it impacts people that are outside the industry and to basically I don't even know.

Speaker 2:

I mean, part part of this is like, this was such a new meme for format. I was trying to clock, like, okay. What does he actually mean by this? This is, like, this is, like, a He was

Speaker 7:

so bad at applying it.

Speaker 1:

Did you watch the full interview, the GQ Sydney Sweeney interview? It was very interesting because it was an interview for, an event that GQ holds that's called Men of the Year. And it and and it was an interview between two women. And so I'd I I I think that they have, like, genericized the term, but they stuck with giving an award for men of the year, and then they give it to whoever or something. Something like, it was very confusing branding for me.

Speaker 7:

It's a man.

Speaker 1:

But then, but then also, like, there was a very weird, I it felt like a little bit of a Rorschach test actually watching the interview because when I actually went and watched the YouTube video, it feels like they're having a ton of fun with each other for, like, ten minutes, and then they do get into that one question that's kind of sharp sharply worded, and she kind of dismisses and says, like, look. I'm not

Speaker 2:

Why would you try to understand the source material, John. It's just a meme.

Speaker 1:

I I I I get it. I don't know. What what

Speaker 2:

do No. That that was that was part of my, like, you know, Mark Mark applied it in a bunch of different ways in a very short period of time. Right? So

Speaker 1:

it's Yeah.

Speaker 2:

He was he was a little trigger happy with it.

Speaker 1:

I don't know. Oh, well.

Speaker 7:

I think I think Mark got excited that he thought there was a new image that you could quote tweet someone with, and it would just own them every time. And he wanted to go apply that in as many places as possible before the cool points ran out because of people like Mark applying it and and running down the cool clock. So he wanted to get as many of these out as possible, and that's what he was trying to do. And I don't think he understood the source material, but he paid the price. Yeah.

Speaker 2:

He had to delete It was it was definitely anybody out there that was thinking about mocking the pope, I think, is gonna is gonna at least think twice.

Speaker 1:

Yeah. The pope the pope is, like, level 9,000.

Speaker 2:

Have you been have you been leveraging that prayer? We we covered we covered the prayer. Yeah. I I I forget which church it's in exactly, but the prayer

Speaker 7:

Star of the sea?

Speaker 1:

Yeah. Yeah. Have you actually

Speaker 2:

You leveraged that at all? That?

Speaker 7:

Oh, I leverage that every week. Absolutely.

Speaker 2:

That seems like some some serious alpha. That that should have become the current thing. There should have been lines out the door.

Speaker 7:

It's insane that it's available, for a 50¢ candle. You can go and pray for San Carlo to intercede for you, who is in heaven, by the way, hanging out with God. And he will intercede on your behalf for your technical problems. Incredible. Incredible.

Speaker 7:

Star of the Sea Parish. San Francisco, everyone.

Speaker 2:

Can see VCs, you know, actually putting that in term sheets going forward. You need to take 50¢

Speaker 1:

Who's the patron state of of capital allocation? If it's like, I think I'm going into a down round. I think my I think my portfolio is about to tank. I need to pray.

Speaker 2:

Prank for us. What do you think tech is getting right right now? Anything?

Speaker 1:

Oh, yeah.

Speaker 7:

Oh, I think we're in a way better place. So I don't know. I came out here in 2018. From then till '21, most of tech Twitter was crypto Twitter,

Speaker 2:

and

Speaker 7:

that sucked. And so now we're actually building tools that are useful. Many AI tools are actually legitimately useful, which is a radical change from when I was out here in those three, four years. So I think a lot of things are going really well in tech right now. I think that, AI will continue to creep into every crevice of, our capital structures and our processes throughout society.

Speaker 7:

I think that's awesome. I think it's gonna save people a bunch of time. It's gonna make people's lives much better. So, yeah, I think tech's doing really great overall right now.

Speaker 2:

Why do you think there's so much darkness on the timeline? John and I were talking this morning. It feels like more more infighting than ever.

Speaker 1:

Flows. It it feels like it ebbs and flows. But right now, yeah, it feels like there's a lot of infighting right now.

Speaker 7:

Infighting like what?

Speaker 1:

I mean, the you and Mark going out, it was pretty much the the the pinnacle of infighting. But it does feel like there's just generally been, like, a little bit more spicy. And from my perspective, it feels like it's because there are actually questions about, like like, is the is are we in a bubble? Is this about to pop? People are getting, like, sort of defensive.

Speaker 1:

Like, oh, like, maybe I shouldn't be, even if you're not like like, oh, I I I have my entire net worth in this one thing. There's even just, like, I don't wanna look stupid for being overly promoting something that's about to sell off by 20% or 50% or something like that. And so it just feels like people are in a little bit more of a defensive posture online. And I was wondering, I don't know, do you feel that, or do or do you not feel that?

Speaker 7:

I don't feel that. The only thing that I feel is weird lately, and it's in San Francisco and in tech, is that, it we attracted a lot of, like, kinda grifty, signal y people.

Speaker 1:

And

Speaker 7:

there's, like, people messaging me about how to, like, you know, establish their image in San Francisco and stuff. I'm like, what are you talking about? Like, what are you doing? Like, literally, like, if you like building things, then come find some guys or whatever. So come find some people that like building and talk to them.

Speaker 7:

And and if you vibe, then you can work on something. Right? Like, I don't know. It's how it always was. And so there's a lot of people that are, like, now moving in and trying to, like, you know, get their headshots done.

Speaker 7:

Like, they're moving to LA. And you don't have to do that. You don't have to do that, man.

Speaker 1:

Yeah. Yeah. I I I I think

Speaker 2:

it won't be the top until you have founders doing pitching at demo day and Chrome Hearts and Rick Owens. When that happens, it's over. Sell sell everything.

Speaker 7:

I hope the top comes before the end.

Speaker 1:

What is, what is the correct way to launch a new startup today?

Speaker 2:

Is it a is it a is it a ninety second vibe reel?

Speaker 1:

I feel like a lot of stuff's played out, but, like, is there anything that you think is, like, actually the correct way to do it?

Speaker 7:

I think first you should have a Twitter account with a 175,000 followers. Yeah. Step up and for sure post constantly. Until you do that.

Speaker 1:

Yeah. Post constantly.

Speaker 7:

No. I, I don't know. It's different for everyone. Like, that's the thing. It's like, I don't know.

Speaker 7:

I always hate, like, startup advice because it's like, literally, is different. Like, literally, if it were automatable, then it would be. Sure. And so it's like, what is your situation? Like, I don't know.

Speaker 7:

That's true. But you gotta figure out the way in which you're weird. You know, and, and work on that. That's usually your, your comparative advantage.

Speaker 1:

Yeah. I like that. I like that. Just just the advice is, like, avoid the templates, basically.

Speaker 2:

Do you get any inbound do you get any inbound customer interest? Like, do any any any anybody ever put it together?

Speaker 1:

Least monetized account with a $179,000.

Speaker 7:

I got one. One customer. Let's to me.

Speaker 1:

Let's ring

Speaker 7:

the Gongs. We won't we Gong. Hell, yeah.

Speaker 1:

Customer from X. Congratulations on your one customer.

Speaker 2:

I mean

Speaker 7:

I That's probably really

Speaker 1:

It's great.

Speaker 7:

It's one more than I thought I would get. I, we won't talk about my industry here, but I definitely work in an industry that is a Twitter industry. Yeah. And so, I never expect I really should have built DevTools or something. I should have done something wrong with but, yeah.

Speaker 7:

So my Twitter is not really related to my real life at all. But, yeah, we did get one out

Speaker 2:

of it. I think I I think at some point, you could just just all the people asking you for advice, you know, on personal branding, just say, fine. I made it I made a course. It's $10,000.

Speaker 1:

I I the $10,000 course It's key.

Speaker 2:

The $10,000 course, you could it's effectively non dilutive funding for the company.

Speaker 1:

Yeah. You can do it like a as a joke.

Speaker 7:

I don't think that would be good for I don't think it would be good for the world. That's true. Actually make it good, and I feel like I'm just ripping people off.

Speaker 1:

I don't wanna be discernment strikes again.

Speaker 7:

I am I am the most I'm the least probably, except no. I accept Rune, the least, very monetized, person on Twitter. Yeah. Probably. Probably.

Speaker 7:

I see people at 20,000.

Speaker 1:

Hey. That's getting hard. Am the most

Speaker 7:

so inspiring, brother, out there. I will say we have an announcement from my real life. The council has met it. We've decided we are hiring an engineer.

Speaker 2:

Woah.

Speaker 7:

Holy shit. So DM growing underscore go on x d m and see everything app and see which app app x is.

Speaker 1:

Yes. Yes.

Speaker 7:

Go and find growing underscore daniel, and you can send him your LinkedIn or your your resume. If you wanna live in San Francisco and you can pass an FBI background check, both those things have to happen. And you are a

Speaker 2:

go get

Speaker 7:

engineer, then you can come work with growing data.

Speaker 2:

Every every Anon truly anon poster who you qualified.

Speaker 1:

It is never you are the most doxable person in the world. No. Thank you for taking the time to come on.

Speaker 2:

No. We will we protect

Speaker 1:

the city. Protect we protect the nonce. We protect the nonce.

Speaker 7:

Thank you, guys. I appreciate it so much.

Speaker 2:

Very exciting role. I'm gonna I'm gonna message some people and say Yeah. Hey. Go go go join the moral discernment company San Francisco.

Speaker 1:

I respect it on so much. Like, I don't even save, like, full names on my phone or anything. The first time the first time, Growing Daniel sends me his email for some reason, it's just his full full name in his email. Like, just first name, middle name, last name. Like, dude, you should

Speaker 7:

Mike, if

Speaker 3:

I could do it again. Do it again.

Speaker 1:

But I respect you.

Speaker 7:

So much for having me.

Speaker 1:

Yeah. It's

Speaker 2:

great to see you.

Speaker 1:

Great to you. Great to talk

Speaker 2:

to I love love watching you build and

Speaker 1:

Yeah.

Speaker 2:

And, have fun on the Internet.

Speaker 1:

And You know how to do it.

Speaker 2:

We both we both come a long way. I think we, we met up at there had a very LA meeting many years ago. Oh, yeah. Got together at Erawan. There you go.

Speaker 2:

I don't remember what we got, but but, it's been it's been fun, to watch you you build and, entertain us all. So thank

Speaker 7:

See you guys.

Speaker 1:

See you. Goodbye. Let me tell you about some honest b to b software.

Speaker 2:

There we go.

Speaker 1:

Adio, customer relationship magic. Adio is the AI CRM that builds, scales, and grows your company to the next level. You can get started for free. We have JD Ross from with coverage in the re street

Speaker 2:

Cofounder of Opendoor.

Speaker 1:

Cofounder of Opendoor. Not back to that. All. Opendoor week here. How are you doing, JD?

Speaker 1:

Good to see you.

Speaker 10:

Great guy. What's happening? Oh, wow. I love the effect.

Speaker 1:

Sometimes it's, it's a little bit of a jump scare getting you into the UltraDome from the Restream waiting room. You can see you're on display here. You got some news for us. What's what's up? I'd love to ring the donk.

Speaker 1:

First, why don't you introduce yourself? Why don't you give us a little bit of background?

Speaker 2:

He says he likes one sound effect. He's like, I

Speaker 1:

that sound effect. You got an you got the bald eagle. Let's hit the bald eagle. It's how

Speaker 2:

Bald eagle?

Speaker 10:

I'm in Texas.

Speaker 1:

He's in Texas. There we go. How are doing? Yeah. How are What are you working on these days?

Speaker 10:

Yeah. I I mean, I'm working on honest b to b software.

Speaker 1:

I'm on

Speaker 10:

extremely honest software.

Speaker 1:

Fantastic.

Speaker 10:

I I think, you know, you had Kaz on yesterday, by the way, talking about the importance of having, like, a leader the right leader in a business. You know? Like, he's he's really amazing. That is a hard business.

Speaker 1:

Yeah.

Speaker 10:

It's like OpenDoor is I think we had like, we had a core value called we eat basis points for breakfast. Yeah. And, like, a basis point is a one hundredth of a percent, and so we're, like, fighting for each edge of a percent to cut off the cost to make the business good back in the day. He's obviously a better operator than I am. So I'm working in the opposite business now.

Speaker 10:

It's an insurance business where you have, like, 99% retention and 99% gross margins. Wow. Because I'm I'm lazy, and I don't wanna fight anymore. And it's public coverage. And, basically, the background was a couple years ago, you know, I was talking to my my buddy Max, who's my cofounder, and I was like, look.

Speaker 10:

In every company I've started or been in, we've had great, like, legal counsel. I've liked my tax counsel. My insurance guy and everyone else I know as insurance guy is kind of an idiot. Like, they don't really know what we do. They don't really know what is for our business.

Speaker 10:

Yeah. And so we're like, we could probably do this better. Let's just mess around for a bit and see what we can make happen. And it's now two years later. We have well over 500 clients, 10,000,000 revenue, profitable the whole way.

Speaker 1:

There we go. That's great. What

Speaker 2:

Profitability. Not not not something you see. We're gonna hit the gong for profitability. Real quick. Before we dive more into coverage, I wanted to ask you about, Opendoor's new feature.

Speaker 2:

Kaz was talking about, you know, introducing, returns to, home buying. I I think you could make the case that it would just increase overall sales velocity if people knew, like, hey. I can buy this. And if I don't like it, I can return it. But when you talk about eating basis points for breakfast, like, you know, returns end up being, like, a huge, like, line item or expense for a lot of companies that sell physical goods.

Speaker 2:

But what's your take?

Speaker 10:

My my view is you always wanna have the right incentives in a business. Mhmm. And returns is a good incentive in a business. Like, if you know that you can be returned, you are gonna keep your quality bar high. Mhmm.

Speaker 10:

You're gonna make sure that you are selling with integrity, that there's a true value in the product. And if you can't return it, you can sell it you're gonna sell as cheap as you possibly can.

Speaker 1:

Now this is lemon law wanna have law in cars. Right? It's like Yeah. You you don't wanna sell lemons because you you will be forced to buy the car back if there is a problem with it.

Speaker 10:

Exactly. Yeah. And the same thing for a house. Right now, everyone's biggest fear when buying a house is did I buy a lemon? Because it's every dollar I probably have.

Speaker 10:

Yep. It's, you know, it's by, like, eight x the largest purchase I'm going to make in my life versus, like, a car or anything afterwards. Yeah. It's scary, and you only do it a few times in your life. And so you're kinda trusting this person you met probably a month ago to as your adviser who's very incentivized to get the deal done.

Speaker 3:

And so you wanna be able to

Speaker 10:

have that confidence point. And if you're like, hey. This is gonna affect the bottom line. Well, only if we're selling bad product. Sell the right product.

Speaker 1:

Right? Right? A friend who bought a house who had, gets into it and, like, a year in discovers that, like, it has a bad foundation. And I was like, that sounds like a catastrophic loss of that house. Like, I Not the part

Speaker 10:

of the house. You wanna have a bad issue.

Speaker 1:

Exactly. It's like if if the if the if the chimney on the top

Speaker 2:

protects Okay. Just pick up the whole house, we'll fix the foundation.

Speaker 1:

Fixing the roof feels way easier than the than the foundation. Yeah.

Speaker 10:

Just get under there.

Speaker 1:

Yeah. So so what what are what what are your clients actually looking for on the insurance side? I mean, we have a variety of insurances here. Is it all one deal? Like, does it get crazier as the as the companies get bigger and bigger and bigger?

Speaker 1:

How are talking about We

Speaker 2:

had an experience getting insurance for, TBPN Yeah. That I couldn't tell you how many times they would come back for a new signature. And I was like, I've signed I've signed for this insurance, like, six times now. It's like, oh, there's another doc. You missed you didn't you didn't do this one doc.

Speaker 2:

And I'm like, oh, well, didn't need to know this doc was coming.

Speaker 1:

In one doc. Put all the docs in one doc.

Speaker 2:

You didn't know about this doc or the last one or the last one or the last one? I gotta keep signing.

Speaker 1:

Yeah.

Speaker 2:

So Yeah.

Speaker 10:

So I I think the way first of all, the the hardest part of our business by far is just getting the first call Yeah. Because it's a thing no one wants to care about unless you're talking about our our our insurance clients in construction where it's a huge deal for them. Most clients, it's just like, just get the first call. Once we have that, they're like, oh, wait. This is much, much better.

Speaker 10:

It's like ramp before, you know, versus any expense management system you had in the past. No one was thinking about changing out the expense management system. But then once you have a much better risk management system in our case, now your entire life is sort of a CFO or, you know, GC gets a lot easier because this whole thing that you know is very important and are very afraid of but don't understand is taken care of. And so our view is we replace the traditional broker with a tech platform. We pair them with an industry expert who's basically your fractional risk manager.

Speaker 10:

And instead of commissions, we try to align the incentives with a fee so that our job is to save the money instead of making them spend more money year over year. Mhmm. And those three things, the platform, the plan, and the team driving savings has generally worked really well. And the types of insurance depends on who we're talking to. Right?

Speaker 10:

Like, for a restaurant and or a manufacturing company or, like, a next generation defense manufacturer, it's workers' compensation typically. There's contractual risk. You're talking about fintech. It's like cyber insurance and errors and omissions and things like that for tech companies, usually DNO and then benefits. And so our job is just, like, build a point solution at each stage.

Speaker 10:

This software has gotten so much easier to build than years ago. Right? AI makes it so you can just you can launch new features when you have the right foundation. It's not cracked, and you don't have to return it a few years later. When you have the right foundation as a business, you can build these things much easier now than you could in the past.

Speaker 10:

And our clients really like, they get a lot of value from that.

Speaker 1:

Yeah.

Speaker 2:

That's Have you thought about buying existing insurance agencies? I've heard I've heard I heard at least one company that portfolio company of a of a fund MNLP in. It's just a hot

Speaker 1:

trend right now with the AI thing is like Yeah. Pull forward by rolling out I'm you're investing some legacy thing.

Speaker 2:

JD, it'd great if you could figure out us a way for us to give you another $50,000,000. I do not like how profitable you are. This is

Speaker 1:

a big problem for us.

Speaker 10:

Yeah. Literally. I think that's part of it. I think people are chasing opportunities with dollars before understanding the opportunities.

Speaker 1:

Sure.

Speaker 10:

Like, insurance is a very, like, acquisitive space, these brokerages, because it's been the only way to grow. I think we might be the only one I know of who's doing purely organic growth where you're saying, hey. Mhmm. You know that guy you golf with every weekend? Forget him.

Speaker 10:

Come on over to us. Mhmm. And having that work. Yeah. And I think it's because the whole industry is broken.

Speaker 10:

Like, everyone kinda feels fleeced by their guy, but no one in that space actually wants to innovate because they're it's the most one. It's the most boring industry on earth. Like, no one wants to talk about this or think about it. And even if it's your job, you generally don't wanna think about it. And now with AI, it's finally, like, ripe for disruption on a platform side.

Speaker 10:

But if you're acquiring brokerages, you have to convince this 50 year old golfer, whatever, to change their operations. Like, no way.

Speaker 2:

Yeah. I noticed you don't have the words AI agents anywhere on your website. Is that intentional?

Speaker 10:

Even had a website website until, like, a month ago.

Speaker 2:

Yeah. But no. No. But it feel but it feels intentional where there like, other people building this business would say would have would be selling this as, we're we're building AI agents for finding in insurance and risk management. But it feels like businesses don't really care.

Speaker 2:

They're not Seems like used not awesome. To using agents that are awesome yet. And so that's not really a selling point. I'm curious how you think about how software is evolving and how you think about, like, integrating AI for the user as well as kind of back office and actually to to deliver value.

Speaker 10:

Definitely. I I think right now, the storyline for AI and the foundation model side is really obvious. Right? It's just increasing general capabilities. And I think the AI software pure kind of play where you're just saying, hey.

Speaker 10:

We're selling AI into businesses to make them more efficient isn't working as well as people want. In certain areas, it's really going gangbusters like engineering, software engineering. Others, it's and content, you know, generation, things like that. In others, it's just not performing as well. Our view is that you it's much easier to build an AI native business than it is to build the software and sell it into people who don't know how to even adopt this stuff yet.

Speaker 10:

Yeah. Over time, I think maybe it'll get easier as the the the entry points make sense in each of these businesses, but it's very experimental right now. And so we're very, very focused on from the ground up, how do we build a business where it works from an AI native way. Right? Like, where our clients can just say, okay.

Speaker 10:

I'm buying something I already understand. And behind the scenes, yeah, when you request a certificate of insurance, an AI agent does that. Right? It reads through this 200 page PDF and says, are you qualified to get this data or not? Okay.

Speaker 10:

Great. We're gonna email your insurance carrier, blah blah blah, and it comes back and works. To the customer, it's just magic. They just get what they want. They don't need to know how.

Speaker 10:

Like, it's like explaining you you don't wanna know where the the sausage meat comes from. You just wanna get your sausage.

Speaker 1:

Last question. Do you have any mattress companies that you're working with?

Speaker 10:

We work with this incredible business called Eight Sleep.

Speaker 1:

Let's go. That's my next ad read. Thank you so much for coming on the show. I'm gonna transcribe

Speaker 2:

setup. Well, you're you're our new, insurance correspondent. So anytime there's some hard hitting insurance news, just

Speaker 1:

Word it out.

Speaker 10:

Be the risk guy.

Speaker 1:

Risk guy.

Speaker 10:

Yeah. Whatever things going wrong, just call me in.

Speaker 1:

Yeah.

Speaker 2:

If something I mean, careful what you wish for.

Speaker 1:

I mean, the big the big question actually is, is is the like, the how how, like, who who is it? Deutsche Bank is, like, creating insurance products around the AI, data center leases to try and hedge the risk there. That's a fascinating story. But, we'll have to jump into it another time because we're we're running late. But thank you so much for coming.

Speaker 2:

Great to finally have you on and come back on anytime.

Speaker 1:

Yeah. We'll talk to you soon. Have a good one. How did you last night, Jordy? I got a 76.

Speaker 2:

I still am away

Speaker 7:

from my eight

Speaker 1:

seat. Yes. Yes.

Speaker 2:

Gotta move it over.

Speaker 1:

I'm I'm I'm fighting against myself then. Let's get to Tyler's cam. Just check-in with Tyler. Oh, what what what happened to her?

Speaker 2:

Oh. Oh.

Speaker 1:

Oh. What happened to her? Oh. You can't see. You can't see.

Speaker 1:

He he he backed down. He backed you you can see that the the microphone is moving upwards because he he, lost the strength to stand at his standing desk and put his standing desk down.

Speaker 4:

I didn't lose the strength.

Speaker 1:

You lost the strength. You

Speaker 4:

I like to move around

Speaker 1:

that room because sometimes

Speaker 4:

I like to sit down.

Speaker 1:

Yeah. Yeah. Okay. Well, you can sit down. That no problem.

Speaker 1:

No problem. Our next guest is Josie Zayner, who is, building a literal unicorn using DNA editing to

Speaker 2:

I will be right back.

Speaker 1:

To modify horses. Is this correct? I mean, welcome to the show. How are you doing? Good to see you.

Speaker 9:

Hey. What's up, John? How's it going? Yeah. It's great.

Speaker 9:

I mean, we're gene editing animals. I think animals are kinda like the, you know, software of reality.

Speaker 1:

Fantastic. And Software of reality. What a turn of phrase. I love it. What like, what does this actually mean?

Speaker 1:

Is it I know.

Speaker 9:

Especially nowadays with all these people

Speaker 1:

Sorry. There's a little bit of a delay. Yeah.

Speaker 9:

No. It's not theoretical. Here's the thing. I know. K.

Speaker 9:

Where am I? I'm in my lab in Austin, Texas.

Speaker 1:

Oh, fun.

Speaker 9:

Doing. Is the delay still there? Can we hear it?

Speaker 1:

It is. Why don't you give me, an introduction on the company, kind of where you are, and I'd love to know how what your actual timeline is for building this actual literal unicorn. That going through?

Speaker 9:

Yeah. So, you know, the company I run is called Embryo Core, and we're working on gene editing animals. I'm kinda just like, I'm tired of people investing in AI note taking apps, and I look at universities and how science isn't going forward. So I'm like, fuck it. I'm going to build a startup company that people actually care about and like, and I'm good science that I can invest my blood, sweat, and tears in.

Speaker 9:

And to me, that's gene editing the future of life. You know?

Speaker 1:

Amazing. I think we have some, is there anything we can do about the delay, or or should we come back to this? I I think we might need to have you back on the show another time. The the the delay is just a little crazy right now. But Yeah.

Speaker 1:

But I'm I'm glad we got to check-in a little bit and talk about the story somewhat. I'm sorry we couldn't get it figured out. Somehow we can edit animals' DNA, and yet we can't get the the the Wi Fi to work. It happens. But the chat is loving you, and they're they're very excited about this.

Speaker 1:

But I'm sorry I'm sorry that we that we ran into technical difficulties, but we will check-in with you soon. Have a good rest of your day. We'll talk to you soon. Let's bring in our next guest after I tell you about public.com. Investing for those that take it seriously.

Speaker 1:

They got multi asset investing, industry leading yields, and they're trusted by millions. Our next guest is Scott Shapiro from Coinbase. Let's bring him in from the restream waiting room into the TPP UltraDome. Scott, how are you doing?

Speaker 5:

Great. How are you guys?

Speaker 1:

We are doing fantastic. A little recovering from some technical difficulties, but, hopefully, there haven't been any technical difficulties in your life recently. Give us the update. What what what is the latest news in your world?

Speaker 5:

Probably our token sales announcement from yesterday morning.

Speaker 1:

I think that was in the Wall Street Journal. Correct? Can you give me can you give me the full story?

Speaker 2:

Guys got in the journal?

Speaker 1:

The sacred text? I I think so. Is is Coinbase launches site for soak for token sales. Blockchain startup Monad will be the first to sell its digital coin on the new platform. Coinbase Global is launching a new platform to allow individual investors to purchase digital tokens before they are listed on a exchange.

Speaker 1:

You tell me. Is this fake news, is this the truth?

Speaker 5:

That is probably the realest news in the whole journal yesterday. Let's go. We are for the first time bringing token sales to the full US retail community since 2018.

Speaker 1:

That's fantastic.

Speaker 2:

Okay. So, yeah, break down the different levels because I know you guys just acquired Echo. That serves a different use case, but what what are the different types of offerings?

Speaker 5:

Yeah. You know, we're trying to go full life cycle when we think about asset issuers. Right? We just have a great exchange. It's been around for a long time for secondary trading.

Speaker 5:

What we announced yesterday introduces primary issuance right before it starts trading on the secondary market. Echo and a related company, Sonar, were acquired by Coinbase a couple weeks ago. Coincidental, we were playing with this token sales product working with Monet already. That is even further upstream, what Echo is doing kind of presales before a token is even ready to be traded or used.

Speaker 2:

Awesome. So so this new product you could compare to certain comp you know, companies like Robinhood and Public will take an allocation in different IPOs and give an allocation to everyday investors, people that are non investment banks or sort of established investment firms. Is that, like, a good comp?

Speaker 5:

It's a it's a fair comp. I'd say the two biggest differences, these are not IPOs. These are token sales. These are purely crypto. No stock exchange involved.

Speaker 2:

Yeah. Yeah. I'm just

Speaker 5:

Second difference is it is only distributed to retail investors on Coinbase. So there is not a little IPO allocation for retail and then a big allocation for institutions. This is a 100% just focused on a sale to the retail community. And then post sale, the real float that comes will be from these folks who bought, tokens on Coinbase in that primary sale.

Speaker 2:

That makes sense. What, why why was now the right time to bring something like this to market? I know I mean, if you look back through the history of crypto, there was an era where these kind of offerings to retail were much more popular, and then there was a lot of prob you know, kind of like problems and regulatory gray area from that, time that, is starting, I feel like, to be a bit more clarified. But, you know, maybe you could talk about the timing.

Speaker 3:

Yeah. There's a couple of things here.

Speaker 5:

One is just our ambitions have gotten much broader in terms of becoming what we call the everything exchange. So being able to offer every asset class at every stage of its development, and what I focus on is really making that tailor made for the retail community all around the world. Second thing is the regulatory environment is much better. And so where a company like Coinbase was being, not not so favorably looked upon by the SEC a couple years ago, that regulatory environment has improved to the degree where we feel much more comfortable offering this type of product, not just to The US community, but in dozens of other countries around the world. And that's a really new thing, for crypto.

Speaker 2:

Yeah. It makes sense. What about, what do you expect the volume of offerings to look like? Monad is is gonna be the first. Is this something that's happening on a on a weekly basis, you know, monthly basis?

Speaker 2:

What what what are the plans there? What can you share?

Speaker 5:

Probably more like monthly than weekly. We're really not trying to maximize volume. This is definitely a quality over quantity play. So it really depends on the types of issuers and the depth of those teams involvement and really their willingness to do this in a way like the Monad team has agreed to where they're not just going to turn around and sell all of the, you know, insider allocations right away, but they're looking to develop long term incentives and hold themselves to a standard of longer term lockups and longer term vesting schedules that really puts control in the hands of the retail investors. And, you know, you look in the past, a lot of these token sales on backwater types of markets have not done that.

Speaker 5:

And so we think that if issuers are willing to do that,

Speaker 3:

we'll we'll engage with them.

Speaker 2:

How does how does pricing work? Is it is it a function of supply and demand or is an issuer, setting evaluation and and a specific allocation amount? Like, what can you say around that front?

Speaker 5:

It's it's exactly the latter we described. So the issuer sets the token price, they set the allocation to what goes in in into this primary sale. We make sure that it's a team of high quality and that they're willing to be transparent about their unlocks investing schedules and that they're really committed for the long term. And then it's really up to the retail audience. They can invest as little as a $100, or 6 figures, paid in USDC, of course, stablecoins, to be involved in the sale.

Speaker 1:

If I'm doing if if I'm starting to sell tokens through this through through this process, are there, like, ad units around this? Like, we we when we go to an IPO in New York, we'll often see, taxi cabs will be wrapped with we were at the Klarna IPO, for example. There were pink taxi cabs everywhere. At Figma, they really, like, went all out. There was, like, a DJ outside.

Speaker 1:

There was a lot of celebration. Big, like, you know, massive canvas that gets hoisted. And there's a lot of attention that comes to it, and the New York Stock Exchange and the Nasdaq too, to some extent, sort of offer effectively, like, marketing packages to celebrate the moment. Is that something that you'll have, like, an analogy to in the digital realm at some point?

Speaker 5:

I love that idea. I I worked in ads for a decade, in Silicon Valley before coming to Coinbase following my confessions.

Speaker 1:

Thank you. Honestly, How is the modern American economy?

Speaker 5:

Yeah. Cheryl Sheryl Sandberg used to say it's the lord's work.

Speaker 1:

It is the lord's work.

Speaker 5:

Back in Facebook. Finally. We do not have that today. I think that's an interesting idea. This is something that if you go to the Coinbase app, you will see it pretty prominently, and Yeah.

Speaker 5:

Right now so the sale actually starts on the seventeenth. Right now is the time when people can come learn about it and get reminded so that we can notify them. But there's no paid media involved, and that's something that we might consider longer term.

Speaker 1:

I think that'd be fun.

Speaker 2:

What has any details released on on pricing?

Speaker 5:

Yeah. The price is 2 and a half cents per token. Mhmm. The FD FTD, the fully diluted valuation is 2 and a half billion, so you could do the math on the number of tokens out there. Yeah.

Speaker 5:

The allocation to Coinbase is a 187 and a half million dollars worth of Monad tokens. And, again, that will represent the bulk of the float once it's tradable. So the actual market cap that's floating is gonna be a small fraction of the total long term fully diluted valuation.

Speaker 2:

Fascinating. That makes sense. Yeah. Well, very very interesting new product, and I'm sure hopefully, we can get the the Monad team on as we get closer to them to Yeah. Learn about, the protocol and what they're working on.

Speaker 1:

Yeah. That'd be awesome. Thank you so much.

Speaker 5:

Thank you, guys.

Speaker 2:

Have a

Speaker 1:

great rest of

Speaker 2:

your day. Thanks, Scott. Cheers.

Speaker 1:

You know, if you do wanna take out an ad, you gotta go to adquick.com.

Speaker 2:

That's right.

Speaker 1:

Out of home advertising made easy and measurable. Say goodbye to the headaches of out of home advertising. Only Adquick combines technology, out of home expertise, and data to enable efficient seamless ad buying across the globe.

Speaker 2:

Some new billboards.

Speaker 1:

We're cooking. We're cooking.

Speaker 2:

We've heard your complaints. Not enough TBP on billboards.

Speaker 1:

Not enough.

Speaker 2:

Haven't seen one in a while.

Speaker 1:

Yeah. The funny thing is we actually have been getting that complaint. Very silly. We have Lawrence Allen from Terra Nova in the restream waiting room. Let's bring him in into the TBPN UltraDome.

Speaker 1:

Lawrence

Speaker 2:

Well, it's happening. At that. Look at that flag. Look at the suit.

Speaker 1:

Look at the flag. The suit. Wow.

Speaker 8:

Royal Flushing. Isaiah looked so dapper yesterday. I I figured I had to put a suit on.

Speaker 1:

You look fantastic. Give us the news, and then I wanna go into what the

Speaker 2:

news Florida ceiling flag.

Speaker 1:

Yeah. This is it's amazing. It's a great time.

Speaker 8:

Some of these flags are just too small. You know? It's hard to even tell if they're they're real now.

Speaker 1:

I'm not I'll let I'll let you in on a secret. If you wanna win the game for the biggest flag, you gotta put it outside. Go on the roof. Everyone's putting the flag inside the factory. If you hang the flag outside the factory, also the world's biggest flag, they they bring it out at, like, the halftime of the Super Bowl every once in a while.

Speaker 1:

There's one of them. It's like the huge and you can get it for, like, 500 k.

Speaker 2:

Only 500 k.

Speaker 1:

It's not that much. Right? If you if you got 7,000,000

Speaker 2:

I mean, it's a lot of money, but it's a lot of flag.

Speaker 1:

It's a lot of money. It's a lot of flag. Anyway, sorry.

Speaker 2:

Yeah. It's great to have you. Great to have you on the show.

Speaker 8:

Weirdest weirdest office ever, and the the outside of the part that's my bedroom is painted rainbow, so I was thinking we just repaint it American flag.

Speaker 1:

Okay. There you go. It's you sleeping in the office too?

Speaker 8:

Absolutely. You know, it's I think it's becoming the standard.

Speaker 2:

Are you where are you We used

Speaker 8:

are not. We're we're in Berkeley.

Speaker 2:

Berkeley. Cool.

Speaker 8:

The flag would be a little smaller in Gundo because the the price is like $3.20 a square foot there.

Speaker 1:

Oh, yeah.

Speaker 2:

Great. Bringing American dynamism to Berkeley.

Speaker 1:

Going on the offensive behind enemy lines.

Speaker 2:

Anyways, great to have you on the show. Congratulations on the launch yesterday. Breakdown. Maybe give some background on yourself and then let's talk about the company. Yeah.

Speaker 8:

Sure. Yeah. I'm a a recent Berkeley grad, dropped out of Stanford grad school. I worked at SpaceX before, and I am CEO of Terra Nova.

Speaker 5:

Mhmm.

Speaker 8:

So we just launched out of stealth. We probably had the longest stealth mode you've seen, and we How long? Wait. Wait.

Speaker 2:

How long?

Speaker 1:

The the There are companies that have been stealth for, like, a decade.

Speaker 2:

Longer than you've been alive.

Speaker 8:

Maybe not the longest you've seen.

Speaker 1:

But humane humane Humane.

Speaker 2:

But Humane AI has in

Speaker 1:

stealth for, like, seven years.

Speaker 2:

Yeah. How long? Like, couple years?

Speaker 8:

I've been working on this since I was at SpaceX, which was, like, late twenty twenty one. So I was in school until under a year ago at Berkeley.

Speaker 1:

I I stay corrected. Serious stealth mode.

Speaker 8:

Serious stealth mode.

Speaker 1:

Serious stealth mode. But you're out there.

Speaker 8:

So we're we're we're officially out. Proud to announce our $7,000,000 seed round led by King Bruin.

Speaker 2:

Great hit. Thank you. Great hit.

Speaker 1:

Congratulations. I was excited for that

Speaker 8:

one. And Outlander as well.

Speaker 2:

Fantastic. Awesome. What from Yeah.

Speaker 8:

What what who funded Valor and also Ponderosa and Gotham's.

Speaker 2:

Very, very cool. Good So what are you working on?

Speaker 8:

Yeah. We're building terraforming robots.

Speaker 1:

Mhmm.

Speaker 8:

So usually, when you hear about terraforming, it's some in the sky stuff like what Augustus is working on at Rainmaker. Here, we're talking about very physical on Earth terraforming where we are lifting land physically out of flood zones. I think this launch video captures it very well. But we're basically injecting really, really large amounts of material underground to just directly lift land. So this would be really useful for you if you are currently in a coastal area.

Speaker 8:

I think lots of places, all over LA, even parts of Gundow have this issue where you were just too low. Long Beach was called the sinking city for years, and the reason is because it just sunk about 10 or or 20 feet due to subsidence. And from that subsidence, you have flooding. So it's not just sea level rise. It's usually petroleum extraction or groundwater extraction or being built on fill.

Speaker 8:

So cities all over the world are facing this issue, and there's just really not much you can do about it. You already have the infrastructure in place. What do you do? You know, there's no way to lift that infrastructure up. You can't put it on some little stilts like you can do a small, you know, single story house.

Speaker 8:

You can put a giant seawall in. Usually, you're looking at hundreds of millions of dollars for things like that. Sometimes levy systems will work. But usually, what's happening now is they're just demolishing the infrastructure, putting a ton of filter on it, waiting, you know, a year, year and a half for it to compact and and moving around dirt on the surface to make that happen, and then they're rebuilding that infrastructure. And so, you know, maybe that's fine if you have, you know, some inexpensive property, but we're talking highways.

Speaker 8:

These are, you know, $8,000,000,000 projects for Highway 37, which is, you know, a few miles north of here. We're talking about mass displacements when you're doing this to whole cities. I'm from San Rafael. We are facing this problem. San Rafael has about 60,000 people, and they're facing a 500 to $900,000,000 bill for the seawall system that they need.

Speaker 8:

There's just no way they can afford that. And so we took a super first principles approach, how could we save our hometown? And this is what we came up with.

Speaker 2:

So you're injecting dirt and rock into the ground in order to effectively move move move like an area or a facility or a town higher? How much how much dirt how much mass do you need to do that effectively?

Speaker 8:

Yeah. You wouldn't believe it, but dirt is actually way more expensive than what we're using. We're using wood. So wood chips are free delivered in whatever category you want. You can get this anywhere in the world.

Speaker 8:

It's abundant. It's usually free delivered because it's such a large waste product. And so we're taking this wood chip slurry, pumping it directly underground. It compacts in about two hours to pretty much the full consolidation, and then you are just three feet higher or whatever you wanted to be at the end of the day.

Speaker 2:

It's hard to wrap my head around how much wood chips you need to raise a city and height.

Speaker 8:

But our shipping container is sized to process 20 semi truckloads of wood a day. Mhmm. So we're basically moving 20 semi truckloads of wood chips through our system. We have a bunch of robots that run around the site that deliver that final fluid underground and then that processing unit that we call our ARC. So the Prometheus system, that's the the robot.

Speaker 8:

We call it that because it's bringing something fundamentally new to the world, is delivering that final injection pressure. And you're basically just moving as much material as possible underground because really what you need is volume, not, you know, mass or something like like you would expect.

Speaker 2:

These wood chips, what what is the primary source? This is a waste product when people are getting trees removed from properties, manufacturing processes? Where does where does it actually come from?

Speaker 8:

Well, lots of sources. You know, if we are gonna do the Centrefell project, for example, there's a place called Marin Sanitary right across the street from the place that floods the most. They ship eight semitruckloads a day of wood chips to be burnt in biomass energy plants, which are subsidized by the state because they're noneconomic. And they would way prefer to, instead of, you know, polluting the air up in Stockton, to just pump that right underground so you can get it locally. You also get it from sawmills.

Speaker 8:

You can get it from fire trimmings all over the state. Obviously, we have huge fire issues here in California, so they're doing massive clearing of of forest, and then they really just don't have anything to do with that wood.

Speaker 2:

Okay. So goal is to save San Rafael or save them from this, you know, massive, infrastructure bill. How do you start small and prove that the company can, you know, execute on this type of operation? Are there specific sites that are, you know, much smaller scale where you can kind of start, proving out, the the thesis?

Speaker 8:

Yeah. So I I just got this text from our team. They are all at our pilot site right now doing some drilling, doing some pumping. You can see we use, like, auger drill bits and, you know, big injection machines to do our our pumping and have our shipping containers. So we are deployed.

Speaker 8:

This works. We've been doing this for about a year, and we are looking for more commercial projects to do right now.

Speaker 2:

So what site what site are they on specifically? Is that a, like, a testing ground?

Speaker 8:

They're an undisclosed site near Sacramento that is primarily a testing ground, but they also have huge drainage and subsidence issues that we are are fixing.

Speaker 1:

Fascinating.

Speaker 2:

Wild. So so what what are what what's, like, the the dream customer right now?

Speaker 8:

Yeah. We we would love to do, strangely, some wetland restoration projects and also some new home development projects. So things that don't already have a lot of infrastructure in place are are very ideal. We've seen a customer willingness to pay slightly higher there because their risk appetite is a little more because there's there's not much to go wrong. We feel ready to do some roadway projects as well.

Speaker 8:

So

Speaker 1:

so we're Break that first one down. It's basically like, I have a bunch of land that's undeveloped. I'm thinking about putting a bunch of houses on it, but there's a problem because there's, like, a marsh in the middle of it, you can kind of get rid of the marsh. Is that roughly correct?

Speaker 8:

Not quite. So K. In the bay, it's usually you've diked or levied off an area, and then because it's organic, it's subsided. So there's an example. It's an 80 acre parcel right on the water in Centrefell near the target.

Speaker 8:

Yep. This has subsided two meters. And so because of that, if you were to break the dike, it'd be under sea level, of course. Okay. And so they wanna do two things.

Speaker 8:

They wanna put some wetland on that. To do that, you have to lift it so it can become tidal, and they also wanna put about $100,000,000 worth of homes on that property on the more upland section. And so that, you would also need to lift it because that section is also too low. Interesting. Both ingredients are fixed by the same solution, which is ours, which is just directly lift it and make it so you can develop it and make it into higher quality wetland.

Speaker 1:

Yeah. Yeah. That, yeah, that makes a ton of sense.

Speaker 2:

How long until you have you're getting death threats from conspiracy theorists?

Speaker 1:

Yeah. Yeah. Modifying the world. Easy

Speaker 2:

easy. Yeah. People people, you know People love Augustus has been on the show.

Speaker 1:

You may get blamed for earthquakes. That's what's gonna happen. Like, there's gonna be an earthquake.

Speaker 8:

We actually think we're gonna help with earthquakes as well. We haven't been publicly claiming this because we're still doing testing. Yes. Formed dramatically better. This underground looks very much like particleboard.

Speaker 8:

It doesn't look like wood chips. It's compact. And so this is better than normal Earth. This is very controllable. It's very physical.

Speaker 1:

Get ready to go on Infowars, buddy, because I I Yeah. Feel like No matter what the science says, you are going to get blamed for, for earthquakes because people are just gonna be like, they were doing something. And then Yeah. The earthquake happened.

Speaker 2:

Yeah. Well, you have you have Augustus as as a as a model. Augustus, you know, was was in the hot seat.

Speaker 1:

But he's done a great job. And I think there is a path for communicating with all constituents, not just tech, not just Twitter. Augustus has done a great job getting to, you know, AM radio listeners and and random random shows that do not break through in tech Twitter. He's done a great job there. But, yeah, communication across all the constituents has to be really important for this business.

Speaker 1:

But

Speaker 8:

Yeah. I I think it's important to say this is not like a blue state solution. This is very much red and blue. Totally. We've missed this all over Florida.

Speaker 8:

Of course, Miami faces some of the worst flooding issues. And this is also not just a US problem. This is something that's happening all over the world. Southeast Asia is particularly bad.

Speaker 1:

Yeah.

Speaker 8:

We've had ministers from Indonesia, from Cambodia come here. They're saying, like, I'm moving my entire capital city of Jakarta. Help me. And that this is very much going to be the fundamental way that you solve large scale flooding problems. And so I I hope that conspiracy theorists don't come for us.

Speaker 8:

We definitely are willing to show them what's happening. Know, this kinda looks like a black box underground, but actually, we have systems where we can measure this to, like, a a two millimeter level precision. And so you can have a very good idea of exactly what's happening.

Speaker 2:

That's amazing. One more question just because I'm trying to wrap my head around the scale. How many, like, you know, truckloads or shipping containers worth of wood chips would you need to, let's say, raise a a square mile, like, call it, like, five feet?

Speaker 8:

No. I will do that math for you right now. So let's do acre to square mile. So, basically, you need 20 truckloads per acre foot, which is actually much better than fill because with fill, you're

Speaker 2:

so much An acre foot is an entire one foot, like, picture an acre and then one

Speaker 1:

foot. Okay. So one foot deep, acre wide long. Yeah.

Speaker 8:

Yeah. So, you want

Speaker 2:

We're doing math on TV. Got it.

Speaker 1:

Well, while he's doing the math, let me tell you about Wander. Find your happy place. Book a Wander with inspiring views of how great men is dreaming. That's top three cleaning. A $24.07 concierge service is a vacation home, but better.

Speaker 1:

And, you know, maybe in the future, your Wander will be on terraformed property. Might be increasing the number of of vacation homes.

Speaker 8:

Absolutely. So a a square mile lifted by five feet is a lot, of course. You know that. This would actually be 64,000 truckloads

Speaker 1:

of Let's go.

Speaker 2:

Let's go. However, however, as much as

Speaker 8:

it sounds, there is way, way, way more than that. Just in California, we could absolutely source that much.

Speaker 1:

Okay.

Speaker 8:

We looked, and it is very straightforward to source a thousand semi truckloads a day in the Bay Area. So that would maybe be a two month project if you were really about your logistics and maybe a two year project if you didn't care quite as much.

Speaker 1:

Okay. Fantastic. Well, we're looking

Speaker 2:

forward to it. Yeah. You're taking taking something that people wanna get rid of and saying, we'll take it.

Speaker 1:

Yeah. But thank you so much for taking the time to to stop by

Speaker 2:

the show.

Speaker 1:

This is fun. The chat was mentioning you. How how are you guys not talking about this? And we did we did sort of miss the the the the announcement, but we're really happy to have you join and and really appreciate taking time.

Speaker 8:

For having me, John. I've I've been watching you since you're making YC videos instead of tech startup videos. So

Speaker 1:

That's amazing.

Speaker 8:

Change.

Speaker 1:

Well, well, thanks so much for supporting.

Speaker 2:

Well, super exciting. Congrats to the whole team.

Speaker 1:

Good luck. We'll talk to you soon. Talk soon.

Speaker 8:

Bye. Jordan.

Speaker 1:

Before we move on, let me tell you about bezel. Get bezel.com. Your bezel concierge is available now to source you any watch in the planet. Seriously, any watch. And In tariff news

Speaker 2:

Yes. Switzerland has been making moves.

Speaker 1:

Oh, yes. The luxury watch market might be in turmoil again. We might need to call US? Get them on the show again.

Speaker 2:

And Switzerland are working on a deal to slash the 39% tariffs and try to get them to 15%. Rolex got Trump a one of one desk watch block that has been photographed on Trump's desk. Yeah. And That might win him over. And yeah.

Speaker 2:

One of one Rolex and win a lot of people over. That's pretty cool. Kind of. So Did

Speaker 1:

you didn't Barack Obama get a Rolex as well, I believe? He got some sort of watch from his Secret Service agent. His Secret Service agent

Speaker 2:

Probably an Omega.

Speaker 1:

It might have been an Omega. Might have been a Speedmaster, but but it became the watch that he wore. He wasn't a watch guy, then one of his secret service guys was like, gotta wear this watch, and he did.

Speaker 2:

Yeah. Very cool. Did you see this post Jeff Davies in 2016? A lot of posts from 2016 these days. Jeff Davies in 2016 said, Apple should buy NVIDIA and lock up VR, AI, and autonomous car tech for the next decade.

Speaker 2:

Look back ten ten years from now and great use of cash. Apple would be a

Speaker 1:

A $10,000,000,000,000 company. Basically. Wow. Wild. Yeah.

Speaker 1:

Looking also back, this Fireflies is this AI notetaker. Someone was talking trash about AI notetakers today on the stream. I I I I don't even remember who at this point. But, so Sheila Monat says this is wild. It's about a screenshot of a LinkedIn post by Sam Udotong, is the cofounder and CTO at fireflies.ai.

Speaker 1:

This is an AI note taking startup. I think they're doing very well. I think they've raised money recently. And and they said, came across this post on LinkedIn. Turns out the first version of fireflies.ai, the AI meeting assistant, doesn't even have AI.

Speaker 1:

It's just founder joining the calls, taking notes manually, and sending a summary back. Sounds crazy. And that is exactly what Sam wrote on LinkedIn ten hours ago when the screenshot was taken. Said we scaled about Fireflies to a $1,000,000,000 valuation after six failures from our original crypto food idea, food delivery idea. It's hilarious.

Speaker 1:

And If you're

Speaker 2:

if you're working on a crypto food idea pivot

Speaker 1:

Pivot to AI notetaker. Get to a get to a billion a billion dollar valuation. And so, he says, this is the key segment. When when when customers schedule the meeting, we'd manually dial in as Fred from fireflies.ai. We'd sit there silently taking detailed notes and send them ten minutes later.

Speaker 1:

After taking notes for a 100 plus meetings and falling asleep in many, we were finally able to make enough money to pay the $750 per month rent for a tiny SF living room. That was the point where we said, let's stop and automate everything. And so this is being put in this is being put in, like, a dunk because they're saying, like, oh, this billion dollar AI note taking company doesn't use AI. It's like, yeah. This was in 2015 or 2016.

Speaker 1:

Like, I actually I actually knew Sam at the time, and I DM'd with him. I was a customer, and I was completely fine with this because I was like, this is awesome. I get somebody to take my notes for me. It's basically like an outsourced notetaker. But it's being sort of, like, reframed as, like, as, like, this is how they would do stuff now.

Speaker 1:

And it's like, no. Obviously, it's been a decade almost since they did this.

Speaker 2:

I think if you're a customer and you didn't know, you have the right to be frustrated Sure. That you were just inviting Sure. A human and Yeah. Yeah. No no it's

Speaker 1:

Hilariously scrappy. Hilariously scrappy. But, yes, they they they were

Speaker 2:

And as long as they're being transparent with the VCs at that time that, yeah, we're just using couple of guys to do this.

Speaker 1:

It's so crazy looking back on this, this conversation with Sam in my DMs. He just says, hey. Like, I'm working on I I needed notes from something. Like, can you, can you send in, and and transcribe everything? I've been using Rev, this, this this manual transcription service that would take an audio file and just give you a text text file back, but it was done by humans.

Speaker 1:

And so here he says, you can add Fred at Fireflies to the calendar invite, Fred will join.

Speaker 2:

Give it up for Fred.

Speaker 1:

Let's give it up for Fred. I was like, just I I was like, hey. Just got my first transcript. Pretty solid. Like and he's like, how would you how comparable would you say it is to Rev?

Speaker 1:

I said, lots of ums and uhs captured. I I I know the exact comparison in the day because I sent the recording to Rev earlier. So I was doing, like, AB testing, actually seeing, like, what was what was useful, what wasn't. It was just funny, like, actually seeing this company grow so much and having been an early adopter and then, and then see people get the story wrong. So you know what?

Speaker 1:

If you're using Firefly, of course, they're using Whisper. Why would they be using humans still? It makes no sense. The technology is good. Anyway

Speaker 2:

Tim Sweeney is land maxing. Oh, do think? Behind Fortnite has bought over 40,000 acres of US wilderness not to build to keep it forever wild.

Speaker 1:

Forever wild. I love it. Where else are you in the in the timeline? Are you deep in there?

Speaker 2:

Mister Beast says 2% of human time is now spent on YouTube.

Speaker 1:

Well, thank you to everyone who's been spending at least

Speaker 2:

We spend way more than we spend way more than that.

Speaker 1:

Oh, yeah.

Speaker 2:

Because we're doing like 15 easily fifteen fifteen to twenty hours a week

Speaker 1:

Oh, yeah.

Speaker 2:

Fifty weeks a year. And, you know, that's we're we're getting the goal should be to get into the double digits. Right?

Speaker 1:

Yeah. For sure.

Speaker 2:

Somewhere around somewhere around seven or 8% now. Dean Ball had some support for your take yesterday. He said, don't think you should just throw a decel at someone who's identifying a negative externality of a new technology. That's not necessarily decelerationist. Dean Ball says, it's heartening to see such eminently reasonable takes making their way into prominent pro tech new media.

Speaker 2:

There's something very well, we're actually neo trad media, to be clear. But, he says there's something very old man about the terms of AI policy debates, and per and perhaps this is because it has been largely old people who have set them. I hear certain arguments about how we need to let the innovators innovate and thus cannot bear the thought of resolving negative externalities from new technology, it just sounds retro to me, like an old man dimly remembering things he believes, Milton Friedman once said, whereas anyone with enough neuroplastic ity remaining to internalize a concept as alien as AGI can easily see that such simple frames do not work. Ron DeSantis says, we might have to see if we get Pelosi to run Florida's pension fund. She's showing, this is pretty incredible.

Speaker 2:

Generated a $133,000,000 of profits and a 16930% return.

Speaker 1:

I don't believe that number. That's 16000%. I mean, I guess this is since 1987. She was probably making money and adding to it consistently over that time.

Speaker 2:

What if she just invested like she just invested once and then

Speaker 1:

I'm I'm very I'm very skeptical of that second number.

Speaker 2:

You don't believe she could just 10 x it and just 10 x that again and then 10 x that again?

Speaker 1:

And then 16 x that last time. It's possible, but it's like, I don't know. I I I would just imagine that you're adding to it every year, and you can't just, like, take the current net worth and divide it by the the amount of money that she had when she started investing. Right? Like, what is $1.33?

Speaker 2:

Anyways, that is a good place to end the show. We will be not be getting into the fifth hour No. Today. So sorry to disappoint everyone. But tomorrow, we have some, fun guests.

Speaker 2:

We have, Brian, Halligan, CEO of HubSpot Mhmm. Coming on. Also, he's a partner over at Sequoia. Mhmm. And then, Fei Fei Li from World Labs.

Speaker 2:

Oh, that's So very excited for all of that. And we hope you have a wonderful, Veterans Day, at home wherever you are, and we'll see you tomorrow.

Speaker 1:

See you tomorrow.

Speaker 2:

Goodbye. Cheers. Thank you.