TBPN

  • (01:58) - The Current State of AI
  • (30:43) - Robbery at the Louvre
  • (36:50) - Timeline Reactions
  • (41:00) - Nvidia & China
  • (48:05) - Timeline Reactions
  • (51:58) - Ryan Cohen is a Canadian entrepreneur and activist investor, best known for founding Chewy, an online pet retailer, and serving as the CEO and Chairman of GameStop. In the conversation, Cohen discusses the impact of government shutdowns on startups, highlighting how delays in contract approvals and visa processing can hinder hiring and operations. He also addresses the challenges in AI regulation, emphasizing the need for federal standards to prevent a patchwork of state laws that could stifle innovation.
  • (01:47:54) - Zak Kukoff, Chair of the Tech and Venture Practice at Lewis-Burke Associates and former venture capitalist at General Catalyst, discusses the impact of government shutdowns on startups, highlighting issues such as delayed deal flow, frozen visa processing for workers, and halted regulatory approvals, which can be existential threats to startups dependent on timely government actions.
  • (02:21:28) - Timeline Reactions

TBPN.com is made possible by: 
Ramp - https://ramp.com
Figma - https://figma.com
Vanta - https://vanta.com
Linear - https://linear.app
Eight Sleep - https://eightsleep.com/tbpn
Wander - https://wander.com/tbpn
Public - https://public.com
AdQuick - https://adquick.com
Bezel - https://getbezel.com 
Numeral - https://www.numeralhq.com
Polymarket - https://polymarket.com
Attio - https://attio.com/tbpn
Fin - https://fin.ai/tbpn
Graphite - https://graphite.dev
Restream - https://restream.io
Profound - https://tryprofound.com
Julius AI - https://julius.ai
turbopuffer - https://turbopuffer.com
fal - https://fal.ai
Privy - https://www.privy.io
Cognition - https://cognition.ai
Gemini - https://gemini.google.com

Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

Technology's daily show (formerly the Technology Brothers Podcast). Streaming live on X and YouTube from 11 - 2 PM PST Monday - Friday. Available on X, Apple, Spotify, and YouTube.

Speaker 1:

You're watching TVPN. Today is Monday, 10/20/2025. We are live from the TVPN UltraDome, the temple of technology, the fortress of finance, the capital of capital.

Speaker 2:

You weren't in France, were you?

Speaker 1:

No. No. No. Why why would you think that was Okay.

Speaker 2:

Paris, France? Well, some people were wondering No. Around the office.

Speaker 1:

What what I don't even know what that is. What's the news?

Speaker 2:

Why? What's France?

Speaker 1:

Why would you be suspicious of it? What is No. Of course, the the crown jewels were stolen from the Louvre in dramatic fashion. We will be covering that story, later in the show. We also have the CEO of GameStop, Ryan Cohen, joining the show at noon.

Speaker 1:

So if you're new here and you're tuning in for the first time, please stay with us. There are also major technical difficulties across the Internet today because of a massive AWS outage. We are working through it. Some of our graphics will not be up to date today. Some of the prices that you see on our tickers might not be up up to date today, but we are doing our best to bring you the news and bring you exclusive interviews.

Speaker 2:

Want us to podcast today, but we didn't. Refused to listen. They were And

Speaker 1:

really, really working against us.

Speaker 2:

And in other news, we will be having both Brian Chesky on the show tomorrow Yes. As well as Palmer Lucky

Speaker 1:

That's gonna be great. Back to back. So please stay tuned. Listen to this show. Listen to tomorrow's show.

Speaker 1:

Do you think the AWS outage is is is it entirely because of us and and the GameStop shorts want us to stop podcasting

Speaker 3:

Yep.

Speaker 1:

So that we can't bring them the latest in the world of of GameStop? Or do you think it's more that with the complete and utter popping of the AI bubble, AWS just threw in the towel. They said, look. The AI thing's not gonna work in. Let's just not even do the cloud computing thing anymore.

Speaker 2:

I think I think it was potentially Amazon Yeah. Just turning it off for a second Yeah. To show everyone that, hey. Cloud cloud's still pretty important. Yeah.

Speaker 2:

Forgotten about us.

Speaker 1:

That makes sense.

Speaker 2:

You're still dependent on us.

Speaker 1:

That makes sense. Well, if you've been living under a data center, Andre Karpathy made waves this week after he dropped a massive bombshell of a podcast with none other than Dwarkash Patel on the Dwarkash Patel podcast. He said a lot of good things about AI. He said a lot of negative things about AI. He called AI slop.

Speaker 1:

He also said that the models are amazing. He said that, autocomplete is his sweet spot. He had a bunch of positive things to say. But overall, people did not react positively. Everyone says the AI bubble is over.

Speaker 1:

It's popped. You gotta rotate out of AI stocks. My take is that if you're looking for something safe, you gotta rotate into food, water, shelter, and guns because that's all you're gonna have. AI is fake. The Internet's fake.

Speaker 1:

Computers are fake. The steam engine's fake. Railroads are fake. Electricity's fake. All of industrialization is a bubble.

Speaker 1:

We're going back to monkey. We're returning to monkey. Yeah. We're going to be living in huts because technology is a bubble. Technology is fake, and all we have left is to homestead and live in a shelter.

Speaker 1:

So get ready to go back to sticks and stones. That's my take.

Speaker 2:

Good take.

Speaker 1:

But, of course, it's not that bad. But, general tightening is having whiplash from this,

Speaker 2:

for sure. Baez over on x says, so happy it's contrarian to believe in AI progress again.

Speaker 1:

Yes. Oh, that's true. Yeah. Yeah. It is it is a it is a good take.

Speaker 1:

And so, basically, Andre Karpathy went on, and Andre is in an interesting position because Andre Karpathy was a cofounding member of OpenAI. He was there from day one. He also was at Tesla and built the self driving car program there. He is one of the most respected AI practitioners, researchers. He's implemented nano chat and nano GPT, these repos that are handcrafted by him to explain to anyone who can program at a high level how to build an an advanced chatbot, an advanced AI model, an LLM, train it from scratch.

Speaker 1:

And and so he's always been extremely respected, but he doesn't have crazy bags right now because he's kind of out of Tesla, out of OpenAI. He was at OpenAI for a little bit. He back and then left. He's working on an education product right now, which is very, very interesting, very cool, called Eureka. I'm actually very excited for it and would recommend anyone who's getting into programming, stay on top of that because I feel like if you can power through the Andre Karpathy pedagogy or his, all of his different, coursework that he puts out, you will probably have a very, very rewarding job very quickly.

Speaker 1:

So people but people are bearish because he's basically saying that AGI is, like, ten years away, not one year, not two years. He's kind of debunking the idea or at least he's just sharing, like, the vibes that he doesn't feel like the AI twenty twenty seven fast takeoffs scenario is probable. And what's interesting is that this aligns with what a bunch of people have been saying, but it's one like, Sam Altman's been saying, you know, curing cancer and teaching people the

Speaker 2:

Free education.

Speaker 1:

Education. Right? But then also, went back to over a year ago, 09/23/2024, Sam Altman posted a blog post said the intel called the the intelligence age. And this is what he said in there. He said, it is possible that we will have superintelligence in a few thousand days.

Speaker 1:

I have always read a few is three. Three or four. Right? What's three thousand days? What's four thousand days?

Speaker 2:

About ten years.

Speaker 1:

It's about ten years. And so there's there's two reads on that. One is that, like, let's say that he knows for some reason. I don't think he does. I don't think anyone does.

Speaker 1:

But let's say he knows. Well, he's still setting expectations to be like AGI is a decade away, which is exactly what Andre said. But when Sam said it, everyone was like, let's go super long. And when Andre says, oh, it's bad, which is kind of an interesting tweak. But, also, there is the more kind of critical take, which is just that when you whenever you talk to a technologist and you say, like, okay.

Speaker 1:

You're working on getting us to Mars. You're working on fusion reactors. You're working on quantum computing. When will this actually change my life? When will this be when will this technology be ready for primetime?

Speaker 1:

Everyone says, it's just a decade away. Because a decade is so long that if you make a decade prediction, like but, like, if we were to say, you know, oh, where will the show be in a decade? Or what will we do in a decade? We can say crazy stuff because it's gonna be so long that everyone will have forgotten in a decade, basically.

Speaker 2:

It's somewhat straightforward to make one, two year, three year prediction. Totally. Beyond that, it gets

Speaker 1:

It just gets so fuzzy. There's so many other things that could happen. It's it's really complicated. And, I mean, people are making long term predictions about AI for a long Rick I got Silicon Valley. Read a book called the Singularity by Rick Kurzweil.

Speaker 1:

I believe he published that in, like, the late nineties, maybe early two thousands. Tyler, have you read The Singularity is Near? It's a tome.

Speaker 4:

I actually have not. I need to read that. Oh, yeah. Yeah. Should.

Speaker 4:

That's like a foundational text.

Speaker 1:

And the crazy thing about that is that he makes a whole bunch of predictions about how, scaling will happen in terms of, like, compute power, and he he bounds everything in flops. He just says, like, let's sum up the total amount of computing power in the world right now and just comp that to how many flops the human brain has or all human brains have. And based on that, he predicted that we would that we'd pass the Turing test in, like, 2022 or 2023. He, like, nailed it, like, twenty years earlier. Completely nailed it.

Speaker 1:

Although, we we do debate, like, are we actually passed the Turing test? Because you can just ask a model, like, are you a model? And it'll say yes. And so but in general, it feels like we passed the Turing test, and it feels like we're hitting Kurzweil timelines. And so I've always kind of, like, like, leaned back whenever someone says somebody presses me on my AGI timelines.

Speaker 1:

I always just say, like, I kind of I kind of stick with Kurzweil. 2045 is what he says. And that sounds bearish now. Yeah. But when he published The Singularity is Near, people were this is never going to happen.

Speaker 1:

Like, this is just insanity maybe in ten thousand years, bro. But, like, it's

Speaker 2:

not I thought it was interesting, you know, some people listen to the Karpathy interview and thought, great. I have ten years to escape the permanent undercut. Yeah. Totally. Is which is exactly exactly how OpenAI is also acting.

Speaker 2:

Right? Yes. Yes. A big tech company. They're launching a social media ad.

Speaker 2:

Yes. They're working on ads.

Speaker 1:

Yes. Yes. Yes. Yes. Trying to Yeah.

Speaker 2:

Scale paid users. Yep. Right? They're just acting like a traditional hyperscaler. Yeah.

Speaker 1:

I I think that's the correct model. I I I think that's 100% the right move. Did you get the time?

Speaker 4:

Yeah. I was just gonna say, like, the fact that ten years AGI is like a bearish take.

Speaker 1:

It's crazy.

Speaker 4:

That's so bullish.

Speaker 1:

That's that's

Speaker 3:

extremely bullish.

Speaker 1:

Yeah. Yes. Yes. Yes. Yes.

Speaker 1:

Bearish takes are bullish. I agree. I agree. Yeah. No.

Speaker 1:

No. No. It is true. But but, I mean, a bunch of people have been saying stuff like this. I mean, when we talked to Tyler Cowen, he says, you know, I think progress will be very incremental, that we won't just see some sort of fast takeoff because there's so many different areas of the economy that you can't it's not just a, like, a single input output text in text out.

Speaker 1:

That's true for small programming tasks for sure. That's true for customer support for sure, but it's not it's not true for, medicine. It's not true for so many other things. And Andre Karpathy does a great job breaking down, that that prediction about radiologists, that, the the idea that radiologists would be put out of a job by AI because AI is really good at looking at images. Well, it turns out that radiologists do, like, 25 other things that are really important, and it's not just a job of looking at a picture and

Speaker 2:

Did you see a lot of pushback from anyone at the big labs on the interview?

Speaker 1:

No. I didn't see anything.

Speaker 2:

I expected to see more people saying, Karpathy's been out of the game for a few years. He doesn't have the scale. He's not really at the front tier. Yeah. He's not seeing what we're seeing.

Speaker 2:

And I didn't see any of that. Just bounce off

Speaker 1:

Karpathy for sure.

Speaker 2:

I don't know. Is that bearish, Tyler?

Speaker 4:

Alright. Maybe that's a little bearish. I'm looking at the, you know, broad strokes here.

Speaker 1:

Yeah. Yeah. Yeah. But, I I mean, I think that the

Speaker 2:

I saw a lot of researchers being like, yeah. RL is pretty mid.

Speaker 1:

Yep. Yeah. It's like the best it's the best we have, I guess. But, I mean, the big question here is is what does this mean for big tech companies? What does this mean for the hyperscalers for Oracle?

Speaker 1:

Those big tech companies can't exactly turn on a dime on the basis of the latest

Speaker 2:

So funny. AWS up 1.3% today.

Speaker 1:

You mean Amazon

Speaker 2:

as a whole? Sorry. Amazon as a whole up 1.3% because everyone is like, oh, this is actually pretty important. We kind of forgot about AWS. Yep.

Speaker 2:

And, Oracle, of course, is down almost 5%.

Speaker 1:

Well, whatever you are thinking of doing in the public markets, go to public.com, investing for those that take it seriously. They got multi asset investing, industry leading yields. They're trusted by millions. And so, Oracle's now on the hook for something like $300,000,000,000 in infrastructure over the next five years. That's a lot very quickly.

Speaker 1:

They're hoping that, OpenAI revenue will pay for that. And there's the hyperscaler narrative that you laid out where, you know, if the ads product really ramps, if the, you know, the SOAR product ramps, if

Speaker 2:

My view, Oracle basically has to bet that OpenAI can get every purse you know, almost every Internet user to be an OpenAI user.

Speaker 1:

Yes. And I I mean, I'm I'm super fascinated in how fast the the commerce ramp will happen. Just over the weekend, I mean, I'm on, like, the furthest edge of early adoption here, and I forget what I was searching for. I was searching for a product. Oh, I was searching for gray printer paper that would look like a newspaper.

Speaker 1:

So if we printed a tweet, we can make it look like newspaper for some reason. I just wanna know if that was a thing. It found me some options. And when I got through to the product, it took me to a landing page, and I wound up just sending it to one of our team members and saying, hey. Can you order this?

Speaker 1:

Because the one click checkout, like, I I I wasn't at a point in ChatGPT where I could just say, okay. Buy it and ship it to the Alterdome. Like, that was not an option for me. So I so so, like, they need to obviously improve, like, features there. But then there's also, like, customer adoption and actually training the user.

Speaker 1:

Because if if if if 700,000,000 of those 800,000,000 are still in the natural, like, muscle memory workflow of when I find a product on ChatGPT, I Google it, then I buy it there, well, that's not gonna monetize very well. They have to train people into thinking, yes. I trust that if I tell ChatGPT, order this and deliver it to my house, it's gonna buy it with the right credit card. It's gonna ship it to the right address. I'm gonna be able to get a refund if I need to.

Speaker 1:

Like, they like, that's that's a whole new user experience workflow that they will have to train people on. Yeah. And so And I would say I

Speaker 2:

would say important. It is very notable

Speaker 1:

Yeah.

Speaker 2:

That OpenAI has announced partnerships with Etsy

Speaker 1:

Oh, yeah. This and Walmart Yep.

Speaker 2:

And not Amazon and eBay. Sort of

Speaker 1:

the TIMU Amazon and the TIMU eBay. You can think of it that way.

Speaker 2:

That's why

Speaker 1:

Walmart is kind of like TIMU Amazon.

Speaker 2:

It really is. No. But it's true. I mean, obviously, OpenAI, I'm sure, would have liked to already announce partnerships with Amazon, would have liked to already partnerships with eBay. And and certainly, conversations would have happened.

Speaker 2:

Nobody neither management team is asleep at the wheel. But I would expect that Amazon's Amazon's management team, eBay's management team aren't so quick to let Sam Yep. The fox into the henhouse. We

Speaker 1:

love we love when foxes get in henhouses. We also love ramp.com. Time is money. Save both. Easy to use corporate cards, bill pay, accounting, and a whole lot more all in one place.

Speaker 2:

Pedro Wait.

Speaker 1:

Hold on. Hold on. One thing on that. So the question about eBay and wall eBay and Amazon staying out of the OpenAI partnerships and OpenAI having to go to the tier twos, Walmart and Etsy. Right?

Speaker 1:

I I think that's, like, the correct framing that Amazon and eBay want to say, hey. We I think we still have leverage here, but it can flip if OpenAI becomes a true aggregator. Amazon does have to run ads on Meta properties. They do have to run ads on Google, and they and and these things can get out of control where, like, the iPhone got so dominant that Google had to pay Apple to be the default search. Like like, there's just a certain amount of, like, if everyone winds up using one thing, then you do have leverage in that conversation.

Speaker 1:

And I wouldn't be surprised if in in if in two, three, four, five years, OpenAI has enough leverage that Amazon and and eBay capitulate and say, yep. We're partnering because we we we we just have to do And so, I mean, the last thing we could talk about is, like, how Oracle is positioning their decision making. I had a little bit of take on this. Do you want me to run me through it?

Speaker 2:

Hit me.

Speaker 1:

So in the so Oracle has two new CEOs. Software Katz is out. There's Clay McGork. He's the co CEO, and he did an interview with CNBC. And he said of OpenAI, look at their financials, their growth, and what's being built with this technology.

Speaker 1:

This isn't a typical company trajectory. They've reached nearly a billion users faster than anyone. Everything about this is unprecedented but in a good way. And I I don't like this framing. I I think that I think that it's so it's not the right framing to say, like, we've never seen anything like this.

Speaker 1:

It's breaking all of our patterns, so we're just completely operating off playbook. I think it's much better to just say that, like, like, this is a new hyperscaler. We all saw what happened with Google and Amazon. They were started before the .com bubble popped. Yep.

Speaker 1:

And they scaled very fast. OpenAI is scaling faster on users and revenue. And so we are investing more, but it's all proportionate to the growth. And so it's the exact same playbook as you wanna build Yeah.

Speaker 2:

The streaming Google is also

Speaker 1:

1999. You wanna build Google for you wanna build infrastructure for OpenAI in 2025. You underwrite it the same way. There's just it's just a faster growth rate, so you're investing slightly more, and it's all proportional. And it's actually the same playbook.

Speaker 2:

Do you think he's trying to frame the growth as so unprecedented? We're just trying to meet it so that people don't see Oracle directly as just this massive bet on OpenAI. Because if he just came out and said, we're betting that ChatGPT will need to serve billions of users Yeah. That feels like just a better kind of read on the situation overall than, oh, we can't predict the demand at all. It's just running away from us.

Speaker 2:

We're just trying to trying to meet Yeah. Their capacity demand. So I don't know.

Speaker 1:

That I don't know.

Speaker 2:

That's good. Either way, Oracle is a bet on OpenAI.

Speaker 1:

Yeah. Yeah. I I I think I prefer the world where they are saying, like, sure. Even if we're in the .com bubble, we think we found the Amazon. We think we found the Google.

Speaker 1:

They're gonna make it through, and these are gonna be multitrillion dollar companies. So we are going really hard on that to partner and and draft off of that. The the problem is is that I'm sure Oracle has a million other partners that they can't, like, completely, like, frustrate by saying, hey. You know, we're we're we're not interested.

Speaker 2:

And we

Speaker 1:

move on, I wanna tell you about Restream, one livestream, 30 plus destinations, multistream and reach your audience wherever they are. Sorry, Jordan. Continue.

Speaker 2:

At the same time as Oracle is obviously bullish on OpenAI, There is reporting. There was reporting in TechCrunch that was based on a report by Apptopia that was using App Store data to look at ChatGPT overall growth and user minutes and things like that. And this somewhat aligned to the reporting from Deutsche Bank that Hey.

Speaker 1:

Hey, guys. We got Nick on the camera somehow. Sorry, everyone.

Speaker 2:

Sorry. AWS goes down.

Speaker 1:

For our live

Speaker 2:

TV here. Doesn't go down entirely, but it's close.

Speaker 1:

But welcome, and please enjoy the beautiful view of one of our team members, Nick.

Speaker 2:

So, anyways, hopefully, the audio is still coming through. But We're back. The both of these reports showed that, yeah, OpenAI OpenAI growth, ChatGPT growth is seemingly slowing a little bit. And that and that makes sense if you if you just based entirely on the scale that they're at right now.

Speaker 1:

Yeah. I mean, you do saturate at some point. This is DAU. Is this this is global DAU, and it's around 72,000,000. What's interesting here is that for a long time, people have been spinning the narrative that, like, oh, growth will plateau during the summer.

Speaker 1:

Right? Didn't we look at some data that showed that August was particularly flat for some other reason? Who was that? There was

Speaker 2:

some Growth plat did plateau over summer at least

Speaker 1:

That's not this data. At least This data shows very consistent. Oh, that was Europe. That was Europe. Okay.

Speaker 1:

Yeah. Yeah. It's interesting. Europe, America grinds during the summer and then plateaus. Yeah, we burned out.

Speaker 1:

The great lock in of September was like, okay. I'm not installing any new apps or something like that. I don't know. I don't know how much to read into this Apptopia. Like like, some of these, like, tracking app tracking things, like, I I I don't know how accurate they are because they don't necessarily have a pixel in the exact app.

Speaker 1:

It might be more like Nielsen ratings where they're just polling people. Yep. But but, I mean, if you believe the the beginning of the curve, what happened what happened last year? Like because there was actually a plateau back in it looks like back in January of what was that? 2023 or 2024?

Speaker 1:

Because January oh, no. That was the beginning of this year. That was the beginning of this year. So the beginning of this year was was pretty flat. And then once April 1 came out, everyone went on for April Fool's Day, I guess, and then it started ripping upwards.

Speaker 1:

And then and then and then September and October have been sort of flat. I don't know. We'll see. There was a yeah. There's a lot more we could go through here.

Speaker 1:

But first, let me tell you about Privy, wallet infrastructure for every bank. May Privy makes it easy to build on crypto rails, securely spin up white label wallets, sign transactions, integrate on chain infrastructure all through one simple API.

Speaker 2:

Vuco Capital was sharing another data point over from Similar Web. It shows that Gemini has been gaining market share Gaining? Gaining Mhmm. On OpenAI. Obviously, the overall market is growing.

Speaker 2:

Mhmm. So OpenAI is still Yeah. Growing on a on a per user basis. Mhmm. But Gemini is growing.

Speaker 2:

He says Google already has nine products with over 1,000,000,000 users. Search, Android, Chrome, Drive, Photos, Play Store, YouTube Maps, and Gmail. Yeah. So it says they know a thing or two about a thing or two. Yeah.

Speaker 2:

At

Speaker 1:

the same time, like boot bootstrapping a new a new product flow, like actually funneling you from Gmail into something else, is really hard for a almost thirty year old company. It's always been through through acquisition. They buy something.

Speaker 2:

Don't Meta I mean, Meta and Thread.

Speaker 1:

Meta Meta did do it. Thread. So maybe

Speaker 2:

that When you own when you own one of the best advertising

Speaker 1:

Yeah.

Speaker 2:

Networks in the world

Speaker 1:

Yeah.

Speaker 2:

You can aggressively funnel people. The question is how much do they try to integrate the value of Gemini into the core search product?

Speaker 1:

Yeah. How how how would you do it? Because them out. I I I think what what what Connor did over at Threads is is a really, really smart integration where you actually see Threads content inserted into the Instagram feed. So while you're scrolling Instagram, you see a thread, and you're like, oh, I wanna see that.

Speaker 1:

And then it just takes you over to that app, and then you're DAU of that app. And it was very easy to create a Threads account if you had an Instagram account. How would how would how should Google surface a Gemini flow where you're in Gmail or you're in Yeah. Google search, and then all of a sudden, you're in you're in Gemini.

Speaker 2:

I think there's a bunch of ways. You can do it through search. You can do it through Chrome.

Speaker 1:

Maybe through the AI overviews. Like, you know how you go into AI mode? It should be, like, an one more click, then you're just fully in Gemini, maybe.

Speaker 2:

Yeah. I mean, I would I mean, I think there's a reason that OpenAI wants its own browser. Perplexity wants its own browser. I don't think they've properly even leveraged Chrome yet in order to drive Gemini usage.

Speaker 1:

Yeah. Yeah. Yeah. Was thinking about I was thinking about the browser, like, agents in a browser.

Speaker 2:

Browser wars of the 2025. Over over before they started.

Speaker 1:

Yeah. With a whimper. I yeah. I was thinking about the the the browser because I used to I used to enjoy scraping web content. So I'd go to a website, and I'd write some JavaScript to, like, pull it all out into a spreadsheet.

Speaker 1:

And I was thinking, like, oh, with a with an agent in my browser, I could do that really easily. I could have it write the the JavaScript to scrape everything out. That'd be really easy. But then I realized, like, I could just, like, copy paste the whole thing into an LLM or just give the LLM the URL or even just I was thinking about the Midas list. I was like, if I wanted to get the Midas list into a into a spreadsheet, I used to have to write JavaScript to, like, go through the Midas list and and save everything and put it in a spreadsheet.

Speaker 1:

Now, you can just go to Chattypedia and say, reproduce the full Midas list, and it'll just do it. You don't even have to tell it the URL. So Yep. I'm still I'm still searching for use cases of of AI in the browser.

Speaker 2:

This was a good highlight from the interview, He says, when I go on my Twitter timeline, I see all this stuff that makes no sense to me. Honestly, just fundraising. And then He was the next post from Chris Ofner Yeah. The unbridled joy of listening to someone smart who's not trying to sell you anything. This is why I think this interview hit so hard Yep.

Speaker 2:

Is he's not sitting there, like, trying to get to the next tender offer.

Speaker 1:

Yep.

Speaker 2:

Right? He's just Yep. Being honest about what he's seeing and some people aren't gonna like that. But

Speaker 1:

I mean, same with Sutton. Like Sutton doesn't have bags. Right? And he was able to come on and and make a pretty pretty clear argument to to put Tyler out of a job.

Speaker 4:

I mean, I I think I I do think, like, the most interesting quote out of this Yeah. Don't I see a ton of people talking about it. It's that he basically doesn't think, like, even if we get AGI in ten years, GDP growth will actually just stay at 2% Mhmm. Which I think is, like, very contrarian. It seems to me that, like, most people are, we won't get AGI.

Speaker 4:

Yeah. If they're saying that GDP g d p GDP growth is gonna stay Yeah. This that means we didn't get AGI and it's just, like, better autocomplete or etcetera. Yeah. He he basically loops in, like, all, you know, innovation just into the same 2% GDP growth that's been going on because Yeah.

Speaker 4:

Yeah. Yeah. You can't if you look back, you can't really

Speaker 1:

see Yeah. That was a super profound idea. This this idea that he sees AI as an extension of the Internet. And he's like, yeah. When I hear the story about, like, recursive AI, like, oh, wow.

Speaker 1:

The Foundation Model Lab, AI researchers are using AI to improve the AI that they build. He was like, well, yeah. Like, the Google engineers would Google stuff. Like, yeah. And like, you know, the Ford the Ford motor manufacturers would Cars.

Speaker 1:

Probably be like let's drive cars.

Speaker 2:

Make more cars.

Speaker 1:

Exactly. So, like, was like, he was kind of like every technological innovation since we've invented technology and started

Speaker 2:

look at OpenAI. And fundamentally, it it looks a lot like Google. Yeah. Right? It has Yeah.

Speaker 2:

Different vibes Yeah. Different I like it. A different leadership But

Speaker 1:

Yeah.

Speaker 2:

You know, they have their YouTube. They're building their AdSense. Yeah. So

Speaker 1:

Well, if you're writing code, you need to check out Cognition. They're the makers of Devon, the AI software engineer. Devon, crush your backlog with your personal AI engineering team. Everyone's from Canada, says Yaxine. Basically, every famous computer scientist is Canadian Andre Carpathi, Hinton, Alex Nett, Rob Pike, Yaxine Yaxine, Elon Elon.

Speaker 1:

Canadian. All Canadian. I love it.

Speaker 2:

Everyone's some fake news. Will Brown. Thoughts on the Karpathy interview. Agree with him on pretty much everything. Yep.

Speaker 2:

Literal transformative AGI is not imminent. That's fine. Doesn't mean it's a bubble.

Speaker 1:

Doesn't mean it's a bubble. Let's go.

Speaker 2:

Hit that gong.

Speaker 1:

He's got

Speaker 2:

this. Normal technology that improves productivity and growth is very valuable. The market already reflects this.

Speaker 1:

That's great. Nothing ever happens and everything is fine and we're so back and it's already priced in. The exponential RSI take off.

Speaker 2:

Famous famous last words. Suspended cap says the Carpathi interview is scaling. We aren't short compute.

Speaker 1:

Scathing.

Speaker 2:

Scathing. We aren't short compute. We are short breakthroughs that replicate things the human brain can do. No doubt. We'll get there, but wouldn't be surprised to see markets sputter on Monday while this makes the rounds of really honest stuff.

Speaker 1:

Yeah. No one in the market is talking about this. Like, they like, the the private markets and the public markets are fully disconnected. The public markets the Nasdaq is up 1.46%. This the vibes on X are way different than what happens on Wall Street.

Speaker 1:

I remember when, Nat Friedman and Daniel Gross went on Strathecari, and they did a great they did a great interview series about AI back in 2022, 2023. There's a whole series of them. And in there, they talk about NVIDIA specifically and how important NVIDIA chips are gonna be. And NVIDIA was, like, a $300,000,000,000 company or something at the time. And it, like, almost, like, I don't know, three x or 10 x.

Speaker 1:

Maybe it was, like, maybe it's eight x.

Speaker 2:

To be fair, JPMorgan did downgrade Oracle's credit rating today.

Speaker 1:

On on the Carpathi poll, you think that's what happened? They were like, oh, they're watching DoorDash. They're like, we gotta downgrade this. This is wild. Do do you know who Connor Leahy is?

Speaker 4:

Yeah. So he's like a famous I think he's a big doomer.

Speaker 1:

He's a doomer. Right? Yeah. So he says, I wanna preregister the following opinion. I think it's plausible, but by no means guaranteed that we could see a massive financial crisis or bubble pop affecting AI in the next year.

Speaker 1:

I expect if this happens, it will mostly be for mundane economic reasons, overleveraged markets, financial policy of major nations, mistiming of bets, even by a small amount of and and good old fraud, not because the technology isn't making rapid progress. I suspect such a crisis to have at most modest effects on time lines to existential dangerous AI ASI being developed, but will be used by partisans to try and dismiss the risk. Sadly, a bunch of people making poorly thought through leveraged bets on the market tells you little about the underlying object reality of how powerful AI is or soon will be. Do not be fooled by narrative. So he's saying, yes.

Speaker 1:

We we gotta go even harder. The the bear market's bullish for doom, potentially. Underrated AI doom scenario, the bubble pops so hard that everyone goes bankrupt. There's mass famine and starvation, and humanity population collapses all because we went too levered long AI. What do you think about that?

Speaker 1:

Plausible? It just might be. Matt Palmer says, I've been saying this. Update your models and timelines. Foam sells.

Speaker 1:

Mankind is back. Mankind is back. And Figma is back. Think bigger, build faster. Figma helps design and development teams build great products together.

Speaker 1:

How it feels listening to this podcast, Han Wang.

Speaker 2:

Can we pull up the speaker?

Speaker 1:

Yep. It's balloon popping. We will see about that. Who knows how this will all play out?

Speaker 2:

Francois Chalet says the reason it is so important for everyone to keep pretending that AGI is definitely right around the corner is that there is now over 1,000,000,000,000 of investment riding on this belief, either already expended or committed. Current and recent past CapEx cannot be justified by current use cases and technology, currently spending 10 to $15 to make $1. To be to ever be in the black, you need dramatically better tech applications, and you need them fast before current data centers depreciate, which is a three to five year time scale. Now Brown was calling out, said, didn't you just say two months ago that

Speaker 1:

you No. Just gonna open the air.

Speaker 2:

Did you that you think AGI is about five years away? Of course.

Speaker 1:

Yes. And Chase Chase Coleman, the founder of Tiger Global, came out of hiding on the same weekend carpath. He said AI labs are promoting slop as innovation to secure funding, and we're decades away from AGI. Sell it all today.

Speaker 2:

It was a good was a good panel. Yeah. Listen, Brad Brad Gerstner.

Speaker 1:

I have I didn't I didn't have a chance.

Speaker 2:

We're on it.

Speaker 1:

I should have thrown that one in the mix. I listened to the full Palmer Palmer Palmer Lucky on Joe Rogan episode. That was a lot of fun. Palmer will be on the show tomorrow. And I listened to the Carpathi episode, which was great.

Speaker 1:

Oh, another interesting Carpathi tidbit. He invented the term hallucination. He he hallucinated calling, what LLMs do, hallucination, which is, very funny. I just thought that was really cool. Anyway, we should move to the, the Napoleon jewels that were stolen from the Louvre in a major robbery.

Speaker 1:

The thieves stole jewelry from display cases in Galleria de Paulione.

Speaker 2:

Finally, a brazen heist. Finally, brazen heist. Been little to no brazen heists.

Speaker 1:

If you're new here on this show, we are anti heist. We are anti crime. Do not do this. If you have information that could lead to the arrest of these thieves, let us know. We will send you a hat.

Speaker 1:

That's the best we can do. But Luke Metro says, is this the cure for male loneliness? I hope not because we don't want more brazen heists. We want less brazen heists. Leave the crown jewels alone.

Speaker 2:

A truck mounted furniture elevator to reach the gallery housing the artifacts and then sped away on motorcycles, officials said. Tourists were streaming into the world's most visited museum on Sunday when a group of thieves burst in through a window of a gilded gallery on the Second Second Floor and made off with a set of priceless royal jewels. It was broad daylight, roughly 09:30 local time when four individuals driving two powerful motorcycles in a truck with a portable portable furniture elevator parked outside the Louvre Museum.

Speaker 1:

I didn't know what a furniture elevator was. Have you ever heard of this before? It it seems like actually a really awesome invention because if you need to get a couch to the top of, like, your house and you can't take it up the stairs, you just put it on this. It it looks like a fire truck, basically. The fire truck pulls the ladder up, and then the elevator slides up.

Speaker 1:

But they move the

Speaker 2:

And it is funny. When I saw the picture of the furniture elevator outside the museum, it looks like just kinda makes sense. Right? Normally, when you see something like this, you're not thinking to yourself that's out of place.

Speaker 1:

Yeah. No. No. No way.

Speaker 2:

There's maybe one person that worked out of at the museum that would think, that doesn't seem scheduled. Yeah. No. Totally.

Speaker 1:

It happened during normal hours. Of the less than seven minutes.

Speaker 2:

So Yeah. Of the perpetrators, at least one wearing a high visibility yellow vest That looks amazing.

Speaker 1:

Blend in so much. It just looks like workers. Like, oh, they're gonna go work on the balcony. They need to they they need some equipment. Like, they're really the value of, like, the professional equipment.

Speaker 1:

But there was a there was

Speaker 2:

a Angle crown grinders to cut through the window and get inside.

Speaker 1:

Yep. Right. They attempted but failed to set fire to their truck. So they were trying to burn the truck on the way out to, like, get rid of all the fingerprints, I suppose. And they dropped the crown of Empress Eugene with nearly 1,400 diamonds before they sped away.

Speaker 1:

It was found damaged. No one was injured. Eight pieces in total were stolen, including an emerald earring and an emerald necklace that belonged to empress Marie.

Speaker 2:

Such a brutal fumble.

Speaker 1:

That is Fumbling fumble of this diamonds. Massive fumble. Absolutely brutal. An emerald ring and an emerald necklace. Some of these some of these things are are massive.

Speaker 1:

I didn't know this, but The Wall Street Journal takes us through some some history of burglaries. There's been a long string of

Speaker 2:

French history. This is a very

Speaker 1:

It's extremely French. Leonardo da Vinci's Mona Lisa was stolen from the Louvre in 1911 by an Italian carpenter only to be found two years later. He got away with it for two years. In 2010, a thief broke into Paris' modern art museum and made off with more than a $120,000,000 of artworks, works, including some from Picasso and Matisse. A thief the thief dubbed Spider Man was sent to prison for the crime and later appeared in a Netflix documentary.

Speaker 1:

What do you think they're gonna do with the stolen jewels? We were debating this at breakfast. It's gonna be hard to fence this. I think that's the term. You you can't just go on eBay and throw it up there.

Speaker 2:

Facebook Marketplace.

Speaker 1:

Facebook Marketplace was built No.

Speaker 2:

Yeah. It it's an interesting challenge because

Speaker 1:

Yeah.

Speaker 2:

Yeah. Obviously, they're you're they're never gonna be able to capture the same value necessarily as if you're able to do an open auction. Yep. Right? If you're able to do an open auction and you could buy this legally Yep.

Speaker 2:

And the person that bought it could then go flex it around and publicly be the owner of of these, you know, incredible historic jewels. Yep. That would get you sort of the maximum price. That being said, given the historical significance, think there's there's a small pool of buyers. Yeah.

Speaker 2:

There's probably a 100 people in the world Yep. That would be willing to pay tens of millions Yeah. And potentially more Yeah. Just just to know that they had it and just to be a family secret.

Speaker 1:

Do you think there are any buyers that would flex it? Because No. Obviously, if you're if you're an ally of France, you're not gonna do that, obviously, because they'll immediately come and be like, those are stolen. Give it to us. Doesn't matter what you paid.

Speaker 1:

You gotta return it. If you're America or Germany or Britain. Right? But what if you're Vladimir Putin? What if you're Kim Jong Un?

Speaker 1:

Like, if you're in North Korea, like, oh, France is gonna invade North Korea for a couple of rules? Kim Jong Un just rocking it himself. I wouldn't be surprised. I don't know. I I feel like there are some countries that are actively not friendly with France that could just say, you know what?

Speaker 1:

We're gonna we're gonna buy this and we're gonna We're just gonna rock it. It's gonna be in our museum now. It's a stolen artifact. A lot of museums have stolen artifacts.

Speaker 2:

Yeah. The other the other play is to you know, if if you're a family that does you know, is is into long term planning Yep. You could say, alright. We're gonna store these for for literally a 150 years as a family.

Speaker 1:

Mhmm.

Speaker 2:

And once a hundred and fifty years pass, we're gonna be able to pop up and we're gonna be able to talk about our jewels publicly because everybody that was around when they were stolen isn't here anymore. Nobody's really gonna care. It's not gonna we're gonna be the entire world population will have turned over, and we'll be able to suddenly come out and and make up some story of Yep. Of how our family got them.

Speaker 1:

Yep. We need to up our security because I feel like someone's gonna steal that horse

Speaker 2:

Yeah.

Speaker 1:

If we're not careful.

Speaker 2:

They're gonna use a furniture elevator to get out of

Speaker 1:

here. If you wanna manage risk in your business, go to vanta.com. Automate compliance, manage risk, prove trust continuously. Vantage's trust management platform takes the manual work out of security and compliance process and replaces it with continuous automation.

Speaker 2:

Okay. This story from Friday, Saturday Yes. I was cracking up.

Speaker 1:

Tell me.

Speaker 2:

So long story short, some OpenAI researchers

Speaker 1:

Mhmm.

Speaker 2:

Were able to use ChatGPT to find the results to a handful of problems

Speaker 1:

Yes.

Speaker 2:

That were previously marked as unsolved. Sure. What are they called? The the air Erdush. Erdush.

Speaker 2:

It's from Paul Erdush. Paul Erdish. He's mathematician. And so they were able to they said, update. Muthab and I pushed further on this using thousands of GPT five queries.

Speaker 2:

We found solutions to 10 Erdish problems

Speaker 1:

Mhmm.

Speaker 2:

That were listed as open. And so they put this out and it looked like ChatGPT had made this incredible novel discovery. Yep. And then turns out, they were just using ChatGPT as this effectively web search.

Speaker 1:

Knowledge retrieval. Novel knowledge retrieval. Someone had put these up there. Yeah. Okay.

Speaker 2:

And they were kind of positioning it. It it was you you could read it If both you went back and looked at their original post, you could see how they were saying like, we just found the solutions. Yep. We didn't we didn't generate them.

Speaker 1:

Yep.

Speaker 2:

But the way that it was written was was definitely like a lot of people would have just read it as, oh, wow. Chad g GPT five is doing Yeah. Novel novel discovery.

Speaker 1:

I have a very funny story about this.

Speaker 2:

And so, of course Yep. Just just to close it up, Demis over at DeepMind just commented, this is embarrassing.

Speaker 1:

Absolutely roasted. And the OpenAI researcher said, I I deleted the post. I didn't mean to mislead anyone, obviously. I thought the phrasing was clear. Sorry about that.

Speaker 1:

Only solutions in the literature were found. That's it. And this and I find this very accelerating because I know how hard it is to search the literature. John And Likudu is not happy.

Speaker 2:

I'm not gonna read that one.

Speaker 1:

He what he he say he's making a play on hosted by their own petards, which is where you are a victim of something that you it's like stepping on a rake that you put down on the ground. He is not making an allusion to the r word or any sort of slur. So just to be clear there. Anyway, I had a funny story when I was in Silicon Valley in 2012. There were these, like, extremely hard programming challenges that I could do the first couple of, but I couldn't actually get that far because they were really, really difficult.

Speaker 1:

And I didn't really have a reason to, like, go and get like, be, like, a competition programmer. I was just trying to build a company. And but I wanted if if you if you could get, you know, if you could get the service to, like, accept your result, you could get this badge that you could put on your website, and it would link it into your profile, and it would be, like, very legit. It was like computer it was like, you know, programmer cred. And so what I figured out was that there were a lot of answers out there that were, like, on GitHub and open source.

Speaker 1:

So And you could go and find those, put those into the system, get authenticated as, like, having solved these really hard problems. And so I got, like, the gold metal and was able to put it on my website, and it made it look very official that I I could not do that level of programming. So I was doing the same thing, basically. So, you know, maybe I shouldn't be casting stones here. Anyway

Speaker 2:

Elon Oh, yeah. He's

Speaker 1:

he's He made it. Taking shots.

Speaker 2:

Dunked on well, Gabe over at OpenAI. Well, to start it off Yeah. Elon says, my estimate on the probability of Grok five achieving AGI is now at 10% and rising. Mhmm. Gabe says 10% chance Elon declares he reached AGI a fourth time.

Speaker 5:

Elon says

Speaker 1:

Elon ever actually declared he reached AGI? I feel like he has not declared that. Like, I feel like Elon says a lot of crazy stuff about, like, I hope that we'll be able to do novel physics. I think the next version's gonna be really powerful, this and that. But I don't know that he's ever actually called a a version of Grok AGI.

Speaker 1:

But, anyway, clearly, it clearly upset Elon because he replied and said, you call yourself a, quote, unquote, researcher. Hitting somebody with the double quotes, we know with the pain that that inflicts. It's terrible. You never wanna be hit with the double quotes. And he says pathetic.

Speaker 1:

And so, rough rough day on the timeline. Timeline in turmoil.

Speaker 2:

Jensen Huang was at a Citadel event Mhmm. Future of global markets

Speaker 1:

Mhmm.

Speaker 2:

And said we are a 100% out of China. We went from 95% market share to 0%. I can't imagine any policymaker thinking that's a good idea. And, yeah, there was some pushback here. Matt Palmer says, 100% out of China, but doing astonishingly healthy business in Singapore and Malaysia.

Speaker 2:

Totally out of the blue and not for any intelligible reason. Swinging around accusations that Singapore and Malaysia are fronts for Chinese chip demand.

Speaker 1:

Yeah. People have been going back and forth on Jensen's involvement in China. Shyam Sankar took to the page of The Wall Street Journal. I'm sure we have a copy here, actually, talking about NVIDIA in China. We actually have some folks coming on the show later today, hopefully, to break down some of what's happening both in the rare earth world and also in the in the semiconductor world with regard to China.

Speaker 1:

If you are joining us and expecting an interview Ryan Cohen, we are working through some technical difficulties. We are hopefully going to be getting him on the phone in just a few minutes. But first, let me tell you about graphite dot dev code review for the age of AI. Graphite helps teams on GitHub ship higher quality software faster. Palantir CEO, Sean Sankar said, he said, why the China doves are wrong?

Speaker 1:

American business leaders cozying up to Beijing refuse to see that the communist party wants us to fail. China's commerce ministry last week announced far reaching export controls on lithium batteries, products that use Chinese rare earth materials and related technologies. The export controls, which president Trump characterized as a rather sinister and hostile move, are the latest reminder that The US is funding its own destruction through economic dependence on a communist adversary. Many American business elites persist in denying this reality. Jensen Huang is just one of them.

Speaker 1:

He said in a recent interview that while some Americans wear the label China hawk as a badge of honor, it is really a badge of shame. The future, mister Huang says, doesn't have to be all us or them. It could be us and them. A nice sentiment. But the CCP leaders don't believe it.

Speaker 1:

They often speak soothingly of their country's peaceful rise, but the party's history and actions tell a different story. Influenced by the Chinese Civil War and much earlier warring states period, the party believes that stability comes from control. This belief explains its ruthless effort to consolidate power. The Communist Party believes China and The US are locked in a great struggle for mastery in this worldview. It is not enough for China to rise.

Speaker 1:

The US must fall. It is a dark telling of the current state of affairs. It is a rough it is a rough time in geopolitics. He he closes by saying, the first step to ending our dependence on China is admitting we have a problem. We can continue as useful idiots, decrying China hawks who point out that we're funding our own demise, or we can wake up to the reality that we're already in an economic war in which every purchase and investment will help determine which system survives.

Speaker 1:

Very, very rough. I mean, a lot of this is, all teeing up a major trade deal. And, I mean, we're starting to see the, outlines of of exactly how big the chip stacks are on either side of the poker table, if you wanna use that analogy. It is a the the, the rare earth equation is, is certainly a big one. America has a bunch of levers to pull, but, the the the the chip control's one.

Speaker 1:

It it feels less, it feels like a smaller chip stack, honestly. Because even when NVIDIA says, hey. Okay. We're gonna we're gonna sell, you know, h twenties in China. China says, oh, we don't want them.

Speaker 1:

Like, we'll just we'll just stick to Huawei. We will wait. We will, you know, be a little bit later on the slop curve, on the sloppification of our economy. And and and we'll see you in ten years when we're actually fighting it out for AGI. And at that point, we will have our our own full semiconductor stack on the frontier.

Speaker 1:

Stack. Yeah. I mean, that that feels like what they're thinking of. I don't know what their interpretation of the Andre Karvathy interview was, but, it seems like they're they're, if if they were truly, like, the next scaling run is what does it, they would be like, oh, let's take the NVIDIA chips as well. Instead, they're saying, okay.

Speaker 1:

Well, let's let's actually gear up for independence over a long period of time. Wall Street Engine has a story here. JPMorgan says NVIDIA CEO Jensen Huang's projection of AI CapEx growing from 600,000,000,000 today to 3 to 4,000,000,000,000 by 2030 is financially feasible, though ambitious. The bank expects the tech sector to fund it through operating cash flow, private equity, and venture capital inflows in new debt issuance, which we've been tracking. Even with a projected 1.2 1,600,000,000,000.0 annual funding gap, private markets could contribute about 500,000,000,000 per year by 2030, leaving the rest to be covered by leverage expansion.

Speaker 1:

JPMorgan estimates 40% of new debt, 430,000,000,000, would come from bank loans and 60% from bond issuance. So there's gonna be a new, 1,000,000,000,000 of debt coming into the system now. Even with this increase, the tech industry's net debt to cash flow ratio would rise only from point seven to 1.2 x, still below the global average. So that's like the that's the bear case for, like, you know, everyone's investing a ton, but we're not completely over our skis. It is a very aggressive build out.

Speaker 1:

It might be it might take, you know, a hit on depreciation, or it might take time to really monetize all this. But Yeah.

Speaker 2:

And the key question is, you know, if you want AI CapEx to scale to 3 to 4,000,000,000,000, you're gonna need revenue to support that. You're gonna need a lot of revenue fast. We've seen, you know, tremendous growth with on the agent side, you know, developer agents, whether that's, you know, Cloud Code, GitHub Copilot, etcetera, etcetera. Yeah, it's it's it's not that it's not that it's just financially feasible. It's just that it has to be backed up by something.

Speaker 2:

And thank you for bearing with us, everyone. We are getting Ryan on in any second. Yes. AWS being down is presenting some technical difficulties, but the interview is still happening.

Speaker 1:

Yes. We're working on it. Thank you for sticking with us. And thank you for checking out julius.ai, the AI data analyst for everyone. Chat with your data and get expert level insights in seconds.

Speaker 1:

And that's fin dot a I, the number one AI agent for customer service, which is also a sponsor of TEPN. So thank you to them. Dean Ball, who is coming on the show today, he says, I have been contacted by a person clearly undergoing LLM psychosis reaching out be because four o told them to contact me specifically. I have heard other writers say the same thing. I don't know how widespread it is, but it's clearly a real thing.

Speaker 1:

Jordy, have you have you received any any inbound from random people who seem to have GPT psychosis? Hard to say. I feel like my entire, like, unread, like, DM request tab is probably, like, LLM psychos. But but but those those folks don't break through. And honestly, like Yeah.

Speaker 1:

The bot slop has been so bad for so long that

Speaker 2:

Hard to tell what's

Speaker 1:

It's

Speaker 3:

hard to

Speaker 1:

tell what's actually

Speaker 2:

Emmett Cool. Emmett Shear last week was being accused of having LLM psychosis. He said if you're trying to understand AGI core concepts, you should familiarize yourself with Thermal time, quantum Darwinism, quantum Bayesianism. Free energy principle, the zeta function. Integral theory.

Speaker 2:

AGF, CFT. I think

Speaker 1:

that I

Speaker 2:

think he was I think he I think he was joking here.

Speaker 1:

Yes.

Speaker 2:

But but

Speaker 1:

We've talked to

Speaker 2:

him. Many people did not did not take it that way. Yes. Yes. It and it

Speaker 1:

kind of spread out. Yeah. Like, lots of people have LLM psychosis, and lots of people have LLM psychosis psychosis too, where anyone who's using an Emdash is seen as someone who has, like, no brain left because they've been destroyed, and that's not also not true. Yeah. Emmett went on a tear like this a couple months ago.

Speaker 1:

We had him on the show after that, and he seemed fine. He was, like, completely normal on on the on the chat and was, like, clearly just having fun. And I think he's just vague posting. I mean, look at his look at his fave icon. He clearly likes to, you know, post about weird and esoteric things and just, you have a good time.

Speaker 2:

Mike Isaac is highlighting there's a company called Aerie. It's the first big company that this woman, Rachel, has seen that formally states they won't use AI generated imagery. It's one of their most engaged posts with 33,000 likes. Mike Isaac says, I do think none of what we do involves AI. Increasingly becomes a marketing thing over the next few years as we hit saturation, smart posturing, in my opinion.

Speaker 2:

So, yeah, Aries saying, in 2014, we stopped retouching. Today, we commit no AI generated bodies or people, real people only. We believe transparency isn't a trend. It's our promise to you. No retouching, no AI, a 100% real.

Speaker 1:

Interesting. I do do you know this brand, Aerie? Are you familiar with this brand before I mean, good way to go viral and stuff. But, yeah, I don't know. I I I wonder I wonder where where this goes because, like, in Hollywood, there's been a big pitch for, this movie doesn't have any CGI.

Speaker 1:

You know, we shot it all practically. And then what that really means is that, okay. They use miniatures, but they actually did a bunch of green screen stuff, and they did some set extensions. And, yeah, there were some three d models over here and there, but it was mostly in the background. And it's just not a crazy three, you know, CGI explosion.

Speaker 1:

Doesn't look like transformers. But if you just look through the credits on one of those movies like, Christopher Nolan's famous for this saying, like, you know, we shot everything practically, or we did or or we really steered away from from CGI. If you look through the credits, you'll see a bunch of three d artists because they're clearly doing some three d reconstruction of certain things. But it might just be, oh, there was, a boom pole in this shot, or, like, oh, there was a light stand in the back, or, like, oh, you could actually see, like, a taxicab, and we were shooting in in, you know, the eighteen hundreds. So, like, we gotta pull that out.

Speaker 1:

So just go in there with the with the the pen tool and, like, you know, replace it with AI. Anyway, we have our first guest of the show. We have Ryan Cohen, the CEO of GameStop. Can you hear me, Ryan? How are you doing?

Speaker 3:

I can hear you.

Speaker 1:

Okay. Sorry. We have we have serious, serious technical difficulties today. But I believe we have you here on the Thank you so much for joining. Thank you so much for taking the time.

Speaker 1:

How are you today?

Speaker 3:

I'm doing well. How about yourself?

Speaker 1:

We're doing great. We'd love to just get, the story of GameStop from your perspective, kind of how you tell it now. You're a couple years into this project. We'd love to kind of get the the high level just to set the stage.

Speaker 3:

Where do you wanna start?

Speaker 1:

Maybe the original idea. Like, when did you think you'd become involved with this company?

Speaker 3:

I originally invested as a passive investor.

Speaker 1:

Mhmm.

Speaker 3:

I had few conversations with the management team and the board of directors, and I realized that I saw a lot of stuff that you read about that you don't really believe until you actually engage with them. And as we went through conversations, I went from passive to active and, filed the 13 d, I believe, in, August 2020, and then I joined the board in January 2021. And the rest is history. The long story short is there's been a a lot of changes at the company, a lot of cost cutting Yep. A lot of rationalization focusing on the basics of running a profitable business.

Speaker 3:

And a lot of it was getting people out of the way. A board of directors that had perverse incentives, a management team that had perverse incentives. Ultimately, what you see is that when people don't have their own money on the line, they don't give a shit. Mhmm. And they're focused on all kinds of other stuff that ultimately doesn't matter to shareholders.

Speaker 3:

And a lot of it was getting that out of the way so that we could actually focus on the business. And, you you see it in the results of the business today versus where it was not just when I joined the board, but, you know, you look at the results even prior to that. It's a it's a tough business.

Speaker 1:

Yeah. Totally. Well, what do you think the mainstream media gets wrong about the GameStop story today or has gotten wrong throughout the whole saga?

Speaker 3:

That's not an easy question to answer because we're talking in generalities, and

Speaker 1:

Sure.

Speaker 3:

I don't wanna you know, I can't stereotype.

Speaker 1:

Sure.

Speaker 3:

In general, everyone thinks they know. The mainstream c media thinks they know. I'm running the business, and they seem to have greater visibility into the business than I do. So everyone has an opinion as opposed to actually just focusing on the results. And in general, the news should just focus on the results instead of, comments from the peanut gallery.

Speaker 2:

Yeah. Let's let's let's talk about q two specifically. What what made that one of the biggest quarters in years?

Speaker 3:

It's a it's a focus on running a a business like an owner and cutting costs and making money and leaning into the areas of the business where there's margin potential, just trading cards, and then getting rid of the bloat. As an example, I was just looking at this earlier today before we spoke. 2021, there were 14 over 1,400 people in corporate. Wow. Today, there's, like, 400 people.

Speaker 1:

Wow. Yeah.

Speaker 3:

So much productive today than we were in 2021. SG and A has come down by, like, 50%. So people build teams. They hire people. Ultimately, what that means is they delegate their work to someone else.

Speaker 3:

You end up taking on this crazy cost structure, and it works when the business is growing, but doesn't work well once you, the business stops growing. And that's where GameStop was, and, physical retail is tough.

Speaker 1:

Yep. How much of that corporate restructuring going from, I think you said, 1,400 people down to 400, how much of that is just reorienting around a players, aligned incentives, picking the best person to actually run a specific initiative versus using technology, using software, using AI? Is there any sort of narrative around that that you've found success with?

Speaker 3:

It's both. Mhmm. Even finding the right people, though. Yeah. You know, you don't you don't know.

Speaker 3:

I mean, I've been interviewing people for a long time, and you meet people, I meet them, and I get really excited because they're really good at the interview process, and they know how to say the right thing. So what does that mean? Means they've got really good public speaking skills, so they over index there. And then you actually when you look at their execution skills, they're not great.

Speaker 5:

Mhmm.

Speaker 3:

So I've been as equally excited as people that I've hired that have worked out as I have been of people that I've hired that I wasn't excited about, and then they prove to me that they can actually execute. So, you know, the results speak for themselves. You you really only know once you put people someone in the position and you see what they're capable of. They can tell a great story. They can put together a great PowerPoint presentation, but you don't you don't actually know until you see what they're what they can actually do.

Speaker 3:

So Yeah. There's no question that artificial intelligence just broadly speaking, technology has increased productivity. You know, that's that's been a big benefit and then doing more with less.

Speaker 2:

Yep. You take a $0 salary. You have billions of dollars of cash on hand. How do you plan to allocate it over the next two to three years? I don't wanna I don't wanna go too long term.

Speaker 3:

There's, we don't have a gun to our head. So it needs to be a situation where the downside is limited and the upside is really high. And, yeah, that's a different calculus than the world of private equity or or venture capital or any money managers where they're incentivized to deploy capital because they get management fees. In this case, a fancy way of saying is it it's risk adjusted. And I I don't want to lose money, and I I want a situation where there's a good chance of making money and a really low chance of losing money.

Speaker 3:

So it needs to be a pitch that is pretty much down the middle.

Speaker 2:

Does that mean you're waiting you're waiting for a a crash?

Speaker 3:

You never know what's gonna happen in the financial markets. They can go from from green to red and and they don't flash yellow. So Yeah. When that happens, we'll be in a position. But

Speaker 2:

Yeah. I I just feel like it's I know. I just feel like it's notable that you have all these, you know, digital asset treasury companies that are just market buying, you know, obscene amounts of of, you know, Bitcoin and and other tokens, and and you guys are, you know, taking a longer view.

Speaker 3:

Yeah.

Speaker 1:

What are you excited about in the core business over the next few years between retail online? You you mentioned trading cards. Can you unpack a little bit more about some of the the key initiatives, like the customer behaviors that you see as really big opportunities?

Speaker 3:

I tried a lot of different things.

Speaker 1:

Mhmm.

Speaker 3:

So I originally went in with the the Chewy playbook Yeah. Was we focused first on consumables. So we had a lot of success on pet food, treats, litter, things that you could put on auto ship. And then it went from focus on consumables, getting customers on autoship to we're gonna be the everything pet store, and we're gonna expand our catalog, and we're gonna add all these hard goods. So I had all these preconceived biases where I was going to copy the Chewy playbook at GameStop and basically be, like, the everything store Yep.

Speaker 3:

For gaming. And we, I hired a bunch of fancy people from both Amazon and Chewy, and we expanded the catalog. And we added a bunch of product, and most of that product didn't sell. And we ended up marking it down and taking a big hit, and it cost shareholders a lot of money. Because once you have the product trapped in stores, you gotta mark it down in order to move and get it out of the stores.

Speaker 3:

Whereas within the consumable space, if we ever bought it if we ever overbought inventory, we just waited and ultimately, you know, bought too much cat food or whatever it ended up selling. Yep. So physical retail is very, very, very different than in commerce. And I spent a lot of money to figure that out. And so what I learned is that we went into all these categories, and a lot of, we took some significant hits on losing money.

Speaker 3:

We lost money. It's that simple. By trying to expand into all of these categories that were not core to the GameStop customer. And then along comes collectibles. And, obviously, after expanding into categories where there's very little success, your risk appetite at that point is pretty low.

Speaker 3:

All of a sudden, we see that they're, like, the GameStop customer. Yep. Really, when it comes to trading cards, there's a strong appetite for for trading cards. And that category has done very well, and we've gone from, like, 10% of our sales to over close to a third probably for the full year is gonna come from collectibles. And

Speaker 1:

That's remarkable. That's Congratulations. Yeah. I I wanna I I wanna know more about the collectibles thing. Let's let's table that for a second.

Speaker 1:

I'd love to know more about where you see the traditional video gaming market going. I'm I'm experiencing kind of whiplash because I see mobile games and and free to play growing and micro payments happening. EA is getting taken private. You have a lot of stuff going on in that end. But then you also have Palmer Lucky kind of bringing back the Game Boy and the n 64 with Chromatic, and I know you're you're you're partnering on that.

Speaker 1:

What what are you excited about? What is the shape of the traditional sort of, like, video game market look like over the next couple years?

Speaker 3:

The video game market is definitely going from physical to digital.

Speaker 1:

Mhmm.

Speaker 3:

So our ability to play in the digital world is limited. There's a lot of money that's being spent, and, we've taken a CapEx light approach, like a a a pretty risk light approach to the digital world. If there's not a a clear path us being able to to make money and a a payback period that's that's pretty attractive, then, this isn't a story of, like, moonshots. Sure. So there's no moonshots that are being done.

Speaker 3:

Like, we're

Speaker 1:

Execute.

Speaker 3:

We're focused on real returns.

Speaker 1:

Yep. What about wearables, virtual reality? We heard a story about the Meta Quest or the the Meta Ray Ban displays, and a friend of ours went and had to test them at Best Buy and didn't have a great experience. And it feels like, if we do enter sort of like a wearable era, there's a renewed demand for in store experiences. Is that on your radar at all, or is that more of like a, you know, futuristic thing that you'll deal with it when it comes down the pipe?

Speaker 3:

I mean, the when I think about wearables, the Apple Watch is a good product. When I think about the Meta Quest, I mean, it's a joke. Okay. The all of the virtual reality stuff doesn't feel like like, who's gonna walk around with these retarded glasses on their head? Like, it just it's it doesn't it does not seem like that's the future.

Speaker 3:

But if we could sell a product and we can make really high margins, then, obviously, we're gonna sell the product, but we're not investing in in in virtual reality or in in the metaverse unless there's a clear path to being able to deliver results for shareholders.

Speaker 1:

Yeah. Yeah. I meant more just as, like, Apple does have the Apple Store network. Other companies that are trying to get into wearables and might need in store demos don't have one of those. But, yeah.

Speaker 1:

I mean, your your your rationale makes a ton of sense.

Speaker 2:

It's so interesting to think about the you know, when when I hear you talk, Ryan, it's you're you're everything you say aligns with thoughtful capital allocation

Speaker 1:

Yes.

Speaker 2:

And yet, the the broader, like, world seems to believe that this is this is just about being, you know, kind of this is, you know, it it doesn't

Speaker 1:

It's moonshots.

Speaker 2:

The other there there's there's CEOs out there that have effectively meme stocks, and they just act and talk a lot differently Yeah. Than you do.

Speaker 1:

No. This is refreshing.

Speaker 2:

What do what do you think about some of the the conversations? There's been, I guess, rumors that the admin is interested in companies moving to biannual, reporting instead of quarterly reporting. Do you think that's smart? What what kind of moves from the admin, this year have you been, particularly, interested in?

Speaker 3:

I like reducing costs. So if it costs us less money than to report biannually, then by all means. I think it's important for shareholders to have visibility into how the company is doing and then making a determination whether they wanna stay invested or not. But if we have an opportunity to reduce our costs, I mean yeah. So it it costs a lot of money to be a public company.

Speaker 3:

Yeah. We spend a lot of money on audit fees. So that hurt me at Chewy. That hurts me at GameStop. They they charge us a lot of money.

Speaker 3:

So if it me if it ultimately means we spend less money on being a public company, then and it makes us more efficient, then that's that's fantastic.

Speaker 1:

I I've I did did I didn't mention it, but thank you for Chewy. I've been a subscriber for probably a decade. You know, it's it's been the backbone of my household and my dogs thank you as well. I'm interested to hear your take on kind of evolution of e commerce. There's a lot of chatter about agentic commerce and people buying through products through their chat apps.

Speaker 1:

Have you looked into that? Do you have a a a current, framework for thinking about the the adoption rates of people buying stuff through an LLM?

Speaker 3:

Amazon started selling pet products in the late nineties. Mhmm. And they were they were doing okay. Mhmm. Chewy comes along.

Speaker 3:

We originally wanted to start off selling jewelry. We actually bought hundreds of thousands of dollars worth of jewelry. We went I went to a trade show, and I bought hundreds of thousands of dollars worth of jewelry. We set up the website and everything. And then I was shopping in a neighborhood pet store, and I had a, at the time, like, a five or six pound teacup poodle.

Speaker 3:

And I kept on going to this neighborhood pet store every few weeks. And I run into the office. By the way, I don't wear jewelry, so I was not passionate about the cat I didn't know anything. That's important.

Speaker 1:

But I

Speaker 2:

went to trash. I felt like, you're just kind of it was picking, like, this seems like a good Yeah. Business. Business. Little And it's light.

Speaker 2:

You can ship it.

Speaker 1:

Yeah. It was a little too mercenary. Like, you gotta get into the you know, build something that you want so you can give yourself feedback.

Speaker 3:

Yeah. But you think, like, intuition is, like, there's margins in jewelry, and so, like, you can make money. There was there was a blue Nile at the time that had really gross high gross margins.

Speaker 2:

And I

Speaker 3:

was like, we can do we can do well in online jewelry. Anyway, I went to this jewelry trade show. I bought hundreds of thousands of dollars worth of jewelry. And then I was shopping kind of at the same time every few weeks at this neighbor at Pet Store for my poodle, and I was way more interested in the pet category than I was in jewelry. I understood the customer, and so I ended up selling the jewelry.

Speaker 3:

I ended up getting, like, 80 or 90¢ on the dollar. Mhmm. And I went into the pet category, and I liked that well, I understood it. I I like the predictability of the industry. I like the fact that, like, once you're a pet owner, you're buying pet food pretty much for the rest of your life.

Speaker 3:

And and and we shifted. And so Chewy comes along. It's 02/2011, and I took the playbook from Amazon. So the focus on fast shipping, having a great selection, being able to get onto the website, add an item to the cart, and check out. Our average checkout time.

Speaker 3:

It was, like, less than two minutes.

Speaker 1:

Wow.

Speaker 3:

So, you know, you think about, like, GoDaddy, and I don't know if you've ever bought a domain name, but you go through the checkout process. They're trying to upsell you on, like, a gazillion

Speaker 1:

stuff. Yeah.

Speaker 3:

And then all of a sudden, Chewy is like average gecko time less than two minutes. We're not trying to upsell you. It's like we're gonna get you your pet food as fast as possible at the best price. So Chewy comes along in 2011 and completely, completely disrupts the industry. Yeah.

Speaker 3:

And we disrupted the independents. We disrupted Petco and PetSmart. And we were delivering your consumables at a better price, faster, with an easier experience backed up by great customer experience. And so, you don't necessarily have to be first to the game in order to be successful. Amazon was first to the game, but we focused on the category, and we were successful.

Speaker 3:

And so it's kind of what's interesting in technology in general is, like, you have these technology companies that are trying to do everything. Right? Amazon's trying to do streaming. They're trying to do ecommerce. They're trying to do everything.

Speaker 3:

And then all of a sudden, you have Netflix that is really successful in streaming. You have Chewy that's really successful in pet food. Yeah. So if you focus on a category, you could be very successful.

Speaker 1:

How did you process the d to c ecommerce era? It was, like, really hot in Silicon Valley. Every MBA was raising venture capital, slap a $250,000 brand on a white labeled product and raise some money, and then it kind of fizzled out. But, like, how were you processing that at the time? What's your postmortem?

Speaker 1:

Like, how should people think about building brands going forward? Is there still opportunity?

Speaker 3:

For us, building brand was acquiring one customer at a time and making sure that they got their pet food or whatever the hell they ordered from us, the best price, really quickly. That's how you build a brand. It's not spending a bunch of money on, you know, pets.com, the Super Bowl commercial. It is focusing on the the best marketing is word-of-mouth. So you allow a customer through word-of-mouth.

Speaker 3:

You have a great customer experience. You deliver the product really quickly at a great price, and you have a happy customer. And that was the way to build Chewy was getting big market leadership and making sure that customers were really happy.

Speaker 2:

What are the three, like, key lessons from Chewy that you feel like you and the GameStop team are applying today?

Speaker 3:

Running efficiently. There's no question that, we ran very a a focus on Cost. On you know, Chewy was they're so different. Thing is is you think, like, ecommerce and and and physical retail are the same, and that's where I I made a lot of mistakes because I showed up like a wise guy at GameStop. I thought I had a lot all the answers, but ecommerce and physical retail are very, very different.

Speaker 1:

Yeah.

Speaker 3:

So in general, within physical retail, what I've learned is, like, you wanna you have to run lean, and you're better off having less inventory than more inventory. So I cost shareholders a lot of money by taking the Chewy playbook at GameStop. And then I learned physical retail, which was muscle that I had zero memory, and it was on making sure that when you get the product, you sell it very, very quickly. Because if you don't, the product depreciates very quickly. Whereas with Chewy, if I overbought and sold the product very quickly, it didn't matter.

Speaker 3:

So Yeah. They're not the same. They're, they're not the same. The reason why, in general, both concepts have worked out is because, you know, you you you keep I have my own money on the line, so not gonna stop until I figure it out. And

Speaker 2:

You're him.

Speaker 3:

But it but it costs a lot of money to figure it out.

Speaker 1:

Skin in the game. The man's in the arena.

Speaker 2:

Jordy, out the you're Damn. You're too

Speaker 1:

Yeah.

Speaker 2:

People have said you're too bold as a CEO. Do you think the traditional concept of a of a buttoned up CEO, CEO that maybe shies away from controversy should is is obsolete or should be forgotten?

Speaker 3:

I don't even know what a CEO means. I want at the end of the day, I want whoever's in charge. If it ends up successful, they end up doing really well. If it ends up not being successful, they lose a lot of money. And call them call me the janitor for all I care.

Speaker 3:

At the end of the day, their incentives should be aligned with common shareholders. So, that's the most important thing is that the incentives are are generally aligned.

Speaker 2:

Yeah. You're the loudest shareholder. Yeah. How do you decide how do you decide what battles to fight? Because you got a lot of stuff coming your way at any given point, a lot of opportunity.

Speaker 2:

What's your framework?

Speaker 3:

My framework is what's gonna move the needle. Like, if if we can if if we can make 80% margins, but the upside is limited and so not gonna move the needle, then I don't care. If if it's able to translate into billions of dollars of shareholder value and scalable, then we're talking. So there's a lot of things that are, like, small time. And, yeah, we can make a lot of margin, but there's there's not a lot of upside in terms of shareholders, then who cares?

Speaker 3:

But if it translates into into something meaningful and scalable, then then I'm interested. Do you think

Speaker 2:

do you think power packs could be that?

Speaker 3:

What do you think?

Speaker 2:

Well, the chat is going crazy.

Speaker 1:

It seems like

Speaker 2:

that. They wanna know they they want the the update from your side.

Speaker 3:

Powerpacks is interesting. Power packs, physical and digital, is very interesting. We we we can't get enough inventory. So

Speaker 2:

That's always a good story.

Speaker 3:

I don't I don't wanna say anything because, I mean, users can

Speaker 1:

Do you see it as the

Speaker 3:

same figure out for themselves.

Speaker 1:

Yeah. Do you see it as the same sort of, like, it's more consumption focused, so it fits within the Chewy model? Or is is that the right way to think about this as opposed to, you know, you you you're you're there's less risk of getting stuck with a bunch of inventory?

Speaker 3:

Are we talking physical or digital?

Speaker 1:

Physical, I would imagine, would be like the place where there's inventory risk in some regard. But, if if you view it as more of a consumption product, then like the Chewy story, there's presumably less risk in, holding a bunch of inventory.

Speaker 3:

I like the trading card space. Okay. So we have, we GameStop has a lot of assets. We have the community and the brand that allows us to if you look at digital power packs, we've kept, we've kept it pretty limited in terms of being able to get into the digital category. And and, frankly, it's not a marketing thing.

Speaker 3:

We just we can't get enough inventory. And it's the same thing on the physical thing. But digital is more scalable. Mhmm. So if we had a choice between physical and digital, because digital is more scalable, we're gonna go towards digital.

Speaker 3:

Can't get Yeah. Enough inventory at at least at a price where we can I mean, you you can buy inventory at a a 110 or a 120% market value, but we're not gonna we're not gonna start doing stupid stuff like that?

Speaker 2:

How do how do you see the collectibles landscape evolving over the next five years? It feels like different players have picked a focus, whether that's live, traditional auctions, something, you know, like like you just discussed, as well. But, like, what what is what is the shape of the market going forward, in your view?

Speaker 3:

They're all connected. Like, if you look the overall collectible space, it's looked at as a stair store of value, and it's been that way for decades. If you look at trading cards, you it's, like, nostalgic to, you know, I I I grew up. I didn't I collected trading cards a little bit, but there's definitely a comeback right now on trading cards. So it's looked at as a store of value.

Speaker 3:

Whether that continues or not, who the who knows? I mean, everyone is like, well, it's gonna continue. Everyone thinks crypto is gonna continue. Nobody knows. Yep.

Speaker 3:

But but we're having a lot of success when you look at our assets right now. Like, the the way it stands right now is we're selling the product very quickly. We can't get the more inventory we have, we could sell the product. So we're gonna run efficiently. And if we can sell the product, great.

Speaker 3:

And if it happens where we can't sell the product, then we're gonna adjust. And, you know, we're gonna lower our costs and gonna focus on on on the things that make sense.

Speaker 1:

Do you have a do you have a take on LaBubu? I feel like if I find out about it, I'm ultra late, and I might have top ticked it when I finally learned about that. But it seems like it's somewhat important to the collectible boom, the story there. Do you have any idea what's going on?

Speaker 3:

We I don't do we sell the boo boo right now? I don't think we sell the boo boo.

Speaker 1:

Yeah. Yeah. I'm just wondering. Like, yeah, it's, like, kind of an odd strategy. It has some of the unboxing characteristics.

Speaker 1:

Very popular. Seemed like it just kind of emerged out of China out of nowhere. I was wondering if you had tracked the market at all.

Speaker 3:

No. Should we sell it?

Speaker 1:

I don't know. It might be too late. I have

Speaker 2:

no idea. They're very demonic in our view.

Speaker 1:

We we we we think they kinda just don't have the right vibe, and, I feel like there there's plenty of other collectibles that would be more on brand, personally.

Speaker 2:

How do how do you

Speaker 1:

What was that?

Speaker 3:

Is it females or males?

Speaker 1:

I don't know.

Speaker 2:

I think it's all sorts of people buying them.

Speaker 1:

I saw Tim Cook had one. So How are you

Speaker 2:

thinking about how do you think about m and a on the in the collectible space? I'm sure Mhmm. People come to you all the time with sort of platforms that maybe have some scale, but not quite, the scale that you have that that would love to sell to you. But, obviously, companies are bought, not sold. So I'm curious, if you've if if it's something that you would explore in the future.

Speaker 3:

They don't come to us as often as you think because they know well, they probably go to private equity or venture capital or these fancy hedge funds before they come to us because I wanna make sure that I'm I'm I care about cash flow and the price that I pay. So Yeah. We don't we see some deals, but it's hard to compete against guys that are, or girls, whatever, that are getting management fees.

Speaker 1:

You think, do do you think AI, plays into the collectibles world at all? Just this idea that, like, if you have a piece of IP, you can instantiate it maybe much quicker across a whole host of images and videos and kinda build out an intellectual property world faster. Is that actually an accelerant to the collectible trade?

Speaker 3:

I in general, I have been I'm the person that's very cynical when it comes to emerging technologies.

Speaker 1:

Sure.

Speaker 3:

So, like, autonomous driving is example. Like, everyone with, like, GM, Ford, all of the big OEMs, they're finished because there's gonna be autonomous driving. Oh, it's it's, when it comes to AI, it's a big problem. At some point, the computers are going after the humans, and I don't think it's that far away. I think that the sci fi movies, when it comes to AI, I feel it.

Speaker 3:

I feel like there's gonna be a big problem when it comes to artificial intelligence. And at some point, it's gonna be the computers against the humans. China versus The US, who's gonna be the winner? Who the fuck knows? But we got big problems with AI.

Speaker 3:

And it's interesting because you can't stop human innovation, and we've got this insatiable appetite to go into these technologies like artificial intelligence that are very disruptable. There's lots of money that's that is being poured into it. But what the future looks like, we have to be very, very, very careful. So artificial intelligence, it scares me. I mean, I I like the productivity benefits, but AI, once the robots come after us, scares me.

Speaker 2:

What's your timeline there?

Speaker 3:

It's faster. It's faster than I would have thought. When I look at what's happening I don't buy into emerging technologies. But when I look at the vast advancements in AI, this is no joke. And we have to be very, very, very careful about what's gonna happen to artificial intelligence.

Speaker 2:

Where do you think we are in the market cycle? Do you think it's, do you think it's 1999, February 2000? Does that is that even worth comping to, or do you comp to something else?

Speaker 3:

In AI, it feels early. Doesn't feel like we're at the end. It feels like we're, like, the second or third inning, but who the hell knows? But at a high level, I think that it's I think we have to be very careful. When you think about the future of humanity and whether AI is gonna benefit the future of humanity, I don't know.

Speaker 3:

I would, if I was running a dictatorship and someone made me king and you told me, should we move forward with this technology? It's not clear to me whether moving forward with AI is gonna benefit everyone. It's definitely gonna benefit the few that are invested in the industry, but there's gonna be a lot of people that are not gonna benefit from artificial intelligence. So

Speaker 2:

I don't know mean chat GBT chat GBT is depending on pay paid users right now. They can't you know, if they killed off humanity, that'd be kinda bad for business. Isn't there a way to solve that alignment issue?

Speaker 3:

You know, we've if you look at the tractor trailer, The US in the eighteen hundreds, and the tractor trailer I think it was, like, 80% of the population was working on the farm. And then all of a sudden, tractor trailer comes along. And a few centuries later, it's like two to 3% of the population is working in farming. And you would have said, the the hell can The US economy adapt to something that's so disruptive? And we did.

Speaker 3:

But when it comes to artificial intelligence, I feel like it's different. Maybe I'm biased from some of these sci fi movies, but there is gonna be a lot of wealth and equality that's created. I don't I don't like it. I mean, there's there's there's opportunities to be had. There's no question about it.

Speaker 3:

But is it better for humanity in aggregate, AI? What do you think?

Speaker 2:

Well, I yeah. I I think for me, I think it's very straightforward to imagine the dark sci fi timeline, but it feels farther away. I mean, at least in our corner of the Internet, people have been reacting to Karpathy's interview with Dharkesh that was I think it went live Thursday night. And it feels like the debate right now is is is AI frontier lab progress slowing down? Is it just auto complete?

Speaker 2:

Or and if it's auto complete and we don't have and if, basically, if the rate of progress is slowing down, are is there massive overinvestment right now? And what I'm hearing from you is simultaneously, generally kind of the the doomer point of view, which I think is fair, but at the same time, it doesn't sounds it doesn't sound like if you were running a a hyperscaler, you'd be ramping up CapEx right now.

Speaker 3:

Are we smart enough as a society to understand what the benefits are and what are the downsides? And everyone has perverse incentives. So someone who's in the AI industry isn't gonna tell us how AI is the best thing since sliced bread. But in general, as a society, if you think across, like, human evolution over centuries, do I wanna take this emerging technology, and is this gonna benefit the human population in aggregate over the long term? It's TBD.

Speaker 2:

Do you the

Speaker 3:

has What what happen what happens when AI becomes smarter than us?

Speaker 1:

Yeah. It's pretty crazy. Do you think the solution's government intervention, just good stewardship by the leaders of the foundation model labs? Like, who actually who actually has the responsibility of stewarding the new technology most effectively?

Speaker 3:

It's governments, and it's ones who are who have a long tenure. If you look at America, you know, the presidential cycle is four years. So I don't know if that's necessarily long term incentives.

Speaker 1:

Yeah.

Speaker 3:

But who cares about where humanity is gonna be, not in four years from now, not when they check out, but in a hundred years from now, who's got a long term focus on this emerging technology, and who cares about humanity over centuries. But I we have to be very, very careful about this technology.

Speaker 2:

What what was your view on social media a decade ago? It was quite popular for a period to say that social media was destroying humanity, and maybe it is, maybe it isn't. We seem to have found a way through. But do you is your view that that AI is as bad as people once maybe thought social media would be? Like, what what specifically when you think about you know, one one of the things we laugh about internally is, just how easy it is clock when somebody uses AI to generate like a cover lever cover letter or job application.

Speaker 2:

Right? It's like a really good way to just get your cover letter application ignored is just to generate it with ChatGPT. It's just beyond obvious and I'm sure your your team has seen a lot of this too. But I'm curious, like, what do you think the before we get to the, you know, sci fi doomer scenario where the computers rise up and destroy us all, what what what's kind of, like, the immediate impact, that that, you're worried about?

Speaker 3:

Social media is one of the worst things to happen to humanity. If you look at Instagram, people are so easily manipulated. They filter videos. They filter pictures. You know, you look at these young people, their expectations, their lack of work ethic.

Speaker 3:

China has censored all of this stuff for good reason because it's so easy to manipulate the layman. And so, in general, when I look at social media, I say, well, has it benefited humanity, or has it been toxic? It's so there there's no question that social media is is toxic. AI clearly will increase productivity, but at what cost? At what cost?

Speaker 3:

I think we have to be very, very careful.

Speaker 2:

When when public company CEOs talk about how much efficiency they're getting out of AI, do you think that they're actually getting efficiency out of AI, or do you think they're just pushing their teams harder to be more efficient and they wanna blame blame the impact on on AI?

Speaker 3:

It's both. It's both. But AI, without question, increases productivity. But, you know, again, it's short term versus long term. So if you're running Johnson and Johnson or pro or Procter and Gamble, and you can use artificial intelligence, and you could reduce your cost structure because you've got all these humans that are doing these mundane tasks, and all of a sudden, you realize the computers are gonna do it better.

Speaker 3:

Well, duh, you're gonna reduce your cost structure. But all of a sudden, when you when you have all of these people that are unemployed and the demand for your product goes away, what's better? You wanna keep people employed and have some kind of self worth and working hard and making money, or do you wanna replace them and give them universal basic income? And then what does that mean? Does your consumption and demand for your product in aggregate go up or go down as a result of artificial intelligence?

Speaker 3:

I don't know. But I one thing I know for certain is that the CEOs, if they can reduce their cost structure in the short term, they're gonna replace they're gonna take computers over humans. But is that better for humanity over the long term? You tell me. Everyone on UBI over the long term?

Speaker 3:

How does that make them feel? People need a purpose.

Speaker 2:

You don't think we'll create new jobs? I mean, we created email jobs. A lot of them could go away, and the world wouldn't be too much different.

Speaker 3:

Artificial intelligence feels different.

Speaker 1:

Yeah. I just wonder if

Speaker 3:

we are

Speaker 1:

on other things.

Speaker 3:

It feels different.

Speaker 1:

We could still have we could still have hierarchies and competitions on things that only humans can do. I mean, we already do this with sports and all sorts of things. There there there's probably still some sort of, like, reproductive battle to try and get to the top of the stack. Even if you don't need to go and work to make money, there are still other things that you do with your time to raise your status in society. But I don't know.

Speaker 1:

It is it is a bizarre future to I think just wonder if it's five years, ten years, fifty years away.

Speaker 3:

We're in an era of instant gratification. That's the American system is that now now now now now.

Speaker 1:

Mhmm.

Speaker 3:

Innovation, making money today. But when you think about ten, twenty, thirty years from now and artificial intelligence, what what does the world look like? It's it's not about job displacement. It's not about control. But is artificial intelligence gonna control us, or will we control artificial intelligence?

Speaker 2:

And you would argue that social media already controls how we feel day to day. We open our phone and decide Yeah. Our mood based on what's happening in faraway places that we have nothing to do with?

Speaker 3:

Social media is a big problem. What's what's a big problem? It's so e if you look at the divisiveness Yeah. In this country, social media, whether you're a conservative you you go on Instagram and you're you're conservative, and then all of a sudden you get, you know, these algorithms serve you all kinds of things that are gonna make you mad. And if you're a liberal, you all of a sudden go on social media, and you see all kinds of media that's gonna make you mad.

Speaker 3:

Why is it that ultimately humanity like, it it comes down to human why do we have to lose something in order to appreciate? Well, feels like the only neutralizer is death and war. Doesn't have to be that way. But it feels like the only way we can ultimately appreciate something is if we we actually lose it. It's it's very, very, very sad, but that's what it comes down to.

Speaker 3:

There is so much indivisiveness in America. If we can just all come together and figure out things that we both agree on, but instead, we figure out the reasons why we're gonna be divided and the politicians divide us. And

Speaker 2:

I think a lot of people have agreed a lot of people on social media have agreed that they like you a lot. So there's there's there's there's one white pill in there.

Speaker 1:

Yeah. There's some

Speaker 3:

Until they don't. Until they don't.

Speaker 1:

Well, hopefully, it doesn't

Speaker 3:

Do they like me over the long term? I don't know. We'll find out.

Speaker 1:

Yeah.

Speaker 5:

We

Speaker 3:

true true leadership is not about dividing people. It's figuring out how do we bring people together over the long term to benefit humanity.

Speaker 1:

Yeah. How do you apply that that that sort of thinking to video games? Because there was a lot of fear mongering about violent video games causing kids to become violent. The government did step in and regulate video games with the ESRB. Every game is given a rating, and young kids, you know, they they they can figure out a way to get access to some violent video games, but it's you know, parents are more in control now.

Speaker 1:

And I feel like we more or less got the good outcome, and and people can enjoy video games responsibly. And, of course, there's some negative scenarios. But in general, I feel like video games have been just like a cool cool medium for artists to tell stories. There's wide variety of experiences. It feels like we as humanity got through that test.

Speaker 1:

And whereas maybe we're still in the middle of the of the fight for positive social media and maybe just at the beginning of the fight for positive AI outcomes, what lessons should we take from, like, how humanity, dealt with video games?

Speaker 3:

Think the Chinese have restricted their children from playing video games. I let my kids play video games. Doesn't ban unless they're playing, like, Mario Kart.

Speaker 1:

Yeah.

Speaker 3:

But you look at these, like, Call of Duty, they're killing each other. I want positive influences. I want things that are healthy. I want things where people are gonna learn. Mhmm.

Speaker 3:

Blowing someone's head off, exposing it to young people, governments you know, you look at America. It's the land of the free. It's great. It works well for immigrants because they come from places where it's shit. And they come to America.

Speaker 3:

They have all this freedom and their gratitude. They're they're grateful, but then you look at people where they don't necessarily have that level of gratitude. And they come to America. You give them all this freedom, and they destroy themselves. So do I want my kids going and playing Call of Duty blowing each other's heads off?

Speaker 3:

And then you look at the Chinese, and they're restricting the ability to play video games, and you say, well, they're censoring versus we're free? Well, what's better for humanity? Having boundaries and having rules or just letting people do whatever the hell they want, destroy their lives?

Speaker 2:

You ever thought about getting have you ever thought about getting into politics?

Speaker 3:

I was born in Canada.

Speaker 1:

There are some positions you could still run for.

Speaker 3:

Which ones? Senator?

Speaker 1:

Yeah. I think you'd be I don't know if you could be senator, but you could be maybe mayor. Right? Isn't the mayor of New York not born in America or going to

Speaker 3:

There there was this saying my, my herd.

Speaker 1:

You could be city councilman, maybe.

Speaker 3:

Yeah. Because they'd kill me.

Speaker 1:

Well, you could go

Speaker 5:

to Canada.

Speaker 1:

You could be prime minister of Canada.

Speaker 3:

Too honest. I'm too honest.

Speaker 1:

Too too honest.

Speaker 3:

To be full of shit.

Speaker 1:

I'd have

Speaker 3:

to be sure they couldn't handle me. They couldn't handle me. Politicians, they're like diapers. They start to stink. They get they start to stink very quickly.

Speaker 3:

I couldn't play the bullshit.

Speaker 1:

No.

Speaker 3:

So you you wanna tell people what they wanna hear. Yeah. And I'm not playing that game of what they wanna hear. But when it comes to artificial intelligence and social media in general, is it beneficial to society and the humanity as a whole? I don't like him.

Speaker 3:

I don't like him.

Speaker 1:

That's extremely

Speaker 2:

honest. Fumor confirmed. Yeah. That's fair.

Speaker 1:

Well, thank you so much. This has been a really great interview. Thanks so much for calling.

Speaker 2:

Yeah. Anything else anything else that we miss?

Speaker 1:

You know, we'd love to anything else that you're working on that we didn't touch on, we'd love to talk about.

Speaker 3:

You guys got anything else?

Speaker 1:

I think we're good.

Speaker 2:

Last question. What what's your what's your relationship like with Roar and Kitty? You guys talk much?

Speaker 3:

Ask him.

Speaker 1:

Okay. Yeah. We'd love to have him on the show. Maybe we'll maybe we'll Get

Speaker 3:

him on the show.

Speaker 1:

That'd a lot of fun. He's Yeah. You know, we're we're we're pretty new to the live streaming thing. I always enjoy talking to people who operate in the same medium. And so Yeah.

Speaker 1:

We don't conversation.

Speaker 3:

Yeah. We don't want be traders. We want Yeah. I'll have a conversation with him that if he's focused on decades and centuries, not on making a buck.

Speaker 1:

Yeah. That makes a ton of sense.

Speaker 2:

I like it.

Speaker 1:

Well, yeah. Well, we're rooting for you for the next decade, for the next century, for the next millennia. Thank you so much for coming on the show. This was fantastic.

Speaker 2:

Yeah. Great chatting, Ryan. Cheers.

Speaker 3:

Oh, yeah. Bye.

Speaker 2:

Fun.

Speaker 1:

What a wild day on TVPN. Thank you if you're tuning in for the first time. The AWS has an outage, so there are pieces of the show that aren't working. We're cobbling stuff together, but we appreciate you checking out our show. Every day, we go live at 11AM Pacific.

Speaker 1:

We talk about technology and business. We interview CEOs just like Ryan Cohen, the CEO of GameStop. We interview folks in the private markets, the public markets. We've interviewed Mark Zuckerberg, Brian Armstrong. We have Brian Chesky on the show tomorrow from Airbnb.

Speaker 1:

We have Palmer Lucky on the show tomorrow from Anderil. And we go all over the place. We talk to early stage companies, public companies, everyone in between. But we always love focusing on technology and business. Don't do a lot of politics here.

Speaker 1:

There are plenty of other shows for that.

Speaker 2:

We don't do a lot of AI doomerism here though, but there's a place for

Speaker 1:

it. There's a place for that too. It's important. So we'd love for you to subscribe to the channel. You can follow us on Spotify.

Speaker 1:

This show is released as a podcast as well. We have a twenty minute version. We also have a newsletter, tbpn.com. You can go and subscribe and get a daily update from us on what's happening in the world of technology and business news. We have another guest coming in to the studio, a live guest.

Speaker 1:

But first, need to tell you about fall, the generative media platform for developers, the world's best generative image, video, and audio models all in one place, developing fine tune models with serverless GPUs and on demand clusters. And are we ready for our next guest? I think we have Zach

Speaker 2:

Let's do it. Off. Zach Live from the TV pod. Ultradome. Coming in well.

Speaker 2:

The government is turned off.

Speaker 1:

About Turbo Puffer. Search every byte. Serverless vector cert, and full text search built from first principles on object storage. Fast, 10 x cheaper, and extremely scalable. Welcome to the show, Zach.

Speaker 1:

Thanks so much.

Speaker 5:

To Live and in person.

Speaker 1:

Hello. We have you here.

Speaker 2:

What's happening?

Speaker 1:

The government shutdown, but you were still able to attend. What? Yeah. Maybe, like, set the table for us. What does the government shutdown mean?

Speaker 1:

This this feels like this happens all the time. I remember a decade ago being in college at some pool party and someone saying, oh, the government's shutting down, and and nothing happened then. Is something gonna happen now? Is this even a story that people care about? Like, what's going on?

Speaker 5:

Nobody cares. This the big problem. Nobody cares. It's taking forever for them to do anything. And this is there's a couple of reasons why.

Speaker 5:

Okay. One, government shuts down any time they can't pass a budget. Yeah. We don't pass budgets all the time.

Speaker 1:

Yeah. And that's why we

Speaker 3:

do a lot

Speaker 1:

of those continuing resolutions. That's exactly right.

Speaker 5:

We gave up on budgets. We shifted to continuing resolutions, which basically say copy paste whatever the last budget was, plus minus x percent on this thing, plus

Speaker 1:

minus So minus you can sneak stuff into the CRMs.

Speaker 5:

Oh, you can definitely sneak stuff in, people do that all the time.

Speaker 1:

And that's why lobbyists make money, and that's

Speaker 5:

I would I would know. That's why other that's why other lobbyists do bad, shady things. We only do good, positive things.

Speaker 1:

Yeah. Yeah. But but then also that that that's why we've seen a lot of founders, visiting DC.

Speaker 5:

That's right.

Speaker 1:

They because if they're gonna get money for their program, they need it to be provisioned. At some point, it's gonna prove it be provisioned in a new budget, hopefully, but if not a CR.

Speaker 5:

That's right. There's basically think about it this way. Twenty years ago, Congress used to pass all of these small little bills throughout the year, thirty, fifty bills throughout the year. Each one had money attached to them. Interesting.

Speaker 5:

They were gonna give money to this, money to that, whatever piecemeal. Piecemeal. Yeah. Exactly right. And there was an overall budget that governed things.

Speaker 5:

Mhmm. The congress all the time would pass laws, bills, whatever.

Speaker 1:

Yeah.

Speaker 5:

Now we pass two, maybe three bills every year.

Speaker 1:

Yeah.

Speaker 5:

These are called omnibus bills. Okay. Right? CRs are one of them.

Speaker 1:

Okay.

Speaker 5:

NDAA, which authorizes the military, is another one, the Defense Authorization Act. Right? The big problem we have right now is there is a big tension between the Democrats who do not want to pass a clean CR for a variety of reasons. It's not pejorative, by the way. It's just descriptive.

Speaker 1:

Sure.

Speaker 5:

And Republicans who feel it's in their political best interest to pass a clean CR today. Mhmm. If you're Chuck Schumer and you're sitting there remember, in March, Chuck Schumer worked with the Senate GOP

Speaker 4:

Mhmm.

Speaker 5:

To get a budget passed. Mhmm. If you're Chuck and you're sitting there in March, you have a problem now with your left flank.

Speaker 1:

Yeah.

Speaker 5:

Yeah. Who are looking at you and saying, why are you giving in? Why are you rolling over for the Trump ad ban?

Speaker 1:

Yeah.

Speaker 5:

Yeah. If you're in the GOP, you don't wanna have a Christmas tree bill where you have this ornament attached for this amount of money and this ornament attached for that amount of money. Sure. Sure. And everybody gets their favorite thing.

Speaker 5:

All you want is to say, keep it going exactly how it is. Let's put off any of the bigger discussions for a later time.

Speaker 1:

Yeah. I feel like it just is like a fan of American democracy. I like the Yeah. Big fan. I I after democracy.

Speaker 1:

Yeah. Let's give it up for American democracy.

Speaker 5:

Big ups for democracy.

Speaker 1:

But but I feel like I feel like I like the the the piecemeal Yeah.

Speaker 5:

Of course.

Speaker 1:

Piecewise bills. Like, I like the idea like, we're gonna go to the moon. Here's the moon bill. Yes. And it's like that's and we all agreed to that.

Speaker 1:

We all got fired up. Yeah. And everyone kinda got excited and we've That's right. And we voted for that and it kinda happened. Like the classic of, we wanna build a bridge, so we pass a bill to build that bridge.

Speaker 5:

Like, That's right.

Speaker 1:

That's kind of what I was taught in, like, grade school.

Speaker 5:

That's what you imagine when you go to civics class Yes. Grade. You're thinking to yourself, oh, I want did you ever watch School house Raw? Yes. Yes.

Speaker 1:

I'll jump to

Speaker 5:

a bill. Yeah. I'm just a bill. Clear on Capitol Hill. Oh, yeah.

Speaker 5:

Thinking to yourself

Speaker 2:

how to I want it to be when there when there's, like, a systemic failure in the financial ecosystem. I think it should be a one everybody gets a vote.

Speaker 1:

Oh, you want direct democracy? Direct. Direct democracy. California. Yeah.

Speaker 1:

It's worth checking.

Speaker 2:

Should we bail out the banks?

Speaker 1:

No. No. Should we just let weird thing because I I I feel like on on one level Yeah. As a fan of American democracy, I I I do want the direct democracy. I do want the piecemeal.

Speaker 1:

Sure. But then at the same time, live in California Yep. And I've had the direct democracy thing. That's terrible. And I still don't have a train that goes It's horrible.

Speaker 1:

LA to

Speaker 5:

And there's a weird, like, futile landowners now who pass down their homes in California with no property tax increase.

Speaker 1:

All these odd so it feels like maybe both sides have their own warts and edges.

Speaker 5:

You want, like I don't know. Far be it for me to say what the optimal setup is. The reason I like the older setup is every one of those small bills gets airtime, gets debate, gets discussion. And then ultimately, at some point, you step back and you say, look, I don't want to spend, and most people don't want to spend all morning waking up thinking about politics. I mean, your audience doesn't.

Speaker 5:

Right? I come on here enough as it is. Yeah. They don't want to wake up and think about that a bunch. Yeah.

Speaker 5:

So they elect somebody to think about it for them, but they still discuss at least what all the things are that happened. Yeah. The omnibus bills instead, they're so slammed every time. Mhmm. You're just jamming a 100 things through.

Speaker 1:

Yeah.

Speaker 5:

So if you're a damn look, Schumer's stated concern

Speaker 1:

Yeah.

Speaker 5:

Is that there were a bunch of Obamacare subsidies, ACA subsidies that were built in during COVID.

Speaker 1:

Yeah.

Speaker 5:

That were all set to expire this year.

Speaker 2:

Yeah.

Speaker 5:

They were meant to be temporary, but as we know, oftentimes, you pass something in government, it stays around forever, and it goes on for rest of time.

Speaker 1:

Yeah.

Speaker 5:

Dems wanna codify this in law. They want it to keep going. Republicans don't wanna take up any major topic. Clean CR, nothing attached

Speaker 1:

to it. Mhmm.

Speaker 5:

That's the crux of this. And Schumer is winning points with his left flank, right, the AOCs of the world, who's not a senator, but is considered primarying Schumer. He's winning points with his left flank because he doesn't now have to give in or be perceived as giving in Mhmm. To a hostile admin. And the admin thinks they're winning because they look at this and they go, great.

Speaker 5:

Now I'm gonna rift people. I'm gonna lay off. I'm gonna shrink the government.

Speaker 1:

Sure.

Speaker 5:

All of the things I want to do already Yeah. This I mean, a judge blocked it, but in theory, allows me to do.

Speaker 2:

Since this is a technology and business show Sure. What are the current what kind of groups are currently impacted? I talked to a defense tech founder Yep. On Friday, and he was saying, like, it's obviously massively disruptive because you're in you know, you're trying to get contracts done and nothing can kind of happen for you basically, like, add sixty days almost to whatever timeline you thought. And so that prevents new hiring because they don't have, they can't hire against specific contracts, etcetera.

Speaker 2:

But what are the kind of immediate impacts that you're seeing or hearing about in the private markets?

Speaker 5:

I mean, biggest thing is this. We work with a lot of founders, and I have a lot of founders who are clients who are here on o one visas. And we think the o one visa is phenomenal. Mhmm. If you wanna get an o one visa today, your processing time is so much longer than it ever was before.

Speaker 5:

Even though, by the way, consulars aren't shut down, consular offices aren't shut down because they're fee based. They're they have their own revenue stream.

Speaker 1:

Oh, that makes sense. Yeah. You pay the o one fee

Speaker 5:

That's exactly for

Speaker 1:

the funding, so they don't need funding for the government

Speaker 5:

of the taxpayer. Anything that either has multi year funding Yep. Or has its own independent set of revenue is able to stay open. Still in business. It's only the things that require year over year funding from the government that close.

Speaker 5:

And by the way, there's some nuance on that because the admin has kept stuff open. For example, normally, the army doesn't get paid during government shutdown. The admin said, look, we're gonna take the $6,000,000,000 left over from r and d spend in defense. So this is impacting the defense folks. Right?

Speaker 5:

Any r and d

Speaker 2:

that would have potentially gone to That's right. Start

Speaker 1:

up That's right.

Speaker 5:

It's not SBIR. So much, but it's more sciency. But yeah.

Speaker 1:

Exactly. Spending. Exactly. Go to start ups.

Speaker 5:

Yes. That could go to start ups that do r and d stuff. And instead, we're gonna use that money that was already allocated but hasn't yet been dispersed Sure. To pay the military.

Speaker 1:

So it's little

Speaker 5:

bit of robbing Peter's

Speaker 1:

the the couch cushions.

Speaker 5:

Yeah. That's exactly right. And the problem you run into is, look, they can do that now, but eventually, you run out of that money too.

Speaker 1:

Yeah.

Speaker 5:

And so at some point, somebody's gonna have to not get paid. So if you're an o one visa holder, you're in a tough spot because you now have a much longer lag time, much longer processing time to get into this country. Even if you were already here and just happened to be gone and coming back for your renewal. That's tough.

Speaker 1:

Yep. Yeah. That makes sense. So, like, how long do we expect this to last? Like, where should we be, watching for, like, updates?

Speaker 1:

Is is everyone kind of pricing in, oh, six weeks is standard, but then the it it it's the question of, like, is it six or twelve? Or is it or is it really wide?

Speaker 5:

Who knows? The real honest answer is the longest partial shutdown in history Yeah. Was under Trump one

Speaker 1:

Yeah.

Speaker 5:

And it was thirty five days. Okay. We're twenty days today.

Speaker 1:

Okay. Yeah.

Speaker 5:

So So that's the, like, tail end risk. The problem is, the difference between then and now is a couple things. Yeah. One, it's not a big press issue like we talked about. Yeah.

Speaker 5:

Like, I can imagine I'm watching people tune out of this interview when they're hearing about the government shutdown. People don't care.

Speaker 1:

No. No.

Speaker 5:

And by the way, both sides think they're winning. The admin says, look. There's great stuff happening. We used to have peace in The Middle East, plus or minus a little bit.

Speaker 1:

Sure.

Speaker 5:

Right? Dems are saying, we just had huge protests. We galvanized the base. Oh, yeah. No one's motivated to get this close.

Speaker 5:

Couple leading indicators you wanna look for.

Speaker 1:

That's it.

Speaker 5:

One, military payday, like I said. If that happens and they don't have another way of getting dollars attached to it, that's gonna be a problem. Two, majority leader John Thune, who is the senator from North Dakota

Speaker 1:

Yeah.

Speaker 5:

In the senate Republican guy. All of the farm state senators are looking at the expiration of loan programs that help keep farmers solvent every year.

Speaker 1:

Mhmm.

Speaker 5:

If you're a farm state senator, this matters to you a great, great deal. Mhmm. So the admin is looking today to figure out where can I get my dollars to go and continue to fund these things? But in the absence of funding this, it's really, really hard for a shutdown to continue without political pressure.

Speaker 1:

Mhmm.

Speaker 5:

Third thing you wanna look at, and this is the big one, is political consequences. Because the truth is, as much as Congress today has said, fine, I'm happy to abdicate my role and let the White House run the shutdown response, In theory, congress controls the power of the purse. They are able to, at any time Mhmm. Right, come together and do something. If the Dems lose, the governor race in Virginia, which is disproportionately affected by the shutdowns, there's so many federal employees who live in Northern Virginia

Speaker 2:

Mhmm.

Speaker 5:

Or if a year from now, which I don't expect will still be shut down, but if in some number of months, you start to see negative impact in the as we lead up to the midterms, senators are gonna wake up and say, gosh. My political future right now is shouldn't doesn't have to be tied to the outcome of this shutdown. I can be an independent actor. I don't need to rely on the White House or anyone else. The Dems can say this too.

Speaker 5:

Don't need to rely on Chuck Schumer Mhmm. To guide where my vote goes. That's the real one when rubber meets the road.

Speaker 1:

Is the government shutdown going to increase the risk of thieves stealing the constitution, robbing the society? We saw that these snatched jewels secrets book in the Brezin Congress. Louvre robbery. Do you

Speaker 5:

saw this story? And they, like, left it in the gutter,

Speaker 1:

I heard. Well, they well, they dropped a crown. Yeah. They did get away with a lot of jewels. Yeah.

Speaker 1:

And I and I hope that the folks at the Smithsonian who are protecting our moon rocks are not getting furloughed because we gotta protect our crowns. Be the

Speaker 5:

worst national treasure city.

Speaker 1:

It would be. By the way, it would be.

Speaker 2:

What's the current dialogue around AI regulation and policy in Washington? And then what are you seeing across the

Speaker 4:

state level?

Speaker 1:

Take to go from Carpathian to our cash to Capitol Hill? Is it are they watching, or are they are they listening to shows that are talking about that? Or They're watching. Talking the next narrative? How many links in

Speaker 2:

the Like, wait. It's autocomplete? Yeah. Always has been.

Speaker 5:

It's always There are a swath of think tanks that do a really good job of translating

Speaker 1:

Yep.

Speaker 5:

What happens in DC. Sorry. Happens in Or SF. Yeah. Exactly.

Speaker 5:

Yeah. Big computer. That's right. Everything is computer. You can build a really good think tank.

Speaker 5:

The guest who's supposed to be on before me, Dean Ball, who's an amazing guy, was the AI advisor at OSTP at the White House Office of Science and Tech Policy. He's done a really good job being that communicator. Bridging the gap. Exactly. And his think tank, FAI, which I'm involved with too Love that.

Speaker 5:

Does a really, really good job bridging the gap there. And by the way, things like the progress conference just happened at SF. There's a lot of things now institutionally that are built to translate this on like a relatively quick time scale. The truth is, look, if you're watching right now and you care about AI policy, the White House has an RFI request for information out today where they're asking people, founders in particular, to write in and say, gosh, what are the things that are impacting? What are the regulatory burdens that are impacting my ability to do AI business in America?

Speaker 5:

Yeah. So they want they want the answer, but it's hard to get ground truth because when you're in DC, it's little bit of a bubble. You hear a little bit

Speaker 1:

of an echo chamber. Sure. Is is the is the core DC AI narrative just, state by state regulation versus not, or are there actually higher level discussions around, okay. If a trillion dollars of CapEx is gonna happen, like, there might be there might need to be some fundamental changes in the way we regulate data centers Yeah. Power.

Speaker 1:

Even just even even if you're just the ultimate AI bull, you might need to step in and say, we're gonna help speed this up. Because at a certain point, you're building 10 nuclear reactors, you just can't do that without the help of the federal government.

Speaker 5:

Right? Alright. So so part of the problem is we have a federalist system.

Speaker 1:

Yep.

Speaker 5:

And so power by necessity is distributed across state, federal, local. Yeah. Right? So if you're talking about It's government power, not Sorry.

Speaker 1:

Literal power.

Speaker 5:

Although, yeah, literal power is also distributed. Electricity. Yeah. Electricity is also distributed across these states.

Speaker 1:

Yeah. Yeah. Yeah. As you can see with Elon, he built his Colossus two data center right at the intersection of three different

Speaker 5:

That's right.

Speaker 1:

States because he to able go over here for a little bit.

Speaker 5:

Yeah. Part of the problem is this. If you wanna build a new reactor or you wanna build an energy transmission line

Speaker 1:

Yep.

Speaker 5:

You need so many different people to buy in. Sure. And the last thirty years of American governance, basically at every level

Speaker 1:

Yep.

Speaker 5:

Has been built around keeping things the same. Status quo for people who already have done fairly well.

Speaker 1:

Mhmm.

Speaker 5:

The other thing I would think about, by the way

Speaker 2:

It's so funny because I don't read a lot about politics, but I have like such tangible experience in politics through trying to get things done with my HOA.

Speaker 5:

That's exactly right.

Speaker 2:

Literally like the boomers control the HOA board, and I I tried to suggest changes when I bought my house, and I basically got death threats.

Speaker 5:

So that sounds about it.

Speaker 2:

That's my framework for

Speaker 5:

California governance is the largest HOA in the country. Think of it that way. It is just completely biased against Alright. Here's the other problem.

Speaker 1:

Yeah.

Speaker 5:

You have a thousand people in congress who all have their own political aspirations. Look like a Marsha Blackburn. Marsha Blackburn wants to be governor of Tennessee. Mhmm. What's one of the biggest industries in Tennessee?

Speaker 5:

Nashville. Nashville music. Right? Yeah. So Nashville Hold on.

Speaker 5:

Wants say

Speaker 1:

it home of John Fios, home of drone. Yes.

Speaker 5:

Yes. Yes.

Speaker 1:

Yeah. I'm that's really swinging

Speaker 5:

the huge. Huge. Blackburn

Speaker 2:

we saw about Tennessee? Like something about isn't Nashville? Nashville specifically is pushing heavy on AI regulation.

Speaker 5:

So I think that's right. Exactly. They have the all the Music. No.

Speaker 1:

This is music. Okay. That's right. Yeah.

Speaker 2:

So Alright.

Speaker 5:

You guys know Suno, the AI Yeah. Audio app. Okay. They hate Suno. They hate Oh, they hate it.

Speaker 5:

It's like public enemy number one Okay. When you're in Nashville. Right? So Marsha Blackburn.

Speaker 2:

Isn't it an open carrier out there? Do not if you're if at

Speaker 1:

Lightspeed and you funded Suno, do not go to Tennessee.

Speaker 5:

Do walk around Tennessee. I I wouldn't personally. Nashville? Yeah. Marsha Blackburn wants a political future in Tennessee.

Speaker 5:

She wants to get out of the senate, which is a miserable place to be, and get into the executive chair in Tennessee.

Speaker 1:

Why is it so miserable?

Speaker 5:

Because nothing gets done. You know, the house works? No. This is, real. I mean, look, it's frustrating if you actually are a motivated person.

Speaker 1:

So you say that the house works, but the senate doesn't?

Speaker 5:

No. The opposite. The house works even less. The house worked in the last since, like, July 4, I think the house has worked something, like, twenty days in total. Wow.

Speaker 5:

Okay? That's insane. Like, think about if you guys Did they

Speaker 1:

not get the memo with the great locket? No. They are

Speaker 5:

they are doomed. The permanent underclass. It's terrible. It's awful. This is the problem.

Speaker 5:

Alright. So Marsha Blackburn wants a political future in Tennessee.

Speaker 1:

Yeah. Yeah.

Speaker 5:

She says, okay. If I pass the Elvis Act today or if I allow the Elvis Act to

Speaker 2:

pass They should called it the trough acts. Don't put slop in my trough. Keeping troughs Keep organic.

Speaker 3:

I want

Speaker 2:

organic food for all my piggies.

Speaker 1:

Yeah. Make the trough great again.

Speaker 5:

Organic, only organic AI for me. Thank you very much.

Speaker 1:

Okay. So if wants If

Speaker 5:

she lets the Alvis Act pass with no federal preemption, preemption, she wins herself a lot of friends

Speaker 1:

Sure. Tennessee.

Speaker 5:

Right? She'd be governor. Exactly.

Speaker 2:

Yeah. If you wanna I imagine having all the country music fans and the and big country on your side Yeah. For for governorship in Tennessee, that's gotta be pretty helpful.

Speaker 5:

It's capital and it's social capital too. It's a big outcome. Now think about that same dynamic writ large over and over and over. A thousand people Sure. All of whom want their own thing.

Speaker 5:

Look, the truth is at some point, there will have to be a federal standard that happens. Mhmm. It's gonna have to happen. And some of this stuff, Ted Cruz put out a bill around AI sandboxes, which is in the OSTP, the White House AI action plan, basically saying, look, we're gonna set up special economic experimentation zones. It's one way to think about them.

Speaker 5:

So people are tackling it from different respects. But unless you're thinking completely nationally and you wanna be governor or sorry, you wanna be president, if you're thinking about anything at the state level, you are right now not going to act federally when it comes to AI policy. That's

Speaker 1:

Wait. Wait. Really quickly. You said a thousand people. Help me break that down.

Speaker 1:

It's a it's a 100 senators, five thirty eight Yeah. House members, but where's the other,

Speaker 5:

like I mean, I was being a little hyperbolic. Okay. It a gang. Was gang. It was a phrase.

Speaker 5:

Gang of 500.

Speaker 1:

Okay.

Speaker 5:

Okay? Gang of five hundred refers to everyone in the house, everyone in the senate Yep. White House staffers, agency staffers.

Speaker 1:

Got it.

Speaker 5:

And then people like me, the lobbyists, the reporters,

Speaker 1:

the outsiders. You add all those folks and and those people have Correct. They might be want wanna be the governor or something.

Speaker 5:

Sure. They might wanna have an aspiration or even they wanna do business with somebody. Sure. Sure. Sure.

Speaker 5:

Sure. Have some aspiration to do whatever.

Speaker 1:

There's a thousand live players.

Speaker 5:

That's right. The thousand live players, really 500 live players.

Speaker 1:

Sure.

Speaker 5:

And that's who determines quote unquote conventional wisdom in

Speaker 2:

Mercy. Okay.

Speaker 5:

Got it. So if you're, you know, Blake, you guys head on from Boomero a while ago, whatever it is, the number one thing you wanted to do to convince people that you wanna have a speed sorry, a sound law and not a speed law is convince those 500 people over and over again to shift their perspective. Once you do that, conventional

Speaker 1:

wisdom works. So difficult. I mean, I I love Blake, obviously, and I'm rooting for him.

Speaker 3:

But Yeah.

Speaker 1:

It that seems like such a hard challenge in the face of, like, you can't just go and do the supersonic act of 2025 by itself, and everyone's like, yeah. This one makes sense Yeah. Because you have to puzzle piece it with 25 other things on the omnibus bill.

Speaker 5:

It's like when we were all in enterprise software, it's marketing. You know what I mean? Account based marketing, you have one person, you have all the influencers who sit around that person,

Speaker 1:

and your job is

Speaker 5:

to win over each influencer over and over again. It's the same thing here. You have one person, maybe five people, right, who can actually do it, the majority leader, the president, speaker, so on and so forth. You wanna flip over all the people, the think tanks, the reporters, the junior members, their delegation, all the people who have influence on them. If you do that, you win.

Speaker 1:

Yep. Sorry, Jordy. Cut

Speaker 2:

you off. Any do you cover nuclear at all? Yeah. That is that because because every time we see these, you know, one gigawatt, you know, all all these gigawatts data centers being announced Yeah. It it just it seems like the like, nuclear is just gonna have to play a huge part in that.

Speaker 2:

And yet, we need to relearn how to make reactors here in America.

Speaker 1:

Yeah. It seems like a lot of people are underwriting like, oh, yeah. Like, you know, we're gonna it's gonna be a lot of We'll sign we'll sign on this line, and then the money will come. One. Wire the money.

Speaker 1:

And then, yeah, we'll just turn on a new camera. New active reactors. Are two very different things.

Speaker 5:

So easy to do.

Speaker 1:

Sorry,

Speaker 2:

Drew. Yeah. One one new reactor, please.

Speaker 1:

Please. Yeah. Yeah. Just just add that to the term.

Speaker 2:

Thank you very much.

Speaker 5:

I'll take one reactor and 50 transmission lines. Thank you very much. I'm all

Speaker 1:

done. You really can, like, wire a $100,000,000,000 in a day. Yes. But you cannot just wire a 100 gigawatts in a day.

Speaker 5:

This is part of the problem. So we do full disclosure, we do lobby for a nuclear energy company. It's a great company. I won't name them, but phenomenal business. We have look, if look at the federal government

Speaker 2:

By 30, if you're not heavily conflicted, you're doing something wrong

Speaker 5:

in life. No conflict, no interest is the name of the case, unfortunately. Fratilian. This is the thing. Alright?

Speaker 5:

If you are looking at the federal government Mhmm. And you're looking at this federalism system and the challenges that are inherent to it, you're thinking, who's cutting across this? Credit to the admin. They are the first ones who I've heard of, who at least thought about, hey, we ought to have somebody whose job it is to expedite these long term. It's Michael Grimes, right, who runs

Speaker 1:

Oh, yeah.

Speaker 5:

The US investment accelerator, former tech banker for many, many years, Cal Berkeley, Oslo Bears. Yeah. There you go.

Speaker 1:

Polytechnical. Yeah.

Speaker 5:

And, anyway, he is running basically what turns out to be the federal government's investment bank.

Speaker 1:

Sure.

Speaker 5:

He's running that across any time there's a system they want to accelerate in deployment of an investment Yep. That's who they can turn to. But even they're That's the problem. So nuclear is disproportionately affected by this.

Speaker 1:

Yeah. No. It makes a lot of sense. What what do you have a take on like, Sagar and Jetty has been saying, like, oh, the the the technology doesn't know what's about to come. Yeah.

Speaker 1:

The the narrative I mean, we see these things debunked on x and in tech in the tech part of x every day where there will be a mainstream news headline about, you know, Sora uses 25 gallons of water every time. And it gets sort of debunked in some, like, research paper from Google.

Speaker 5:

Yep. Yep.

Speaker 1:

But that doesn't really make it back. No.

Speaker 5:

Of course not.

Speaker 1:

And so, you know, I think Sagar is is is, you know, identifying a potential wave of anti sentiment across both sides of the aisle. What's the mood around that generally?

Speaker 5:

I mean, look. Unfortunately, it's great business and great politics to be anti tech. That's the problem.

Speaker 1:

Because it's a narrow community that

Speaker 5:

you can community. Right. It's disproportionately wealthy Yep. Disproportionately influential.

Speaker 1:

Yep.

Speaker 5:

Very easy to be a punching bag.

Speaker 1:

Very sloppy. Very very sloppy. It's you know, it really is, like, very visual know. To just show a picture and be like, this is bad. Yeah.

Speaker 2:

And that's that's why I was I was shocked that OpenAI came out and announced that they'd be supporting erotica because that just feels like, you know, as as these debates come around with power and infrastructure, it's very easy to be like, you said you were trying to cure cancer Yep. Give free education, but like, clearly a lot of your users are using this for for adult entertainment. Yep. It it it takes you down a path that, it just puts a target on your back.

Speaker 5:

This is the big divide on the right. Right? If you're looking if you're on the right wing, there's two camps. One camp says, let the slop flow free. Right?

Speaker 5:

Tech companies, let them do their work. Whatever it's great, let them do it all. Yep. And says the goal is to be pro tech, generally. Right?

Speaker 5:

One camp says, we're social conservatives. I don't want my kid getting porn from OpenAI. I want my six year old being exposed to this shit. Pardon the language. And so they want to reel it back in.

Speaker 5:

That's the big problem. Look, if you're OpenAI, culture is often downstream of politics. Trump won this election, huge wave of things that feel pro freedom of speech, and you can quibble with the definition and the boundaries, that all you like. Mhmm. But in theory, the messaging is around being pro free speech.

Speaker 1:

Mhmm.

Speaker 5:

And so a lot of what they're doing aligns to that, but I think there's a backlash brewing, and it cuts to, John, to your point, it cuts across left and right. Right? Josh Hawley and Elizabeth Warren Yeah. Have common cause on very few things, but one of them is tech.

Speaker 1:

I made a YouTube video about that, like Yeah. Years ago Yeah. About how it was they just came from completely different realms, but they were saying the exact same thing about tech.

Speaker 5:

That's right.

Speaker 1:

And and then that that was in the context of, you know, Google and monopolism and Yeah. And and, like, large corporations. But, you know, it just keeps ringing true again and again and again.

Speaker 5:

Because the truth is the neo Brandeisians who didn't like tech, who were opposed to these sort of monopolistic or what their view was monopolistic practices, A lot of those folks have recoaked themselves. Now they talk about tech's power in other ways. Right? And it's very sympathetic. If you watch these videos

Speaker 2:

What about Swiss watches? I

Speaker 1:

heard that too. I was wondering if you would call me.

Speaker 3:

Yeah. Can we do

Speaker 5:

a little wrist view? A little wrist zoom in, please? Yeah. I I don't want to do that because you guys mock me every single time. I'm just a blunt for punishment.

Speaker 5:

The bigger thing is all right. So look, get these videos on X from this group. You know this group, More Perfect Union? You guys see these videos?

Speaker 1:

That's awesome.

Speaker 5:

They're the big water sort of power pusher people.

Speaker 3:

For sure.

Speaker 5:

They make these very slick videos basically saying, if you allow a data center to open up in your neighborhood, you're gonna be in a drought, and your crops are gonna dry up, and God's gonna smite you with locusts, and so on and so forth. Like, horrible outcome. Right? And then, know, people fighting the good fight. Know, IFB, Alex Stapp?

Speaker 5:

You guys know them? I know.

Speaker 4:

Yeah. Yeah.

Speaker 1:

Yeah. Okay. I IFP, Institute for Progress.

Speaker 5:

Right? Exactly. Institute for

Speaker 2:

Progress. Right?

Speaker 1:

I don't know if Alec has been on, but

Speaker 5:

Alright. So you have Alec or Caleb from IFP, Institute for Progress Yep. Who do great god's work on Twitter. Yep. Every time that comes out, they're there posting saying it's not true and so

Speaker 1:

on so a data center that runs on saltwater.

Speaker 5:

Yeah. There you go. Would be

Speaker 1:

really easy. That's that's easy win. You could just figure Exactly.

Speaker 5:

Give me, like, sewer water in there. Yeah. You know, just disgusting gross shit. You're You're slapping that.

Speaker 2:

It be such

Speaker 1:

a it would be such a good retort if somebody's like, you're using all this water. You're like, yeah, use seawater.

Speaker 5:

Yeah. Correct. There's infinite infinite in fact, we're purifying it at the same time.

Speaker 2:

Because it's it's so No. It wouldn't work. It wouldn't it wouldn't be like a dolphin was gonna drink that.

Speaker 5:

A dolphin was gonna fish was gonna drink that. Or otherwise And and right off the

Speaker 2:

shore, there's an endangered species that that is going to go extinct because you took the water from them.

Speaker 1:

The sea turtles gonna get stuck in the NVIDIA rack and

Speaker 2:

it's gonna

Speaker 1:

be a

Speaker 2:

I I have a I have a pitch for you. Alright. Let's hear it. So we talk about like presidential libraries. Right?

Speaker 2:

Could we be moving into a future where each president gets one SPAC during

Speaker 5:

Each president gets one free shot of insider trading and that's all good from there.

Speaker 2:

Okay. Good enough. No. But what what's going on with

Speaker 1:

Obama's presidential library. No. No. No. No.

Speaker 1:

Clearly not built by humans.

Speaker 5:

Oh my

Speaker 1:

god. Architecture. An energy source. But three, because of the AI narrative, this could be what sets up Michelle for her run because energy is gonna be so expensive. The Obama presidential library, if that's what it is, built by aliens, power source, lowers the rate of energy for for everyday Chicagoans.

Speaker 5:

It's so true.

Speaker 1:

Huge, huge So true. That that that's a launch launch You guy that's

Speaker 2:

gotta get the get the tin

Speaker 5:

foil hat for you. But

Speaker 2:

no. What what I just saw something in the I I we didn't even get to it in the show today. I just saw that that there's a Cayman Cayman Islands entity that's gonna be Yes. Participating in this SPAC. What Yeah.

Speaker 2:

What's the do you know what the story is there?

Speaker 5:

I don't have the full story.

Speaker 1:

And do

Speaker 2:

and what do you think in a in a what's your sort of non nonpolitical view on, you know, we will undoubtedly have another Democratic Sure. Democrat, you know, president. Do they does this set a new norm where

Speaker 1:

Wasn't Gavin Newsom saying like, oh, maybe we're gonna do a coin? We'll do our own coin. Like, we'll fire back with our version

Speaker 5:

Peaked way too early. Oh, yeah. Like, you don't wanna be peaking this early.

Speaker 1:

Sure. Sure.

Speaker 5:

Sure. Like a delayed peak a little bit. Yeah. You wanna have some time.

Speaker 1:

You know, you're way ahead. I do wonder, you know, people are obviously latching on to all the all the Trump projects, whether it's the coin or true social or this new SPAC or a variety of projects. But I wonder if we look at his the number of companies he builds per decade, if he's actually a low period in his career. Because he was originally doing, like, Trump Steaks, Trump University, Trump he had a vodka.

Speaker 5:

He had the vodka. He had the ties. Had a casino. Yeah.

Speaker 1:

And so That's right. He he used to be in 10 different industries. Maybe he's actually more focused than ever.

Speaker 5:

That's right. If you really think about it, in some ways, this is the double down lock in period

Speaker 1:

Yeah. For Trump as a business person. He's as a business person. Yes.

Speaker 5:

Yes. He's avoiding the underclasses.

Speaker 1:

Yes. But, you know, we gotta give credit to Jimmy Carter who was the most locked in president. He had a peanut farm. He sold it.

Speaker 2:

Now he put it in a blind trust.

Speaker 1:

Oh, did? You know, you looked it up. Out of the game.

Speaker 5:

I'm locked in. This is my thing with the Dems. Right? Yeah. Like, if you're looking at the next four years and you're thinking to yourself, okay.

Speaker 5:

Public opinion, often thermostatic. It's gonna swing back at some point.

Speaker 1:

Oh, that's good for praise for that. I

Speaker 5:

like that. And this is the problem, which is like, look, both parties, when they're in power, they love to overreach.

Speaker 1:

Yeah. Yeah.

Speaker 5:

Right? And so if you're doing if you think to yourself, I have four to eight years to execute every goal I've ever had in my life before public opinion is thermostatic again.

Speaker 1:

Sure.

Speaker 5:

It's no wonder there's people call it the magnification of the Dem Party. Sure. Right? You look at them and you're like, gosh, AOC, Zoran Yeah. Being stars in this party is so unusual relative to even where Clinton was in 02/2008.

Speaker 5:

Hillary Clinton was designate, let alone 2016.

Speaker 1:

Yep.

Speaker 5:

So if you're trying to, you know, put your finger up, particularly where the wind is blowing Mhmm. You think to yourself, if a Dem benefits from massive tailwinds coming out of this admin, and they elect or they nominate somebody who is a magnified version of the DEMs, right, an AOC or a Zoran ish part of the wing, that makes me concerned, right? Because then you just see the pendulum swinging back and forth from extreme to extreme

Speaker 1:

Sure, sure.

Speaker 5:

For the next foreseeable future. And there are many, many times

Speaker 2:

Are you saying that America could be more divided than ever?

Speaker 5:

Yeah. America more divide first, I'm hearing of this, more divided than ever? Yeah. And this is true historically, many times.

Speaker 1:

Yeah. Yeah. What else are you monitoring in DC right now? What did What we cover? What what's kind of, like, a under the radar story that maybe people aren't focusing on yet but should be?

Speaker 1:

Is there anything that's, like, at the bottom of your list that maybe is bubbling up?

Speaker 5:

Yeah. I mean, the big thing that I'm thinking about a lot is a lot of these questions around education. So you guys you guys see the higher ed compact that the Trump admin put out? I didn't. Alright.

Speaker 5:

This is a big thing. Trump admin's basically saying, look, we want a set of standards Mhmm. That universities will sign Okay. In order to be eligible to receive future funding.

Speaker 1:

Sure. Sure.

Speaker 5:

Sure. A lot of the things when you read them, by the way, they don't sound

Speaker 3:

You

Speaker 5:

read them, you're like, gosh, I thought universities always operated this way.

Speaker 1:

Yep.

Speaker 5:

Some of them, by the way, are impossible for universities to do.

Speaker 1:

Okay.

Speaker 5:

Some of them are super reasonable, there's a huge spectrum

Speaker 1:

of Sure. Sure. Sure.

Speaker 5:

Generally, I think a lot of people are sympathetic. I understand the sympathy for it. Yeah. Fear is, look, universities used to be. I'm not saying they ever were this in practice.

Speaker 5:

Obviously, they've had a political bias all their own. Mean, academies have that for a long time. But as an institution Yep. Universities used to be beyond politics. Right?

Speaker 5:

You didn't have somebody who ran Yeah.

Speaker 1:

You got tenure, then you can say whatever you wanted politically

Speaker 5:

because you were you were good. Everybody knew there were some whack job professors,

Speaker 1:

but it

Speaker 5:

was like your nutty uncle. It wasn't like the threatening evil communist.

Speaker 1:

Totally. Totally. It was like don't platform Yeah. Chomsky Yeah. That's exactly right.

Speaker 5:

Yeah. Nobody was looking at Chomsky saying this guy's, you know, deport him, send him back

Speaker 1:

to Poland or whatever. You know

Speaker 5:

what mean?

Speaker 3:

He's got

Speaker 1:

his theories.

Speaker 5:

So here's the problem. Now you look at this compact, you go, okay. Maybe you say to yourself, 70%, whatever is reasonable.

Speaker 2:

Sure.

Speaker 5:

What's gonna happen the next time there's a demon office who now says, great. Our compact for universities to get federal funding is going to say there has to be DEI.

Speaker 1:

Sure, sure.

Speaker 5:

Or there has to be

Speaker 1:

this or Yeah,

Speaker 5:

exactly. That's still the thing I'm tracking for the future, is look, universities are pushing back. I understand why. I think it's very reasonable for them to do so. And I get both sides of it.

Speaker 5:

I see the value for the admin, too. I'm concerned about the precedent of it swinging back and forth over and over again.

Speaker 1:

Yeah. It's so interesting because, like, I've maybe it's just like the teal influence, but I've I've sort of, like, stopped paying attention to universities

Speaker 5:

entirely Yeah.

Speaker 1:

Sure. Because I I every day on this show, we talk to some 16 year old that isn't planning to go to college and is already building a company. And then and then when we do talk to people about the education space, it's usually because they're they're building an alternative to homeschooling Yes. That's right. Or what Joe Lamont's doing at Alpha School or Andre Karpathy is doing with Eureka.

Speaker 1:

Like, there's so many different initiatives that just live entirely outside of the original

Speaker 5:

Yes.

Speaker 1:

The traditional education system that, I but but it does feel like like there's still pockets of the economy that are really tied to this education. We talked a Yeah. Few minutes about this in in, like, basic science research.

Speaker 5:

Yes. That's

Speaker 1:

exactly right. I'm super blind to it because all of the AI research that was previously done at, you know, these, these AI labs on university campuses, just completely paid for by big tech.

Speaker 5:

That's right. Right?

Speaker 1:

And so so I I'm not worried like, oh, we're not gonna get the next AI innovation because the because there's not enough funding or whatever. It's like, no. There's more funding ever. Yeah. These PhDs are making a $100,000,000 now.

Speaker 5:

Well, my problem is in AI. Right? AI, I'm with you

Speaker 1:

on. Yeah. Yeah. It's like, you can't apply that to everything.

Speaker 5:

Correct. You look at something like, you know, ocean oceanographic research.

Speaker 1:

Yeah. Totally.

Speaker 5:

And you have like one place, Woods Hole

Speaker 1:

Yeah.

Speaker 5:

In Massachusetts, which is, like, the best oceanographic research institute in the country. Yeah. There is no industry that is funding hundreds of millions of dollars for oceanographic PhDs Yeah. To go do weather research.

Speaker 1:

Yeah. We need, like, Saronic, Andoril Yeah. That's right. That's right. Amazon, and there's a few others that are, like, in that ocean ocean textbooks.

Speaker 1:

We need them to be worth trillions.

Speaker 5:

That's right. And the NSF and basic research and universities Yeah. Are hugely important for that stuff still.

Speaker 1:

Yep.

Speaker 5:

Yep. The thing that I get, concerned about

Speaker 1:

Yeah.

Speaker 5:

Is when they start to be the football moved back and forth. And frankly, if you look at look, this is a very Nixonian admin in a lot of ways.

Speaker 1:

Like, if

Speaker 5:

you ask young Republicans in DC I love Nixon, I'll just put that out there. If you ask young Republicans in DC who they look up to, they will tell you it's Nixon staffers.

Speaker 1:

Okay.

Speaker 5:

They'll tell you to go read the Pat Buchanan books, and Pat Buchanan's got his own fair share of Misha Goss. But if you read the books, which are well written, you'll see the big theme the Republicans have about Nixon

Speaker 1:

Yeah.

Speaker 5:

Is he apologized too much. Pat Buchanan will tell you, Nixon should have brazen out Watergate. Oh. Never should have apologized. Yeah.

Speaker 5:

Should have stuck it out. Should have broken the power of the Ford Foundation, the large institutional foundations. That's the Nixonian view.

Speaker 1:

Right? What's the view on Kissinger then?

Speaker 5:

I think these people are more domestic policy focused than Kissinger, but they're more America first than the Kissinger and

Speaker 1:

non because that seems like very different.

Speaker 5:

It's like

Speaker 1:

open up relationship with China. Let's go over there and, like, do deals.

Speaker 5:

This domestic is policy.

Speaker 1:

Like, so we we we like we like Nixon, but just the

Speaker 5:

Yeah. Just the stuff we like. Just the stuff we like.

Speaker 1:

That's how people always remember everyone.

Speaker 5:

Right? And the truth is people like Buchanan become Jobs

Speaker 1:

was just a designer. He wasn't a ruthless manager. No.

Speaker 5:

I've never heard of man. A passionate fan of design. Exactly. That's That's

Speaker 1:

That's it.

Speaker 5:

If you're like a Pat Buchanan fan, you're looking at this moment and you're saying, gosh, we should break the backs of foundations or institutional life, you know, higher ed. Right? And that's where some of the desire is coming from, is this reading of history where you said, look, Knicks and Admin didn't go far enough in defeating its enemies. Yep. And they allowed the academy, they allowed capture by x number of people.

Speaker 1:

Yep.

Speaker 5:

And that's why they're motivated now to go and try to do this even though I think it is not great for universities.

Speaker 1:

Yep. No. That makes a lot of sense. Jordy, anything else?

Speaker 2:

No. Always We can chat all day.

Speaker 5:

Thanks, Jeff. I mean,

Speaker 2:

if you wanna keep hanging out, I can tell you about Replacement AI. They're running they're running a billboard in right now that says, our AI does your daughter's homework, reads her bedtime stories, romances her, deep fakes her, don't worry. Totally legal.

Speaker 4:

Who needs parenting?

Speaker 2:

And then it and then on

Speaker 1:

the Well, if wanna billboard, go to adquick.com out of home advertising media. Easy measure. We'll think about the headaches about advertising only. Ad quick combines technology out of home expertise and data to enable efficiency.

Speaker 2:

He's right.

Speaker 1:

Sorry. He's like, I'm not here.

Speaker 5:

Thanks so

Speaker 1:

much for coming on. Great to see you, Zach, as always. Take care, guys. But, yes, this billboard was was bizarre.

Speaker 3:

Whole site

Speaker 2:

is wild. So if you go to their website, it says the only honest AI company, and they say human flourishing

Speaker 1:

bad business. Tyler? This this this the replacement.ai? Are they are they doomers or are they

Speaker 2:

pro AI? So they're massive doomers.

Speaker 1:

But are they pro AI?

Speaker 2:

They're they're positioning the they're positioning their their movement as a startup. So it's replacement.ai as a website. The title says humans no longer necessary. So we're getting rid of them. Replacement AI can do anything a human can do, but better, faster, and much much cheaper.

Speaker 4:

I I don't think they're doomers in the sense that they think AI is gonna like kill everyone now. Mhmm. It's just that like they're it's just gonna make everyone's life lives worse.

Speaker 2:

Yeah. Okay. So they highlight some quotes here. This quote from Sam Allman, AI will probably most likely lead to the end of the world, but in the meantime, there'll be great companies.

Speaker 1:

That's a

Speaker 2:

wild quote. They have one.

Speaker 1:

When did he see this? This feels like very taken out of context.

Speaker 2:

Yeah. That this that was in, like, twenties

Speaker 1:

They're probably asking him, like, what's the worst possible thing that could happen? And he's like, this is what the worst thing could happen.

Speaker 2:

I have a quote here from Dario. I think there's a 25% chance that things go really, really badly.

Speaker 1:

Very odd. Well, on the other side of the spectrum, you got the Starbucks CEO, Brian Nickel, says the coffee giant is all in on AI, reveals real time artificial intelligence systems designed to assist baristas and transform store operations. I got a tip for you, Brian. Get on ProFound. Get your brand mentioned on ChatGPT.

Speaker 4:

Honestly

Speaker 1:

Reach millions of consumers who are using AI to discover new products and brands. Seriously, people are gonna be saying, I want coffee. Where should I go near me? You gotta be on ProFound. Get your brand mentioned.

Speaker 2:

Something, we gotta put on the skits I had.

Speaker 1:

Okay.

Speaker 2:

Please. Starbucks stock is up 6.66% past five days. Coincidence?

Speaker 1:

Very weird.

Speaker 2:

Are they are they using AI to summon the demon?

Speaker 1:

I don't know. Well, if they build something, they should do it on Google AI Studio, the fastest way from prompt to production with Gemini. Chat with models, vibe code, and monitor usage. Also, we I I we have to try that frame to frame, VO 3.1 thing. I I I we'll talk about this later, Tyler.

Speaker 1:

But, did you see these demos? So, basically, you've always been able to upload a single image and then say, like, animate this and turn this into something. But now you can upload two images, a starting image and an ending image, and have v o three, like, interpolate between them. And so I saw this really cool video that someone made where it was basically a tour of ancient Rome. And so it's flying around through the Colosseum, and the Colosseum gets built up, and then it fills with water.

Speaker 1:

And then there's boats, and you go under the water, and there's sharks, and then you go into some tunnel. You come out. You're at the Acropolis. It was really awesome, and I feel like we could do something really creative with this. It really got the creative juices flowing.

Speaker 1:

So excited to build something around that. Maybe I'll plan out the whole project with linear because linear is a purpose built tool for planning and building products. Meet the system for modern software development, streamline issues, projects, and product

Speaker 2:

People really were not excited about Starbucks getting into AI. They had other suggestions. Like, I saw Ryan Peterson just saying, make the Wi Fi use you. Make the

Speaker 1:

Wi Fi work.

Speaker 2:

Go back, to the previous era.

Speaker 1:

Also, our president Dylan Eberscato went viral on x, quote, posting barely AI, who is breaking some news about Uber. Uber is going to give its drivers in The US an option to make money by doing digital tasks. These short minute long tasks can be done anytime, including while idling for passengers. So your Uber pulls up, starts waiting, starts doing some tasks to help train the next generation of AI. You can do data the data labeling, uploading restaurant menus, recording audio samples of themselves, narrating scenarios in different languages.

Speaker 1:

People is quite long.

Speaker 2:

Also absolutely hated this.

Speaker 1:

Yes. They did stuff like this as

Speaker 2:

I hope they put some type of restriction in where if they detect that you're in a move like, actually moving, that they say you cannot do data labeling right now because somebody FSD. Somebody in traffic just data labeling.

Speaker 1:

You're you're you're delegating the driving task to the AI while training the AI on the next thing. And you get a task, and it's like it's like, is this car about to crash? Yes or no? And you look up, and it's a picture of exactly what you're looking at. You're like, oh, no.

Speaker 1:

I should have been labeling this with the pedals instead of the buttons. I don't know. There's a lot of, black mirror scenarios, but, it does seem like a big market. Scale AI, obviously, Surge AI. We've talked to Merkor.

Speaker 1:

We've talked to a few other of these data labeling label box. Some of these are more focused on the the real real complex, reinforcement learning environments, with verifiable rewards. It's more expert driven, but there's still clearly a need for general AI data labeling. Tyler, do you a take on this?

Speaker 4:

Yeah. I mean, was just interesting because I I feel like, generally, the playbook of of data labeling has, like, definitely moved, like, up the the scale ladder. Right? Like Totally. Scale.

Speaker 4:

I I don't know what their, like, revenues are now from the same kind of, like, you know, the the Filipino, like

Speaker 1:

It's felt yeah. It felt like the job was finished with with the general base level RLHF, but I think there's still niche areas where, you know, uploading every restaurant menu, like, Scale didn't necessarily do that. They're probably still competing with that. I mean, we when we had the president or the new CEO of Scale on, he was saying they they just got a new huge contract for a $100,000,000, the DOD. What are they data labeling?

Speaker 1:

Probably some stuff that hasn't been labeled before. You know? How much how much how many how many rations do they have in in stock over time or something?

Speaker 2:

Tyler, you're pretty AGI pilled. Right? Yeah. Okay. Well, why don't you earn a $100 doing data labeling?

Speaker 1:

Yes.

Speaker 2:

That's your challenge for that's your challenge for the next twenty four hours.

Speaker 1:

You if you believe in, well, Rocco's Basilisk. Rocco's Basilisk demands that you do Yeah. I feel like you do data labeling.

Speaker 2:

Sign up sign up right now.

Speaker 1:

Help summon the showgoth with with data labeling tasks $1 at a time. Know, I don't know.

Speaker 2:

Matthew Prince said, was weird being in Vegas recently so quiet. So many fewer people gambling, drinking, partying. My pet theory Ozempic killing Vegas just like it's killing snack food brands, liquor producers, and Napa. Yes. I didn't know it was impacting Napa.

Speaker 1:

Yeah. Didn't know that either.

Speaker 2:

Obviously, this is just kind of him riffing.

Speaker 1:

Yes. But do you agree with him or do you agree with the rebuttal from Near which you can read?

Speaker 2:

Near says, would go the opposite angle with the rise of Robinhood prediction markets and sports betting. There is little reason to go to Vegas. It is more expensive. The house edge is higher and it's hotter outside. Less drinking contributes too, though.

Speaker 1:

Okay. So do you think it's the rise of of online, outlets

Speaker 2:

for Vegas a isn't it a re like a recession indicator? Right? Like, the the That's another option. Not it's just not doing well

Speaker 3:

right now.

Speaker 1:

Yeah.

Speaker 2:

Right? There's no I would wanna know like It's Yeah. It's not like, you know, the average person that's going to Vegas is saying

Speaker 1:

Yeah.

Speaker 2:

I need or the average person in America is, I gotta be in Vegas this weekend.

Speaker 3:

They're

Speaker 2:

scaling AI CapEx like crazy. I gotta be there.

Speaker 1:

Yeah. Like Tyler, can you look up, the the the rough percentage of Americans who are on Ozempic? Like, I wanna know if is it, like, 1%, 10%, 50%? Because if Vegas is I if it's, like, two percent of Americans are on peptides, like, I would I would I would expect that wouldn't show up in the Vegas data. But if it's, like, 50%, I might see an effect there.

Speaker 1:

Right? I don't know.

Speaker 2:

Or think that What what do you have roughly? Kind of people that go to Vegas are, like, you know

Speaker 1:

The most likely to be on Ozempic, Oh, what but let's just get the number. What

Speaker 4:

do think? Like around twelve percent have used it once.

Speaker 1:

Okay.

Speaker 4:

So we're not a type of character

Speaker 1:

in the air. Insignificant. That could take a couple points off the off the house edge. But the question is Yeah. The other thing clearly under pressure.

Speaker 2:

Yeah. People I I saw another quote that just said, like, the nightlife scene there has been hyper financialized to a degree that it's just not even fun anymore.

Speaker 1:

Oh, sure. Sure. Sure. Yeah. Well, if you're running a casino in Vegas, you have sales tax, you gotta get anumeralhq.com, sales tax on autopilot.

Speaker 1:

Spend less than five minutes per month on sales tax compliance. But here's my theory for how you bring back Vegas. People aren't gambling for one reason or another. Maybe it's the Ozempic. You need to make Vegas high end.

Speaker 1:

So we Vegas needs to bring in an opera house, a symphony, an art gallery. The Louvre should relocate. Maybe they're behind it. Maybe we'll see, you know, fine art come to Las Vegas. This is the only way that they can sustain.

Speaker 2:

Yeah. What if they set up a bunker in there that that has a bunch of stolen goods from all over the world?

Speaker 1:

They already have an f one race. They need to turn it into Monaco. They need to part I

Speaker 2:

think somebody's gonna look this up and be like, they already have an opera. They already have a symphony. They already have art museums.

Speaker 1:

But they should close all the nightclubs and only have opera and only have symphony. I think that would be the the the the the true solution. We talked a little bit about the the the water issue. Where else should we go? There's

Speaker 2:

made a pretty egregious error when trying to calculate the unit economics of neo clouds and hyperscalers.

Speaker 1:

Yeah. What happened here?

Speaker 2:

I don't know if they've issued an official correction yet, but it came out. It was super bearish, and then a bunch of people pointed out that that they had gotten, the math wrong by Well, if you

Speaker 1:

want to get the the correct data, I would defer to ClusterMax, the project from semi analysis. They are the most reliable analysts in the space.

Speaker 2:

You said, last week, the new media traditional media divide is far too simplistic. If you wanna understand things today, you have to know the difference between legacy media, traditional media, new media, legacy new media, neo media, post neo media, alt media, and neo alt media.

Speaker 1:

Yeah. You yeah. You really have to study this stuff. It's it's more complex than punk bands. Like, you can't just say, oh, it's a rock band.

Speaker 1:

There's so many different layers. This is actually a riff on what you said because you you you used the phrase legacy new media around me, and I thought it was so funny. But, obviously, this is a comment on the the the colossal debate over Colossus Magazine.

Speaker 2:

And I think the reason we were talking off air this morning about this, I think the reason that it sparked a debate is that it it's new media that looks and feels like like legacy media.

Speaker 1:

Unpack this. This is a good take. But first, before you do, let me tell you about Adi. Customer relationship magic. Adi was the AI native CRM that builds scales and grows your company to the next level.

Speaker 2:

Continue. My point was that, Colossus looks like Fortune Magazine. Yeah. It doesn't, like It's the Joker. Of course, it's

Speaker 1:

No one cared until I put on the makeup. Right? Yeah. Isn't that isn't that from the Joker?

Speaker 4:

What's the what's the line? Or the mask.

Speaker 1:

Yeah. Oh, the mask. Right? Yeah. He says no one cared until I put on the mask.

Speaker 3:

It's Batman's

Speaker 1:

Oh, it's Batman. Yeah. Wait. What? Batman says that?

Speaker 4:

Yeah. He puts on the mask.

Speaker 1:

And he says no one cared until I put on the mask?

Speaker 4:

Let me fact check.

Speaker 1:

I think this might be the Joker or something. I don't know. Anyway, the the idea is, like, people have been writing profiles about tech companies in a positive way. Like, Paki McCormick has had a really successful yeah.

Speaker 4:

Okay. It's Bain.

Speaker 1:

It's Bain.

Speaker 4:

Yeah. Yeah. This makes me

Speaker 1:

That's right. Bain says no one cared till I put on the mask. Right? And so and so it's like no one cared about, oh, it's just a blog. Oh, it's just a sub stack.

Speaker 1:

Oh, it doesn't matter if it gets a lot of views or a lot of attention in tack. What matters now is that Colossus put the put the the the profile in terms that the other magazines can understand because it looks like theirs. It looks it's the same glossy magazine. And we're seeing this with Arena mag. We're seeing

Speaker 2:

this It it's like legacy media didn't feel threatened until

Speaker 1:

It took on the aesthetics.

Speaker 2:

Took on the aesthetics

Speaker 1:

Yes. Of aesthetics. And we really need a new term for this because there's there's traditional media, there's new media, and then there's new new media. It's

Speaker 4:

Neo legacy media.

Speaker 1:

It's neo trad media. It's neo we're the neo trads. We're the neo trads. So this show is new media, but it looks like trad media. You can see the overlays and the stock tickers.

Speaker 1:

Neotrad. This looks like TV, but it's not. And Colossus similarly looks like a magazine, but it's really not. I mean, it is a magazine, but it's also but most people consume the Thrive profile on the web. It's not just a blog.

Speaker 1:

It's not just a sub stack. So Yep. Fascinating. Well, Tyler, threw out a an alternative term, adding to the fray of media terms. What do you what do you wanna call it?

Speaker 4:

Dark media.

Speaker 1:

Dark media? Are we dark media? You're dark media. Look how dark your background is. You're dark media.

Speaker 4:

I think the the doomers if if the doomers released, some new publication, I think that would be dark media.

Speaker 2:

Yeah. Well, if you were

Speaker 1:

speaking of dark, it's dark when you go to bed, sleep on an Eight Sleep, 8sleep.com, pod five, five year warranty, thirty day risk free trial, free returns, free shipping.

Speaker 2:

It is so funny that Eight Sleep runs on AWS. And so Oh, no. It was it was

Speaker 1:

Oh, no.

Speaker 2:

My my reporting last night is all off. I'm seeing some other, but it's good.

Speaker 1:

I took a fast fantastic nap on my eight sleep yesterday, so I'm feeling very refreshed.

Speaker 2:

In other news Yes. Stitch Fix is up 7% today after Bill Gurley posted

Speaker 1:

Go away.

Speaker 2:

Pictures of himself using Stitch Fix's new AI product, Stitch Fix Vision. Is this I I I I think Stitch Fix is just popping because they announced a version of, like, this a like Yeah. Yeah. You you upload some images and then it'll generate you in different outfits. This is Thrive did a company.

Speaker 1:

Well, that's super interesting.

Speaker 2:

Like Doji. Doji.

Speaker 1:

I don't know how I don't know

Speaker 2:

They basically created like a Doji.

Speaker 1:

I don't know the I don't know the recent history of Stitch Fix, but I remember at one point, they were I think they were employing a lot of human beings to do the, the collection assembly. So you would send in some photos. You would say your some of your opinions, and then they would put together a box of clothes for you and kind of act as like a virtual stylist. And and these stylists obviously had real costs to them. And so if that's something that AI can do, like, that is a material change to their business and their fundamental economics.

Speaker 1:

So who knows if people will like it? Maybe they say, hey. This particular outfit is slop. But if they have a lot of training data and also they have the whole pipeline of logistics and fulfillment and ecommerce set up, that that mean that could be that could be significant for the business. It's exciting.

Speaker 1:

I'd love to have the founder on and talk more.

Speaker 2:

Kermeux says over the last fifteen years, Reddit's relationship had advice has shifted towards recommending breakups, boundaries, and therapies Yeah. And against compromised communication and letting people have their space.

Speaker 1:

Mhmm.

Speaker 2:

So he did well, there's a actually, I guess, a Reddit user that Sure. That generated all this. This is concerning for a number of reasons. One being that your favorite AI models are all on Yeah.

Speaker 1:

This is what Pat Gelsinger is has been sounding the alarm about when he evaluates the LLMs. He he he he put it in different terms, but he kind of noticed a similar trend. What's your best advice for a relationship? I say start a podcast together. That works great.

Speaker 1:

Andrew Osorkin's new book, 1929, has wild details of Winston Churchill YOLO creating US stocks.

Speaker 2:

One one we gotta we gotta expand on that because if you're having, relationships problems, whether friend or romantic, and you start a podcast together, That sounds

Speaker 1:

like terrible. So here are the options from Reddit. End relationship, communicate, give space slash time, set respect boundaries, seek therapy, counseling, compromise, other. So you're saying

Speaker 2:

others podcast?

Speaker 1:

The ring. How about getting the octagon?

Speaker 2:

Duke it out.

Speaker 1:

Duke it out. Break a sweat. Break some knuckles, you know?

Speaker 2:

Trunk

Speaker 1:

Throw down. Throw down. That's the correct answer. Throw down. If you're having relationship problems, throw down.

Speaker 1:

Why is that not on Reddit at all? Redditors haven't even thought of that as advice for a relationship.

Speaker 2:

True.

Speaker 1:

Throw down.

Speaker 2:

You gotta get on there. Trung fan was sharing some highlights from Andrew Ross Sorkin's

Speaker 1:

Wait. I have one more piece of advice before we go to 1929. If you're having a relationship problem with someone, buy them a luxury watch on getbezel.com because your bezel concierge is available now to source you any watch on the planet. Seriously, any watch. You're like, you know, Tyler seems frustrated with me lately.

Speaker 1:

What should I do? It's like, get him a GMT.

Speaker 2:

There you go.

Speaker 1:

Get him get him a Get

Speaker 2:

him a hitter.

Speaker 1:

He's gonna be like, yeah, it's water on the bridge, John.

Speaker 2:

Metal solves So let's go

Speaker 1:

to Tyler Cosgrove with his official review of 1929. Did you really read the whole book this year this weekend?

Speaker 4:

No. I was playing on it, and I

Speaker 1:

I didn't.

Speaker 4:

No. I I yeah. I'm not done yet.

Speaker 1:

Did you watch any TikToks about the book? No. Did you generate a Sora summary of the book?

Speaker 4:

That would have been much more efficient.

Speaker 1:

Okay. Did you actually read any of the book? Yeah. Yeah.

Speaker 4:

I I'm, like, close to, like, halfway through. Cool. Yeah. I told you guys I was gonna be done, but Yeah.

Speaker 3:

I know

Speaker 4:

there's a little white light. But it I mean, it's it's really interesting. Okay. It's, like, kind of crazy how many parallels there are between, like, the current day and and Break it down. 1929.

Speaker 4:

So I I I think we asked Sorkin a couple of these. Yeah. But okay. So so one example is this guy, Durant. He's kind of this he's famously for for for just being like this This Kevin Durant, to be clear.

Speaker 4:

Kevin. He he he's, like, thinking about starting this company that is essentially the same thing as a company we've had on Basic capital.

Speaker 1:

Okay. Where Oh, yeah.

Speaker 4:

It's basically a mortgage for for buying, like, ETF.

Speaker 1:

Yeah. Leverage.

Speaker 4:

Yeah. It's like, there's so many parallels where it's like, oh, that's like, just like that thing that I saw the news about the other week. So Okay. A little bit bearish reading this.

Speaker 1:

Sure.

Speaker 4:

But I I think also, I mean, the the main thing in the book is you see, like, the real reason for all the speculation is basically just there's just so much new money coming in. Sure. Like, so so many of of of the big banks, they're just focusing on trying to get essentially, like, retail

Speaker 1:

Yeah.

Speaker 4:

To to take leverage or to just, like, get people to buy stocks

Speaker 1:

in general? I I I you know, Sergeant was mentioning, like, like, the the AI equivalent in 1929 was RCA, like, radio radio waves. Like, does that come through, or was the bubble broader?

Speaker 4:

Yeah. I mean, he so he talks about RCA a fair amount. I I think that's mostly as just a specific example of a stock that is being pumped.

Speaker 1:

Sure.

Speaker 4:

But it's certainly not like the the the general fervor is not like, oh, everyone has to get in on this specific company. Yeah. It's that everyone needs to get in on the market.

Speaker 1:

On the market.

Speaker 4:

The market is just gonna keep

Speaker 1:

I mean, there's a little bit of that happening.

Speaker 2:

Here's some here's some extra context. So Churchill was day trading. Love it. He was trading £400,000 a week on But adjusted for inflation, that's $36,000,000. That he was just slanging around while Great.

Speaker 2:

While he was he was was was he in was he actually Empower Empower nine? Let me let me see.

Speaker 4:

No. So so this is like he's kind of in some kind of like ambassador role at this Okay.

Speaker 1:

But

Speaker 4:

yeah. Basically, he he has this very like luxurious lifestyle that he has to keep up. Yep. So he goes on this roadshow throughout the country Mhmm. Where he just finds these

Speaker 2:

Churchill didn't actually come to power until 1940.

Speaker 1:

Yeah. So

Speaker 4:

Yeah. So he he's going on this roadshow basically to to find rich guys who will just give him money to keep traveling, but also just so that they can speculate. Mhmm. And another thing is is

Speaker 1:

So is he kinda like the Leopold Ashenbrenner of 1929?

Speaker 4:

I think Leopold has much more of differentiated view.

Speaker 1:

Much more alpha. Yeah. Winston was pure beta.

Speaker 4:

Yeah. Well, I mean, it's still alpha in the sense that the amount of, like, insider trading going on is kind of absurd. Like, basically, there's a deal that's about to happen, and then the banks just deal out like, oh, here's a couple friends of mine, you know. Here's like Calvin Coolidge. I'll just give him like $400,000, basically.

Speaker 4:

Wow. Lot current presidents are like getting deals, you know. So Yeah. So maybe it's not that Maybe it's not that different,

Speaker 1:

you know. Yeah. Who knows? Yeah. That's fun.

Speaker 1:

That's fun. Anything else to take out from the book?

Speaker 4:

I think once I fully finish it, I'll I'll write something out maybe, then I'll give a a final review.

Speaker 1:

That'd cool. But Yeah.

Speaker 4:

Very good so far.

Speaker 1:

Overall, if you haven't had a chance to pick up the book yet, you can get it on Audible or wherever books are sold. We're very thank you. We're thankful for Andrew Ross working to come on the show, share his insight, and hang out with us for about half an hour last week.

Speaker 2:

It's worth Interviews and Worth noting that Churchill made it all back with a book deal.

Speaker 1:

He did?

Speaker 2:

On his

Speaker 1:

No way.

Speaker 2:

World War two memoirs.

Speaker 1:

Okay. Here's a potential recession indicator from Jacob Silverman. Says the number of people who registered to take the LSAT last year in 2024 was 18,000. But in September 2025, a year later, the number of people who registered to take the LSAT is now 32,000. So a massive, massive uptick.

Speaker 2:

Yeah. And the the these are the idea is that these are people that are were trying to enter the job market struggling and are trying to buy basically buy time

Speaker 1:

Maybe.

Speaker 2:

Or or

Speaker 1:

Oh, yeah. New grads.

Speaker 2:

Not finding yeah. Just not finding opportunities that are

Speaker 1:

College grads who say, hey.

Speaker 2:

Their expectations

Speaker 1:

Maybe I'll maybe a alternatively, these are people who watched Andre Carpathi on DoorDash and said, well, AGI's ten years away. I can go be a lawyer. I'll take the LSAT now, get through law school, become a partner, and then I'll be able to cash out before AGI hits. So, you know, LMs aren't gonna be able to do law anytime soon. It's LAP.

Speaker 1:

Heard it there first, and so they all apply to the LSAT. Could happen. Could happen. Anyway, let me tell you about Wander. Find your happy place.

Speaker 1:

Book a Wander with inspiring views, hotel grade amenities, dreamy beds, top tier cleaning, and twenty four seven concierge service. It's a vacation home, but better, folks.

Speaker 2:

Did you see the senate Republicans posted a video of Chuck Schumer? Yes. That was fully it was a real quote, but it was fully AI generated.

Speaker 1:

It made a it made an AI video and audio of the

Speaker 2:

video up to see how sloppy it is?

Speaker 1:

Yeah. They made an AI video and audio of the actual quote to kind of illustrate it, to the audience. Andrew Curran says, first time I've seen this. This is a real quote, but it wasn't set on camera, so they generated a video of Chuck Schumer saying it. Now I would love to know what model they used because a lot of models have guardrails here.

Speaker 1:

Let's play the video right Chuck Schumer thinks playing with Americans Every day gets better for us. Chuck Schumer Low income women, infants Every day gets better for us.

Speaker 2:

Okay. K. The mouth movement is a

Speaker 1:

lot Chuck Schumer thinks Every day gets better us. Every day gets better bad. I mean, like, it's clearly based on some real image that they took, and so the lighting looks very real. The skin texture looks really real because the I the AI model doesn't really have to do that much to regenerate it from scratch. But, of course, we've seen that, you know, all the models are basically, you know, indistinguishable, if you don't have full context at this point.

Speaker 1:

Awesome. Says we're we're gonna see some AI laws come to fruition as fast as they push the COVID bill through back, through back in March 2020. The Democrats introduced the bill. Negotiations, Trump signs it on condition of ending government's shutdown. That's kind of a it's kind of a bold take.

Speaker 1:

I don't know if that will for sure happen, but, you could definitely see, before the next election, both sides wanting some clarity on where the line is in terms of what you can use. Now the n s r the NRSC, the Republican group, the Senate Republican group that created this video did put an AI generated tag in the bottom right hand corner. And so they did disclose that they used AI. Will that be required? Will there be certain, you know, definitions of how how how aggressive the watermark needs to be?

Speaker 1:

That's all up for debate, but we will continue to check it out.

Speaker 2:

More insight here Yes. From Barbarian Capital.

Speaker 1:

Mhmm.

Speaker 2:

Thought we were out of noun plus capital names on x, but we always find more.

Speaker 3:

And more.

Speaker 2:

They pulled up a chart here from Schroders. It's looking at average year to date performance of companies in each category. They're looking at Nasdaq, no revenues. Mhmm. Mag seven, unprofitable Nasdaq, unprofitable US small mid caps, Nasdaq with revenues, the S and P and profitable Nasdaq, and then profitable small mid caps.

Speaker 2:

What group do you think performed the best year to date, John?

Speaker 1:

I have no idea.

Speaker 2:

NASDAQ, no revenues.

Speaker 1:

No revenues.

Speaker 2:

Yeah. Wow. I gonna say, half half half much looser list listing

Speaker 1:

Okay.

Speaker 2:

Sure. Requirements than than NICEE. And, anyway, so yeah.

Speaker 1:

And the know in Silicon Valley in that in the HBO show. Like, you don't want revenue because then people will comp you. But now we're doing it at the public markets. That is crazy. And and, I mean, that goes back to the 1929 thing that Tyler was talking about.

Speaker 1:

Like, once you get to the level of fervor in the market where it's like, you just gotta be in the market and you see froth, like, in stuff that's not even tied to the core innovation, it's like, wait. Like, AI is clearly, like you know, there is a transformative technology. There is growing revenue. Now there's debate about the level of the level of CapEx that's correct, but that's wildly different than looking at some company that's saying, like, oh, we're gonna colonize the moon or build a quantum computer or do something else that's, like, completely off the current tech tree, and they're mooning. That not not to be too punny about it, but that's when you get into, like, real, real bubble territory.

Speaker 1:

Well, I have a one final post. You have anything else? One final post. Congratulations to Everett Randall, former partner at Kleiner Perkins. Now he's a general partner at Benchmark.

Speaker 1:

He has announced his trade deal, potentially the trade deal of the year. Ev Randall is out of KP and on him on this week. That would be amazing to

Speaker 2:

catch up him. What's the plan?

Speaker 1:

And of course, Delian has already taken shots, putting him in an AI generated image without a watermark. So and I I think everyone can tell that this is an AI generated image.

Speaker 2:

There's one person out there that doesn't. I don't know who it is.

Speaker 1:

Maybe. Maybe. Well, Everett's playing along and laughs, at Deleon's post because they're both having fun on the timeline. But congratulations to Ev, of course. And, good luck with, the next era of his investing career.

Speaker 2:

One thing I missed about the MAGA SPAC Mhmm. Is that Chamath will be in this one too

Speaker 1:

No way.

Speaker 2:

With Donald Trump junior and Laura Ingraham.

Speaker 1:

I wonder so this is this is just a SPAC. They haven't identified a target. Have they even defined we need to dig into and and understand, like, have they even defined an an area that they would want to take a company public through.

Speaker 2:

Well, found an area that they would wanna incorporate the Columbia Acquisition Corp. Mhmm. And that is the Cayman Islands.

Speaker 1:

Okay. Yeah. So they're in the Cayman Islands. I think that's standard practice, but I don't know.

Speaker 2:

Yeah.

Speaker 1:

Anyway, thank you for tuning in. Thank you for watching. If this was the first show you've seen because, we have the CEO of GameStop on today, please subscribe. Follow us on X. Follow us on YouTube, Instagram, LinkedIn, wherever you get your, live streams.

Speaker 2:

Shout out to our whole production team.

Speaker 3:

They were

Speaker 2:

fighting for their life. Were under

Speaker 1:

attack probably by a state actor. We don't know yet. We will get to the bottom of it. We will bring you the news

Speaker 2:

They thought they could take down our show by taking down try. Nice try.

Speaker 1:

Nice try. Try again. Try again. Don't don't actually try again, please. And you can also subscribe to our newsletter at tbpn.com.

Speaker 1:

Thank you so much. Leave us five stars on Apple Podcasts and Spotify. Tomorrow. We'll see you tomorrow.

Speaker 2:

Massive day.

Speaker 1:

Massive day.

Speaker 2:

Palmer Lucky.

Speaker 1:

Palmer Lucky and Brian And

Speaker 2:

probably a lot more. See you then.

Speaker 1:

Goodbye.

Speaker 2:

Cheers.