TBPN

Diet TBPN delivers the best of today’s TBPN episode in 30 minutes. TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays 11–2 PT on X and YouTube, with each episode posted to podcast platforms right after.

Described by The New York Times as “Silicon Valley’s newest obsession,” the show has recently featured Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella.

TBPN is made possible by:

Ramp - https://Ramp.com

AppLovin - https://axon.ai

Cisco - https://www.cisco.com

Cognition - https://cognition.ai

Console - https://console.com

CrowdStrike - https://crowdstrike.com

ElevenLabs - https://elevenlabs.io

Figma - https://figma.com

Fin - https://fin.ai

Gemini - https://gemini.google.com

Graphite - https://graphite.com

Gusto - https://gusto.com/tbpn

Kalshi - https://kalshi.com

Labelbox - https://labelbox.com

Lambda - https://lambda.ai

Linear - https://linear.app

MongoDB - https://mongodb.com

NYSE - https://nyse.com

Okta - https://www.okta.com

Phantom - https://phantom.com/cash

Plaid - https://plaid.com

Public - https://public.com

Railway - https://railway.com

Restream - https://restream.io

Sentry - https://sentry.io

Shopify - https://shopify.com/tbpn

Turbopuffer - https://turbopuffer.com

Vanta - https://vanta.com

Vibe - https://vibe.co


Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

TBPN is a live tech talk show hosted by John Coogan and Jordi Hays, streaming weekdays from 11–2 PT on X and YouTube, with full episodes posted to Spotify immediately after airing.

Described by The New York Times as “Silicon Valley’s newest obsession,” TBPN has interviewed Mark Zuckerberg, Sam Altman, Mark Cuban, and Satya Nadella. Diet TBPN delivers the best moments from each episode in under 30 minutes.

Speaker 1:

Well, I've been addicted to social media lawsuits. I cannot get enough of these lawsuits. I keep reading about them.

Speaker 2:

Losing sleep. You're potentially filing your own lawsuit against the lawyers

Speaker 1:

Yes.

Speaker 2:

That were coming after these social media.

Speaker 1:

Yeah. Yeah. So there's actually a profile in The Wall Street Journal, in the exchange this weekend, the lawyer who beat Meta and Google. And it goes into some of his addictive techniques that are are driving jurors crazy across the country. Attorney Mark Lanier, he uses props.

Speaker 1:

Come on.

Speaker 2:

Come on.

Speaker 1:

What's more to do with props? He use he also uses parables. Okay? What? Parables.

Speaker 1:

Metaphors, axioms, all of the above. He moonlights as a preacher, and it shows when he's taking on the world's most powerful companies. The 65 year old came to court in Downtown Los Angeles for closing arguments this month of one of the biggest trials of his career armed with a parable of leavened bread. That feels like something that is designed to make it hard to rip yourself Exactly. Away So he knew he needed a simple way to show a jury that Meta's Instagram and Google's YouTube were designed to be addictive and were harmful to young people.

Speaker 1:

So the veteran plaintiff's lawyer

Speaker 2:

from We just Texas say he looks fantastic for 65.

Speaker 1:

He does look fantastic. And and I and as much as I'm joking, do think he's doing important work, I do think there's a potentially really good outcome here that we'll that we'll go into. But we're still having some fun. So the veteran plaintiff's lawyer from Texas showed them two grocery items, cupcakes and tortillas. Social media, he told the the courtroom, was like the baking powder that makes a cake rise, exacerbating the struggles of already vulnerable teens.

Speaker 1:

We have an in interactor, an amplifier, something that blows it up. We have here social media that takes the vulnerable and goes after them in destructive ways. It's as easy as ABC. He's making the argument that social media is more like cupcakes than tortillas. Both contain flour.

Speaker 1:

Both are carb carbohydrate loaded. But one is bigger than the other, or puffier, I suppose. The simple image delivered with linear's slight drawl helped convince a majority of jurors. On Wednesday, the ninth day of deliberation, the jury found that Meta and YouTube were negligent in a case that accused the companies of designing their apps to be addictive and harmful to teens. And there's some interesting images both of him walking into the courthouse with a large box of papers, clearly very anti tech movement there.

Speaker 1:

He's saying, I reject technology. This cannot be stored digitally. I'm using paper.

Speaker 2:

Which, I don't know, this seems a little bit risky because we've been addicted to the printed word

Speaker 1:

So in the

Speaker 2:

much so that we face criticism from people that said, hey, printing is unnecessary.

Speaker 1:

They did.

Speaker 2:

You're very friendly, but maybe he's a forced to adjust.

Speaker 1:

Maybe he can flip over to to be our defense attorney when we are attacked. There is a there's a courtroom sketch showing linear questioning former TBPN guest, Adam Osseri, the head of Meta's Instagram. A jury ordered the company to pay $3,000,000 each in compensatory damages and 3,000,000 in punitive damages. So I think it's 6,000,000 across both firms, but it's split, compensatory and punitive damages. A now 20 year old woman named Kaylee, whose last name was redacted in the case, she had testified that social media use that started when she was a child dominated her life for years and contributed to mental health issues, including anxiety, depression, and body dysmorphia.

Speaker 1:

Very, very sad situation, very unfortunate for her, of course. In a statement, Meta said it disagrees with the verdict and plans to pursue an appeal, reducing something as complex as teen mental health to a single cause risk, risks leaving the many broader issues teens face today unaddressed. Not mutually exclusive, but of course, that is a reasonable position for Meta to take. Google also put out a statement. What do think?

Speaker 2:

They're like, we we're not even a social media company.

Speaker 1:

We're a VR company.

Speaker 2:

No. No. No. Google said, misunderstands YouTube which is respond which is a responsibly built streaming platform, not a social media site.

Speaker 1:

That's true.

Speaker 2:

Got the wrong guy.

Speaker 1:

Yeah. I've I I think of of YouTube very much as as in the same world as social media anyone can post, but it is severely lacking in some of the greatest features of social media sites. You if you when you when you actually become a YouTuber, you start putting out content that like there is sort of a I don't know, like a group of made men on YouTube, people that have ascended. They're now making content professionally, and they are in conversation with each other, and they might be reacting to each other's content. Of course, there are different communities.

Speaker 1:

There's the car YouTuber community, and then there's the the game show community, and there's the business community. And pretty quickly, everyone sort of gets to know each other, but there's no DM feature. So even if I make a video

Speaker 2:

Which is a which is a good argument for

Speaker 1:

it Not being social media.

Speaker 2:

To be not being a social media.

Speaker 1:

Yeah. Yeah. So like, you know, we at this point have done the Colin and Samir show. We don't really have a way. We can go on to the Colin and Samir YouTube channel and leave them a comment, and they might see it if it's from the TBPN account.

Speaker 1:

But we can't like just DM them and be surfaced to the top of the inbox. People have always wanted an inbox on YouTube. Yeah. That's a huge feature request. It's insane.

Speaker 1:

It would be so cool to see to be able to see, okay, I got a DM from someone who has a 100,000 followers, and I can click on their profile and see, oh, they're like, you know, in the same niche, like, maybe we'd wanna work together as opposed to everyone basically needs to flow over to Twitter or X and then DM there

Speaker 2:

on The other thing Google has in the in in this kind of position is that so much of the watch time on YouTube is happening on television. Something like 50%.

Speaker 1:

Yep. Very different.

Speaker 2:

And so they can make the argument that this is just modern

Speaker 1:

Yep. Television. So let's go through Lanier's career because The Wall Street Journal has some interesting backstory here. He says Lanier has built a career and fortune representing plaintiffs against corporate giants. He won one of the first major wrongful death trials against pharma company Merck over claims that the prescription anti inflammatory drug Vioxx caused heart problems.

Speaker 1:

He also won a $4,690,000,000 verdict in 2018 for women and their families who said asbestos tainted talcum powder caused ovarian cancer. Over his career, it seems like he's done some very, very good work and has won some massive, massive settlements against big companies with broadly damaging product. Lot to admire about his career here. The social media trial drew more scrutiny than he predicted before he joined the plaintiffs' team last fall and was brought face to face with Meta Chief Executive Mark Zuckerberg. Suddenly, Lanier was at the epicenter

Speaker 2:

I believe that Zuck is actually mewing in this picture, if we can pull up this this image.

Speaker 1:

It does appear to be something along those lines. Suddenly, Lanier was at the enter

Speaker 2:

You agree, Tyler. Right?

Speaker 3:

You can tell his cortisol is not spiking here.

Speaker 1:

That's true. That definitely seems he seems calm, collected. But this is not his first time putting on a suit. This is not the first time he's been in court. Suddenly, Lanier was at the epicenter of a broad public debate about social media and how people stay connected or are disconnected on platforms offering nearly endless content curated by algorithms.

Speaker 1:

Quote, nothing compared to this, Lanier said, reflecting on the attention to the trial over oatmeal toast and a Coke Zero in Downtown Los Angeles. Social media companies have largely been shielded from being held liable for third party content on their platforms by Section two thirty of the 1996 Communications Decency Act. At trial, Lanier had to focus on the platform's features, not the content to make a case. And that's something that I want to talk about today and I wrote about in the newsletter. The trial was the first among thousands of consolidated lawsuits filed by teenagers, school districts, state attorneys against Meta, YouTube, TikTok, and Snap.

Speaker 1:

More are scheduled for this year. TikTok and Snap settled the case settled the first case. He's known for showing jurors hand drawn road maps and illustrations on an overhead projector to guide them through his legal reasoning and evidence, including signposts and human figures that could have been sketched by a child. To visualize microscopic asbestos fibers in talcum powder, he brought a bale of hay into a courtroom and dropped a needle into the blades. Into the blades?

Speaker 1:

The blades of grass. Oh, the blades of hay. Got it. Okay. Wow.

Speaker 1:

Very, very interesting. He likes he likes props. That's it. He held up a jar of four fifteen M and M's to show how a $1,000,000,000 fine would be a fraction of Alphabet's 415,000,000,000 in shareholder equity. He needs a bigger jar.

Speaker 1:

He says he tries to avoid being flashy himself. He wears the same two unremarkable suits on rotation during a trial, and then I go burn them. What? He burns his suits after

Speaker 2:

My work here is done?

Speaker 1:

I guess. He began gaining renown as a lawyer in an era when asbestos cases were swamping The US courts. He won a jury verdict of about 115,000,000 in 1998 for 21 steel workers who fell ill after using machinery that contained asbestos. Lanier and his wife, Becky, met in high school debate class. They have five children and 12 grandchildren.

Speaker 1:

Wow. Overnight success. They were known for years for their child friendly Christmas parties at their estate of more than 35 acres near Houston, which has a model railroad that can seat a 120 people. Okay. This guy's gotta win all the I I have completely changed my position here.

Speaker 3:

I need a mansion section article.

Speaker 1:

I think I think we

Speaker 2:

have a direct line to him, by the way. Okay. We want him on the show. Well, like, this is Maybe we should go do a show from Yes.

Speaker 3:

Yes. The

Speaker 2:

from the model train.

Speaker 1:

Yes. I'm I'm so ready to be convinced of his position. I I I wrote a whole piece about how I disagree with with the result, but he's winning me

Speaker 2:

over Disagree with this entire argument, but you're agreeing with this approach to life.

Speaker 1:

Yes. Yeah. 100%. 100%. The theme of his cases against major corporations is responsibility and integrity or lack of it.

Speaker 1:

Tech billionaires don't need his help, Lanier said, but Kaylee would not have anybody else. Faith is much the same way. God's there to try to help people who need the help. Last week, Brandon Guerrell summarized the ruling this way. He said, in the case, the plaintiff's lawyer, Lark Mark Lanier, argued that Meta and YouTube built digital casinos that used neurobiological techniques similar to those employed by slot machines.

Speaker 1:

The jury found that specific features of Meta and YouTube are designed to be addictive. And I want you to really hone in on these features. So infinite scroll creates an environment where there are no natural stopping points. Algorithmic recommendation feeds algorithmic recommendation speeds users highly engaging content. Autoplay removes users' agency in choosing whether to watch the next video.

Speaker 1:

Notifications pull users back in by exploiting their need for validation. IG beauty filters contribute to to the plaintiff's body dysmorphia. And features like the Like button exploit users' biological need for social approval. Okay. So you got a bunch of features.

Speaker 1:

You know this stuff. You everyone uses social media. We all know about this stuff. The question is, like, is are the features addictive or is the content addictive? Because social media platforms are, of course, protected from the content that is posted on

Speaker 2:

the Lanier's entire Lanier's entire argument is predicated on it being the features.

Speaker 1:

So we talked to Eric Goldman from Santa Clara University of Law. And he was saying that, like, yes, it's a $6,000,000 settlement right now, but this could be huge. The direct quote was whether we will even have social media in the future. This could be existential.

Speaker 2:

Yeah. There's thousands of other cases like this kind of percolating.

Speaker 1:

And they could turn into a class action. He's gotten $6,000,000,000 before. He could get $50,000,000,000 I don't know. He could get a lot. Like, 6,000,000 is he's not a $6,000,000 guy.

Speaker 1:

He's a $6,000,000,000 guy. And so this is the precursor, and it's going further. And whether it's a ton of different cases or one big one, like, it's a big problem for I the tech thought it was an odd coincidence that we sort of had what I called the placebo controlled trial for these exact features last week when shut down. So OpenAI's nascent social network Sora shut down. The reaction of the news was funny to watch because a lot of people were like, yeah, told you it was always bad.

Speaker 1:

But when it launched, it was exactly the opposite. Everyone was like, it's too good. We won't be able It's to look too good. Simply Rune summarized this pretty well, I I think, yesterday. Sora was peak moral panic.

Speaker 1:

All of these breathless takes about making videos that are going to addict humanity and waste everyone's time. Meanwhile, we made some funny videos that were less funny as time went on, and AI slop is just one category among many on Instagram Reels. Don't worry so much about making videos that are going to blow up people's brains, without worry, worry about making anything good at all. The best Soros were up there with the best Reels, and the humor relied significantly on the voice of the creator. I completely agree.

Speaker 1:

The funny sores that I

Speaker 2:

Yeah, even the video we played last week of the cat on the porch

Speaker 1:

Yes. That wasn't Yeah. One The prompt was not make something that will retain users.

Speaker 2:

And it wouldn't have been funny if the person hadn't been escalating the scene every new prompt Sora and ensuring them

Speaker 1:

absolutely used all of the social media best practices or addictive and harmful neurobiological techniques, if you want to use the coarse language. Sora app was basically the same as TikTok, Instagram Reels, YouTube Shorts, Snap in terms of UI and UX design. It had infinite scroll. It had algorithmic recommendations. It had notifications.

Speaker 1:

It had a Like button. And it didn't have IG Beauty filters, but like the whole thing is a filter because I could go in there and say, make me look like a bodybuilder, and it did a good job and I looked great in the videos. It gave me crippling body dysmorphia, obviously. I dream for the day when I will look like my Sora avatar, my what do they call it? Cameo?

Speaker 1:

My Cameo? No. But they really did use all the normal tools. That was for familiarity, but also because they're moving quickly. And the key innovation was not the UI design or the fact that it's vertical or algorithmic feeds.

Speaker 1:

Like, we are in 2026. We're not in 2014 when we're launching Vine. So the key insight was purely AI generated content. And it didn't work. The features were not addictive because the people that downloaded Sora did not become addictive because the content was a little bit too sloppy.

Speaker 2:

Yeah, well, it was just one type of content. Exactly. And it turns out people selection and they like variability.

Speaker 1:

Yes.

Speaker 2:

They might want to see a video of someone skiing and then some slop and then something their friend made and then some health content. And it's really the collection of that. The other thing I think that seems very obvious is if it was the product itself and the features that were addicting, there would be so many social media apps that were effectively thriving. There would be like a bunch of Instagrams.

Speaker 1:

And this is where I get to the cigarette comparison. So there's a bunch of comparisons to the cigarette industry. And I think it's really worth revisiting, like, what is addictive about cigarettes? Because there are some people that say, like, it's an oral fixation. Like, you just want to put, like, a stick in your mouth, so you should, like, switch to carrots.

Speaker 1:

That is like And maybe like 1%.

Speaker 2:

Some could argue it's an addiction to looking cool.

Speaker 1:

There you go. But but it is the nicotine. It is the nicotine. And that's why you do have a long tail of like 50 different cigarette brands and a thousand different e cigarette brands. And nicotine gum is addictive.

Speaker 1:

Nicotine patches are addictive. Nicotine pouches are addictive because they all contain the nicotine. If the court is asking us to believe that the Like button, the algorithmic feed, that is addictive, then we should see addiction like results from any app that implements that because that is the case for all nicotine containing products. They all addict people at I mean, there are less addictive formats in general.

Speaker 2:

How apps have you tried or test flights over the years that had any of these features that you used for thirty seconds

Speaker 1:

Exactly. The Because what actually keeps you coming back is the content, which is created by the users. And so you

Speaker 2:

want Lanier to go after every single person that has ever posted anything on Instagram and jail them, correct?

Speaker 1:

No. I think that some creators do create very compelling content. Some of

Speaker 2:

that You want to jail the

Speaker 1:

best creator? No. Some of that content is amazing. Some of that content is great. Some of that content is bad.

Speaker 1:

There's a very, very wide range. So yeah. But I mean, but it is true. Like like, I think the court is correct and and Lanier is correct that some people go on social media and make horrible content that depresses people that land on it. And it goes without saying that social media companies do have an enormous responsibility to manage recommendation feeds responsibly and route people in tough situations to helpful resources.

Speaker 1:

So Google already does this very, very well. If you type in specific keywords that seem like you're in a mental health crisis, it will not give you search results. It will give you a phone number for someone to call. And they know when to route the right people to that. And I do believe that all the tech platforms are thinking about this and implementing this.

Speaker 1:

Maybe they need to be more aggressive. I think that the big thing that most people can agree on is parental controls here. And I think that that's a much easier middle ground here. And just in general, one other nice meet in the middle option is potentially just getting tech companies to give users and parents in particular, but users broadly, more control over their experience. My mental health as a social media creator was at an all time high before I understood the metrics.

Speaker 1:

Because I was just like, oh, 300 views? I'm famous. This is amazing. 300 people sat down and watched my ten minute video essay about a dying VR technology or something like that. It was like, I've done it.

Speaker 1:

But then eventually, you get it, you're like, wait, the last video got 400,000 views. Why does this one have 375,000 views? I'm a failure. It is possible for me to create a platform that incentivizes addictive content. And that's like the retention curve.

Speaker 1:

So retention editing makes it more addictive. You become addicted to the content, but it's because of the features.

Speaker 3:

Yeah. So I I think that's like broadly

Speaker 1:

Pretty good argument.

Speaker 3:

The steel man that you you can make for Yeah. Like Lanier's position. Yeah. And then, I mean, there's other stuff. I I think like just the nicotine analogy, we were talking about this like Mhmm.

Speaker 3:

Okay, so you have nicotine like broadly, and then below nicotine you have like smoking, which is like definitely very bad for you. And then you have like, you know, pouches or stuff like this, which is like probably less bad. Like, it's just nicotine. There's no tobacco. So like, maybe this is like less bad.

Speaker 3:

And so maybe Yeah. The equivalent is like, you know, the the cool snowboarding videos on Instagram Yep. Are like the, you know, the the cleaner like nicotine stuff and then the like

Speaker 1:

Still addictive but not harmful.

Speaker 3:

Yes. And and then there's The question Jordan is like you're

Speaker 1:

gonna try and do a double cork twelve sixty and and eat Get smoked. Get smoked into the ground. You would imagine even in the most strict ruling where every new social media platform needs to be approved, you could potentially use all of those addictive features as long as the content was not carcinogenic with inside that app. Yeah. And that would be like a new nicotine gum, basically.

Speaker 3:

Yeah. Like, basically, I'm saying, like, right now, if you're 18, you can still like, there's parental controls. You can't be 13 or whatever. But it's poorly enforced. You can actually see a lot of the bad stuff if you're 18 on Instagram or whatever.

Speaker 2:

So I think I have a potential solution. Let's pull up this image of a cigarette package in Europe. Okay. So this is typical cigarette packaging in Europe. John, you probably wouldn't know this because you're very American and you're very loyal.

Speaker 1:

You

Speaker 2:

avoid overseas trips as much as possible. So on any given cigarette pack in Europe, you're going to see a really terrible image. This woman Yes. Apparently is coughing up blood. Yes.

Speaker 2:

And so I think what a potential solution that Meta could do is as soon as you open Instagram, it makes an AI generated image based on the last picture of you that you posted on social media and it just makes you look terrible.

Speaker 1:

Oh. And

Speaker 2:

it says like warning, social media will destroy you.

Speaker 1:

You could show you

Speaker 2:

And then you can scroll past

Speaker 1:

We could potentially show you with TechNack.

Speaker 2:

Are you familiar with TechNack?

Speaker 1:

Yes.

Speaker 2:

Yes. It's just a crazy image of you with Tech Neck It shows with you your Tech phone. Neck. Then so you can scroll past it. But every time you open that app, it's a new image.

Speaker 2:

It's a reminder. It's a new image of you looking the worst, wasting Are your life

Speaker 1:

the AI labs lobbying to get

Speaker 2:

this All put this away.

Speaker 1:

Are the AI labs lobbying to get that removed? Because I think most of their timelines suggest that lung cancer will be cured by AI any day now. So potentially, could start smoking again. Has anyone come out as pro smoking?

Speaker 2:

I don't think anthropics come out with they're anti anthropic has the joke that they make with journalists. They kind of got caught on anti But if

Speaker 1:

AI is going to cure liver cancer, it's game on. It's game on. It's game on. You can drink as much as you want because that's the thing because you get a liver disease if you drink too much. If I'm gonna be able to vibe code an mRNA vaccine to cure my liver cancer, I'm gonna be oozing for sure.

Speaker 1:

It's the only only rational thing to do.

Speaker 3:

Yeah. This is also kind of like when Sort

Speaker 1:

of irrational.

Speaker 3:

When companies like are saying like, oh, yeah, work life balance is super important. Yeah. So then their competitors will, you

Speaker 1:

know Yes.

Speaker 3:

Yeah. Because it's like People on topic should tell people to open their eyes to start drinking a lot because AGS is going to cure liver disease.

Speaker 1:

Yes. Yes. Yes. Yes. This is good.

Speaker 1:

This is good.

Speaker 2:

Okay. Let's revisit the Jetsons.

Speaker 1:

Okay. Revisit the Jetsons. I'm sure you've seen the Jetsons.

Speaker 2:

Where is my flying car and three hour work day? So I'm gonna be learning about the Jetsons.

Speaker 1:

Okay.

Speaker 2:

John is gonna be revisiting. I version of the future is way more fun than our reality. But when it comes to innovations, we're catching up. Interesting. Let's see.

Speaker 2:

Nicole says, I recently spent a weekend doing deep investigative research into future technologies. I binged the Jetsons in my sweatpants. For the uninitiated The Forgetful, this space age family sitcom features George and Jane Jetson living the American dream in an apartment in the sky with their two children, dog Astro and robot maid Rosie. The show is set in 2062, a century ahead from its original 1962 air date. It's full of fantastical inventions such as flying cars, dinner generating machines and canine treadmills complete with fire hydrants.

Speaker 2:

The upbeat vibe is markedly different from the apocalyptic, at times murderous sci fi of today. The 1960s were full of optimism about what the twenty first century would bring. And some of it actually has come true. While we've still got a few decades before the Jetsons family is meant to arrive, I dug into some of the show's technological hallmarks and determined how close we already are. Video calling.

Speaker 2:

She says, absolutely. In lieu of a home phone, the Jetsons had a video phone. Think Show's this creators couldn't fathom mobile devices, but they were spot on about video calling.

Speaker 1:

Now, to be clear, we are still working on with one of our business associates, like a video call that doesn't stop halfway through

Speaker 3:

Yeah.

Speaker 1:

And just cancel. But

Speaker 2:

So the Jetsons didn't predict the free tier

Speaker 1:

of Zoom? Yeah. The free tier of Zoom was not considered in in the in the Jetsons where

Speaker 2:

you're talking to someone. Couldn't fathom it.

Speaker 1:

You're you're clearly gonna go long on the meeting and Zoom's just like, goodbye.

Speaker 2:

It's over.

Speaker 1:

It's over. It kicks everyone out with no notice.

Speaker 2:

Is that a new thing? I feel like it used to do a countdown.

Speaker 1:

I think it did a countdown too, but

Speaker 2:

now Now it's they're just like, we want to embarrass the host.

Speaker 1:

I'm just So in

Speaker 2:

the Jetsons, they could even create deep fakes to stand in for them on camera. Woah. That's cool.

Speaker 1:

That's cool. I didn't realize that.

Speaker 2:

FaceTime's got got on there.

Speaker 1:

This is good. Read read this next line. When George secretly attended a robot football game, his simulacrum told Jane he had to work late. He's like using a deep fake to lie to his wife. Great.

Speaker 1:

This is so sixties.

Speaker 2:

Do not do this.

Speaker 1:

Do not do this. This is dystopian. Flying cars It's not all optimism.

Speaker 2:

Flying cars and travel tubes, sort of. Isn't much walking in Orbit City. A conveyor belt brings George from bed to the bathroom to get to and from his classroom. Elroy jets through a series of air tubes called the school homing network. When the wrong child shows up at the Jetsons' home, Jensen sends Jane sends him back with a push of a button.

Speaker 1:

Push button jobs. Almost. George works as a digital index operator at Spacely Space Sprockets for approximately three hours a day, three days a week. As a button pusher, he makes enough for to support a family of four, even though majority of his day is spent with his feet up on his desk.

Speaker 2:

Okay. They basically nailed this.

Speaker 1:

There's some

Speaker 2:

people out there that are basically button pushers right now, vibe coding. TBD on the revenue side.

Speaker 1:

True. Working three hours a day, three days a week. We work three hours a day, five days a week. And maybe the future's three just Monday, Wednesday, Friday streams. We can live the Jetsons future.

Speaker 2:

That'd be devastating for us.

Speaker 1:

Yeah. Until then, we'll be working

Speaker 2:

Space along colonization. Nope.

Speaker 1:

Yeah. They live above Earth with houses built on tall stilts. I like that. Thirty six years and counting. We may not be living as exceptional a future as the Jetsons, but we've still got three and a half decades to catch up.

Speaker 1:

By then, I will be twice as old as I am now. I've already witnessed the dawn of high speed Internet, the iPhone, and generative AI. How many tech revolutions will we experience in another thirty six years? By the time we hit the show's sixty two deadline, maybe we will finally live in space or make our current planet more habitable and make a comfortable living on a nine hour work week. Tyler, what do you think?

Speaker 1:

Will we get space colonization?

Speaker 3:

How do you define space colonization?

Speaker 1:

Living not on the earth above the Karman line for like that's your primary residence, like more than half the year.

Speaker 3:

How many people do

Speaker 1:

it? Anyone with like like if you can if you can afford like a apartment for a few thousand dollars or like a house that's above $500,000 in America, you can you can choose to live in space.

Speaker 3:

It probably depends on like the industry that like is chiefly, you know, benefited from people living there.

Speaker 2:

Andrew Reed says, the faster technology progresses, the harder it gets to print something in the office. We

Speaker 1:

have experienced this.

Speaker 2:

It's very true.

Speaker 1:

The brothers.

Speaker 2:

Aaron Aaron from Box says, Reed's Law. I know you may have wanted a better law, but I don't make the rules.

Speaker 1:

The year is 2027. Gary Tan has just crossed 1,000,000,000 lines of code per day. Water to three year old Californian towns were diverted in order to cool his locally ran LLMs. Riots erupt and protesters demand answers to one single question. What is he building?

Speaker 1:

People are joking about this because what was the latest stat? It was something like eighty

Speaker 3:

Seventy eight thousand

Speaker 1:

Seventy eight thousand code. Lines of code.

Speaker 3:

Per day on Per day. Think on Gary's list.

Speaker 1:

On Gary's list.

Speaker 3:

Which is his his blog.

Speaker 1:

It's a blog.

Speaker 2:

Sam says, remember when this was announced but didn't fully appreciate the size. That's a hell of a cluster. The Department of Energy will basically be a frontier AI company. NVIDIA is collaborating with Oracle and the Department of Energy to build the US Department of Energy's largest AI supercomputer for scientific discovery. The Solstice system will feature record breaking a 100,000 black wells and support the DOE's mission of developing AI capabilities to drive techno technological leadership across US security science and energy applications.

Speaker 2:

Another system, Equinox, will include 10,000 NVIDIA Blackwell GPUs expected to be available in 2026. Both systems will be located at Argon and will be interconnected by NVIDIA networking and deliver a combined 2,200 exaflops of AI performance.

Speaker 1:

We've talked about nationalization before. We haven't talked about privatization. We could potentially spin this out, take it public. There's an option here.

Speaker 3:

So I I I was interested in this. I looked this is gonna be like somewhere around like a quarter of a gigawatt Okay. Equivalent Okay. Of a 100,000 black bills.

Speaker 1:

Half a half a Metacampus, I think. In fundraising news, physical intelligence is in talks to raise $1,000,000,000 and 11,000,000,000 valuation.

Speaker 2:

I need to know. Why is Jeff Bezos here besides the fact that he looks fantastic in the talks?

Speaker 1:

He might put in some money. Oh, no. No. The company has previously raised more than 1,000,000,000 in capital from investors, including Jeff Bezos and Alphabet's independent growth fund, capital g. So you could have put Peter Thiel because Founders Fund's in.

Speaker 1:

You could have Lightspeed. Is it Danny Ryan

Speaker 2:

Reynolds? Or Yeah. Lockheed. Yeah. Or But Bezos Or any of the actual team, but

Speaker 1:

seems to get the the viral attention. You know, they're not a noisy firm. They're not a noisy company that's like posting vibe reels and going and picking fights all the time. So there isn't that much coverage of physical intelligence. But if you just look at the traction, look at the open source contributions, the data, the fundraising, like clearly something is happening there.

Speaker 1:

And so I think it's worth digging in and paying attention to.

Speaker 2:

You're Last night, Bill Ackman hit the timeline. Woah. I didn't He said some of the highest quality businesses in the world are trading at extremely cheap prices. Ignore the main mainstream media. One of the most one-sided wars in history that will end well for The US and the world.

Speaker 2:

And we have a potential for a large piece dividend. One of the best times in a long time to buy quality. Ignore the bears. And he says, and Fannie Mae and Freddie are stupidly cheap. Asymmetry at its best.

Speaker 2:

They could be a 10 x It could happen soon. And of course, Jira tickets comes in and says x.com, the market manipulation app that Fannie Mae and Freddie Mac are up 404237% as of this morning. I think I think they've actually dipped back down a little bit. But Justin says, posting your opinion on a public website is not market manipulation. JT says, don't ruin the tweet.

Speaker 1:

Somebody asked Grock, like, hey, break it down. Like, what is actually going on here with Fannie and Freddie? They generate 25,000,000,000 in stable annual net income from guaranteed fees, low credit losses outside crises. They're still in 2008 conservatorship, and the stock trades for a total market cap of 10,000,000,000. So there's a world where you're sort of buying maybe I don't know exactly how aggregated this is, but maybe it's like $25,000,000,000 of cash at some point for $10,000,000,000 That feels like a very good deal.

Speaker 1:

Get paid back in four months, five months. But of course, there are a whole bunch of other

Speaker 2:

political And of course, does. He does own. Fannie Mae and Freddie Mac are in his Pershing Square portfolio. Again, not illegal to share your opinion.

Speaker 1:

Yeah. Artemis two is launching and Kalshi has it at 64% chance before April 2. We're going to the moon. Four people are going to the moon. Everyday Astronaut says, I'm honestly shocked at how the general public has no idea Artemis two is taking humans out to the moon and will be the furthest humans have ever flown.

Speaker 1:

Every non space nerd I've talked to has no idea. We gotta get people stoked. This is what I'm gonna be writing about tomorrow. NASA is set to launch four astronauts around the moon, the deepest human spaceflight since the final Apollo lunar landing on '9 in 1972.

Speaker 2:

It's been an honor.

Speaker 1:

It's been an honor.

Speaker 2:

It's been an honor to to be here.

Speaker 1:

It was a rough couple days. I'm glad.

Speaker 2:

It's gonna be a great week. Have a wonderful evening.

Speaker 1:

We will see you tomorrow. Goodbye.

Speaker 2:

Throwing smoke.

Speaker 1:

Throwing smoke.