TBPN

Our favorite moments from today's show, in under 30 minutes. 

TBPN.com is made possible by: 
Ramp - https://ramp.com
Figma - https://figma.com
Vanta - https://vanta.com
Linear - https://linear.app
Eight Sleep - https://eightsleep.com/tbpn
Wander - https://wander.com/tbpn
Public - https://public.com
AdQuick - https://adquick.com
Bezel - https://getbezel.com 
Numeral - https://www.numeralhq.com
Polymarket - https://polymarket.com
Attio - https://attio.com/tbpn
Fin - https://fin.ai/tbpn
Graphite - https://graphite.dev
Restream - https://restream.io
Profound - https://tryprofound.com
Julius AI - https://julius.ai
turbopuffer - https://turbopuffer.com
fal - https://fal.ai
Privy - https://privy.io
Cognition - https://cognition.ai
Gemini - https://gemini.google.com

Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

Technology's daily show (formerly the Technology Brothers Podcast). Streaming live on X and YouTube from 11 - 2 PM PST Monday - Friday. Available on X, Apple, Spotify, and YouTube.

Speaker 1:

Everyone has ten year AGI timelines right now. It started when Sam Altman put out that post, like, super intelligence is just a few thousand days away. And at the time, it was kind of odd because, like, when he wrote that post last year, everyone was like, AGI is one year away. AGI is two years away. It was like fast takeoff time.

Speaker 1:

Like, everyone was very excited. And then he came out and was like, it's a few thousand days away. And so Sam came out and, you know, and he he was kind of in his blog post of the superintelligence age talking about, know, maybe we're a decade away. Then Andre Karpathy goes on to our cash just a couple months ago, couple weeks ago, says AGI is a decade away. And then to our cash posts this this probability density of of when AGI will be achieved.

Speaker 1:

There's a chance that America does it. There's a chance that China does it. And the median, the 50 per the fiftieth percentile was exactly twenty thirty five.

Speaker 2:

Then You know that that, that chart, that, like, very, schizo chart that says periods when to make money. Have you seen this floating around on x? And it's like it was created, like, I guess, about a 100 years ago. People people reference it anytime it, like, actually aligns to events because it's basically has years in which panics have occurred, years of good times, high prices, and and, time to sell stocks, and years of hard times, low prices, and a good time to buy stocks. And so it's basically like, astrology for stock picking.

Speaker 2:

Okay.

Speaker 1:

And and what is it saying right now?

Speaker 2:

2035 is a year is they're predicting is when a panic will occur.

Speaker 1:

Oh, interesting. Well, that certainly aligns with all these AGI timelines.

Speaker 2:

There you go.

Speaker 1:

And then I was looking at METR, and this one we'll have to debate a lot more. But Meter has been tracking AI's ability to complete long tasks, and it's growing exponentially. It used to be, like, six seconds. Now it's, like, two hours. And, you know, when you talk to, anyone who's in the AI field, they'll tell you that the agents are getting more and more capable of handling longer longer time horizon tasks.

Speaker 1:

The question is, I feel like humans don't have a time horizon. I feel like humans, they're just born, and the goal is, like, survive, be fruitful, multiply. Right? Yep. And so I feel like if you're if you're tracking the meter data, you need to get out to, like, thirty years, like a full career.

Speaker 1:

Right? Like, the prompt needs to be, like, go make money, and then it just goes and becomes a lawyer and, you know, lives its full life and retires after, after a thirty five year run. And, of course, when you track out the doublings, one in 2035, the meter is projecting based on that log curve that or that logs that log graph that AI will be able to have a time horizon in the in the decades. My read on the meter data is that, you know, AGI 2035 again. It's it's May maybe the messiest, the the least, like, definitive.

Speaker 1:

But what's interesting is that it just feels like ten years is this consensus right now, and there's much less diversity of opinion. There aren't that many people saying two years anymore. There aren't that many people saying fifty years anymore. Everyone's kinda saying ten years. And I just wonder like, let's put aside like, let's try and accurately predict when this thing happens, and let's just analyze it from a psychological perspective.

Speaker 1:

And, like, what does it mean when the tech community all has a consensus of something that's a decade? Like, a decade away could just be what people say when they don't know. Like Yeah. If you ask me when flying cars are gonna happen, I'm gonna say a decade. If you're gonna say when quantum computing, oh, that'll be a decade.

Speaker 1:

Oh, Mars? Yeah. That's a decade.

Speaker 2:

Most people under overestimate what they can accomplish in one year Yes. Underestimate what they do what they can do in ten years. And in this situation, we're just like, yeah. We're estimating that we can just achieve AGI in ten years.

Speaker 1:

When I personally feel, like, about AGI, like, I'm convinced by these. Like, I feel like AGI is ten years away. And I don't feel like I'm, like, coping or doing some sort of, like, mental logic jumps or something. It's just like if you actually force me to put a prediction down, I probably would say about a decade. Then I go to, well well, like, how should I actually be changing my behavior?

Speaker 1:

Like, if something big is coming in a decade, it feels like you should actually should not just be acting normally. It it feels like there's some sort of preference falsification going on. Like, how many everyone's saying a decade. How many people are actually acting like it's a decade? Like, what should you be doing in the intervening years if it's a decade away?

Speaker 1:

Like, are you just supposed to, like, build technologies that are fun little dopamine rewards? Are you trying to, like, accumulate as much capital as possible before?

Speaker 2:

Who plans around ten years at all? Right? People tend to Yeah. Go for things that they want today Yeah. In some ways.

Speaker 2:

Right? So if somebody, like, let's say, they're in their twenties Yeah. And, like, I wanna own a home Yeah. By the time by 2035. Yeah.

Speaker 2:

They want that thing today, and so they they under but they might understand, okay, it's gonna take some time to get there. Yeah. But actually making real changes in your life for this, like, impending scenario that's hard to predict Yeah. Entirely Yeah. Is very, very difficult.

Speaker 2:

And I don't know that you know, it was maybe more popular earlier this year to joke about, you know, what you know, the golden retriever maxing. Right?

Speaker 3:

Oh, yeah. We would we would

Speaker 2:

talk about this. Right? But it feels like that dialogue has kind of changed.

Speaker 1:

Tyler, what what do you think about my take? What do thinks, what do you agree with? What do you disagree with?

Speaker 3:

Yeah. I I think it's definitely interesting that people seem to kind of align around this ten year thing. I think there's also some sort of bias. Right? If you think AI is coming in three years Mhmm.

Speaker 3:

You're probably not just gonna be, like, writing blogs. Like, you're maybe you're gonna start a macro hedge fund. Maybe you're gonna go work at one of the labs to, like, really try to influence how it's gonna happen if it's gonna happen super quick.

Speaker 1:

Yeah.

Speaker 3:

So I think there's some sense of that. And if you really don't believe AGI is coming at all, then, like, you're also probably not gonna be writing blogs about AGI. You're just gonna be, like

Speaker 1:

Yeah.

Speaker 3:

Doing your normal job or whatever. So I think there's some, like, confirmation bias there.

Speaker 2:

Tyler's probably the most AGI'd per pilled person on the team. And so it's interesting that you're you kind of

Speaker 1:

I think you

Speaker 2:

day to day act like everyone else on the team.

Speaker 1:

Yeah. What what what's your biggest revealed preference?

Speaker 3:

That's hard to say.

Speaker 1:

Maybe just hanging out on the show.

Speaker 2:

Yeah. Just hanging out podcasting instead of going to universities. Just like, yeah. Nothing matters.

Speaker 1:

This clip of Alex Wang that's sort of going viral. People are dunking on it because he recommends that kids learn to vibe code. And I just disagree with the haters. Like, I I think the haters are wrong on this one. And I was thinking about, like, well, I have a kid.

Speaker 1:

Like, would I teach him to vibe code? And I was playing with Legos last night. I was assembling Legos. And from what I've done vibe coding and what I've done with Legos, I'm like, this is going to be very fun, and this is going to be an activity that we do together. And I'm super I I'm, like, super in agreement that I think learning to vibe code is good.

Speaker 1:

What do think, Tom?

Speaker 3:

I think like, the point of view of, like, the people hating where that, like

Speaker 1:

Yeah.

Speaker 3:

Oh, your your 13 year old should be, like, making b to b SaaS, which is, like, not not that's different than saying they should be vibe coding. Because vibe coding is just, it's, like, basically playing a video game where it's, like, making Minecraft mod or something that's like Yeah. Seems totally fine.

Speaker 2:

Yeah. Some of the criticism some of the criticism was like, you should just be doing really hard things. You have to ask is like, okay, should 13 year olds like only be hand coding? Like, if you wanna get them into maybe making doing engineering work, is it Yeah. They'd doing it with a pen and paper?

Speaker 2:

Like, what what is the alternative? When I was 13, I was working on little iPhone apps. Yeah. And knowing the vibe coding tools that are available today

Speaker 1:

Yeah.

Speaker 2:

I would have been able to make way more progress. I would have had a lot more fun.

Speaker 1:

Yeah.

Speaker 2:

It would have been like having an expert, like, software engineer sitting next to me, kind of like pair programming. Right?

Speaker 1:

Yeah. Yeah.

Speaker 2:

And so I'm on your side. I think I think people generally just just don't like Alex because he's been wildly successful.

Speaker 1:

He's basically the youngest, most successful person in the world. With Alex Wang, he's like, yeah. He's rich, and I have no idea what the thing he I don't interact with it at all.

Speaker 3:

Fun fact. So in fifth grade, in the yearbook, they asked, like, what you want to be. Yeah. I put investor.

Speaker 1:

Investor in fifth grade?

Speaker 3:

Yeah. That's I don't even really know what that meant.

Speaker 1:

SoftBank's big profits jump this quarter came from OpenAI's increased valuation. That SoftBank lifted higher by buying shares from employees, and selling them at a higher price. And so there is a question about is this cert too circular? It's unclear if, SoftBank was really the price setter on this deal, but, Dar Just Dario is certainly

Speaker 2:

Just Dario is saying SoftBank books a capital gain on an investment it hasn't paid for and recorded in its assets. That creates where you you would do this if you were a company that invested in a fund Yes. And it's called maybe some capital, but not all of it yet. And so you can show that there's a gain even though you haven't paid actually paid in the capital yet.

Speaker 1:

Yes.

Speaker 2:

But I think the way in which SoftBank is doing this It doesn't seem illegal, but I think it's nontraditional. Sophie, over on X says SoftBank is selling its NVIDIA stake to fund companies whose main expense is buying from NVIDIA. The more important thing here is that this is not the first time Masa has exited NVIDIA. He exited Okay. In 2019 before the run up.

Speaker 2:

He was then NVIDIA's largest shareholder, and he had a five per se 5% stake in the company that he sold for 3,600,000,000.0, and it would now be worth over 200,000,000,000.

Speaker 1:

Are people mad about this because they think NVIDIA is gonna rip further?

Speaker 2:

Did Selling your stake in NVIDIA. Yeah. Yeah. So basically SoftBank is booking profits.

Speaker 1:

Sure.

Speaker 2:

OpenAI profits. They are then selling NVIDIA shares to fund the original investment of which they've already booked the profits on. And the number one driving force behind NVIDIA's growth is OpenAI.

Speaker 1:

Today, we will get the SoftBank completely sold out of NVIDIA fear from bears on mainstream media and Wall Street. Does anyone do objective research anymore? SoftBank initially bought its NVIDIA stake through the Vision Fund in 2017, then exited completely in January 19. They missed it. They lost patience.

Speaker 1:

Not the first time. Yeah. When did they buy this latest slug? That's the question. SoftBank sold 32,000,000 shares of Nvidia in October, also sold part of its stake in T Mobile for 9,000,000,000.

Speaker 1:

It's not the first time they've cashed out of the chipmaker. That's a funny way to put it. Despite the sale, SoftBank remains tied to Nvidia through its other ventures. Yeah. That makes sense.

Speaker 1:

Warren Buffett says he's going quiet, the world's most famous investor. Warns against corporate greed as he prepares to hand over the reins of Berkshire Hathaway. The 95 year old will step back from day to day responsibilities at Berkshire at the end of this year when he retires from his role as chief executive. Craig Abel, of course, is coming in. Investors have long seen Buffett often called the Oracle of Omaha as a corporate folk hero, interspersing guidance on his portfolio company's performance with life and business advice.

Speaker 1:

He noted, for instance, that requirements for executive compensation disclosures backfired as business chiefs engaged in a race to earn more than rivals. What often bothers very wealthy CEOs, they are human after all, is that other CEOs are getting even richer. Buffett said, envy and greed walk hand in hand. Buffett added that Berkshire should try to avoid future CEOs who are looking to retire at 65 or who want to become look at me rich or initiate a dynasty. Look at me rich.

Speaker 2:

Buffett has never been photographed doing a money spread wearing Balenciaga, Rick Owens.

Speaker 1:

Yeah. So he he took the pledge to give away to give away his wealth. And so, it's gonna be a pretty big changing of the guard. There's that interesting stat that, like, if you go back thirty years ago when Buffett was 65, he was I don't think he was close to the richest man in the world. He he compounded a ton in his third thirty year run.

Speaker 1:

Like, from '65 to '95, he had a particularly good run that took him from pretty rich to one of the richest people in the world. It's very rare. We just don't see that many business executives or that many, like, people broadly where that's, like, the the highlight of the career is '65 to '95. Buffett has commissioned no plays, no poems, no symphonies, no operas, no ballets, funded no paintings or sculptures that will outlive him, endowed no theaters, choirs, orchestras, built no monuments, monasteries. Is this true at all?

Speaker 1:

He's committed to giving away half of his wealth. So, like, some of the money is gonna wind up with the opera, I imagine, just because it's gonna go out and get diffused amongst all the different charity efforts. It's not gonna all go into one thing. Howard agrees with him on this. He says he's right.

Speaker 1:

The greatness does not come about through accumulate great accumulating great amounts amounts of money, which is what he is known for. But beyond that, it's fortune cookie level advice. It's not wrong, but it that's about it. His partner Charlie Munger's contribution to architecture was to fund factory like college dorms in which a majority of the apartments don't have windows. The is was that UCSB?

Speaker 1:

Where is that? Yeah. It's have you have you been to that dorm?

Speaker 2:

Dude, we made the donation.

Speaker 1:

Yeah. Yeah.

Speaker 2:

Yeah. There's some reference to it on the campus, but they abandoned the plans to build a windowless dorm.

Speaker 1:

So it's like it's so Soviet? It's extremely Soviet.

Speaker 3:

I don't know why they wanted I thought there were fake windows, and they were like TVs essentially. But they would show, like, outdoor scenes.

Speaker 1:

It should be vertical TV windows that you can just put Sora on. Is it safe to take the other side of this and just say that like he was about to create the greatest locket of all

Speaker 2:

mean, what if he was basically saying like college students spend very little time in their actual dorms? Yep. Because they're out and about in the world. They're studying.

Speaker 1:

They're at

Speaker 2:

the library. They're at events. Totally. They're really just using the dorm to sleep. Why don't we create common areas that have windows Yes.

Speaker 2:

And outdoor areas that then you go in your pot?

Speaker 1:

This is, Eric Adams floated a similar idea to Munger calling for stripping legislation that promises each citizen each city citizen window access this past March. You don't need no window where you're sleeping. It should be dark, he said. The $2,000 dividend could come in lots of forms. Let's play the clip.

Speaker 4:

You know, it could the the $2,000 dividend could come in lots of forms and lots of ways, George. You know, it could be just the the tax decreases that we are seeing on the president's agenda. You know, no tax on tips, no tax on overtime, no tax on Social Security, deductibility of auto loans. So, you know, those are substantial deductions that, you know, are being financed in the tax bill.

Speaker 1:

Tyler. So Do you feel like this How how how

Speaker 2:

is that gonna help the day trading community?

Speaker 3:

Yeah. I was was

Speaker 2:

planning on

Speaker 3:

doing rampant speculation with this $2,000. What am I gonna do now?

Speaker 2:

Yeah. You're actually you're basically fully you've already committed the funds. Right?

Speaker 3:

Yeah. Yeah. I've already placed a ton of parlays. Yes. So, like, what am I supposed to do now?

Speaker 1:

I can't pay my parlay bill with a tax decrease on auto loans.

Speaker 2:

Yeah. You're not looking for passive income. You're looking for massive

Speaker 3:

Bro, I'm looking for massive income.

Speaker 1:

They should put a button on the IRS website that allows you to either receive the $200 stimmy check, or if you press the button, you have a $50.50 chance of getting either $4,000 or zero. I really think a lot of people would press that button. Two of the world's biggest data center developers have projects in NVIDIA's hometown that may sit empty for years because the local utility isn't ready to supply electricity. 48 megawatts. What are we doing?

Speaker 1:

That's that's child

Speaker 2:

The data center for ants. Calm down, people. Moss's son is a male Kathy Wood. Him selling NVIDIA today means NVIDIA has plenty of upside left. It is.

Speaker 2:

It is. I mean, it just comes down to he will take he's selling NVIDIA. He's gonna give that money to OpenAI, and then OpenAI is gonna give it to NVIDIA.

Speaker 1:

So

Speaker 2:

Sold. Jan Lecun

Speaker 3:

He's out. Says, see you. He's out.

Speaker 2:

He's out. He's out. At from Meta, he's gonna launch his own startup. Tyler, what's your reaction?

Speaker 3:

Yeah. I mean, this is pretty big news. I I think people have kind of expected this for a while

Speaker 1:

Yeah.

Speaker 3:

Basically since Alexander Wang came in. Mhmm. Because Alexander Wang kind of, in some some some sense, like, took over as, like, the chief AI person at Meta, which was

Speaker 1:

Not enough room in this town for two AI

Speaker 3:

Not enough room for

Speaker 1:

Chief scientists. Coon.

Speaker 2:

Yeah. Yeah. Yeah.

Speaker 1:

Not enough room in this town for two chief AI scientists.

Speaker 3:

Yeah. And and so Jan is, like, very goaded AI researcher. Yes. He's been in the game for, like, super long. Today, he's, like, very much seen as being bearish on LLM.

Speaker 1:

Sure.

Speaker 3:

He doesn't think that they can reason. He doesn't think they can, like, do novel tasks or whatever. So he's been, like, very outspoken about that. I think that's maybe part of the reason why he's he just has, like, kind of a a different vision from from Zuck and Alexander Wang. I've heard exactly what it's called, but he has, another kind of path that he sees to AGI.

Speaker 3:

He's Mhmm. He's he's still a professor at NYU, which I think is where he does most of his research these days.

Speaker 1:

Interesting.

Speaker 3:

So I think he'll probably just stay in academia.

Speaker 2:

Does he have any active classes going on this semester? Because if he does, we should send you onto the campus to study.

Speaker 1:

Do we have an update from Brian Johnson? He came back from his his trip. He was on psychedelics. I feel like I'm expecting at him to fall in love with at least one fast food restaurant. I want him to to flip around on at least one and say, okay.

Speaker 1:

Yeah. I'm I'm I'm an in and out guy now. He's been he's been quiet. I I do wonder if he'll if he'll update on anything or if he's just, like, so so powered through that it just will not affect him at all. Because, like, wasn't he was kinda framing it as, like, I'm I'm this is, like, a big deal.

Speaker 1:

It seems like he got through it just super fine.

Speaker 2:

Saw a very viral post of of somebody saying, I don't think it's a good thing that billionaires can talk about taking schedule one drugs publicly with no repercussions. Did seem that he was doing it like this seems like such still such a gray area where you can get it prescribed by a doctor. Mhmm. But it's still schedule one. He posted an update on on his experience.

Speaker 2:

He also talked about more about why he says there's potential, it's potentially a longevity therapy for psilocybin expands lifespan in mice. It can reduce inflation markers tied to aging. It can increase brain entropy, breaks rigid patterns, and boost long term cognition and flexibility. And

Speaker 1:

Yeah. I I don't know. I I kind of agree with that with that that person who is, who is saying, like, they shouldn't be able to talk about it publicly. I feel like he should have a disclaimer or something because there's gonna be there's people he's such a huge audience that there's gonna be some people that just listen

Speaker 3:

what he is and run straight into

Speaker 2:

five minutes. Podcast era,

Speaker 1:

Yeah.

Speaker 2:

You had Tim Ferriss Yeah. Who, to his credit, was, like, doing work on himself Yeah. And would share what he was doing.

Speaker 1:

Yeah.

Speaker 2:

But he would talk about doing things like ibogaine

Speaker 1:

Yeah. Yeah.

Speaker 2:

Which is like a, you know, super powerful psychedelic. Yeah. And he wasn't directly saying, hey. I think you should go take this. But when hundreds of thousands of people, like, follow Tim for health advice, it sort of is like an indirect.

Speaker 2:

It's somewhat of an it's it's somewhat of an endorsement even if it's not directly. So

Speaker 1:

No. No. I mean, I a 100% think some people will see this and be like, oh, looks like he had a good experience. Let me jump straight to exactly his protocol, which is clearly not accessible for the average person. Like, he has been building up a tolerance to this particular chemical for probably a decade.

Speaker 1:

And so it's going to hit him very differently than it will someone who's not in the same place. We were joking yesterday about, like, oh, if he wanted to, like, really challenge himself, he would have been at, like, a crowded concert or And there are gonna be people that see this post and and wind up actually doing that and winding wind up in a very, very rough spot.

Speaker 2:

I did see some people responding and saying, I took a similar dose to this and basically ruined my year. I have joked in the past. It's like, are psychedelics gonna fix your life or just doing the tasks that you've been avoiding? Is that gonna make you feel relief? Yes.

Speaker 2:

You know? Is that gonna make you feel better about your

Speaker 1:

life? It's like that meme of, like, the the anime ninja. If you're tired, do it tired. If you're sober, do it sober. Right?

Speaker 1:

If you're sober, do it sober.

Speaker 2:

2016, so almost ten years ago, says digital addict yeah. Digital addiction is going to be one of the great mental health crises of our time. And Blade says, Unc had this thought and immediately was like, how do I profit from this? The greatest to ever do it, TBH.

Speaker 1:

Like, if you're if you're trying to, like, layer up, like, the the OpenAI is bad, like, case, like, you don't start with, like, the product's too addictive. Like, I don't think that's the claim. The claim is, like, it uses too much water or, like, it it, like, one shots people or it does erotica or it's, like, AI slop. But very few people are saying, like, oh, yeah. People are on their phones too much, and it's because of OpenAI now.

Speaker 1:

People are on their phones too much, but it's because of Instagram, and it's because of, like, social media. Actually, not the AI just got that. I think it's a little bit too early to say that.

Speaker 2:

I don't think it's that I don't think it's working that well. Certainly short short form video.

Speaker 1:

Have you ever been in in in, like, the gym or in a coffee shop and seen someone scrolling Sora, like, over your shoulder? I haven't seen it.

Speaker 2:

I meant I meant vertical video broadly.

Speaker 1:

Totally. Totally. We we which is, I guess I guess, like, a good take here, which is, how do I profit from this? I gotta make an AI version of it. That's that's probably, like, more of what they were thinking.

Speaker 1:

In terms of actually profiting on, on, like, digital addiction, it doesn't seem like it's going very well.

Speaker 2:

Four four o seems to be extremely addictive

Speaker 1:

Yeah.

Speaker 2:

So much so that

Speaker 1:

Very for a very small amount of people, like, absolutely.

Speaker 2:

Yeah. For sure. Yeah. We don't know how many people. Yeah.

Speaker 2:

And we hope you have a wonderful Veterans Day, and we'll see you tomorrow.

Speaker 1:

See you tomorrow. Goodbye. Cheers.