TBPN

Our favorite moments from today's show, in under 30 minutes. 

TBPN.com is made possible by: 
Ramp - https://ramp.com
Figma - https://figma.com
Vanta - https://vanta.com
Linear - https://linear.app
Eight Sleep - https://eightsleep.com/tbpn
Wander - https://wander.com/tbpn
Public - https://public.com
AdQuick - https://adquick.com
Bezel - https://getbezel.com 
Numeral - https://www.numeralhq.com
Polymarket - https://polymarket.com
Attio - https://attio.com/tbpn
Fin - https://fin.ai/tbpn
Graphite - https://graphite.dev
Restream - https://restream.io
Profound - https://tryprofound.com
Julius AI - https://julius.ai
turbopuffer - https://turbopuffer.com
fal - https://fal.ai
Privy - https://privy.io
Cognition - https://cognition.ai
Gemini - https://gemini.google.com

Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

Technology's daily show (formerly the Technology Brothers Podcast). Streaming live on X and YouTube from 11 - 2 PM PST Monday - Friday. Available on X, Apple, Spotify, and YouTube.

Speaker 1:

The timeline is in turmoil over Again? Sam Altman again. Backstop gate continues unabated. The main point of debate is over Sarah Fryer's comments that she used the word backstop. She backtracked on her backstop comment and said, I wasn't asking for a backstop of OpenAI equity.

Speaker 1:

I was advocating for this American manufacturing plan. Here's an OpenAI document submitted one week ago where they advocate for including data center spend within the American manufacturing umbrella. They specifically advocate for federal loan guarantees, and Simfer Satoshi says Sam lied to everyone.

Speaker 2:

Let's read the specifics.

Speaker 1:

Yes. The specifics are production and AI data centers. Broadening coverage of the AMIC, which is the American Advanced Manufacturing Investment Credit, will lower the effective cost of capital, derisk early investment, and unlock private capital to help alleviate bottlenecks and accelerate the AI build in The United States. Counter the PRC by derisking US manufacturing expansion. To provide manufacturers with certainty and capital they need to scale production quickly, the federal government should also deploy grants, cost sharing agreements, loans, or loan guarantees to expand industrial base capacity and resilience.

Speaker 2:

So what's happening here is OpenAI, a week ago, and everyone can go and and read this Yes. Letter, which is publicly available on their website, was making the recommendation that the government should effectively treat data centers or AI Yes. Or token factories, put them in the manufacturing bucket, which would qualify them for similar incentives that traditional manufacturing Yeah. Defense tech, etcetera.

Speaker 1:

I don't have a problem with asking the government for a handout. I think that that's actually, like, best practice. It's actually in your shareholders' responsibility. Like, you have a fiduciary duty to ask the government for as much help as possible. You have every incentive to ask your person in congress and your senator, the people in Washington, to do everything they can to support your mission.

Speaker 1:

This has worked out in the past with Elon and Tesla. It didn't work out in the case of Solyndra. The game on the field is if there are taxpayer dollars that are me moving around the board, you wanna get those into And so that are aligned with you.

Speaker 2:

And so the thing that people are taking issue with

Speaker 1:

Yes.

Speaker 2:

Is that in the opening of his message yesterday, he said, we do not want we do not have or want government guarantees for OpenAI data centers.

Speaker 1:

Yes.

Speaker 2:

And that seems to conflict with the message, the letter that they wrote a week ago that is still up on their website.

Speaker 1:

You can always advocate for, like, we should change the game. Like, we shouldn't be doing this. Like, I would prefer a more of a free market economy. But in the world where we're not in a free market economy, you want to have your company win. Right?

Speaker 1:

That's just rational. Now it is weird optics to talk about the game on the field, and that's very odd. When you say, oh, yeah. This is a one hand wash to the other situation, or, oh, yeah. This is a this is a situation where, you know, a backstop will allow us to be more aggressive.

Speaker 1:

That feels like the banker saying, oh, yeah. I knew that the government was gonna bail us out in o eight, so I was intentionally underwriting loans where it was somebody's fifth house, and I knew that they couldn't pay it. I wasn't asking about their job. I wasn't asking about their income. I wasn't asking about their assets.

Speaker 2:

Which is why I kind of expect a lot more of the narrative and to shift towards subsidizing and incentivizing and bringing new energy You

Speaker 1:

were talking about that yesterday.

Speaker 2:

Which directly benefits the labs and and and anybody building a data center. But it also feels very much in America's, interests broadly. It theoretically would benefit the average American too.

Speaker 1:

OpenAI needs to be crystal clear about the position that they're in, which is that they are the hottest company in the world. There is unlimited demand for their shares. They could be a public company. They could go raise more private capital. They need to be on the opposite end of the risk curve from the stuff that's like, no one really wants to invest in an American fab that might lose money for a decade.

Speaker 1:

Like, when you think about the things that make money in the short term, it's SaaS. Right? AI is for SaaS. Yeah.

Speaker 2:

People's concern Yeah. For, government backed data center lending

Speaker 1:

Yeah.

Speaker 2:

Yeah. Is that you're lending against chips, which have a really fast depreciation schedule. Yep. We don't know. And if you look at, right now, Core Weaves, corporate default swaps are now sitting around 500 basis points.

Speaker 2:

Jumped up dramatically. This is one of the leading Neo Clouds

Speaker 1:

They are

Speaker 2:

still in the platinum. Neo Cloud in the platinum platinum tier. And people are worried about them. Right? Yep.

Speaker 2:

Potentially, you know, having some bankruptcy risk. And so if you start doing government guaranteed data center lending, you could get in a situation where there's a bunch of new data centers that come online that really don't have a clear pathway to ROI. Yeah. And it just incentivizes the entire stack to just really exuberant. Going back to Sarah Fryer's interview on Wednesday, she felt the market was not exuberant enough.

Speaker 2:

There's been a lot of insanity this year, silliness. Maybe we don't need more. The director of the Federal Housing Finance Agency, William Polt, says Fannie and Freddie eyeing stakes in tech firms. Bill Polt, the director of the Federal Housing Finance Agency, said that Fannie Mae and Freddie Mac are looking at ways to take equity stakes in technology companies. We have some of the biggest technology and public companies offering equity to Fannie and Freddie in exchange for Fannie and Freddie partnering with them in our business.

Speaker 2:

We're looking at taking stakes in companies that are willing to give it to us because of how much power Fannie and Freddie have over the whole ecosystem. So Yeah.

Speaker 1:

This, this Wall Street Journal event is just so many articles came out of this. Did you get a whole new soundboard? What's going on with the soundboard?

Speaker 2:

I did. I still got all the classics.

Speaker 1:

Got all the classics.

Speaker 2:

Working with some new material. Let's give it up for global businessmen.

Speaker 1:

Let's give it up. Let's give it You have new neighbor. So Tom Petty apparently lived in your neighborhood in Malibu, California. He was living in Malibu in a $11,200,000 house, 8,744 square feet, seven bedrooms, has a music studio. The buyer is Steven Slade Thien, a psychoanalyst and author.

Speaker 1:

Thien didn't respond for a request for comment. I wonder if he you'd wanna, you know, get beer sometime, hang out. We should reach out to

Speaker 3:

him for comment.

Speaker 2:

I was at the first ever Outside Lands.

Speaker 1:

Really? How old are you?

Speaker 2:

Tom Petty headlined Cool. On Saturday. This was in August 2008, and that was my first time smelling cannabis. And I kept asking, like, his friend's parents, what is

Speaker 3:

that stinky

Speaker 2:

smell? It's so stinky. Every we can't we can't get away from it. Yeah. That's hilarious.

Speaker 1:

Elon's trillion dollar pay package is done. It's signed. It's approved. I'm sure it will be contested in the courts. It's always contested in the courts.

Speaker 1:

Elon could get 1,000,000,000,000 in Tesla stock if he hits all these different tranches. He's worth half a trillion now, but he also owns 414,000,000 Tesla shares outright, got another award in 2018 of 300,000,000 shares, and this next award is 424,000,000 across 12 tranche tranches. Basically, what he already had, they're giving him the same amount again. And there's a bunch of things that he has to do. He has to get the market cap really, really high, and then there's also these, like, qualitative operational goals, or I guess they're quantitative.

Speaker 1:

50,000,000,000 in EBITDA, 20,000,000 cars delivered, 1,000,000 robots sold, 1,000,000 robotaxis in operation, 10,000,000 full self driving subscription. Now some of those are obviously more gameable than others. What's the definition of a robot? If he comes out with a really cheap robot and he sells a bunch of those because it's more of a toy, does that really fulfill the goal? What's the definition of a

Speaker 2:

robotax? Qualifies. It's it's a Tesla that is enabled for

Speaker 1:

And now you turn on FSD and my friend rides in it.

Speaker 2:

Is there anything for actual rides?

Speaker 1:

But these are more gameable than others, but the market cap really isn't.

Speaker 2:

How many self driving subscriptions are there today?

Speaker 1:

I saw I I looked that up. It's somewhere between, like, one and three million right now. So he has to he definitely has to, like, triple the size at least. He hasn't sold any robots, so a million would be entirely new robots. He's obviously delivering a lot of cars.

Speaker 1:

And on the EBITDA front, 50,000,000,000 in EBITDA. Company did, like, 13 last year, so that's that's a huge increase in EBITDA. I mean, 50,000,000,000 in EBITDA is a lot of money. He's he only has to take the market cap to 8,500,000,000,000.0, and Tesla's already worth a trillion. So it's gonna be weird to live in the world of the trillionaire.

Speaker 1:

Like, well but we are getting close. Like, that's going to happen not just within our lifetime, like, definitely within the next decade. I wonder how that's gonna reshape our our culture, like the world in America. Because when billionaires became so prevalent and prominent, there was a lot of heat that was taken off the millionaire.

Speaker 2:

Billionaires are the heat shield.

Speaker 1:

Yeah. So the millionaire became more accessible, and the billionaire became the thing that the society scapegoats

Speaker 3:

for all the problems.

Speaker 2:

Approximately 9.4 to 9.5% of American adults are millionaires.

Speaker 1:

Yeah. Like, what happens to the billionaire when trillionaires come in? Like, what happens when you have to say, like, well, like, trillionaires are the real policy failure, but billionaires are also the policy's failure. And millionaires were, like, kind of okay with, but it's not great. It's like it becomes much more complicated.

Speaker 1:

What does this mean for other companies? What does it mean for Sam Altman at OpenAI? Can he run a similar playbook? Could he go to the OpenAI for profit board and say, hey. If if OpenAI IPO is at 2,000,000,000,000, I want 20%.

Speaker 1:

We know that Sam runs a bit of an Elon playbook. They were in business together. They cofounded OpenAI together. And then I also wonder what what will happen at the garden variety unicorn. If you're just the CEO of a $5,000,000,000 company and you're just kinda hanging out there and you say, like, yeah.

Speaker 1:

I I had 30% of the company when I started. I've been diluted down to five or 10, But I like this company, and I wanna get it up. Could I go to the board and say, okay. We're at 5,000,000,000 now. If I get us to 50, will you double my equity position?

Speaker 1:

How would the growth stage venture companies Yeah. Feel about that?

Speaker 2:

I think the right way for founders to think about that is, like, no one's taking your shares Yeah. Unless you decide to sell them. Yes. Your job is just to make the share price go up Yeah. And there's gonna be more shares issued over time.

Speaker 2:

Yeah. But if you just make the share price go up Yeah. Forever, it doesn't really matter. And you can also buy back. Also, can also buy back.

Speaker 2:

Yeah. Almost no other CEOs would take a deal, like, because it's so ambitious. Yep. And so I think it's I think it's healthy.

Speaker 1:

Our good friend Tyler Cosgrove has put together a slide deck for us that tries to help map the mag seven. Really, I call it the TBPN top 10. TBPN top 10, the 10 most important companies in AI loosely. Yeah. The mag seven plus a few bonus ones.

Speaker 1:

We're gonna try

Speaker 2:

and Watch your take head time.

Speaker 1:

Through. And look. You're you're to hit the horse. We're gonna try and go through the various companies and rank them based on how AGI pilled they are and how much they need AGI. Is

Speaker 3:

that right? So, basically, on the horizontal, we have how AGI pilled they

Speaker 1:

are. Sure.

Speaker 3:

So I feel like that's fairly self explanatory. You kind of believe that AI will become something like it it can produce the the median economic output of

Speaker 1:

a person.

Speaker 3:

So then on the vertical axis, we have how much they need AGI. Okay. So I think this is a maybe a little harder. I so I I wanna qualify this. Yeah.

Speaker 3:

So, I mean, this doesn't necessarily believe that you have this kind of sentient, you know, AI that's as good as a person. Yeah. But I I think it more so in this context just means that AI will continue to become more and more economically valuable. Let's start, with Sam Altman. He believes in AGI.

Speaker 3:

Right? He he runs kind of the biggest AI company.

Speaker 2:

I think that more and more, at least in the short term, OpenAI looks like a hyperscaler. They're kind of a junior hyperscaler. And I think their actions are more think that OpenAI, a lot of people want to say that they're bearish on OpenAI at current levels. But ultimately, when you look at how their business is evolving Mhmm. They seem to me like they'd be fine if the models plateaued.

Speaker 2:

Next, have Dario.

Speaker 3:

Okay. He's extremely AGI pill. This is kind of the reasoning why he's he's so anti China. Yeah. Right?

Speaker 3:

Because he sees it as an actual race. This is going to super intelligence. Nuclear weapons. It is a national, you know, security. Totally.

Speaker 3:

A problem if China gets there first. UAV online. You you need a lot of continued growth for for Anthropic to keep kind of making sense economically, I think.

Speaker 1:

Okay. Yeah.

Speaker 3:

Yeah. Next is Larry. Larry. Larry. Larry.

Speaker 3:

Larry's in kind of an interesting spot here. Yeah. So this is kind of a weird place to be where you don't believe in AGI, but you need

Speaker 1:

it. Okay. How did he wind up there?

Speaker 3:

He doesn't seem the type that is believes in some kind of super intelligent

Speaker 1:

Yes.

Speaker 3:

God that that is gonna come, that's gonna, you know, birth this new thing Yeah. And humanity will will rise. Okay.

Speaker 1:

Sasha. Okay.

Speaker 2:

Here we go.

Speaker 3:

I think this is a fairly reasonable spot. Yeah. Obviously, there's, you know, there's some sense where he is slightly AGI pilled Yeah. Or maybe more than slightly.

Speaker 1:

Believes in the power of the technology. Yeah.

Speaker 3:

I mean, he's very early on OpenAI. Yeah. He he thinks that AI in general will become very useful, but maybe it won't become, you know, super intelligent. Maybe it's not gonna replace every person. It's just a useful tool.

Speaker 3:

It's the next

Speaker 1:

quote I always go back to is him saying, like, my definition of AGI is just greater economic growth. So show me the economic numbers, and that will be it's like, it's a it's a very practical

Speaker 2:

Yeah.

Speaker 3:

Definition. I think people see him as very reasonable. He's not gonna over skis.

Speaker 1:

Yeah. I like him in the center I like him in the center of the grid somewhere. That that seems like a real

Speaker 3:

He's also if OpenAI works out, he'll do very well. Sure. If they don't work out, I think he's also

Speaker 1:

He's hedged.

Speaker 3:

He's doing quite well. If Jensen was very IGI pilled Yes. I mean, he is the the kind of he he's the rock on which this all is built

Speaker 1:

on. Yes.

Speaker 3:

He has the chips.

Speaker 1:

Yes. Yes.

Speaker 3:

If he was AGI pilled, he would not be doing out those chips. He would keep all to himself, and he would be training his own model.

Speaker 1:

Okay.

Speaker 3:

So that's why I think he's he's more on the Yes. Doesn't believe in AGI's. Yes. There was a new blog post yesterday. It was basically AI twenty point seven.

Speaker 3:

There was new one. It was AI twenty thirty two. So it's basically team? Different team, but the the team of AI twenty twenty seven was promoting it. Sundar.

Speaker 1:

He believes in AGI more than Satya, you think.

Speaker 3:

Yeah. Well, I I think you can see this in kind of they were even earlier in some sense than Satya, right, with with DeepMind.

Speaker 1:

Yeah. There is a little bit of, like, if you really believe in AGI, the actions that we see are you, like, squirming and being like, I gotta get in. I I it doesn't matter if I'm 1% behind or 10% behind or 80% behind. I gotta get in.

Speaker 3:

I I think Sundar is also definitely below this line because, you know, Google has been doing very well. Mhmm. AI was at at first, I mean, you people thought of AI as, like, oh, this is gonna this is gonna destroy Google. This is bad for Google. So if AI doesn't work, then Google is just in the spot they were before, which is doing very well.

Speaker 3:

If AI does work, then, I mean, Gemini is Mhmm. One of the best models. They'll do very well. So Zuck is also kind of in an interesting spot. Yes.

Speaker 3:

I think I think Zuck is actually someone who has shifted rightward.

Speaker 1:

It's fascinating. It felt like Zuck was sort of like, oh, yeah. AI. Like, it's this cool thing. I'm gonna check the box.

Speaker 1:

I got my team. We did this fun little side project. It's this, open source model. We kinda found our own, like, little lane, but we're not, like, competing in the big cosmic battle.

Speaker 2:

Do you think that was just a counter position to way to win like, try to win the AI warp? Go say, hey. We're just gonna try to commodit commodify this market and, like, take a Chinese approach?

Speaker 3:

So Elon, I mean, he's been AGI pilled, I think, a very long time.

Speaker 1:

Super AGI pill. I agree with that.

Speaker 3:

OpenAI cofounder. Even before that, he I think he was fairly big in the in the safety space.

Speaker 1:

Totally.

Speaker 3:

You see him even on on Joe Rogan, he was talking about AI safety. He still believes it.

Speaker 1:

Yeah. He hasn't backed off.

Speaker 3:

And AI safety is important because it's gonna become super intelligent. It's gonna over the world. Totally. Totally.

Speaker 2:

So he talked yesterday about about humanoids being sort of an infinite money glitch. And I feel like you kind of need AGI in order to kind of unlock the the infinite money glitch.

Speaker 1:

Yes. But at the same time, very strong core business. The cars don't need AGI. The rockets don't need AGI. Starlink doesn't need AGI.

Speaker 1:

So, yeah, so he's so he's not entirely indexed on it in the way the Foundation Labs are.

Speaker 2:

Chad says, where's Tyler on the chart? Feels like Tyler isn't AGI pill anymore.

Speaker 1:

Let's pick

Speaker 2:

him up.

Speaker 3:

I actually am on the chart here.

Speaker 1:

Let's go to Tyler.

Speaker 3:

I am I yeah. I I think I'm very AGI filled. Right? I know I'm ready for the Dyson sphere.

Speaker 1:

Yes.

Speaker 3:

I think it's, you know, only a matter of years, handful.

Speaker 1:

Only a matter of years.

Speaker 3:

If AI does not work, the macro economy

Speaker 1:

Yeah.

Speaker 3:

Is looking not good. So I feel pretty bad about my job outlook Okay. Without AGI.

Speaker 1:

Sure. Sure. Sure. Even though you're already employed.

Speaker 2:

Breaking news.

Speaker 1:

Walk through the timeline. What's the breaking

Speaker 2:

news?

Speaker 1:

This was the problem with your Ford Raptor. It wasn't the r.

Speaker 2:

Zero to 60 in 3.93

Speaker 1:

to 720 horsepower. 5.2 liter supercharged V eight.

Speaker 2:

Altman today. We're looking at selling compute, but we need as much as possible. Zach, last week, we could sell compute. Are we in a compute shortage or not? Because both are saying they're buying as much of it as they can and thinking about selling it.

Speaker 1:

I like that that story about Jensen where, like, he understands the dynamic here, but there's still you get kinda crushed if there's a sell off in terms of demand.

Speaker 2:

It it's not the worst idea to buy as much as you can so that you have preferred access to it and then resell some if you have more than you need.

Speaker 1:

I mean, we just talked to David Bazooki from, from Roblox, and he was saying that, look. We had our we had our own on prem, but then we had we had spikes of demand. And so we went to the hyperscalers for that because they can load balance across while people are playing Roblox here, and then maybe they watch some Netflix over there, and then they are storing, you know, all sorts of different data. And there's different workloads that happen at different times. Jumping straight to selling compute, I think that the timelines are a little bit funky on this one.

Speaker 1:

It seems odd. It seems rushed. It seems rushed. Elon apparently confirmed that Tesla is going to build a semiconductor fab.

Speaker 2:

It's still not enough. So I think we may have to do a Tesla TeraFab. Woah. Great name. Oh.

Speaker 1:

That is a bombshell.

Speaker 2:

It's like Giga, but way bigger. Terabyte. Terabyte. He's feeling good I right I can't see any other way to get to the volume of chips that that we're looking for. So I think we're probably gonna have to build a gigantic trip out.

Speaker 1:

Morris, giant TSMC is like, no. I actually, I'm fine. I will I will supply you. Samsung is like, I'm good. I will do it.

Speaker 2:

Pull up this other pull up this other video of Optimus.

Speaker 1:

There's something like

Speaker 2:

This thing this thing has motion.

Speaker 1:

Yeah. It's not bad.

Speaker 2:

Imagine you're working late at the office one night and this thing just walks out onto the floor and starts looking at you and doing these moves.

Speaker 1:

Yeah. Most people that bought Teslas already had cars. Right? And so it's just like a one for one slot swap. Who are you replacing with this?

Speaker 3:

Well, with this, you can replace, like, a an exotic dancer.

Speaker 1:

Yes. That's true. Elon Musk now has $1,000,000,000,000 in his bank account. That's a thousand times $1,000,000,000. He could give every single human on Earth 1,000,000,000 and still be left with 992,000,000,000.

Speaker 1:

Let that sink in. People love this funny math whenever it drops. China has a similar humanoid robotic project, although it's way, way scarier because it went full Terminator mode on this. In spooky, spooky territory. This is pretty crazy.

Speaker 1:

In the last ten months, three very talented friends have joined separate hot early stage startups in senior roles and quit after realizing that the company's actual revenue was significantly less than what the founder had told them during the interview process and and shared online.

Speaker 2:

Yeah. Again, this is going back to spring. Right? That just felt like there was every founder was feeling this insane pressure to show, like, one to 10,000,000 of like, a one to ten million ramp that was just insane.

Speaker 1:

And the weird dynamic is that, like, as that pressure ramps up, you just get more and more incentive to fake it with community adjusted ARR and contracts that don't actually stick. And

Speaker 2:

Kazakhstan has signed an MOU Let's go. To buy up to 2,000,000,000 of advanced chips from NVIDIA. Let's hit the gong for Kazakhstan. Warm it up. Warm it up.

Speaker 2:

Warm it up. Boom. Great hit. Great hit for our friends in Kazakhstan. Good to see them getting into the game.

Speaker 1:

Maybe they should do Borat three about data centers. Breaking NVIDIA's losses accelerated to negative 5% of the day, now down 16% since Monday's high. That marks a drop of $8,800,000,000,000 since Monday. Wow. That is a wild sell off.

Speaker 1:

Didn't Tyler quote this and say, is this bullish or something like that?

Speaker 3:

No. That was on the the DoorDash. They went down 20%.

Speaker 1:

Everyone's down 10 or 20%.

Speaker 2:

We need to Whether you meet whether you beat or miss, you're going down. Heisenberg is sharing Yeah. Sharing a little bit of red here. Microsoft down 10% in the last eight days. What the hell is he doing?

Speaker 2:

He's shopping.

Speaker 3:

Microsoft went down three points.

Speaker 2:

That's real good. Meta down 18%. Losses, have accelerated. My inbox is full of people breathlessly trying to interpret this, and it is Yes. One year look at spy

Speaker 1:

It's over. It's just completely over. The bubble popped. Now it's ready to start rebuilding.

Speaker 2:

It was a

Speaker 1:

good run. We can go out from here.

Speaker 2:

Oh, the stocks have already stopped trading. So we're safe now.

Speaker 1:

We're safe. They can't hurt us anymore.

Speaker 2:

The economy can't hurt you on the weekend. Bank of America Yes. As of yesterday, fully recovered 19 from global financial crisis. Wow. Took nineteen

Speaker 1:

years. The day, and then it's, like, back in another crisis. Can you imagine? Famously backstopped by none other than Warren Buffett.

Speaker 2:

There's some more information on the Snap

Speaker 1:

Snapchat deal.

Speaker 2:

Snap gets 400,000,000, which is greater than Perplexity's total revenue. Mhmm. Snap gives nothing except access to

Speaker 1:

Wait. Wait. Woah. It's more? What?

Speaker 1:

They're they're they're paying more than the revenue? The deal looks incredibly in Snap's favor. Snap gets 400,000,000 greater than Perplexity total revenue. Now Signal is saying this is most likely 399,000,000 of Perplexity equity to Snap, not cash. So there is a question about how much of it is cash, but it is possible that they're just giving $400,000,000 of equity like a stock grant.

Speaker 2:

I was surprised that Snapchat has not figured out a way to just monetize all the capital that a lot of these, consumer AI companies have given their massive, massive user base.

Speaker 1:

Snap will be integrating Perplexity directly into Snapchat's chat interface. Perplexity will play will pay Snap $400,000,000 over a year in a mix of cash and stock as part of the deal and gets access to Snap's 900,000,000 monthly active users. Snapchat's gonna accidentally build AGI just trying to

Speaker 2:

make the dog filter blink realistically.

Speaker 1:

I love it.

Speaker 2:

Deutsche Bank is exploring ways to hedge its exposure to AI to data centers. It's looking at options including shorting a basket of AI related stocks and buying default protection via synthetic risk transfers. So

Speaker 1:

Like, is getting in on the action. Like, few years ago, like the like, it was, like, if you have anything that's related to AI, like, go, go, go, raise money, grow it, Spin it. Flip it. Turn it around. Pivot it.

Speaker 1:

Whatever you wanna do.

Speaker 2:

Have a great weekend, everyone. We love you. Goodbye. See you Monday.