TBPN

  • (00:03) - The Soham Parek Drama
  • (05:39) - Soham Parek Interview
  • (27:20) - Flo Crivello (Lindy)
  • (38:13) - Emily Sundberg (Feed Me)
  • (01:09:33) - Senate Passes Trump's Megabill
  • (01:13:46) - Zak Kukoff (Lewis-Bruke Associates)
  • (01:28:23) - Matthew Prince (Cloudflare)
  • (02:00:03) - Meta Poaches OpenAI Engineers
  • (02:39:51) - Will Brown (Prime Intellect)
  • (03:10:05) - Sam Altman Slams Meta

TBPN.com is made possible by: 
Ramp - https://ramp.com
Figma - https://figma.com
Vanta - https://vanta.com
Linear - https://linear.app
Eight Sleep - https://eightsleep.com/tbpn
Wander - https://wander.com/tbpn
Public - https://public.com
AdQuick - https://adquick.com
Bezel - https://getbezel.com 
Numeral - https://www.numeralhq.com
Polymarket - https://polymarket.com
Attio - https://attio.com/tbpn
Fin - https://fin.ai/tbpn
Graphite - https://graphite.dev

Follow TBPN: 
https://TBPN.com
https://x.com/tbpn
https://open.spotify.com/show/2L6WMqY3GUPCGBD0dX6p00?si=674252d53acf4231
https://podcasts.apple.com/us/podcast/technology-brothers/id1772360235
https://www.youtube.com/@TBPNLive

What is TBPN?

Technology's daily show (formerly the Technology Brothers Podcast). Streaming live on X and YouTube from 11 - 2 PM PST Monday - Friday. Available on X, Apple, Spotify, and YouTube.

Speaker 1:

You're watching TBPN.

Speaker 2:

We have some breaking news from TBPN. Let's go to the printer cam, which I believe

Speaker 1:

The timeline.

Speaker 2:

King is Here it goes. Printer cam's ready.

Speaker 1:

Terminal.

Speaker 2:

I see it spinning up. We got some breaking news. We got some breaking news. What is this?

Speaker 1:

Got He's to.

Speaker 2:

He's wanted to.

Speaker 1:

The timeline is in turmoil. Special segment starting now.

Speaker 2:

I love this because

Speaker 1:

it's it's Soham Parikh just became famous

Speaker 2:

It's funny because

Speaker 1:

the last few hours, not for the reasons that he would necessarily want to be famous.

Speaker 2:

No. No. Is this a real picture of him? Do we know? Or is a picture?

Speaker 2:

But this is from Var Epsilon. It says breaking meta, open AI, Anthropic, and Google have signed Soham Parekh as fractional chief AI officer. Obviously, a joke, but he's been on the timeline all day because apparently he's been working at multiple y c companies, multiple companies at the same time. What's funny is that we were we were working on a during the show, we were working on a post. We couldn't decide how to frame the trade deal of Soham and

Speaker 1:

But

Speaker 2:

some fan just did it for us.

Speaker 1:

This guy nailed it and

Speaker 2:

got a thousand lines.

Speaker 1:

Fully sent it. Matt Parker from Anti Metal was actually Soham was their first engineering hire in 2022. Mhmm. Really smart and likable. Enjoyed working with him.

Speaker 1:

We realized pretty quickly that he was working at multiple companies and we let him go. I can't imagine the amount of equity he's left on the table. Matt says, hiring SOHAM is a new rite of passage, TBH. Any great company should go through it. So Suhail was the first person to call this out.

Speaker 1:

He said this morning, or actually it was last night. So this has been this has been building. PSA, there's a guy named Soham Parikh in India who works at three or four startups at the same time. He's been preying on YC companies and more. Gary is not gonna be happy about You do not You don't want to put up the bat.

Speaker 1:

You know, you don't want Gary People to put up the Gary sign.

Speaker 2:

Nope. You don't want go up against him. But apparently, he's

Speaker 1:

prank on YC companies. I fired this guy in his first week and told him to stop lying, scamming people.

Speaker 2:

Oh, this is fascinating. Did you see this from Pratika Mehta? Tyler put this in. You wanna read this one first?

Speaker 1:

He's been all over the place. Roy said I interviewed this guy yesterday.

Speaker 2:

No way. That's amazing. Tyler, break it down for us. What does Pratika have to say?

Speaker 3:

Okay. Apparently, mean, this is still unconfirmed. I've actually I've sent him an email.

Speaker 2:

Okay.

Speaker 3:

I've I've I've obtained his email. I'm not sure if it's the real one. We'll see. But she says, the dude clears interviews. He's good with that.

Speaker 3:

After clearing, he has junior people doing all the work.

Speaker 2:

Yep.

Speaker 1:

And

Speaker 3:

he has around 20 employees and interns. Basically, small dev shop.

Speaker 2:

Wow. So he's just figured out the economics of like, hey, if I'm making 200 k or 300 k

Speaker 1:

He's basically signing people do a 20 k a month retainer. He knows he's gonna have some churn. Yep. But it doesn't matter. He's picking up some equity.

Speaker 2:

This is fascinating.

Speaker 1:

Equity comp as well. Yes. I wonder if he's vest. Wonder if

Speaker 2:

he's vested.

Speaker 1:

Wonder if he's

Speaker 2:

There's some chum shares on the balance sheet.

Speaker 1:

At at at any of these companies. This is crazy. Watana says, Soham Parikh is the Bonnie Blue of YC. Very vulgar.

Speaker 2:

Oh, I get it.

Speaker 1:

Sar says, all recruiting must stop until we get Soham on TBPN to figure out what

Speaker 2:

Open is going invite to Soham.

Speaker 1:

Santiago says, Soham on TBPN when. Daniel, Growing Daniel says, Microsoft just Here laid out we laid off 9,000 workers. All of them Soham

Speaker 2:

If you see

Speaker 4:

I wonder

Speaker 2:

if it doesn't have Soham Perique email in their inbox, it's time to start polishing your resume because he hasn't reached out. It means he's not bullish on you.

Speaker 1:

Austin Allred says, must suck being an unemployed software engineer and realizing that Soham Parikh has been hired 79 times in the past four years.

Speaker 2:

This guy

Speaker 1:

is And then Matthew Berman says, are you think AI is killing software jobs? Tell that to Soham Parikh. Signal says, not now honey. There is a Soham gate happening on x. It is really funny.

Speaker 1:

It's basically say, I'm gonna build a dev shop but it's gonna be my name and I'm, you know, he he probably, no, he probably could have had a just very successful dev shop.

Speaker 2:

Yeah. Yeah. But but there's a stigma around hiring dev shops. You wanna hire founding engineers. Like, you know, the title matters.

Speaker 2:

And the idea of, like, okay. We have a full time employee. We got somebody who's delivering. They're a high performer. Our team is cracked.

Speaker 2:

We don't have to outsource. The whole thing with Uber. Remember Uber's initial build out was all outsourced, and it was like it was like a mark on the company. Obviously, it didn't affect them. They went on a generational run.

Speaker 2:

It was fantastic. But there's this there's this idea of, like, if you're building a startup, you need the the founders need to be coding. The employees need to be coding. It needs be handcrafted. You can't outsource.

Speaker 2:

True. You know, he's proven us all wrong, I guess. True.

Speaker 1:

So we're working on getting him live. We got another post here from Growing Daniel.

Speaker 2:

Has he gotten back to you, Tyler, on email? It seems like he might not come on today, but maybe we can pitch him and try to get him on.

Speaker 3:

Hasn't got back. I sent it twenty minutes ago.

Speaker 2:

Okay.

Speaker 4:

Yeah. No.

Speaker 1:

He just doesn't have a He's sorry. He needs to come out and say he's sorry. Yeah. And he needs to say this is the right time to launch

Speaker 2:

Yeah. Dev

Speaker 1:

Yeah. And it should be called the Dev Shop of Soham Parikh.

Speaker 2:

Yes. I I don't know. Maybe he's delivering.

Speaker 1:

Growing Daniel has a post here. Chief of staff brings list of applicants. Really likes this one Indian guy. Asks if it's a rune or a Soham. She doesn't understand.

Speaker 1:

Pull out illustrated diagram explaining the difference between runes and soham. She laughs and says, it's a good engineer, sir. It's a soham. Oh,

Speaker 2:

that's a wild pest. That's great. We believe he's here in the studio with us here. So welcome to the stream. Soham, good to meet you.

Speaker 2:

How are you doing?

Speaker 1:

There he is. Welcome to Thanks so

Speaker 2:

much for joining. Why don't you, why don't you introduce yourself and, just take a few minutes to just, give us what's going on in your world. What's the last twenty four hours been like? Would you describe yourself? What what what do you have to say to people?

Speaker 2:

And then I'm sure

Speaker 1:

Let's go back to the very beginning. Who are you? Where do you come from?

Speaker 5:

I mean, I'm, you know, I'm originally from Mumbai. I kind of grew up, you know, here all my life. Obviously, drove into computer science, like, you know, coding and everything kind of without a choice, kind of process of elimination. Fell in love with it, you know. As you're aware, at this point, I'm like everyone's, like, favorite founding engineer pretty much or work for 17 or whatever you might call it, like AI bot or so home as a service.

Speaker 5:

I don't know. Yeah. But yeah. Don't have much to my story, essentially. You know, I've just been always passionate about, you know, building

Speaker 2:

What was your first job?

Speaker 5:

My first job was actually at a company called Alan. It was like a voice assisted, like, startup. You know, this is back in, like, 2020, 2021.

Speaker 2:

2020.

Speaker 1:

And when did you come to The States? Because I know, I was talking to Matt, and he said you you would go to the anti metal off sites, and and you're very much a part of the team.

Speaker 5:

So to be honest, in my my, you know, I have family in the East Coast. The first time I ever visited US was in 2018. And then I was supposed to do grad school there, you know, different financial circumstances, couldn't do that. Had to kind of, like, you know, not go at dealt with that plan. But, obviously, you know, I have visited, you know, America mostly for, like, off sites and stuff just to catch up with team members and things like that.

Speaker 2:

Okay. So the main claim, that was levied against you on X yesterday, kinda started with Suhail over at Playground. He accused you of working at multiple jobs simultaneously. Is that true?

Speaker 5:

It is true. And yeah. I I would, you know, love to add color to it, but, you know, that is true.

Speaker 2:

Yeah. I guess the question is, do you believe that you were in violation of your own employment contracts, or do you believe that there was some sort of legal loophole that allowed you to do this without committing any sort of legal violation?

Speaker 5:

I mean, honestly, I think going back to, like, how it even, like, started to happen and what the motivations were

Speaker 2:

Yeah. Please.

Speaker 5:

Would probably have you know, obviously, I I would wanna, like, preface with saying, like, I'm not proud of what I've done. You know, that's not something that I endorse either. But, you know, financial circumstances essentially like, no one really likes to work one forty hours a week. Right? But I had to do with this kind of out of necessity.

Speaker 5:

Like, I was in extremely, like, dire financial circumstances and somehow, like, you know, don't I'm not a very people person. I I don't share much in terms of, like, what's going on with my life and

Speaker 2:

Sure.

Speaker 5:

Kind of in my internal thought process, you know, was kind of, like, getting more stressed with, like, hey, know, I wanna, like, come out of the situation. What should I do? So it's more it was not more so kind of outbreed, but essentially, like, necessity. And Yeah. Just, like, thought that if I work, like, multiple places, you know, like, I can basically, like, help myself, like, alleviate the situation I was in much faster.

Speaker 2:

No. It's very economically rational. What about the conjecture that that you had as a a team of junior developers underneath you helping you actually accomplish the tasks? Is there any truth to that?

Speaker 5:

I wish I had the money, and I wish that was true, but that is not true. Really? Any of the founders that, you know, I worked with can vouch for that. I have multiple locations, like, where I program with people. Yeah.

Speaker 5:

You know, I've written every single inch of the code. You know, Antimetal is one of the companies I worked at. Yeah. You know, they have pretty high bar for design as you might know. They did a launch recently.

Speaker 5:

You know, there were two occasions where, you know, just before the launch, like, you know, the the contractor who was going to do the front end for us essentially had bailed out and, know, we as a team had to figure out and do everything, which was a team of only three engineers. Right?

Speaker 1:

So so how did it originally happen though? You started with one job in 2020, then you realized, hey, I could add another, and then you realized you could just continuously add them. Did you create any type of financial models to kinda model out potential churn as you would get, you know I'm curious if you were operating this looking at as basically a SaaS business, the SaaS of Soham.

Speaker 5:

This was this was not a business to me. Every company that I've worked with, I have deeply cared about, like, you know. And again, people who spend, you know, a a lot of time with me can, like, back that on multiple locations. The and, again, I do not have any financial model. The like I said, the only motivation for me was, hey.

Speaker 5:

You know, I'm in a financial jeopardy. Mhmm. You know? And so I didn't do this until 2022. 2022 is actually when I was, you know, running into issues.

Speaker 5:

I had deferred my, you know, grad school Sure. Admit and, you know, did, an online degree, but basically, you know, did not have, like, enough, you know, like essentially, like, you know, just to get out of the situation I was in.

Speaker 2:

Sure.

Speaker 1:

I don't

Speaker 5:

wanna talk, like, too much deeply about it. But No. No. That's fine. So yeah.

Speaker 5:

So that's Talk about talk

Speaker 1:

about your own talk about your own, abilities. Every it it seemed universal that the the the teams that you worked on said that you were extremely talented, that you interviewed incredibly well. It's easy for new CS grads to be frustrated because you were probably running laps around them in the interview process. But but yeah, it's very impressive if you actually didn't have a junior team and you were just juggling. You know, it seems estimates seem to put it at maybe five companies at once.

Speaker 2:

Yeah. You just are you just like productive? Are you leveraging AI tools and coding assistance? Are you just able to move really quickly in a startup environment? Or are you also, like, talented, like, systems engineering and algorithms and, like, solving really, really hard computational prog problems?

Speaker 5:

So I'll preface with this. You know, some of these companies I worked at was before the Copilot boom. Right? There were no AI assisted programming. At least three startups have went on the record, like, saying this.

Speaker 5:

People who have proved their program that we can also watch with this. But the real truth is, you know, there's this funny line in some of my intramers, like, yeah, we are I don't do anything outside coding. That is very true. And if you're, like, spending hours, like, you know, sitting in front of the computer, you will eventually, like, at least at home so, like, get good at it. I I I would there's obviously a lot for me to learn in general.

Speaker 5:

You know, I wouldn't say, like, I'm, you know, like a a a like, a a a I don't know what they call, like, you know, like a platform engine or something like, you know, top principle or whatever. But I I would like to believe that I I was a decent enough a good enough engineer to essentially be able to work at two places because that's the only thing I did the entire day.

Speaker 2:

So so for the An average day for you an average week for you, it feels like, basically, maybe you sleep for six to eight hours and then you're basically programming for twelve to fourteen hours every single day for seven days a week.

Speaker 1:

But as as some of these, you know, new coding, code gen tools came out, ClaudeCode, Cursor, Devon, etcetera, Were did you feel like you could add maybe a few more jobs? Were you getting, you know, more efficient? Or do you still handcrafted code, farm to table, organic, you know, handmade?

Speaker 5:

I mean, obviously, like, you know, having more, like, you know, AI assisted tooling, like, definitely help, but it did not amount to, like, me working on more jobs. Mhmm. Because, like, I would still love to spend, you know, a justifiable hour, like, working on something, and it doesn't have, like, a measure. It's not like, hey. You know, I'm gonna split time, like, four hours for this company, for us, for that company.

Speaker 5:

Like, I was essentially, like, working until I got something done for that day, and I think people around me will probably say this, that I am notoriously known for not sleeping. I am, you know, I'm a serial, like, non sleeper at this point is what I would say. Insultiac. But yeah. And as far as the interviews go, right, like, one there there's also been a lot of things around, like, hey.

Speaker 5:

You know, I probably used, you know, some sort of, like, tool like Cluely, which, you know, I would love for the founder to essentially go on record and say if I'm a paying customer, I'm not. But or or some sort of, like, you know, hack to game the system. The truth is a lot of these companies did not have lead code style problems. And if they did, I would have bombed interviews, you know, because I don't practice lead code as much. I'm I know my data structures well, I know what to use when, but, you know, like typically a gun Well,

Speaker 1:

you're also great at cold email. I saw some of these cold emails. You got a great format. It pulls people in. It it it shows your personality.

Speaker 1:

It shows that you care about, you know, coding. And I think that helps Totally. Lot as

Speaker 5:

Yeah. I mean, I do care about coding a lot. But again, like, you know, most of these companies essentially had a take home assessment or a work trial. Right? So it's very difficult to, like, game a work trial or game a take home assessment.

Speaker 5:

Like, you have to do it and then deliver it. And I I did everything, you know, on my own. Like, I was also doing the work trials on my own. And Yeah. Yeah.

Speaker 5:

Like, I guess I'll I'll leave it at that. If if they did have lead call, would have bombed the interviews, but luckily, they didn't. Like, startups usually just, like, care more about, like, working

Speaker 2:

with about talk a little bit about the companies that you chose to work for you applied to. Do you think you could have pulled this off with Microsoft, Google, Amazon, and split three times? Or do they have defenses in place that would make this difficult? There's a variety of memes around maybe big tech companies. It's easier work.

Speaker 2:

Maybe startups are harder. I'm just kind of curious about it seems like you went for a lot of startups. What was the reasoning behind that?

Speaker 5:

Yeah. So it feels counterintuitive. Like, if I was in dire need of money, like, why would I work for, you know, startups as opposed to big time? Because they clearly pay well. They have a nine to five schedule.

Speaker 5:

For the most part, like, I would say, you know, people don't really care about, like, what you are essentially, like, doing. The thing with me is, like, if you're spending, you know, multiple hours a week, like, working on something, you would have to at least decently be passionate about it because otherwise, you'll just, like, burn out. Like I said, you know, each of these companies that I've worked for and again, you know, founders some of the founders have spent meaningful time with me and vouch, you know, would would say that I actually cared about these companies. So it wasn't like cold email without context. Like, I deeply read into what a company was doing, what the business model was, you know, here who their customers were.

Speaker 5:

I had great ideas about, like, you know, what to build for them, what not to build for them, you know, what the platform is like. You know, I did deliver, like, beyond engineering for a lot of these companies. And yeah. Like, it was more kind of, like, hey. If I'm spending, one forty hours, like, I wanna do something that I actually care about.

Speaker 5:

Like, I don't wanna, like, you know, do nine to five and kind of, like, center a dev in like six hours or something. Yeah. You know, that's that's just not me because I do like what I do and and and that's kind of

Speaker 1:

Cold email hack. Send the cold emails from the heart. Put some real

Speaker 2:

Good point.

Speaker 1:

Thought into them.

Speaker 2:

Talk to me about other alternatives that you may be considered or maybe didn't. Did you ever think about going to a startup and saying like, hey. I'm doing a great job. I'm working forty hours a week. I need to make more money.

Speaker 2:

I'd like to take on a second job. Would would that be amenable to you? Do you think that founders might be actually open to that?

Speaker 1:

Or say I'm working eighteen hours a day. Can you pay

Speaker 2:

me Can you pay me three times as much? Pay me Yeah. I'm doing the work of three devs because there is a world where you could have allocated all that time and one startup would have gotten three times as much value but they probably should have paid you three times as much maybe to make that balance out. And so did you ever have those types of decisions like we're seeing this with the AI talent wars, the the 10 x you can think of yourself as like a three to five x engineering. So no offense, you know, the 100 x engineers are getting

Speaker 1:

the 100,000,000 Maybe be 10 x if you just if you really focus.

Speaker 2:

Yeah. Yeah. Yeah. And so and so it it it's almost a question of, like, why couldn't you get the money that you wanted from a single job? What was the what was the what was the stumbling block there?

Speaker 5:

Yeah. I think the part where, you know, I should have basically done is, like, come clean with my situation. You know, to be very frank with you, like I mentioned, like, I'm not a good, you know, person with sharing what my internal conflicts are. I do like to keep boundaries between my personal and private, you know, like, whatever is, like, going with me mentally. And that's true with any person I'm with as well, like, and that's outside of work.

Speaker 5:

So one, there was an embarrassment to essentially, like, admitting that, hey, you know, I have this struggles mentally, you know, like Mhmm. And talk in open about it. And then the second thing is, you know, I, really did not, like, think through this because like I said, it was an action that was done more out of, like, desperation to get out

Speaker 1:

of the

Speaker 5:

situation I was in rather than, like

Speaker 2:

I mean, the flip side here is that you you don't necessarily need to have something going on in your family to justify a $10,000,000 salary or a $1,000,000 salary. Like, LeBron James doesn't need to say, oh, like, you know, I have a sob story, and that's why I need $50,000,000 He's like, I'm delivering this value to you, and you should pay me the value that I'm creating. And so I'm wondering if you ever considered just coming to a founder and saying, like, I can do the job of three engineers, but you gotta pay me three times as much.

Speaker 5:

Honestly, I again, it feels counterintuitive. Like, I don't really care much about the money. I was really into it for building. So, like, greed was not, like, an incentive for me. Again, regardless of my financial circumstances, in most of these companies, I always took the lower, you know, pay a higher equity offer.

Speaker 5:

Again, like, equity for me was not even, like, vested until I would make the move to US, which, like I said, you know, I was based out of India. So I had to go through a whole visa process in order to make that happen. So I wouldn't even be getting equity as long as my immigration was uncertain, which it was. Yeah. So all of the decisions seem counterintuitive.

Speaker 5:

Basically, for me, it's like, hey, you know, I have to do something to kind of get out of the situation I'm in. I also wanna do spend time, like, building something meaningful. Mhmm. You know, as long as I like the startup and as long as it's not, like, hand to mouth Yeah. It did not matter too much, like, you know

Speaker 1:

So what are you gonna do now? You have the intention of the entire tech ecosystem that, you know, I appreciate how candid you've been Yeah. With us and and

Speaker 2:

I wanna talk about dev shop.

Speaker 1:

Specifically. Yeah. I mean there's a few angles here. You could do the the dev shop of Soham Parikh. I think you would be able to get a handful of clients immediately.

Speaker 1:

Yeah. I think you could sign a max out max out contract with Cluly. They would probably pay you a a pretty penny to to join the team over there. And I also, it sounds like you're interested in joining as a founding engineer of a new AI startup. So how are you thinking about your opportunities right now and and how can you turn this into a win?

Speaker 5:

Yeah. Look like I've, you know, regardless of the situation, like, I've been privileged to, like, work with some of these companies like, you know, Antimetra, you know, shares and the founders like Sync Labs, you know, Pradi, you know, Rutaba. They're one of the most, you know, crack faces I've been at in general, partly because they had me maybe not. I don't know. I can stand with that.

Speaker 5:

But I think the main thing is, like, I am really excited about what I'm gonna be a part of next. So, you know, I'm working with a company called Darwin. They are essentially, like, building a new, like, AI driven, like, data plat like, you know, video platform essentially for Mhmm. More kind of, like, you know, UGC style media. Mhmm.

Speaker 5:

And we're gonna release to the market very shortly. This is the only thing I'm going to focus on. I think, you know, they've put a bet on me. I have a lot to prove, and there's there's not a lot for me to say. So yeah.

Speaker 2:

How do you rebuild trust? How do you you do you give them the confidence that this isn't just you're focused on it for a couple weeks, and then the second job creeps in, then the third job creeps in? Is there something that you see happening down the down the line around, you know, worker monitoring or or checking when the GitHub commits are coming in or doing something to understand if someone else is is is working multiple places. Somebody was saying we like, tech needs a, like, a some sort of service to see, hey. Are we employing the same company?

Speaker 2:

How do you how do you rebuild trust and and guarantee that you won't be working multiple jobs against the will of the current employer?

Speaker 5:

So I I think, like, you know, all all of this that has taught me is, like, you know, just coming to kinda, like, probably under makes people understand your situation. After all, like, we all are humans. I don't anticipate, like, working on multiple gigs again, but, you know, my financial circumstances have not changed. So if even if it did came to them, which it won't, like, right now, very clear that this is my sole focus. Mhmm.

Speaker 5:

You know, I would probably, like, approach with, like, being very candid with the founders and just letting them know. And if that's not a possibility, you know, that's fine. You know, maybe I can figure a way out where, you know, there is, like, a higher pay or whatever. Kinda like your point. Right?

Speaker 5:

Mhmm. But in in in another spectrum of the world, I think, like, I've really realized is that I just, you know, I just wanna be a part of, like, building something that I can, like, focus on a longer term. And that's always been the goal. You know, it is always a goal with any of these companies I worked with. So, you know, I obviously, you know, like, I I I believe in actions more than words.

Speaker 5:

I guess what we'll build and launch in a month would probably be a testament to, you know, what we've been able to build.

Speaker 1:

Well, you're certainly gonna have a lot of attention around that launch. I hope you now is probably not the right time to renegotiate your your offer letter for the new company. You should probably, you know, stick with it. But I think you're gonna bring a lot of marketing value to to Darwin, the new company you're joining. What's reaction been like at home?

Speaker 1:

You're getting a lot of messages from from friends?

Speaker 2:

Has this broken tech x and broken containment into kind of your your world beyond just people asking you about this?

Speaker 5:

Yeah. I mean, obviously, you know, reaction is natural. Like, a ton of people have been asking about this. What's also funny is, like, you know, some of the memes, like I I am very new to Twitter. I joined Twitter yesterday.

Speaker 5:

So this was a lesson for me in social media in general. Like, I am not a social media person. So it just like I was like, you know, very old and with it in general. Obviously, you know, feeling shitty about it as well. Like, I like I said, you know, I'm not a proud very proud of what I did.

Speaker 5:

I think there was a way where I could have course corrected in in a much different manner. So there's a lot of, like, reflection in general, but I think friends and family have been strong. You know, like I said, I was lucky to work at some of these places where I had really close relationships with the founders. I know a of them reached out, you know, they to essentially be, like, help for your kind of give advice on what I should be doing next. Even, like, know, sometimes, like, staying through with with me, like, you know, just to make sure that I'm, like, alright.

Speaker 5:

You know, like

Speaker 1:

No. I think it says a lot that that a a number of the people that that you work for that fired you Yeah. Because you were working multiple jobs still care about you. It shows that when you were working for them at least it, you know, that that it does seem like you really you really did care.

Speaker 2:

Are there any startups that you've worked for in the past that you were working multiple jobs for that you haven't gotten a chance to apologize to yet that you plan to?

Speaker 5:

I mean, general, like, plan to apologize to everyone like that's that goes without saying.

Speaker 1:

Any companies that you still need to resign from?

Speaker 5:

None whatsoever. Diamond is the only focus I have.

Speaker 2:

That's great. Okay. Okay. He's on the

Speaker 1:

record. You're on the right path.

Speaker 2:

On the right path. Well, I'm

Speaker 1:

rooting for

Speaker 2:

you. I I I hope this is a story of redemption. I believe in forgiveness. I

Speaker 1:

believe in

Speaker 2:

mistakes and I'm I'm I'm optimistic that this can have a good ending.

Speaker 1:

Me too. So to be clear, your ex account is at real sohamparik. Right? Yeah. That's the real one?

Speaker 2:

That's the real one. Okay. There's lot

Speaker 1:

of fake ones floating out there trying to take, trying to ride the hype wave.

Speaker 5:

Mhmm. Yeah. I mean, a 100%. There there's been so many fake accounts and a lot of fake narratives as well. Yeah, that is my real account.

Speaker 5:

There's there's one person who asked me to post a picture of banana on my head with a fork in my hand. Mhmm. And I did not have banana at my home at that point in time, but I was trying to do that just to show

Speaker 6:

Turns out freak.

Speaker 5:

Okay. Yeah. But it's it's been it's been interestingly, like, in a while to, like, notice how people are, like, taking this in different ways. But, yeah, that kind of helps, like, just feel a little bit less less shitty about it. It.

Speaker 5:

But yeah.

Speaker 2:

Yeah. What what what what are the the questions here? Remote versus in office, US versus international? Yeah.

Speaker 1:

I mean, I think we we had a post from Ahmad at at Mercury. I think I think the reason this set struck such a chord is, you know, again, it's this remote angle, the international angle, the ideas around a 10 x engineer, the potential of new AI tools. But this has been really helpful. It's been great know I think, you were getting, you provided entertainment for a lot of people the last twenty four hours. So I'm glad to hear that you're going to apologize to everybody that you misled.

Speaker 1:

And, I am excited to follow, what you do next.

Speaker 2:

Yeah. Thanks for stopping by.

Speaker 5:

Thank you so much for having me.

Speaker 2:

Cheers. We'll talk to you soon.

Speaker 1:

We'll talk soon.

Speaker 2:

Bye. How are you doing? Good to have you back on the show.

Speaker 7:

Yeah. Thanks for having me. No. Redemption isn't possible. When people show you who they are, believe them.

Speaker 7:

No. There's plenty of very honest people. Like, what are the odds that someone who's been lying compulsively and being being confronted about it for three years, like, two or three years? And, like, look. A it's a special kind of person too who can do this thing to people who trust them.

Speaker 7:

Right? And you have a relationship with an employer. You you know? Mhmm. And and I my understanding is that he basically has been averaging, like, getting fired once a month for the last three years.

Speaker 7:

It's a special kind of person who can

Speaker 1:

do that. So he could have learned his lesson, like, two years, three years ago

Speaker 2:

Yes.

Speaker 1:

Yes. Like, the first time he was called out and and Totally. Sure.

Speaker 7:

Totally. I I wasn't I wasn't tuning in. Did I didn't hear what he said.

Speaker 1:

To me the to me, the argument of of sounds like he has some situation that's bad, but it's not a justification to go steal money from a bunch of Silicon Valley startups for years and years and years and years and use them as a piggy bank to solve his own problem.

Speaker 2:

So really quickly, how long did he work with you? Two weeks. Two weeks. And what was the result of that? I mean, I imagine you paid him for those two weeks of work.

Speaker 2:

Did you get any good results?

Speaker 7:

I am literally out of a meeting, like, an hour ago, Will. The engineer described his impact as negative.

Speaker 2:

Negative. Okay.

Speaker 7:

Well, because we we had to we had to onboard him.

Speaker 2:

Yeah. You had to onboard him. Yeah.

Speaker 7:

Yeah. It's Okay. Wasted our time and money.

Speaker 2:

Has he apologized to you?

Speaker 7:

Not yet.

Speaker 2:

Okay. He said he would apologize to everyone else. He would go on an apology tour. We'll see if that actually pans out.

Speaker 1:

My my big my big question is is somebody like this, you could have them working in your office as a full time employee, and they would probably still there's a good chance they would still keep doing this thing with people on LinkedIn that didn't see the whole SoHomgate.

Speaker 7:

Never we never compromise on ethics. That's just a thing I don't fuck with. Yep. He's gonna apologize. Is he gonna give the money back?

Speaker 2:

Yeah. It's a good question. I don't know.

Speaker 7:

Because because because stock is cheap.

Speaker 2:

Yeah. That's a higher bar. If he wants to make a statement, that's what he would do.

Speaker 7:

A 100%.

Speaker 2:

Yeah. Makes sense. How do you prevent this in the future? How does Silicon Valley prevent this in the future? Obviously, The Valley is built on trust.

Speaker 2:

This is a violation of trust. There's been discussions about, like, maybe we need a Reddit for are we dating the same man? Are we employing the same software engineer? Maybe we need an AI tool to, you know, check the GitHub commits and see that, oh, they're coming in at the same time every

Speaker 1:

week. If this not available at the HRIS level. Yep. It's like this this employee

Speaker 2:

has accepted two offers. In two instances of of Rippling. Of rippling? That doesn't make any sense. Yeah.

Speaker 2:

How how are we solving this?

Speaker 7:

I I don't wanna over rotate on it. I think a large part of what makes Silicon Valley tick is its culture of of trust. Mhmm. You know, there's this whole game theory around, like, hawks and doves. Yep.

Speaker 7:

You know? And it's it it turns out there is no stable equilibrium. You're, like, in a constant state of of of shifting. Because if if it runs a dove, then the the returns of being a hawk are very high and vice versa. If it runs a hawk, then you can have these, clicks of doves that that start the building.

Speaker 7:

So I I'm really careful not to tip Silicon Valley towards this attractor state of, like, the hawk. And, like, a lot of industries end up in this state of, like, low trust, and I really don't wanna do that. Like Totally. Look. You're gonna get screwed over, over so often.

Speaker 7:

We had a good laugh. You know, the memes are amazing. Like Yeah.

Speaker 1:

Yeah. I hope I hope your your viral posts at least drove a bunch of new sign ups, and maybe I hope so paid back a little bit by some

Speaker 2:

some customer acquisition. I I I completely agree. Like, yes, the handshake deal protocol is built on trust. Do some VCs violate it? Yes.

Speaker 2:

Every once in a while, a team's a term sheet falls through. It's not the best thing that ever happens, but the alternative is so much overhead on everyone that it's probably worth taking the risk of, yeah, occasionally, the handshake deal is gonna fall through.

Speaker 7:

You're gonna get screwed over every so often, and it's not worth over pivoting on it and, like, looking for your shoulder your entire life.

Speaker 2:

So it sounds like he is he he's announced that he's taken a new role as a founding engineer at a single start up. He says he's not gonna work at multiple companies. It seems like the jig, even if he even if he is a pathological liar and continues to lie, like, people like, his his SEO is terrible now. Like, will forever, like, be, like, very easy when you go to Google search for the person that you're hiring.

Speaker 1:

He's like, oh, that's a different Sohan Parikh. There's a lot of there's a lot of people in there.

Speaker 2:

His name, and it's gonna be really complicated for him to get away with this longer. My question is, like, if, we we were we were kinda saying, like, the Soham Parikh dev shop makes much more sense here. Does that make sense to you? Would like if he was saying, hey. I've been working at all these different startups.

Speaker 2:

I'm starting a developer shop. And yes, you can hire me,

Speaker 1:

but I'm

Speaker 2:

a contractor. Dev shops. Why why are dev shops bad for startups? Why would this not work for him?

Speaker 7:

I actually think for him, the real move is to start a course on how to crack coding interviews because he's really good.

Speaker 8:

Interesting.

Speaker 7:

Credible. He's he's actually credible. Right? So he can actually make lemonade here. Like, he should do that and he can make a lot of money.

Speaker 7:

It's very

Speaker 2:

high leverage.

Speaker 1:

And he was that in the cold email strategy. Yeah. Yeah. He had a very dialed in cold email approach where he would be, you know, he'd be like, say something nice and really clearly. I think

Speaker 2:

a prod guy.

Speaker 1:

I think he's a prod guy. Building. The funny thing is in his message, he said, he said, I'm pissed. And I was like, About what? What?

Speaker 9:

Why are

Speaker 1:

you pissed? You're pissed you got busted? And you kind of gave himself a

Speaker 2:

little Yeah. Bit of was saying like, oh, yeah. Like, I I'm gonna put, know, there there's gonna be a great reversal here. And he came on and was Yeah. It's like the narrative is basically correct about me, but maybe I'm sorry.

Speaker 1:

The the redemption of being like, I'm gonna prove everyone wrong Yeah. By working at one company. It's like, you don't get you don't get you

Speaker 7:

don't you don't get medal.

Speaker 1:

Yeah. Yeah. We don't give

Speaker 2:

a Yeah.

Speaker 1:

Congrats. Second the fourth place trophy.

Speaker 2:

You worked at one company. Yeah. That's funny.

Speaker 7:

Yeah. I mean, and look, he's he's not sorry. He just got cut. Like, it's it's that's it. It's really simple.

Speaker 7:

Mhmm. And and why do dev shops not work for startups? I mean, just like a does yeah. I'm I'm a nerd when it comes to the the the theory of the film because of my experience at Uber and so forth. Sure.

Speaker 8:

There's a

Speaker 7:

that's called managerial dilemmas that explains why dev shops don't work.

Speaker 2:

Okay.

Speaker 7:

It's it's fascinating. Yeah.

Speaker 2:

But can you unpack it a little bit more? Because Uber is interesting because didn't wasn't the very first version of Uber done by a dev shop, and wasn't that part of the early early Uber lore that, like, the code base was written in Spanish, and so everyone at Uber had, like, the Spanish to English dictionary there, which is hilarious because I'm pretty sure Google Translate existed at the time, but they still had the dictionary for some reason. It's like it's like amazing lore regardless of how real it is.

Speaker 7:

Yeah. I think people add a little bit to it. This is the first time I hear about But that, you know, like next time I hear about it, there's gonna be a guy like, a sombrero in the other.

Speaker 2:

Yeah. Exactly. Yeah. Travis was was in Tijuana pulling random devs off the street. Yeah.

Speaker 2:

They really, really amp it up. The apocryphalness of the story just needs to grow and grow and grow. It's a tall tale at this point.

Speaker 7:

The the reason why dev shops don't work is because there's, like, a complicated there's a complicated game theoretic argument, which goes, basically, the reason why you employ people is because they know things you don't know. And so there is a an information asymmetry, which always makes it possible for someone to cheat because they can basically bullshit their, like and and you see with engineering all the time. It's like, oh, this is so complicated. You thought it'd be a day, but actually it's gonna be a month. And you have no clue.

Speaker 7:

You have no way to verify. And this is inevitable. Like, the reason you hire them is because there is this information asymmetry gap because otherwise, might as well just do it yourself. And so basically, it is in a way, it is always possible to cheat, which is why we are not fucking around and hiring people who do cheat. Because there is no way you can always prevent them from cheating.

Speaker 7:

If he doesn't cheat by doing his, he'll cheat by doing something else. So there is always a way to cheat. So you could say that to some extent, not cheating is irrational. You can't cheat. You can't get away with it.

Speaker 7:

He's made millions. He's made so much money. Right? Like, you can't cheat. You can't get away with it.

Speaker 7:

And so in a way, you've gotta solve for the equation where it's like, I can't cheat. I can make a lot of money. I can get away with it. Nothing is ever gonna happen to me, but I'm not gonna do it because of x. And you have to solve for x.

Speaker 7:

That's your job as a founder. You're like, why would you act irrationally? And the way you do that, thankfully, is you you we we have that that that bit that you can flip in our brains as humans. We have these tribal prosocial tendencies that are like, I don't wanna cheat because I like these people or because I'm I'm budding to the mission, which is why they always say every startup is like a religion. Like, the mission, the culture really matter on the camaraderie.

Speaker 7:

It really matter as a startup. And that's what you lose when you have dev shops.

Speaker 1:

Well, yeah. The the to to be even more specific, the dev shop gives you a scope. They say it's gonna take me x y z time to do roughly this or maybe they're on a retainer. And then if they come back to you the next day and they say, well, actually it's gonna take more time and it's gonna be more money. Well, an employee is gonna say, well, it's gonna take more time maybe, but I'm not going to necessarily charge you astronomically more.

Speaker 1:

Right? I'm just a salaried team member and because they have that equity compensation as well, they're bought and and hopefully bought into the mission if it ends up working out. Thank you for jumping on. Give us the update on Lindy. I know you guys moved quickly.

Speaker 1:

It's been a couple months maybe. What's the latest?

Speaker 7:

It's going well. We are currently grinding. To my point about dev shops, were at the office last night until, like, 10PM. Like, you don't get that from the dev shops. So we're currently grinding.

Speaker 7:

We're growing fast. We are making a big launch on August 4.

Speaker 2:

You gotta come back on the show then.

Speaker 7:

We've got some really, really exciting stuff coming around. I'm I I couldn't be more excited.

Speaker 2:

Amazing. Incredible. Last last question on Soham. What should he do? What what should he do next?

Speaker 7:

He should totally start a course, and he should raise around 26 and

Speaker 2:

Start a course. Okay. We'll see.

Speaker 1:

Maybe maybe he'll No. That that this is weird.

Speaker 2:

Yeah. That's a

Speaker 1:

laugh track. Yeah. We've got we've got laugh tracks around here. But, yeah, it it is it is the part of why the whole thing was so controversial is you'd like, the average new CS grad, even from elite universities, is not having the easiest time in the job market. Mhmm.

Speaker 1:

So it's such it's such a, you know, it's gotta be disappointing for them. But but clearly Soham has cracked the the entire flow.

Speaker 2:

The interview code.

Speaker 1:

So

Speaker 7:

He's doing pretty well. He's figured it out, so he should he should monetize that. It is it is a monetizable skill.

Speaker 2:

Interesting. Well, thanks for hopping on. This is fantastic. Good to get to other side.

Speaker 1:

Thanks, everyone. Enjoy the Talk to you soon. Cheers.

Speaker 2:

Bye. We have to sing her a song.

Speaker 1:

Birthday to you. Happy birthday to you. Happy birthday dear Emily. We're hitting the size

Speaker 2:

gong for you. Congratulations.

Speaker 1:

It was yesterday.

Speaker 2:

It was yesterday. But

Speaker 1:

it's July 1.

Speaker 2:

You just didn't come on the show yesterday. So welcome.

Speaker 9:

I would have come on yesterday if you asked.

Speaker 2:

We should We should have.

Speaker 1:

We could have had a cake. We could have had a cake.

Speaker 2:

What did you do for your birthday? I

Speaker 9:

had a few meetings. I went to the YMCA. It was empty which was nice. Mhmm. Then my husband took me to dinner.

Speaker 1:

Oh, that's nice. Yeah. YMCA, underrated. You're you're My

Speaker 8:

is nice.

Speaker 9:

I This New York weather, it's one of the only places blast ing that air conditioning.

Speaker 2:

So Oh, interesting.

Speaker 1:

There you

Speaker 2:

What what is the gym tier list look like in New York today? You know, Equinox at a moment? Lifetime's making moves.

Speaker 9:

Lifetime's making moves. So I kind of do a high low thing. I live in Park Slope, so the the YMCA makes a lot of sense for me, but I'm also a member at a club called Casa Cipriani with Oh, okay.

Speaker 2:

Very small members club. Yep. Yeah. It's fun.

Speaker 1:

The high low. The the high low. John John's in favor of that too. We just started going to a new gym. But John is like actively wants to be kicked out so he can go back to Gold's Gym.

Speaker 1:

I'm like, we no, we're not can't do this.

Speaker 9:

Aren't there don't seem to be as many YMCAs on the West Coast.

Speaker 2:

No. No. Not not that many. Lots of 24 fitness.

Speaker 9:

Lots of Gold's

Speaker 2:

Gyms. I mean, Gold's Gym, it's the it's the staple of Venice. It's where Yeah. Out. You know, RFK's out there.

Speaker 2:

It's a good crew.

Speaker 1:

John John thought Equinox was like a $75 a month gym while we were actively going there for a brief period. Yeah. And I just loved how out of touch you are. You were.

Speaker 2:

It felt like premium, you know. Yeah. Ultra premium. It's the besties all in tequila level.

Speaker 1:

Yeah. The location matters a lot.

Speaker 2:

Yeah. What else is new in your world?

Speaker 1:

Yeah. Give us a give us a a Hamptons update. People were

Speaker 2:

I assume are talking the city right now.

Speaker 9:

Well, look Yeah. I I got out here this morning. I don't wanna brag. I recently inherited a car from my late grandfather to 2006 Camry.

Speaker 2:

Okay. There

Speaker 1:

you go. Iconic era of really great vintage. Very reliable.

Speaker 9:

Yeah. Awesome. So I drove out here today. It's crowded. It's really crowded.

Speaker 9:

I'm hoping that this sort of, level of crowds is due to the holiday weekend, but it might be a mistake that I I rented a house out here, but we'll see. What else is new since two weeks ago?

Speaker 2:

Is there

Speaker 1:

Well, so is there a craziness? Broadly Yeah. People a month ago were saying that rentals were down massively year over year. Does it feel that way?

Speaker 9:

Yeah. I I don't know how much of that was because of weather or or or people's finances. I think some people were less motivated to, like, make the move on booking a house because we had really bad weather the past few months in New York. It was really rainy and gross, and it was sort of a late start to summer. Mhmm.

Speaker 9:

But I'm sure people's financial situation

Speaker 1:

There was also a trade war that's still ongoing. Right. And if your portfolio's down tremendously, you're not exactly saying, yeah, I'd love to spend, a 100 k for two weeks.

Speaker 2:

Yeah. I had a friend who was dating a Wall Street, like hedge fund guy and, she was like, yeah, it's going so great. Like it's amazing. And then like a couple weeks later, she's like, yeah, he's just like really weird. Like I don't think it's gonna work out.

Speaker 2:

And I was like showing her the stock market chart and being like, so his emotions perfectly map to the 10% sell off that we just saw. You're telling me that this guy his entire his entire personality is derived from The

Speaker 1:

state of the market.

Speaker 2:

The state of the market and she was like, oh, this makes so much more sense now. Okay.

Speaker 9:

Yeah. I'm sure there's a lot of women who went through similar situations as

Speaker 2:

her. I'm the kangaroo market's not good for the dating scene. Is, is that is that crazy white party going on? Is that that's a Hamptons thing. Is that a is that a July 4 thing, or is that canceled?

Speaker 2:

I thought Michael Rubin was, like, pulling back from that because it was, like, maybe too flashy. Is there an update there?

Speaker 9:

I I I will see. Is that supposed to be July 4? I think that's

Speaker 1:

wearing she's like, white party, like What are

Speaker 2:

you talking

Speaker 9:

about? I was like,

Speaker 1:

you did. Can't comment.

Speaker 2:

No. Not the diddy one. Oh, no. Oh, yeah. Yeah.

Speaker 2:

I think that might have been a factor in in in pulling back for that. But but but I always see, like, everyone's, like, spying the watches, spying, like, you know, what cars people showed up in, all

Speaker 9:

this stuff. There's a lot of paparazzi. There's a lot of TikTokers.

Speaker 2:

So we

Speaker 9:

see everything happening in real time.

Speaker 2:

Interesting.

Speaker 9:

Nobody can really, like, leave with another man's wife anymore. Like, there's so much coverage of these events now. Really takes away

Speaker 2:

The mystique. The mystique. Somebody might might

Speaker 1:

have What's up what's up this with this place, Drugstore? You were covering it. Let's try

Speaker 9:

Oh, yeah. This is awful. Bloomberg broke news yesterday that there's a new movie Smoothie. Smoothie pop up chain from, like, a Jay Z invested celebrity chef.

Speaker 2:

Okay.

Speaker 9:

And he's these these, like, very Instagrammable smoothies, which obviously Arrowon started, but now there's a lot of places that are kind of hopping on the bandwagon of treating, like, smoothies as a billboard. Like, brands can collab with them, and then whoever's serving these smoothies is, like, making extra money from an advertising business, which is crazy. Like, I think to get an Erwan smoothie for your brand, it's, like, 200 k for a month or something.

Speaker 2:

We have to do this. Yeah. We're we we we did this. You say that's crazy.

Speaker 9:

Don't have to do

Speaker 2:

worth every penny for the TBPN Feed Me Erwan smoothie.

Speaker 1:

You're telling me we could get that out.

Speaker 9:

No. You should do a drink at, like, the US Open or something. Like, get don't do the smoothie.

Speaker 2:

Don't do the smoothie. It's played out?

Speaker 1:

Yeah. It's played it's played out. Okay.

Speaker 2:

Didn't didn't everyone have a collab with, like, Volkswagen or something or some really funny, like, car company they did?

Speaker 9:

Yeah. You're right. They did do a car comp. I think they they also do, like, beauty brands, like sunscreen and stuff. Like, it's not

Speaker 2:

Yeah. Chevrolet. They did a Chevrolet drink collab. And what the worst part is that I wouldn't have had a problem if they did the z r one, like, the really crazy Chevy sports car, but they did the 2025 Equinox EV. You're

Speaker 9:

doing smoothie.

Speaker 2:

According to the manufacturer, the collaboration between, it combines Chevy's commitment to a mission free vehicle vehicles and Eiruwan's mission to promote sustainability as a merchant of organic foods and wellness products. The drink called the electric juice consists of, ChuChu? ChoCho? I don't even know what this is, which is the most protein rich plant source and blue spirulina, the super few food that matches the Equinox's EV's blue color. The drinks ingredients are designed to energize and recharge consumers according to several

Speaker 7:

of the

Speaker 1:

The winner here was the ad agency that got paid For sure. Half a million dollars

Speaker 9:

to stuff.

Speaker 2:

I don't know. I feel like I feel like everyone's getting the Hailey Bieber smoothie. I'm about to rock the Equinox EV smoothie. I think it has a

Speaker 9:

certain batch value. Like they're like 70 grams of sugar. It's like four Cokes.

Speaker 1:

I'm long I'm long sugar though. Four Cokes is a lot.

Speaker 2:

That is a lot. That is

Speaker 1:

a But I'm going long I'm going long sugar from I did I did an experiment. My wife was very afraid of

Speaker 2:

the It's worst experiment you ever I continue.

Speaker 1:

Yeah. But anyways, people people are very afraid of sugar. I to prove a point, I drank a soda Yeah. Every single day. Yeah.

Speaker 1:

I drank a Coke every single day for six months straight. I know that's You're

Speaker 2:

young and you work out every day and like Yeah. There's like

Speaker 9:

a million ways

Speaker 2:

to offset

Speaker 9:

that. It wouldn't be good.

Speaker 1:

No. A lot of a lot of people think that.

Speaker 2:

No. I agree. It it is over With sugars. People think like a single Coca Cola was terrible. Yeah.

Speaker 2:

Of Of course. Everything's in the the dose is the poison with all of this stuff.

Speaker 1:

Yeah. You drain a full gallon of milk. So this week on our side, we've been covering the AI talent wars, how basically AI researchers are being comped almost like pro athletes and sort of like, you know, these sort of 9 figure massive offers. The reason I bring it up is because you obviously cover a lot of the media landscape and we had Derek Thompson on who left Atlantic. I imagine that he's already like run rating well well beyond.

Speaker 1:

I I don't know what his metrics are. But I would imagine he's already

Speaker 9:

on Substack. He's catching up to me.

Speaker 2:

Oh, really?

Speaker 9:

Yeah. It's a problem.

Speaker 1:

Go subscribe to subscribe to Feed Me right now and help help win the race. But Well, he's winning the I was curious if you it feels like if if legacy media companies, I won't call anyone out, want to want real attention. Not just the sort of attention that comes from their logo to some degree and the prestige. But like actually quality content. At some point, they'll have to like kind of reevaluate their comp structures.

Speaker 1:

I think the Atlantic was like 300 k was being reported as like the kind of range. And when you can go on Substack as a superstar writer and immediately be making somewhere in the 7 figure range at some point or another, it just becomes really difficult.

Speaker 9:

I see, traditional media companies more likely to start doing like handshakes with substack writers than starting to completely restructure how they're paying their staff.

Speaker 1:

I don't know you're seeing So, like, syndic syndication, basically.

Speaker 9:

Or, like I mean, I've talked to so many editors from traditional magazines and papers who are, like, trying to figure out how they can work with Feed Me. Is it, like, a copublishing thing? Is it, like, creating a podcast together? And it's it's been very interesting to experience firsthand to, like, talk to these people because I don't I don't really need that. Like, Derek doesn't need the Atlantic because the people who are reading him are reading him for him, not the whole surrounding.

Speaker 2:

Yeah. I mean, just to set the table for the folks who might be listening, when Emily comes to the show, we pay her a $100,000 for her appearance. So just just to say, kinda set the bar just so

Speaker 1:

And it ramps up

Speaker 2:

over Yeah. Yeah. It obviously ramps up. But but, yes, I am interested in, the dynamic of, like, you can still go and publish in in traditional media. Is that just like freelance?

Speaker 2:

What is the difference between

Speaker 9:

What do you mean like when I write for GQ?

Speaker 2:

Yeah. Exactly. Are are you a columnist?

Speaker 9:

That I'm doing that besides that I That's cool. Adore my editor there and I I think that it's just like it's it's working like a different part of my brain. Like, I don't it it's like a gift to get to be edited by GQ's editors.

Speaker 2:

That's cool. Yeah. Do you, so are are you a columnist over there, or are you a contributor? Like, how does with the actual shape, do you get assigned things? Like, I've had reporters reach out to me and be like, I was assigned a story on nicotine, so I need to talk to you as an expert or whatever.

Speaker 9:

So with GQ specifically, my editor Dan, when I wrote the Zin story and when I wrote the story about members clubs last year, both were because I I've written about both of those topics in my newsletter. And Got it. Like, think it would be interesting for you to expand upon this in, like, a larger feature in a in a print issue. And that's how that's happened. I I don't really but, like, I'm at the point now where if I have an idea of a great story that I wanna write, like, it doesn't make sense for me to pitch it to the Times or to New York Magazine because it I can get it done faster on Feed Me.

Speaker 9:

Sure. I can outsource an editor if I need that. I can outsource legal services if I need that, and it it's better for me to get to be, like, breaking those sorts of stories. You know?

Speaker 2:

Yeah.

Speaker 9:

Yeah. I just have, like, a good thing going in It's kind of fun.

Speaker 2:

That's cool. I wanna bounce an idea off you. There is this narrative that, like, when you go direct, when you're on Substack, like, it has a different texture, a different, different vibe, different maybe more objective, maybe more pro tech or pro, you know, creator or pro whatever the topic is being covered. I think that it's less about the personalities or the views of the people and maybe just more about the economics that essentially when you're in legacy media, you basically have a salary cap. And when you're outside of it, you have no salary cap.

Speaker 2:

And that actually defines the economic terms shape this the type of content more than than anything else. Do you think that's reasonable? Do you disagree? How how would you wrestle with that?

Speaker 9:

Can you can you keep, like, expanding on that? I I want it

Speaker 2:

So so I think that there's there's something where if you if you're working at a at a particular outlet and there's some sort of, like, salary cap, it's like, imagine if there were two NBA teams playing it against each other and one had a salary cap of $300,000 per player, and the other had no salary cap. Like, who would you expect to produce better basketball? Who would you expect to produce better content when you have when you have one that where a person can make 10,000,000 or Joe Rogan can make a 100,000,000? Like, you're just going to attract the absolute top because that's where

Speaker 1:

Yeah. The

Speaker 9:

That's right.

Speaker 2:

They're yeah. And they're extremely incentivized to just work extra hard because if they work if they work a little bit harder and they compound a little bit more and get that extra guest and write that extra piece, it could not it it doesn't mean, oh, here, you got a $5,000 Christmas bonus. It means Yeah. Spotify signed you to a $100,000,000 contract. Yeah.

Speaker 2:

And it's the difference between, like, when LeBron James shows up at midnight to shoot more three free free, free throws, like, that could be the difference between, like, a $50,000,000 contract and a $100,000,000 contract.

Speaker 9:

So I

Speaker 2:

think going with extra things And I think like, better content.

Speaker 9:

Yeah. It's really interesting to see so many traditional journalists, you know, give Sub Sack a shot and, like, they just plateau. Plateau.

Speaker 2:

Yes.

Speaker 9:

But I'm like I'm an animal. Like, when my eyes are open, I'm working. Like, this letter goes out every day. It's I'm marketing it. I'm editing it.

Speaker 9:

I'm pitching stories. But I've always had a really high tolerance for being told no. Like, when I was getting paid 60 k at a job and when I'm making, like, 10 times that now. Right? Like so I think that Substack and these sorts of platforms, even what you guys are doing, also attract personality types that are had less boundaries and rules in the ways that they work.

Speaker 9:

Yeah. And then when people who who don't think like that think that they can find the same success, like, get disappointed or confused or think that it's rigged or something like that.

Speaker 2:

When really they're

Speaker 5:

just gonna occupy

Speaker 9:

mean, going going above and beyond for a job where the cap might be like, what did that New York Magazine story say about the Atlantic? 300 k? That's what they're paying their journalists? Like okay. So, like, you're you're gonna go as hard as you can to make 300 k and get those benefits and whatever.

Speaker 9:

But the other thing that I'll say, like, is you have to be willing to not have insurance. You have to be willing to Mhmm. You know, have have months where you're you're not having, like, the support of a team. And it's just, like, a very specific type of personality. But I think you're totally onto something.

Speaker 9:

Like, I think I'm playing a different game than my peers who are working in an office at a at a magazine or a newspaper.

Speaker 1:

Yeah. Yeah. Also, the the insurance thing is real, but it's also a problem that is very quickly overcome. And you can quit your job and say, I'm going to pay out of pocket for insurance for two years and if this doesn't work out, then I'll get a roll back. I'm curious, you were commenting on Vogue partnering with Nutrigrain bars and I'm curious if you think, as people are so used to influencer creator led advertising, like, if if you were to partner with a brand and people are like, wait, like, what?

Speaker 1:

Like, why did she partner? Like that makes no sense. Like that's clearly a cash grab.

Speaker 2:

Yeah.

Speaker 1:

That that feels like it seems like legacy media brands could get away with that kind of thing for a long time. Even though we're saying, you know, why is

Speaker 2:

Nutrigrain needs better marketing. Because the first thing I think about Nutrigrain

Speaker 9:

think we can shut down.

Speaker 2:

Yeah. I I I always think about the the the I I think that Nature Valley meme where it's like effing crumbs everywhere. Don't know if you've But seen that that's not Nutrigrain. And so my biggest association with

Speaker 9:

Nutrigrain Fig Newton.

Speaker 2:

They gotta work on that. Yeah. It's like it's a is it a Fig Newton or is it the crumbs one? Like, I just know it's not good, so they gotta step it up. But, yeah, the Vogue audience I mean, why why do you think that there was ever a pitch where it's like, okay, we're gonna go after the moms that read, Vogue and get them to buy it for the kids?

Speaker 2:

And so it's some sort of like bank

Speaker 9:

shot strategy? For me, what stopped me in my tracks is it wasn't a banner ad in a newsletter. It wasn't, like, one of those it wasn't, like, somebody writing about, their wellness routine sponsored by Nutrigrain Nutrigrain. It was a produced video of a stylist sort of, like, gallivanting through the streets of New York City, and I really like this girl, Michelle.

Speaker 2:

Yeah.

Speaker 9:

When I saw her post it, and I saw that it was between Vogue and Nutrigrain, like, there was two I who was in the room? Who sold this ad, and how much was it for, and how much did Michelle get? Get, And and how how much much did did Vogue Vogue get? Like

Speaker 2:

should be big.

Speaker 9:

Very confusing to me because there were there are other ways that that could have been executed and made a lot more sense. And they probably didn't think that somebody would screenshot it and start that conversation.

Speaker 1:

But, unfortunately was started. Yes.

Speaker 9:

Yeah. There's so many media reporters. If I didn't pick up on it, somebody else would've. But I think that Conde Nast overall right now is on life support. Like, we had this conversation when Vanity Fair is looking for their next editor.

Speaker 9:

Unfortunately, like, it's happening immediately right after that with this new this new Vogue editor. Mhmm. It's a lot of it's a lot of pressure for, like, I don't know what. Like, a few more years of running this magazine, it just doesn't really seem like how it doesn't Vogue doesn't seem I don't know what I would look to Vogue for right now.

Speaker 2:

Yeah. Yeah. It kinda goes back to that idea of, like, the the independent media creator versus the legacy media creator. Like, we just released an ad for Wander that they we said they asked us. They barely even asked us to make it and we just wrote it, shot it, directed it, like edited it.

Speaker 2:

We just did everything. We sent them the final thing and they were like, yeah. Cool. And so The brand carried through

Speaker 1:

There was like a second that they asked for a tweak.

Speaker 2:

Yeah. They were like, actually, our font is this one instead of that one. And that was it. And so and so, like, that is the type of, like, you know, entrepreneurial energy that I'm sure the particular partner that worked on who actually starred in the commercial would bring to a partnership and select and and coordinate with a brand that was actually aligned. But there were just, like, too many spreadsheet monkeys in the office that day, and so it just went

Speaker 5:

on what

Speaker 9:

cool they're I was like, why wouldn't you just hand these to the hungover girls at Bar stool and, like, have them talk about, like, how this is, like, the bar like, the hangover snack. Like, that,

Speaker 4:

I think,

Speaker 9:

would have moved the needle more

Speaker 7:

than Totally.

Speaker 9:

I don't know. But

Speaker 2:

Are are the when I when I think of logical vote partnerships, I think of, like, luxury brands. Are luxury brands pulling back from that type of partnership? Like, have you seen anything there? Like, what are what what are the innovative things that you're seeing from, like, the really top tier, brands out there? Have you seen anything

Speaker 4:

cool?

Speaker 2:

What I

Speaker 9:

saw today? Do you know those David bars, like the metallic?

Speaker 2:

Yeah. We had them on the show.

Speaker 9:

I love them. Yeah. I saw that Balenciaga at their show this week. Somebody posted a photo of, like, Balenciaga bars. Like Oh, there we sort of metallic

Speaker 2:

Okay.

Speaker 9:

Metallic label. I took a photo of it. I'll send it to you guys after this.

Speaker 5:

That's very cool.

Speaker 9:

So, mean, like, is that going to move their their hoodies? I don't know. But, like, that's those sort of fun, weird moments are more exciting to me, if we're talking about nutrition bars.

Speaker 2:

Yeah. I mean, it certainly shows that, like, Balenciaga is still on the, you know, hip or just aware because, like, Peter Ruholl, he's kind of, you know, known in consumer packaged goods world and a little bit in tech. And some people know the story of r of RxBarr and then David Barr with the revenue ramp. Like, the business is doing well, but it's still, like, a complete insider industry story. Like, the average person on the street doesn't know that.

Speaker 2:

So Balenciaga is kinda signaling a little bit to that.

Speaker 9:

Yeah. Hey. We're cool with the new business community. And I'm sure that that took them, like, a couple thousand bucks, like, put together. Like, you know, it's not difficult.

Speaker 9:

As far as what what luxury brands are doing, I think it's a lot of events now. Like, they're putting a lot of money into events. Like, I was just at a Chanel event. They do a lot of programming with Tribeca Film Festival, which is cool because everybody's in the city anyway. So they they throw events and make sure that people show up to that.

Speaker 9:

Mhmm. I think that a lot of my social calendar in New York right now is dominated by brand events. It just seems to be what's moving the needle because people go and they get photos of themselves and they post it, and it's like Mhmm. The brand is the background. So you end up being like, New York just feels like a billboard every every time you go out now because everything is a brand event.

Speaker 9:

It's, like, dominating how people are organizing their social calendars.

Speaker 2:

I don't go to events. I don't go outside. I just stream. What makes for a good brand event? Like, is this just a happy hour?

Speaker 2:

Is that what we're talking about? Is there dancing?

Speaker 9:

Bad brand event.

Speaker 1:

It's actually interesting that the thing I'll say, we were talking about SVB. Oh, gladly Glasses threw dinners. Throw dinners that Sure. Like, there's an entire founder dinner industrial complex that are just like financial companies that throw these dinners. The funny thing is that within tech, the company has historically failed to make them social moments.

Speaker 1:

So it'll like, it'll say the name, it'll say SVB Yeah, on the

Speaker 9:

and people

Speaker 1:

aren't like taking a picture of the menu as much as they're just like taking pictures. So I think what the luxury brands do well is you're saying, hey,

Speaker 2:

we're gonna invite Step and repeat.

Speaker 1:

200 You're people into just the professional space photographer. And you're not gonna be able, if you post any photos at all, we're gonna We're be getting

Speaker 9:

gonna have a moment. Yeah. I I threw a brand dinner or a feed me dinner with my friend Paul from the infatuation last year, and Andrew Ross Sorkin was sitting next to me, I gave him a feed me keychain. That was my brand moment. That's a great moment.

Speaker 9:

So that's another thing. Like, at the those dinners are

Speaker 2:

off Yeah. How are brands being creative? Are we seeing, like, ice sculptures or step and repeats? Like, like, how how can brands stand out with one of those?

Speaker 9:

Like, the worst possible brand event is a DJ and free drinks. Like, just people standing in front of, like, a 22 year old DJ with, like, free drinks. And then, more innovative things are, like I think when you give people an activity to do, like, if you are doing, like, bowling or and making it, like, glamorous or, like, you know, now brands are are if they're doing a collab with, like, an athletic brand, they'll they'll take a bunch of women to, like, tennis courts or a golf club in Connecticut, and, like, that's a photo op for everybody involved. So, like, these mini expeditions or Yeah. Like Jake Crew last year took a bunch of people on a gorgeous sailboat in the Hudson River off of like the seaport.

Speaker 1:

It's basically like take your customers on a date. Right.

Speaker 6:

Come on

Speaker 9:

a Give them Adventurous date. Photo. Yeah. Yeah. Exactly.

Speaker 9:

And then if you level up, then you're you're gonna start talking about, like, brand trips. Yeah. Like, Hermes took people to Aspen last summer and yeah.

Speaker 2:

I mean, a ton of VC firms do that all the time. Could come with us and we'll do a track day or movie night or we'll go out to, you know, the the some, you know, place, some ranch and, like, shoot shotguns and have dinners and it's a whole weekend.

Speaker 9:

Where's my call? I want to do that.

Speaker 2:

They should get you on the calls. Introduce

Speaker 1:

Postmortem you to on the Democratic primary, but just on the marketing, right, there was a lot of coverage of Zoran's basically film, you know, vertical video making abilities that seems to have played a key role in the campaign. I'm curious.

Speaker 2:

So are you talking to any, like, TikTokers or content creators that are like, I want to build a business around this for politicians? That would be interesting.

Speaker 9:

Yeah. I had a friend who I used actually used to work with at Conde Nast who worked in the White House for Biden, and I think that she helped out with Zoran's campaign. Like, there are people who know I think like you have to be excellent at that job if you're working

Speaker 2:

Yep.

Speaker 9:

For a political camp political campaign, especially in the White House because the turnaround times are insane and the levels of approvals that they need to go through are crazy as they should be. So I think there's like a mini network. I mean, there's there are a lot of people that worked in the White House under Biden who used to work at Instagram. Like, they were recruited straight from Meta. So that's definitely a thing.

Speaker 9:

But I think that organizing is a tremendous amount of work and like part of that is digitally now. Like, and and Zoran did that really well.

Speaker 1:

Yeah. Yeah. It's interesting to think about how the presidential election felt like the podcast election. Yep. And then the the this primary felt like the short form election, and that sort of it makes sense especially as you're trying to just appeal to younger and younger voters against

Speaker 9:

I mean, his team invited me to start covering his campaign back in February. Like, I was I was at a party, I think, in March for him. But he's he also went on, like, throwing fits and a few other podcasts, and he did, a q and a with me for Feed Me. So he his team knew where to where to act.

Speaker 2:

Where to go? That's good. We've been talking a ton about, the AI talent wars. There's a lot of people in Silicon Valley making a $100,000,000 for in signing bonuses. Silicon Valley notoriously bad at spending money.

Speaker 2:

New York, famously good at spending money. What are some recommendations that you can give to these PhD researchers that have a $100,000,000 burning a hole in their pocket? They wanna get in the game. They wanna step up their fashion, their cars, their houses. What would you recommend?

Speaker 9:

It's it's funny that you go to fashion and cars because I think there's a real, a Louis Mangione situation on our hands here with this kind of job. So I feel like you need a security guard. Like, an awesome security guard.

Speaker 2:

For sure.

Speaker 9:

Yep. Plain clothes security guard, which means like those tight Lacoste polos. You know?

Speaker 2:

Okay. Classic. Maybe they should be carrying too. Maybe they should, you know, if you got the you gotta be everyday carry now if you're making a 100,000,000

Speaker 1:

it's gotten more lenient.

Speaker 2:

Yeah. Yeah.

Speaker 9:

So I I feel like you have to staff up in a in a position like this. Like Yep. You want a driver who's trustworthy and isn't gonna, like, leak your information, a personal trainer

Speaker 2:

Okay.

Speaker 9:

Who's trustworthy, a Feed Me subscription.

Speaker 2:

Of course. There we go.

Speaker 9:

And then I think it's like removing friction from areas of your life with with, like, staffing staffing up up like like that. That. Mhmm. I recently learned that, this neighborhood in the Hamptons called Sagaponic is a a microclimate, and I'm I feel like that's a good place to invest your money. Wait.

Speaker 1:

Break that down a little bit better a little bit more. It's just it's just on an average day, it's nicer there?

Speaker 9:

I think it's From a water standpoint? Nicer. I think it might be, like, wetter and more humid than other areas. Like, I am in a different neighborhood than that. And if you were there right now, it would feel different than here even though it's not that far away.

Speaker 2:

Yeah. Yeah. Like like, you could be like Long Island as a whole is hot today, but Sagaponic could could have a different weather report because it's this micro subclimate

Speaker 5:

Because of the dunes

Speaker 9:

and the tides and the trees and

Speaker 2:

There's different parts of LA that have the same kind of dynamic as

Speaker 9:

well. I mean, isn't San Francisco like that too?

Speaker 2:

Yes. Yeah. Famously. That's why in the summer, it'll still be it'll still be foggy and stuff. But then it's really warm later in the year And

Speaker 9:

so Right. Yeah. So I I feel like investing in a microclimb climate is a good way to spend your that just sounds wise.

Speaker 2:

Yep. They're not making any more of them.

Speaker 1:

They're not

Speaker 9:

And then I I think that you need to use your money to start taking some some experts to dinner because you can't just start buying watches and clothes. There's a really high chance that you'll miss and go for, like, trendy items. So I'd try to hang out with, like do you guys know the watch dealer, Mike Nuvo? Like, somebody like him or, know, like, take him

Speaker 1:

out to dinner and talk

Speaker 9:

to him and and get some advice. I bought a watch from him last year, and he seems like he knows his stuff.

Speaker 2:

What'd you pick up?

Speaker 9:

I got my husband a nice watch.

Speaker 2:

Okay. Let's let's go. There we for the

Speaker 1:

Two size gongs, one interview. That's how you know it's a good one.

Speaker 2:

And a gift.

Speaker 9:

And maybe like befriend Chris Black or like another stylish guy and get him to advise, consult on the closet.

Speaker 1:

Can you get us can you get us Chris Black on the show? Because I the only podcast merch I've ever bought was their merch that they released Yeah. Where it was a cease and desist from the New York Times. Got it. Because they like copied the cover art printed, made a shirt, sold it.

Speaker 1:

That's great. It was a it was a work of art.

Speaker 9:

Chris would definitely come on. I'll connect you guys. That'd awesome. Yeah. I think that the biggest mistake that people who come into a lot money quickly make is just, you know, doing that automatic Apple Pay on on while they're shopping, and then you end up with stuff that doesn't fit you or looks horrible or something.

Speaker 1:

Yeah. So stylists, tech.

Speaker 9:

Stylists. But more of just, like, paid friends.

Speaker 2:

You know? Friends.

Speaker 9:

Take take people out to dinner, host parties. Like, keep them around. You know?

Speaker 2:

Do you do you think it's good to surround yourself with Yes Men?

Speaker 9:

What do you what's how do you

Speaker 2:

So if you surround yourself with yes men, then whatever ideas you have, all your yes men will just tell you, yes. That's the best idea ever. A lot of people say, don't do it. But maybe

Speaker 9:

it I would not do that.

Speaker 2:

Everyone around me has been saying that it's a good idea.

Speaker 9:

No. I would not. That's not good.

Speaker 2:

Okay. So stay away from the yes men.

Speaker 9:

Yeah. You

Speaker 1:

want yes people on your team where you say, wanna do the impossible and they just Yes. Say yes. We're gonna do it. But, there's there's a line, you know, you don't you don't wanna cross it.

Speaker 2:

Well, thank you so much for stopping by. This was fantastic.

Speaker 9:

Thanks, guys.

Speaker 2:

See you soon. So send us the invoice for the 100 k appearance fee, and, we'll look

Speaker 9:

forward Yeah. To doing it

Speaker 1:

Send the next view to

Speaker 9:

Happy fourth of July. Yeah.

Speaker 1:

Happy July. Fun. We'll talk soon. Bye. Other news.

Speaker 1:

The senate passes Trump's mega bill, the big beautiful bill after an all night session. They went all night.

Speaker 2:

I like that.

Speaker 1:

And it's gotta get through the house still, but Elon is fuming.

Speaker 2:

Play the Ashton Hall sound effect for

Speaker 1:

He is absolutely fuming.

Speaker 2:

For the bill. It's got It's absolute war on the timeline. The timeline's in turmoil because Elon Musk said, anyone who campaigned on the promise of reducing spending but continues to vote on the biggest debt ceiling increase in history will see their face on this poster in the primary next year.

Speaker 1:

Pinocchio mode.

Speaker 2:

Liar. Voted to increase America's

Speaker 1:

dirt by time. Easier to make your enemies look like Pinocchio.

Speaker 2:

Yeah. We but but I feel like if you put, like I don't know. I I don't even know who voted for it. Like, let's just say, like, Ted Cruz or something. If you put, like, Ted Cruz's face on here, like, are you going to lose the Pinocchio ness?

Speaker 2:

I

Speaker 1:

feel We'll find a way. I feel like it

Speaker 2:

could be a little bit a little bit rough. Anyway, Elon's upset. He's trying to start a new party.

Speaker 1:

He America party?

Speaker 2:

He said Vox Populae, Vox Dei. He wants to start the American party.

Speaker 1:

And he is pissed because Polymarket Yes. Is showing that the, is expecting the reconciliation bill, the big beautiful bill, to pass

Speaker 2:

by July 4. And 62% oh, yeah. 62% by July 4, and almost certainly by the end of the month.

Speaker 1:

So this will be the one to watch. They're voting I guess the first vote is tomorrow. Yeah. We'll be following that.

Speaker 2:

Yeah. Be interesting. Lots of pork. But if you're AGI pilled, it doesn't matter. Yeah.

Speaker 2:

We are gonna be printing a 10% GDP, folks, in just just a few thousand days when superintelligence arrives. These these deaths

Speaker 1:

are Where weeks matter. Not days away. Many peep many people are saying And so if

Speaker 2:

you're if you're against this bill, you're just telling yourself about your AGI timelines. You're like, oh, I don't really think superintelligence is coming.

Speaker 1:

Yeah. It it is, I think Thiel was pushing, generally pushing on Elon a bit in the interview he did last week around like, okay, so are are humanoids are going to be important? You're obviously making them. Are they going to impact the economy? Are they gonna accelerate growth?

Speaker 2:

Yeah.

Speaker 1:

You know?

Speaker 2:

This is the post nineteen seventies stagnation. You know, we've been growing at 2%. Satya Nadella is asking the question of will AI accelerate GDP growth? Because if if GDP growth accelerates a lot, it changes the dynamic around everything. Like, not just this debt, which we've been kind of lightly joking about, but also,

Speaker 1:

like, redistribution credit, he's also said when he when he initially exited Doge Yeah. And just stopped being a special employee. He did say the only way out is growth. So he knows this too. Yep.

Speaker 1:

But I think he can hold both of these ideas, which is we need to grow, but he's also Yeah. Frustrated.

Speaker 2:

The Hegelian dialectic. Of course. Yeah. I mean, if if GDP is growing really, really fast, you could even keep tax rates the same or lower them and still pay down the debt and do more redistribution. Like, you could be paying people like, you could have, like, something that feels like a massive social safety net, something that feels like UBI or or socialism, but to actually attack people less because the growth is so significant.

Speaker 2:

So it all changes once superintelligence arrives.

Speaker 1:

Maybe maybe Zoran is just so AGI pilled himself That's true. That he's like, yeah. We're gonna have plenty.

Speaker 2:

Yeah. Plenty to go around.

Speaker 1:

We're gonna have such high levels of production. Yeah. Sure. The state can just seize a little bit.

Speaker 2:

Alternatively, maybe he's a hardcore Bitcoiner. And so if he wants if he wants there to be no billionaires, no. It is impossible mathematically to be a billionaire in Bitcoin because there's only 21,000,000 of them. So the richest you could possibly be is a millionaire if you owned over 5% of the Bitcoin supply. Right?

Speaker 2:

So there will never be a Bitcoin billionaire because there aren't a billion of them.

Speaker 1:

Well

Speaker 2:

Whereas there are billions of dollars.

Speaker 1:

Maybe he doesn't want

Speaker 2:

people in the network

Speaker 1:

to vote to expand the supply.

Speaker 2:

Yes. So he's probably against the expansion of the Bitcoin supply, but maybe he's a hardcore Bitcoiner.

Speaker 1:

He's a hardcore Bitcoiner who's against inflation.

Speaker 2:

Yes. Yes. Exactly. Zach Kukoff. Thanks for having me.

Speaker 1:

Live from Mister Washington.

Speaker 2:

Washington correspondent.

Speaker 6:

Washington. Even set up I don't know if you can see. That's

Speaker 2:

That's great.

Speaker 6:

Oh, that's the, monument right there.

Speaker 2:

Wow. No way.

Speaker 1:

There we go. Beautiful. There we go.

Speaker 2:

That's great.

Speaker 1:

We'll just have our production team zoom in zoom in just on the monument so you won't be on camera.

Speaker 6:

Perfect. Just I'll get out of the way, but you can see the good stuff behind me. That's the important fact.

Speaker 2:

So, yeah, break it down for us. The, The Wall Street Journal has a story. Senate passes Trump's mega bill bill after all night session. Tax cuts and Medicaid spending reductions head to the house with a July 4 deadline reachable. What's interesting in the bill?

Speaker 2:

How likely do you think it is to pass? What are the major reads on this? And then we wanna get your your takeaway from, like, where the dividing lines are around the bill.

Speaker 6:

Yeah. So there's a couple trip lines. The first trip line, I would say, you guys have been tracking the moratorium, the ten year moratorium on AI development. This was the on AI regulation. This was the first it was in.

Speaker 6:

Ted Cruz put it in, then it was out, then it was back in. There was a compromise. There was a five year agreement instead of a ten year agreement, and Blackburn has been sort of the key driver of getting it back out again. It's out. Now it was done 99 to one in the senate, almost unanimous in the senate to get this out.

Speaker 6:

This was a big win for Gario, Evanthropic, probably the number one beneficiary. And by the way, somebody who certainly used a fair bit of political capital pushing for this, that's fault line number one. I would say fault line number two,

Speaker 2:

Medicaid Wait. On that. It's it's a it's a potential ban on AI regulation that was removed. And so the current bill does not have a has a does not have a moratorium on regulation. It's So kind double

Speaker 6:

negative. Moratorium on federal regulation. That's right. So you can have California Yeah. States, whatever, who want to Marsha Blackburn, her home state, Tennessee, passed last year something called the Elvis Act, which was cutely named.

Speaker 6:

But basically said, look. If you are building, like, a a, you know, an AI audio business, you cannot use any likeness of a real performer's sound. So if you wanna have, John singing the blues or Jory singing the blues, I cannot do that without a prior deal. That's because in Tennessee, Nashville is a dominant industry. The music industry is a dominant industry.

Speaker 6:

And so part of Blackburn's opposition to the moratorium, which would have prevented bills like that from taking effect, is that it ended up superseding the work that her home state has done to protect its dominant industries. By the way, net net, this moratorium, the time that they got to a compromise was so watered down, it really wouldn't have impacted Tennessee or California anyway. But it was a very dramatic swing from in, out, in, and now out once again for good.

Speaker 2:

Mhmm. And and

Speaker 1:

and What do

Speaker 7:

you what

Speaker 1:

do you expect, at at the state level around AI safety going forward? Is this gonna is the fact that this was removed gonna inspire certain governors to say, you know, make make this a part of their platform?

Speaker 2:

And I guess, like, I I wanna know more about, like, if I'm Eleven Labs and Yep. I'm and I'm training a voice agent and I can't be sure that some clip from, you know, Elvis or, you know, some Justin Bieber worked its way in there. Does that mean that I just turn off Tennessee at, like, the DNS level or do some geofencing? Or or is it more risky than that where have to I I have to comply kind of at the national level and they're kind of forcing me to comply everywhere, or can I just pull back out of that state? Because we've seen this with with Apple intelligence not being available in China, being available in Europe, for example.

Speaker 2:

But, yeah, what what would the impact on something like that be?

Speaker 6:

Well, think about states in two categories. Right? There are some states that are so large. And by the way, the analog that I would think about here is what happens in the textbook market. So if you're a textbook publisher and you wanna write some textbook about biology or whatever, if the largest states in the country, it's category one, has a law mandating something needs to be in a textbook, suddenly you're gonna comply nationally because they are so large, they warp the market exactly.

Speaker 6:

You have to. That's right. And so if California tomorrow says, look. We're passing a European style bill to severely curtail, which I think would be a disaster, to severely curtail AI development, that would, in effect, be a national bill. Mhmm.

Speaker 6:

And part of the, in my view, mistake that's happening tactically right now is this self imposed July 4 deadline is so restrictive that it forces all of these more nuanced negotiating points to fall by the wayside. Mhmm. Right? It becomes blanket. Either it's in or it's out.

Speaker 6:

There's just not time to get to a more nuanced middle ground. So eleven Labs of Tennessee had the Elvis Act probably says, look. We just screwed around Tennessee. I don't I haven't spoken to eleven Labs. But if California passes, then suddenly you have the whole country warped by this.

Speaker 6:

Gavin Newsom and the California legislature are certainly ambitious enough to try as they did last session, to try to pass something that would impact the whole country.

Speaker 1:

Yeah. Wasn't wasn't there news out of California this week or or last week around some of this stuff?

Speaker 6:

Yeah. Yeah. There's there was a huge fight which had an unlikely set of bedfellows where you saw Andreessen and others leading the anti legislative, anti regulatory brigade against some other folks who are much more pro curtailing. By the way, I I will just say the other interesting takeaway for Teck, not to totally dovetail away from your question, but the interest other interesting takeaway for us, I think, in this is the deciding vote in the senate on the overall package, the reconciliation bill, was J. D.

Speaker 6:

Vance. And so if you're a Vance head looking at 2028, right, you already see that Ocasio Cortez and others, AOC and others, are gonna use his deciding vote on this bill, which includes huge amounts of Medicaid cuts that Tom Tillis from North Carolina, in particular, thought were politically toxic as a wedge against him when he comes and runs for president in four years. So sorry. That was a you didn't answer. You asked a question about California, and I dovetailed it back to the federal level.

Speaker 6:

No. It's super interesting. Yeah. Yeah. Yeah.

Speaker 6:

That's the big the big shock. So you asked about tripwires. The AI tripwire was one. Medicaid cuts, which are politically really toxic. Right?

Speaker 6:

Because who wants the idea that you voted for the government to take away your health care. That's another huge tripwire. And then there's a package of things that maybe we care about that are nuanced, like, for example, IRA tax incentives for clean energy, right, or new taxes on wind and solar from China. These are all things that might enrage the Republican, the the very conservative Republican faction in Vigee and the house, which now has to still pass the bill and now complicates the negotiating posture because the senate watered down. The senate preserved the IRA tax incentives and watered down some of the tax cuts, got rid of some of the tax cuts on Chinese imports, which means, basically, you have a bill that's actually financially harder to pass because it's less balanced.

Speaker 6:

And at the same time, you've now alienated a huge faction of the GOP, the super conservative GOP side of the party who who still have to vote for the bill to happen. So it's up in the air if it passes or not.

Speaker 2:

Can you tell me about the EV tax credits? I know Americans are struggling more than ever to buy naturally aspirated v twelves. I'm hoping that there's some move there to subsidize things like the Ferrari 12 Slendry, the Dolce Trelendi. It's a multi $100,000 car. Even if we can't subsidize that, maybe we can just take away the subsidies from electric cars to tilt the to tilt the playing field in favor of the of the larger engines, the supercharged v eights, the twin turbos, the g 60 threes, these types of cars that Americans, you know, deserve to be driving.

Speaker 2:

So what's the news on the EV credit front?

Speaker 6:

I agree with you that the backbone of the American economy is the Ferrari. Right? It's impossible to me to imagine a world. Thank you for the last track. I appreciate that.

Speaker 6:

Where a blue collar factory worker cannot own a sports car is beyond me. This bill doesn't address that. I wish it did. Great if it did. It's not gonna solve that huge unsolved problem.

Speaker 6:

Yeah. What it will do, by the way, is continue to drive a wedge between Elon and Trump. Right? Then you saw Elon take the Twitter. I imagine you guys probably mentioned earlier.

Speaker 6:

You saw Elon take the Twitter and start to really crusade against people voting for this. That is in part because at least the house version of the bill was much worse on a lot of the subsidies for electric vehicles than a senate version. The big challenge is gonna be going into now the conference committee. It's gonna be how do they reconcile these two challenges so that they can hopefully tilt the market back in favor, of course, of Ferraris and supercars nonetheless, while also not alienating one of the two power centers of the GOP, at least today, in Elon who could easily primary somebody who votes against.

Speaker 2:

Yeah. That makes a lot of sense. What what about, how how are the Democrats feeling about EVs these days? Because it used to I used to feel like it was a democratic cause to be pro EV incentive. Let's save the environment.

Speaker 2:

Let's get people out of the gas cars. I'm joking about the v twelves, obviously. But, but now it's become maybe let's punish Elon, but is this a cut your nose despite your face situation for the Democrats?

Speaker 6:

I think it's a rough spot. I mean, you guys remember when Elon wasn't invited to the big EV summit under the Biden administration. Right? That's part of what kicked off his political awakening. Totally.

Speaker 6:

Is he isn't he doesn't use unionized labor, and so he was excluded from this EV summit. There's been a huge polarization, a negative polarization against EVs among not only Dems in office, but a democratic base too Mhmm. Which is, to all the points you brought up, quite ironic. You're not gonna find many Dems willing to cross the aisle in support of electric vehicles to save this bill. This is a bill that will live or die by Republican whip count and Republican unity.

Speaker 6:

If the whip count isn't there, it's gonna be really hard to get it done before July 4.

Speaker 1:

Mhmm. About, Elon's America party? You think that's real or or just exciting?

Speaker 6:

Those are my only two options. Real or exciting. There's no door there's no door number three.

Speaker 1:

Yeah. I mean, it's obvious like, you know, we were talking earlier, third parties haven't haven't historically done well.

Speaker 2:

Well, he has the America PAC. So when you say

Speaker 6:

That's right.

Speaker 2:

Elon could primary someone, he could potentially primary a Republican on the Republican ticket with the American PAC, and he could almost, like, be kind of like a head fake. Like, oh, this is part of the America party, but it's not really a third party. Is that more what we're thinking here, how this plays out?

Speaker 6:

Maybe maybe one historical analog. It's not perfect, but think about, like, a Theodore Roosevelt, like, Teddy Roosevelt with the Bull Moose party. Right? You think about it's a faction within a larger established party, to your point, that can serve as an engine to primary people who are in disagreements.

Speaker 2:

Like the

Speaker 6:

tea party. Exactly. Like the tea party or, frankly, what's happening now. Look at Zoran

Speaker 2:

in New York. MAGA is also like a wing of the Republican Party.

Speaker 6:

That's right. That's right. MAGA, the Tea Party, and now on the left, the polarization from the Zoran and democratic socialist world. All of these folks are new factions in a way that, by the way, we haven't seen before. The same way that fifty years ago, you basically watched MASH on television.

Speaker 6:

That was your only option. And now I tune into TBPN while I play Subway Surfers and have 30 other things hitting me at the same time. That's awesome. The political version of that. Right?

Speaker 6:

Where you say, okay. It used to be my options were a platform that I accepted wholesale on the Republican side or the Dem side. Now I can be a MAGA guy. I can be a tech right guy. I can be a populist guy.

Speaker 6:

I'm a new right guy. And on the left, I can be a democratic socialist. I can be an abundance bro. I have an infinite menu of things. The America party is more likely, if it sticks, one more menu on the side of the right.

Speaker 2:

Sure. Sure. That makes sense. A lot of

Speaker 1:

sense. What question. What, what time what do you expect out of the next twenty four hours?

Speaker 6:

So tomorrow morning at nine, house rules committee votes. That's gonna be a key vote to see how this thing does. What I would say is if this gets done by July 4, the thing that you're gonna have to look to is that 9AM eastern vote to see how much unity is there in the Republican party. They can barely lose any votes if this thing goes through the house in its current form. But I think it's unlikely it's going to happen immediately.

Speaker 6:

It'll happen. This will get passed. But Mhmm. On their timeline, I don't know. In the next twenty four hours, if you see more senators like Murkowski from Alaska who are saying things like the bill is flawed.

Speaker 6:

Please send it back to the senate so we can work on it again, That's a key indicator this thing may not have the legs it needs the next couple days.

Speaker 2:

Yeah. We heard about I mean, everyone's saying, like, it's the biggest bill. There's all this pork. Like, are there any, like, green shoots where there's some random startup out there that's just super excited because maybe it's not a headline grabbing, you know, $10,000,000,000 thing, but it's like, that's a $100,000,000 that I can go after in energy or in defense or in shipbuilding or anything. Have you seen any of these, like they're called riders.

Speaker 2:

Right?

Speaker 6:

Yeah. Yeah. It's a good question. Biden was famous for stacking a lot of these. Like, the inflation reduction act had a million ways you could go participate.

Speaker 6:

This one's got fewer. I'll say one funny piece of pork, and I'll say one actual opportunity. The real opportunity, I would say, is for fintech folks who are looking at the Trump accounts, right, which is negatively giving huge amounts of new Americans who are underbanked or not banks before access to capital for the first time. If I'm Robin Hood, I am loving. That's a big change for me.

Speaker 6:

Funniest piece of pork in the bill, Murkowski, who was the key holdout in the senate from Alaska, got a tax exemption for Alaskan native Alaskan whalers. She said, basically, native Alaskan whalers need to be yeah. That was honestly big delivery for, for her constituency of people who kill whales

Speaker 2:

in Alaska.

Speaker 1:

Let's for give it the whalers. Yeah. They don't get any contention in Washington.

Speaker 2:

I mean, we've we've talked about the, you know, energy is really important for the AI data center race. Whales, they produce

Speaker 1:

oil that you can burn. So Wasn't it Chase who said he would he would burn

Speaker 2:

whale oil? He didn't say he would do it. He said that the hyperscalers are so desperate for

Speaker 1:

energy right now

Speaker 2:

that they would burn whale oil. It's more

Speaker 1:

it's more fun to think about Chase

Speaker 2:

He's being,

Speaker 1:

like personally.

Speaker 2:

Launching Crusoe for whales.

Speaker 1:

Yeah. They're so blubber.

Speaker 6:

VC's are the new whalers.

Speaker 2:

Yes. Yes. They really are. The old whalers.

Speaker 1:

And the

Speaker 6:

old true. Whalers.

Speaker 1:

Zach, this was so much fun. This is great. We'll have to have you call in tomorrow or Thursday. When there's more news.

Speaker 6:

Please. I'll be around it's my first wedding anniversary, so I'll be around. Oh, congratulations. Plenty of time to call in. Thank you very

Speaker 1:

much. Incredible. Incredible.

Speaker 2:

We'll talk

Speaker 1:

to you soon.

Speaker 2:

Thanks, Josh.

Speaker 6:

Thanks for having me, boys.

Speaker 7:

See you soon.

Speaker 1:

See you.

Speaker 2:

Bye. Next up, we have Matthew Prince from Cloudflare coming in to TBPN. Welcome to the stream. Some huge news from Cloudflare. We'll let him tell us the story.

Speaker 2:

How are you doing? Good to meet you. Nice to meet you. Welcome to the stream. I would love to start with, the news today, and then I wanna go into some of the the history of the company and and, when you faced similar, junction points and kind of, turning points in the past.

Speaker 2:

So why don't you, kick us off with, the big announcement, the big news today from Cloudflare?

Speaker 8:

Sure. So about eighteen months ago, we started getting approached by a number of content creators saying that they faced a new new threat online. We spend most of our days, most of our business trying to fight cyber hackers, Iranians, Russians, North Koreans.

Speaker 2:

Yeah.

Speaker 8:

And so I was surprised when the new threat was AI companies. And frankly, I kind of rolled my eyes at publishers. I was like, publishers are always complaining about, you know, the next new technology. And and and and they said, just pull the data. Look at what is is going on.

Speaker 8:

And it was really striking that over time, you could watch as the sort of deal the publishers had made with Google, you know, starting thirty years ago, which was you can copy all of my content and in exchange you send us send us traffic. That deal just didn't hold up anymore. Mhmm. Where they were copying the content, but because they were providing answer boxes, AI overviews, they just weren't sending nearly as much traffic. And and that was the good news.

Speaker 8:

As you looked at OpenAI, it was about 750 times harder than with the kind of Google of old to get traffic with something like Anthropic. It's 30,000 times harder. And the reason why is because people are reading the derivatives. They're not reading the content. It'd be effectively like, if I asked an AI system, you know, summarize what happened on the Technology Brothers show today, I have less incentive to actually go and watch your actual show.

Speaker 8:

And Yeah. That means that if you're making money through advertising, if you're making money through subscriptions, if you're just, you know, getting value out of the ego of knowing that people are watching your show, That's going away in an increasingly AI driven web. And and we worried about that quite a bit. And more and more as we talk to publishers, we realized there was something that we could do about it because Cloudflare runs one of the world's largest networks. A huge amount of the Internet already sits behind us.

Speaker 8:

We could partner with literally publishers, everyone from Adweek to Ziff Davis and everything in between to say that let's change the the way that that the Internet works, and let's change the sort of model where if you're going to take content, you should actually compensate the content creators for it. And so starting on July 1, we flipped the script. We made it as a default that if you're using Cloudflare that you can, by default, for free, block any AI crawlers from taking your content without compensating you. And I think now the hard work begun begins where we have to figure out now the business model around how we get compensation back to these content creators and make sure that folks like you who are who are creating real original content can continue to be compensated the hard work that you're doing.

Speaker 1:

Yeah. Well, we used to joke saying that the the labs are welcome to scrape our content because we want to influence the the outputs of the of the model. But, no, it's it's a serious issue. We we were covering one of Ben Thompson's pieces about this whole debacle a few weeks ago. It was funny because at the time we were like, there's so many different stakeholders here.

Speaker 1:

Who could possibly try to figure out a solution to this problem? Obviously you guys are a key player in this and you have the scale in order to kind of make changes happen. Yeah. Can yeah, can you, I guess, like break down more kind of like the market structure between and and how you see this actually

Speaker 2:

Yeah. I mean, I guess what deals have been done in over the past year? Because I know, didn't News Corp do a bespoke deal with OpenAI for that exact thing? It sounds like you are more equipped to kind of handle, like, the longer tail, but also some of the bigger publishers. So talk to me about kind of the state of the art of what those deals look like when they're one to one and then what things might look like going forward.

Speaker 8:

Yeah. So I think that, first of all, I I think for some time to come, large publishers will do deals with large AI companies Mhmm. Where there might be places that people need help or either small publishers or small AI companies, you know, and and figuring out what has to happen there. But I think even in the case of places where deals have been struck, what the challenge is is, you know, someone like OpenAI is paying a lot of publishers for their content and doesn't actually object to the the basic idea that they should pay for the content that they're using. But but they do object to the fact that they're paying where all of their competitors are getting content for free.

Speaker 8:

Sure. Any market that exists has to have scarcity. And the problem, the reason why there hasn't been a market in this place space is that there isn't scarcity. And so what changed yesterday is scarcity was introduced really for the first time. And I I think the closest analogy is if you think back to, you know, before iTunes launched, we were all downloading music on Napster and, you know, all the the free pirate services.

Speaker 8:

That's effectively how the AI companies are taking data from any content creators today. And what you need is an iTunes. You need someone to come in and say, okay, let's provide a way where people can pay for that content and they can go back. Now now that's evolved over time. I think our first version in the marketplace will probably be relatively naive and we'll figure it out.

Speaker 8:

Night we no one's paying 99¢ for songs anymore. We've moved to something that's closer to a subscription model like a Spotify. And and and I think that that's what what we will evolve to have over time. And and if we do it right, you know, I think that we can have a large number of sellers of content, obviously, but also a large number of buyers of content, Both the big sort of general purpose models, but then also very specific models. And we've gotta make sure that new entrants can come into this market and have have a vibrant way of of participating and competing.

Speaker 8:

And so I think that for now, most of the deals will be one on one bespoke deals between publishers and AI companies. You'll need to have some way of enforcing those deals and making sure that if you're charging someone, they can get access, but their competitors can't. And then over time, I think we'll develop something that looks much more like a robust marketplace where content providers can set sort of what their price is. There can be a bid for maybe getting that that content exclusively for some period of time or or or whatever else. And that's gonna develop over over the next over the coming months.

Speaker 2:

Is that the correct I I is that the correct model to think that the the content producer would set the price? I mean, that's certainly what happens on Substack and, you know, the the The New York Times, The Wall Street Journal set their price. But I've always thought that this might evolve a little bit more like YouTube or like Spotify where

Speaker 5:

Yeah.

Speaker 2:

There is a pool of dollars that are going into ChatGPT subscriptions. I personally pay $200 a month. And occasionally, I fire off queries that probably land on The Wall Street Journal or The Financial Times. And I would be happy for them to send 10¢ or a dollar or split off 50% of what I'm paying. So you have a $100 to kind of, you know, parcel out in the pool, and then you see how much inference pulled, what data from what what sources, and and kind of have it all be driven by the, like, kind of the same volume, like, the volume algorithm that you see on on, on YouTube or Spotify.

Speaker 2:

Does that seem like kind of what the long term, outcome might look like, or do we need to also get to some sort of, like, ad driven marketplace? Because that's ultimately where YouTube how YouTube prices these things.

Speaker 8:

Yeah. You know what? I think it's gonna be, I I think that we there's a there's a lot that we still have to figure out. I I I, last week, traveled up to Stockholm to meet with Daniel Ek, the the Nice. Founder of of Spotify.

Speaker 8:

And and it was, you know, it's really it it was interesting. He said, you know, that the ultimate model that Spotify settled on, I think it's I don't wanna put words in his mouth, but but it was basically like it was it was sort of the most naive of all of the different options that he had come up with. And it and it is works as it is a pool of funds and then they divide it up based on basically how many minutes of of content you you consume. And there are different rates for music versus podcast versus audiobooks. And and was fascinating to to see.

Speaker 8:

So you could imagine a version where Cloudflare effectively negotiates on behalf of all of the content that sits behind us with the big AI companies. People can opt in or opt out of being a part of that pool. The more content that's part of that, the more valuable it is to the AI companies. And then we would look at how we could split that up across how much crawling is actually actually done across that. I think one of the real keys will be how can we give hints to the AI companies on what is the content which is the most valuable to them.

Speaker 8:

So for instance, if Taylor Swift releases a brand new song and maybe does an interview about it, that's incredibly valuable content if you're a chatbot for teen lonely teen girls. Right? On the other hand, it's probably not so valuable if you're sort of a chatbot trying to be the world's best doctor. You know, may maybe there's some value in in sort of bedside manner, but but largely, you know, that's not gonna be a big deal in your in your care. On the other hand, if a research study comes out on acetaminophen's effects on on, you know, different types of chemotherapy, Very little value for the chatbot for teen girls, but incredibly valuable for the doctor.

Speaker 8:

And so what we have to do is actually I think on a on a LLM or a AI model by AI model basis, be able to give them signals back to say this is the sort of thing that you should be paying attention to, that you should be indexing, that you should be actually scraping and paying for. And and the the same way that in, you know, the music industry, we've got reviews, we've got different types of music that that people are, you know, that you might be into country or you might be in a classical, and and and it kind of allows you to compartmentalize that. I think that whole ecosystem has to develop. But once that does, something very much like Spotify could be the right answer long term. And again, I think we're in a unique position to pull it together.

Speaker 1:

Rudy? Talk briefly about Stablecoins. I remember in this Ben Thompson piece, he was people have been suggesting this idea that Stablecoins could be some part of the solution to to put or a backbone for a new, business model for the Internet. At the same same time, while I'm bullish on stablecoins, someone like a Cloudflare can set up the infrastructure to just use traditional financial rails to process payments. So I'm guessing that it's not part of what you're This

Speaker 2:

was Ben's reaction to the the original sin of the Internet was advertising and this idea that, Marc Andreessen, when he built the browser, didn't build in payments because it was this multiparty discussion to figure out how to actually integrate payments into HTTP, didn't happen. The first MCP spec, that's the model context protocol spec for AI agents to interact with each other, doesn't include payments, but he was kind of saying that it might be on the horizon, but would love your take.

Speaker 8:

Yeah. You know, I think, the answer doesn't have to be a hundred percent one or or the other. And so I would guess that there are there are gonna be, very traditional kind of traditional, financial rails that will be ways to pay pay people out. But there will also likely be something that is blockchain based or or stablecoin based. And and, again, I think we're looking at all of the different providers that are in this space, including potentially creating our own stablecoin that would be part of how these these transactions, you you know, take place.

Speaker 8:

And I think you're exactly right that, there's a natural extension from what we're doing here to how we figure out around what the payment rails look like for MCPs. Because as your agent is going around interacting with things, that's also going to be something that is is either, you know, charged for or that you that you actually, pay get paid to do. And and there's gotta be a way for those those transactions take place. And to the extent that you can reduce the transaction costs for those transactions as far as possible, then that's that's a pretty interesting, way of of of of being able to, process, you know, very high volume, low dollar amount transactions. So I think that's right.

Speaker 8:

The my only one quibble is that I think that the original sin of the Internet is not that it didn't include payments. It's that it was that there was no privacy built into it and your IP address reveals far too much about you as and who you are. We don't even have to get to cookies but that's a

Speaker 2:

No. No. We totally can.

Speaker 1:

What about we we we've talked about content creators, publishers. What about, the the apps of the world, the Ubers, the Instacarts that actually have really meaningful advertiser business, you know, businesses built on top of them. Do you expect there to be any type of of this functionality rolled out to them over time? Or or is that something you think they're thinking about quite a bit? Because if I can go into ChatGPT and that's my default daily assistant, let's say, in a few years from now, and I'm like, hey, order me groceries and order me this Uber, you know, I'm not in the app seeing ads potentially, you know, being upsold and and that could be a problem for those types of companies.

Speaker 8:

Yeah. Totally. And I think that, I mean, you know, robots don't click on ads. And and so and so the the sort of model that that part of the, you know, revenue stream to support these services, has to come from from somewhere. And if today, you know, Uber is able to, monetize the time where I'm I'm waiting for a car to come by showing me an advertisement, that that means that they can effectively charge me less for the ride.

Speaker 8:

And so it may be that if that ride is booked through an agent instead and they don't get that that advertising experience. And, again, I don't know enough about, you know, Uber's business model to know how how important that is to them. But if they don't get that, then maybe it is that there's some sort of premium that an agent booked ride receives versus a human booked ride. And, again, I think that that's, that that that's just we've gotta figure out what that what that, what that looks like. I think that's gonna be less up to Cloudflare.

Speaker 8:

I think it's gonna be you know, our job will be how do we provide those payment rails? How do we provide the systems? How How do we make it so so that we can have guardrails on as agents interact, with applications that are there, that you as the application provider can set those those rules and controls, and then and then agents can interact within within that system. And MCP is is sort of the leading candidate for providing that. But you're you're right that that's a sort of draft version of that that specification, and there are a lot of things that have to, you know, mature in order for that to be something that is the the real kind of agent to agent system that runs that that is likely to run what will be an increasingly AI driven web.

Speaker 2:

Yeah. Can you talk about any previous crucible moments for either Cloudflare or just like the Internet history broadly? This feels like a moment where you're potentially trying to, I mean, deal with a shift in the in in the overall Internet structure or the incentive structure. But it's a reaction, but it's also an action that should reshape the the the relationship between different parties on the Internet. Are there different moments in the history of Cloudflare or the history of the Internet that you think we could draw on as historical analogies to learn from to see how this might play out?

Speaker 8:

You know, I think that, you know, the Google really, thirty years ago, defined what was the business model for the web, which was you you create content, search drives traffic, and then you monetize that content through advertising or or subscriptions or or ego. And ego and monetization not quite the right thing, but you drive value from it in one of those those three ways. And that that's held up for the last thirty years. I I I I think that this is the first time that I've I've seen something where I thought, wow. There there are really going to be very, very, very fundamental changes to the structure of how the Internet works if we move from a search driven model to an AI driven model.

Speaker 8:

And and I think that, what I hope is that we can actually make this something where, what gets rewarded is not what gets rewarded today, which which largely is, you know, who can write the headline that produces the most, you know, sort of inflammatory response

Speaker 2:

Yep.

Speaker 8:

In in all of us, you know, who who basically drives the most cortisol, you know, gets the most clicks, and and therefore the most ad revenue or the most subscriptions. That's that's not a that's not a healthy way of looking at the world. I think, you know, the alternative is if you imagine that kind of all of the AI models together are good kind of assemblage of of an approximation on human knowledge. If we can identify where the holes are and then create incentives for content creators to actually fill in those holes, that actually is gonna move humanity forward. It's gonna make the AIs better.

Speaker 8:

It's gonna it's gonna be a much more interesting thing, and that's I think how we're gonna we're gonna be creating it. I I can't think in the last 30 of of it as as seminal moment. I think there are some things that are along the way. Think 2016 was a really big turning point year for for a lot of reasons. It was sort of the year that it went you know, technology went from being able to do no wrong to being able to do no right.

Speaker 8:

And sort of rise in populism around the world. You have you have Brexit in in the EU. You have the first Donald Trump election. You've got Xi and Modi consolidating power in in Asia, the Filipino election. And I think that that was a point in time where, you know, also driven by Snowden shortly before that in 2015 that that a lot of the world lost faith in kind of The US model of Internet governance.

Speaker 8:

And it wasn't quite ready to adopt the Chinese model, but that's that's definitely been a shift, which is substantial. And I think we're still watching that play out. That's that, I think, is probably still remains the biggest threat to the overall Internet. But this shift to AI as well, you know, again, I I think is is is a real challenge, but it's also an incredible opportunity. If if, again, we can make the rewards incentives for content creators not be who can produce the most most cortisol, but who can produce the biggest sort of incremental improvement in human knowledge, that that'd be an incredible win.

Speaker 8:

And and something that that, you you know, I I I don't quite know how to get there yet. Yeah. But I I think yesterday or today and, you know, we took in July 1, took a a real step in in in moving in in that direction and making sure that compensate content creators can be compensated for for really producing, information which is valuable to advance human knowledge.

Speaker 2:

What do you think about the future of investigative journalism? There's been a big shift in just content creation into independent creators, But a lot of that is actually downstream or separate from the type of work that investigative journalists do. I'm thinking of, like, Seymour Hirsch hanging out at a local army bar for a year before he gets one scoop that he can write a book about. And and he and these investigative journalists, they tend to you know, John Kerry Rue was on the payroll of The Wall Street Journal for years. He was writing stuff, but then he blows the Theranos case wide open, eventually a book and then a movie.

Speaker 2:

And and I and I imagine that with artificial intelligence, the value of a scoop could actually go up because you could use an LLM to read through the clickbait repurposed version of the article to find, wait, they're sourcing actually this person posted it on the Internet earlier. They're the one that actually did the hard work to get the scoop. They should get 90% of the value, and 10% should go to the repurposed rage bait version as opposed to the economic model now, which is maybe you get 90% more clicks on the on the viral, you know, repurposed content and the scooper actually gets 10%. Do you think that's possible, or or how do you think that investigative journalism will evolve?

Speaker 8:

Yeah. This I I don't know if this is an answer to your question, but but it but it's yeah. I thought it was I thought was interesting. I I as I said, I met with Daniel Ak Sure. Up in up in in Stockholm.

Speaker 8:

And and and told me he said, you know, we were talking about this, and he said, you know, one of the things we do at Spotify is we actually surface all of the different queries that are sent to Spotify that we don't feel like we have a really good response for. So if somebody searches for, you know, I wanna a a song to a reggae kind of beat with and it's all about how much it sucks when your sister steals your car and your dog. Like

Speaker 1:

Not to get too specific. No.

Speaker 4:

I'm not I'm again, I don't

Speaker 8:

know what what what these things are.

Speaker 2:

But they're Yeah.

Speaker 8:

People they're musicians who take this list and they actually write songs based on what people are looking for.

Speaker 2:

No way.

Speaker 8:

And that that they make tens of millions of dollars a year doing just exactly that. And there's something about that that I think is really beautiful as opposed to the alternative of just, you know, the just trying to do a little more me too of what everybody else is doing. Like, taking a different twist using signals from the market in order to go finance. So, you know, again, that's not investigative journalism by any means, but I think it is actually advancing kind of the sort of good of humanity where it's taking what a place where there's a missing need and it's filling that missing need. And so I am hopeful, you know, and and relative I mean, I'm a relatively optimistic person.

Speaker 8:

And and I'm hopeful that if we get this right, that we might be on a golden age of of content creation where people actually are rewarded for doing things that advance human knowledge as opposed to just doing things that produce as much cortisol as possible.

Speaker 1:

Yeah. The the idea of a knowledge bounty. Right? It's like, hey, there are there's there's groups of people or a single entity online that will just pay you to produce really high quality information around this specific topic. Yep.

Speaker 1:

And this the internet always did this sort of organically, but it would just be somebody spinning up a blog or maybe an Instagram account or something like that. And they would just start blogging about something that was niche And then an entire network would form around it and other people would contribute to it. So I think there's quite a lot of precedent. But it would be exciting to like put that reward out. Mhmm.

Speaker 1:

And not saying, hey, have to do this and then maybe you'll get some ads associated with it later. But, like, basically, like, the the pool of capital is here and you just have to deliver that knowledge that that people already want and we know it because they're searching it on LLMs and and things like that.

Speaker 2:

Can I throw you oh, sorry? I think

Speaker 8:

I think I'll tell you the sort of black mirror kind of dystopian outcome if we don't get this right. Which is I don't think that I don't think journalism goes away. I don't think that research goes away. But you could imagine a world in which we go back to something kinda like the Medicis, where there are five big AI companies and they basically employ each of them, all of the journalists, all the researchers, all the academics to be able to do the research to feed their systems. Because you're still gonna have to have original content.

Speaker 8:

Yep. People are still gonna have to be able to produce that in order for the AI systems to, you know, fill in the holes in their kind of effective block of Swiss cheese or or that that is that is their model. But but I I think that that's that's a bad that's a pretty bad world where where again because you then probably have like the conservative AI and the liberal AI and I mean, all all these things that we've kind of that the Internet kind of broke apart. You could imagine AI is a system that is just this massive consolidator in the future. And and I I think that's a bad outcome.

Speaker 8:

And I think a better outcome is if we can figure out how how can we reward people for, again, driving knowledge, but but but not get to a place where all of them have to be employed by by one of the five, you know, big

Speaker 2:

Yeah. I I actually talked to somebody very high up at one of the labs who said that was was saying that, like, if you take the total, the total salaries of all the journalists in America, it would be a fraction of what the AI labs are spending right now just acquiring data. And so That's right. It it's not a financial question. It is a political cultural question.

Speaker 2:

It's interesting. Yeah.

Speaker 1:

I I have to make

Speaker 8:

that lot. The argument why I think that and and I've been really positively, surprised as we've talked with because we've talked with all the big tech companies, all the AI companies about this. And they and they all like, with with maybe one or two very small exceptions. They all say, yes. We understand over the long term we have to pay for content.

Speaker 8:

Mhmm. And so they get that. What what what's key though is it has to be a fair playing field. Yeah. And that fair playing field, everyone sees that slightly differently.

Speaker 8:

So if you're a new incumbent, you wanna say, well, Google can't be advantaged. If you're Google, you have to say, you know, well, we have to it has we have be respected for the ecosystem that we've helped build and flourish, and we should be able to, you know, you see. And so everyone's got a slightly different take on this, but I I'm actually encouraged that that the the dollars are there, that there is an acceptance that this is is, is something that needs to happen. And that if you if you if you ask them, everyone says, yes. We need to do this as long as you can create a level playing field.

Speaker 8:

Mhmm. And, again, that's where I think our role comes in is how can we create that level playing field? How can we create scarcity? And then how can we make sure that content creators are being compensated for the the hard work that they're doing?

Speaker 1:

Yeah. Zooming out just for a second, what is the state, how's the Internet been for the last few weeks? It's been a busy, month in or last month in geopolitics. I'm and I know that makes your job a bit harder or at least I would imagine things get a little bit busier.

Speaker 8:

Keeps it interesting. I mean, it's whatever you see sort of in in a in the sort of kinetic side, the physical side of a conflict is is usually replicated in the in the cyber Mhmm. Side of the conflict. And so, you know, we as the hostilities, ramped up between Israel and Iran, First of all, we saw very clear signals that that something was coming ahead of time, which is it's interesting how there's sort of a digital residue, which shows when there's going to be kinetic action taken. And so and so we were we were watching that that pretty carefully.

Speaker 8:

Big attacks targeting both Iran, one of Iran's largest banks was hacked by what looked like a a group of is Israeli hacktivists. The largest cryptocurrency exchange in Iran was hacked and all its funds were drained. Right. Again, looking like it was either by the Israelis or or Israeli sympathetic attackers. Iran has has tried to do the same thing against against Israel.

Speaker 8:

Israel's cyber, defenses are much stronger. But we're starting to see that Iran is targeting other, other largely US, financial services firms, critical infrastructure firms to try and see to sort of try and strike back from what what is, you know, how do they strike back against The US in a way that that that is harmful but doesn't start World War three. And I think that increasingly, unfortunately, cyber will be a big piece of that. And so we're, you know, we're we're we're doing everything we can to to stay in front of that and make sure that anyone who needs it has has protection and that we can keep keep the lights on everywhere in the world.

Speaker 2:

Last question for me. Go for it. There's a number of larger overarching tech narratives. I'm interested in if you're losing sleep over any or tracking any in in particular focus. The AI talent wars and kind of pay inflation for top performers.

Speaker 2:

Is that an issue or something you've been tracking? Or the difficulty to build larger data centers and infrastructure, the GPU short of the chip wars? Any any of those topics kind of particularly relevant to you or your business or or kinda keeping you up at night?

Speaker 8:

Yeah. You know, I think we're we're not trying to build a foundational model LLM. So so we're not we're not as much competing, for the the bleeding edge AI researchers. I will say that we've I've been really amazed at at both from an academic side and then also just, know, pure talent side. How many people are interested in working on that problem of something like how do you score content in order to be able to create a a marketplace?

Speaker 8:

And so and so that that's something that's been, you know, pretty exciting. We're we're very excited about the, the the people who are coming to, apply to work for Cloudflare to to work on problems like that. And, like, today has been a record, applicant day for us.

Speaker 6:

It's awesome.

Speaker 1:

On the on the

Speaker 2:

Congratulations. Records. Sorry.

Speaker 1:

I hit the I hit the air air horns for you. Records. Applicant day.

Speaker 2:

But it's good. It's good news.

Speaker 8:

And then, you know, I think, you know, we we've, we we tend to put a little bit of equipment in a lot of places and so that we don't tend not to have kind of massive multi Yeah.

Speaker 2:

You're not running into, like, power constraints or rare earth element constraints or water constraints.

Speaker 8:

We we do in but in in in kind of a micro way where Sure. You know, put GPUs at the edge of our network. We have to live with a power envelope that we're given by an ISP. So Sure. We're doing things to try and figure out how to just get the best power efficiency out of the the GPUs there.

Speaker 8:

The answer to, you know, more and more need of GPUs can't be everyone has to turn up their own nuclear power station. So so I do think the place where we're pushing suppliers and AMD is doing some really interesting things in this space. Qualcomm is doing some interesting things in this space. The fruit company down in Cupertino is doing some interesting things in this space. Or how do you focus on, yes, performance, but performance per watt?

Speaker 8:

And and we've seen this game before with with Intel saying, oh, that doesn't matter. Like, just just get the most performance possible. You can water cool things, all kinds of things. That that tends to not be the winning strategy over the long term. And so we do think that there will be a renewed focus on just energy efficiency around GPUs, and and we're really pushing that in a way that because because again, we can't build our own own data centers in the business that we're in.

Speaker 8:

And then and then chip shortage. The chip shortages are funny. Like, it's there's never been a chip shortage in any time in the history of of silicon that doesn't turn into a glut. Because it's Hey.

Speaker 1:

But this time it's different.

Speaker 8:

Yeah. This time this time is totally different. So I I think that I think you're gonna see both. You're gonna just see shifting demand Sure. That's there and and and a lot more.

Speaker 8:

It's NVIDIA is not gonna be the only chip supplier of GPUs. There are some really, again, great great other providers that are out there.

Speaker 2:

Yeah. Well, thank you so much

Speaker 1:

for stopping by. The conversation. Yeah. Congrats on the launch and come back on again soon.

Speaker 2:

Yeah. Yeah. Thanks for Congratulations. Good luck filtering through all those job applicants. We'll talk to you soon.

Speaker 1:

See you. Bye. There's no there's no decency anymore. We can't even take a summer weekend.

Speaker 2:

Send an email to a 3,000 person organization without it leaking to Wired. Mark Chen, the chief research officer at OpenAI, sent a forceful memo to staff on Saturday, promising to go head to head with social giant in the war for top research talent. This memo, which was sent to OpenAI employees in Slack and obtained by Wired, came days after Meta CEO Mark Zuckerberg successfully recruited four senior researchers from the company to join Meta's super intelligence lab. I have a visceral feeling right now as if someone has broken into our home and stolen something, Chen wrote. Please trust that we haven't been sitting idly by.

Speaker 2:

I mean, this makes sense. You're like, you need to you need to tell the troops that, like, you know, we're prepared to go on the counterattack. Like, we got attacked, but we're ready to go back. What's interesting is, like, is, like, I don't think that I don't think that Meta is necessarily gonna try and go eat ChatGPT's lunch. Like, I was revisiting that conversation that we had with Jeff Huber, and he was saying, like, you know, he was kind of reiterating, like, never bet against Zuck.

Speaker 2:

Like, he's working on an open source model, but a lot of it is is more of this, like, b to b applied AI necessarily than trying to go and and and disrupt ChatGPT because Google's already running that playbook. Like, Mark can see that and see that, like, what happens when you're a trillion dollar company and you roll out a, basically, a direct clone of the ChatGPT app? It's like, yeah. You can get a decent amount of users in a numb in notional terms. Like, I'm pretty sure the Gemini app has, like, a 100,000 user or a 100,000,000 users and, like, lots of five star reviews, but it just doesn't have anywhere near the penetration of ChatGPT.

Speaker 2:

And so when you pull someone off the street and you say, what do use for AI? They say chat. They don't even say ChatGPT anymore. Yeah. So dominant.

Speaker 2:

And so I think Mark knows that he shouldn't necessarily try and go after that, but he should be going after the next thing and the next next thing and have a model that he can pipe into all sorts of different features within the meta ecosystem. But Yeah. The crazy

Speaker 1:

thing is there's not it doesn't seem like there's clear precedent for a raid of a talent raid of this magnitude. Right? We we covered Ken Griffin raiding Enron But it was after. The collapse. Yep.

Speaker 1:

Right? Yep. Apple heavily recruited out of Xerox PARC in in 1979 in the early eighties. Yeah. But like It was kind

Speaker 2:

of a downswing.

Speaker 1:

Right? Yeah. It wasn't it wasn't Yeah. It wasn't anything of this magnitude nor nor was it, hey. We're gonna come in and just give you these, like, nine figure off, you know, packages and things like that.

Speaker 2:

Yeah. And and it's very common for startup to to be able to pull people from the big legacy company that's kind of sclerotic.

Speaker 1:

And Yeah.

Speaker 2:

You know, we've seen a lot of people that were at Apple or or Microsoft or Xerox, and then they went to Google, and then they went to Meta, then and they went to OpenAI. I mean, Brett Taylor, who's the who's the chairman of the board at OpenAI, was the CTO of Meta. Right? And it's like, okay. He clearly just wants to be in, like, the hot thing, and he's he's like a startup guy almost.

Speaker 2:

And I I believe he was at Google before cause he worked he yeah. Yeah. He redid Google Maps. Yeah. And so, like, the, like, the story of Brett Taylor is really, like, the default, I think, for most, like, startup people.

Speaker 2:

It's like, he's at he's he's early at Google. He works with Paul Buchheit on Gmail. He works on Google Maps. Then he goes over to Meta, CTO there. Then he goes to OpenAI as chairman of the board there.

Speaker 2:

And it's like he is a he's a startup guy. And so he's always riding like the new wait. You usually can't poach them back into the older company. That's rare.

Speaker 1:

Yeah. One other some other precedent here in 2010 and 2011, Zuck and Facebook went super aggressive poaching from Google. Yep. So Facebook had crossed half a billion users. They were in that period of transitioning from a super scrappy startup.

Speaker 1:

Yep. And they went heavy into Google. So they went and got high performers from ads, search, mobile on the Android side and some of the social product people from Google plus And Zac did something similar, not to the same scale, but just went in with these sort of heavy, heavy comp packages and was very successful. Google had to respond and increase comp in in a number of different

Speaker 2:

ways. Did they poach any designers? Any people that use figma.com? Any people that think bigger, build faster? Anyone that, you know because Figma helps design and development teams build great product together.

Speaker 2:

It would be a different place. Back then. But fortunately, can go to figma.com.

Speaker 1:

You can think

Speaker 7:

of You

Speaker 2:

can get started

Speaker 1:

for free. Faster.

Speaker 2:

Anyway, what do you think of the the the buzzword superintelligence? This whole they, like, they've they've really rebranded it. Is it a gold post shifting think

Speaker 1:

he's smart. I think he he poached the the the word superintelligence.

Speaker 2:

It's like it's mine. Mine now. It's mine now. Because

Speaker 1:

And he's not saying and he's not saying the safe superintelligence team. He's just saying super intel Implies reckless

Speaker 2:

Safe superintelligence? Implies reckless superintelligence. I like that. But it's funny because, you know, everyone was focused on, like, AI as a buzzword, then it was, AGI, then it would now it's superintelligence. But, like, you know, within our group, like, we all think, like, superintelligence is kinda the old thing.

Speaker 2:

Like, everyone like, me and you, like, we we know the secrets to this stuff. Like like, it it's kind of it's kind of already priced in. The real killers, they're focused on the next thing. You know what I'm talking about. Right?

Speaker 2:

Giga intelligence. That's the one that people are getting. Right? Really? The the real killers are like, yeah.

Speaker 2:

Well, like Yeah. AGI is basically here. Super intelligence is basically here. But giga intelligence, that's the one we're working on. Because we're gonna need a new buzzword, because we're burning through buzzwords and increasing rate.

Speaker 2:

And so we're just we we we we just steamrolled the Turing test, completely blasted through AGI. No one talks about AGI anymore. AGI is here.

Speaker 1:

Yeah.

Speaker 2:

But, you you know, superintelligence, we're gonna solve this buzzword. We're gonna churn through this in a year or two. We gotta start working on giga intelligence now.

Speaker 7:

Yeah.

Speaker 2:

This is the way.

Speaker 1:

Anyway Start adopting it.

Speaker 2:

Chen promised that he was working with Sam Altman, the CEO of OpenAI, and other leaders of company around the clock to talk to those with offers, adding we've been more proactive than ever before. We're recalibrating comp, and we're scooping out and we're scooping out creative ways to recognize and reward top talent. Still, even as OpenAI even even as OpenAI leadership appears desperate to retain its staff, Chen said that he has high personal standards of fairness and wants to retain top talent with that in mind. While I'll fight for to keep every one of you, I won't do it at the price of fairness to others, he wrote.

Speaker 1:

It's such a brutal dynamic if you're Mark Chen. Yeah. If you're Sam. Because not everybody at OpenAI is getting poached. But presumably, Zuck is working with all the people he's poached already to identify who are the best people at OpenAI, who do we really want.

Speaker 1:

Yep. And then going to them and making the maxed out offers. Meanwhile, you can't, like, the the team dynamic, if you have a few researchers on a team and somebody gets, you know, somebody's potentially getting poached by Meta and you say to retain them, you're like, okay, we're gonna give you this sort of massive incremental grant, and the other people on their team are going be like, well, is that my market value? I didn't go talk to Facebook or I didn't I turned down, I never took the meeting or So, you should pay me the same thing too.

Speaker 2:

Yeah. Right? Yeah.

Speaker 1:

Yeah. I'm loyal. Shouldn't you wanna pay the

Speaker 2:

I'm a missionary.

Speaker 1:

Yeah. I'm a missionary.

Speaker 2:

Why are you paying the mercenaries more? You should pay you should pay them the missionaries the same price just because they're missionaries. Rough. For sure. Doodle.

Speaker 2:

Anyway. Let me talk about

Speaker 1:

And and and and no. The the other the other thing is is it creates this like really nasty incentive Yeah. To basically be like, oh, yeah. You know, you maybe got hit up. I had met a recruiter like many months ago.

Speaker 1:

And you're gonna be like, yeah, I mean, reached out to me and suddenly they're, like, hovering around you, like, being like, alright, what's it going to take? What's it going take? We really don't want to lose you. It is a

Speaker 2:

Yeah. I mean, it's crazy game theory. Like, every single conversation in a company is, like, this constant game theory.

Speaker 1:

And then timing that timing that with with this, like, summer break that I guess OpenAI was saying like, hey, everybody should take the week of the fourth off or not the week of the fourth, but anyways, you wanted to say something more important.

Speaker 2:

Well, I mean, you know, if you wanna see who your high performers are, just look in the linear. Right? Linear is purpose built tool for planning and building products. Meet the system for modern software development, streamline issues, projects, product roadmaps.

Speaker 1:

If Zuck were able to gain access to OpenAI's linear, it would be it would be illegal, but

Speaker 2:

Yeah.

Speaker 1:

It would also tell him tell him a

Speaker 7:

lot.

Speaker 2:

Yeah. You don't wanna do anything illegal, Zuck. You wanna be on Vanta. Automate compliance, manage risk, prove trust continuously. Vanta's trust management platform takes the manual work out of your security and compliance process and replaces it with continuous automation, whether you're pursuing your first framework or managing a complex program.

Speaker 2:

So the news comes as competition for for top AI researchers heating up in Silicon Valley. Zuckerberg has been particularly aggressive in his approach, offering $100,000,000 signing bonuses to some OpenAI staffers according to comments Altman made on a podcast with his brother, Jack Altman. So this this $100,000,000 signing bonus, this just got baked into the lexicon. But I was listening to Dylan Patel on, Jordan Schneider's China Talk podcast. It's actually transistor radio, but it goes out on China Talk.

Speaker 2:

Anyway, and Dylan Patel was saying that that might be that might be a kind of a game of telephone a little bit in the sense that it's possible that you could maybe be making a $100,000,000 over a number of years in stock based on appreciation, but he was very skeptical that anyone was getting a 100,000,000 on day one without any sort of strings attached for, like, a true signing bonus. So I don't know how real

Speaker 1:

that Yeah. Remember that the

Speaker 2:

Zurich Zurich team

Speaker 1:

came out and said the Zurich team explicitly said we

Speaker 2:

Said they didn't

Speaker 1:

get it.

Speaker 2:

But I know. I know. Yeah.

Speaker 1:

The way that he phrased it,

Speaker 5:

it Could

Speaker 2:

have been more. Clearly forgetting No. No. No. You know, something.

Speaker 2:

No. I I I I do think that that the the round numbers are are easy to latch on to. Like, why is Mamdani talking about billionaire specifically? He doesn't have a problem with 999 millionaires because, like, that that's, like, not easy to grasp. Like, people latch on to round numbers.

Speaker 2:

A 100,000,000 is so much. A billionaire is so much, and it's easy to it's easy to quantify. And so the 100,000,000, if it was 80,000,000, it wouldn't be going nearly as viral as 100,000,000. And so so I I I don't know I don't know how real that is. Maybe that was thrown out as a total package, and then it got kind of telephoned into, well, you know, they it's basically a signing bonus because they they got the deal on the on the as soon as they signed, but they do have some sort of earn out or the shares are locked up in some way.

Speaker 2:

But there's a bunch of creative accounting that can go into basically delivering someone what feels like a $100,000,000 of value, which is not far off from, okay. What what would it take? You know, Johnny, I've got, you know, a huge offer to go to OpenAI. And it was structured in an interesting way with, like, an acquisition and, you know, but he got well,

Speaker 1:

we He's got a few points.

Speaker 2:

Yeah. We were it was a few it was a few percentage points. And when you think about the scale of OpenAI and the scale of Johnny Ive's legacy, like, yeah, like, what would it take to get someone like that's joining your he's a founding designer, basically, so it's a couple couple percentage points. So multiple sources of OpenAI with direct knowledge of the offers confirmed the number to Wired. So I don't know because it it it kind of benefits like both sides to to have that number out there.

Speaker 2:

It's kind of unclear who would who would really wanna confirm that and then what the nature of

Speaker 1:

these things were. The other thing that's though Which is interesting. Is if you're working at OpenAI, and let's say their top researchers have on average a $100,000,000 in OpenAI shares that they're Yeah. Yeah. Like the truly top people.

Speaker 1:

Yeah. And then you get an offer that's like a $20,000,000 signing bonus, but it's, you know, it's kind of paid out over time. Mhmm. And that or or they can claw it back if you don't stay past a certain date. And you're kind of looking at it and be like, okay, I'm already a Centy, and I'm getting an offer to like go to a new team that I don't know if I'm gonna like and it's and it's unclear.

Speaker 1:

The vision is not quite as clear at Meta. Mhmm. Right? I'm joining the super intelligence team. It's gonna be an incredible group of people with a ton of resources.

Speaker 1:

That's exciting. But it is a big switch if you're already happily vesting out tens of millions of dollars. So, it would make sense that these $20,000,000, you know, if it's $20,000,000 Yeah. I can see that. If it's $100,000,000 for the for the top, top, top people, it makes total sense.

Speaker 2:

Yeah. I mean, like, what was the I mean, the the deal to bring on Alex Wang was probably, like, way over a $100,000,000 to him personally based on his ownership in that company, which was acquired. And so, like, that number doesn't feel impossible. I would just be surprised if there's not some structure around it based on based on what Dylan was was talking about.

Speaker 1:

There's just enough precedent here that, again, going back to Yeah. Zuck rating Google

Speaker 2:

Yeah.

Speaker 1:

In in twenty ten, twenty eleven, he was offering tens of millions of dollars in stock packages to mid level engineers just because he was like, I need a really good ad product. Yeah. I wanna make my ad product great.

Speaker 2:

Yeah. I mean, have you seen the why am I even asking this? Have you seen the BlackBerry movie? Of course not. But the but in the BlackBerry movie, there's this whole story about they go and and they try and poach someone, and then it was controversial because they, like, backdated some stock options for someone to basically give them more, like, cash, but without having, like, the the the initial impact of that cash.

Speaker 2:

It's kinda interesting. But, yeah, I mean, to poach top talent, it's expensive. It kinda benefits both maybe. I mean, I don't know. Like, on the meta side, do they want does Zuck want the idea that he's willing to pay up to a $100,000,000 out there?

Speaker 2:

Like, maybe because that that certainly leads to more people taking the phone call of like, oh, wow. Mark is really taking this seriously. I should I should talk to Meta's recruiting team. I should actually hear them out. And then if they wind up coming back to me with a $50,000,000 offer or a $10,000,000 offer, like, you know, we can discuss that and I can see it and and they can evaluate how much I'm worth.

Speaker 2:

They can make an offer, but at least it got the conversation started. But I don't know. There was also the theory that Sam Altman put that $100,000,000 number out in order to kind of, like, poison the well. Because if if Mark comes to you and says, hey. You're a top AI AI researcher.

Speaker 2:

Come over here. And then the top AI the AI researcher says, like, yeah. Great. I hear I hear a $100,000,000 is the going rate. And Mark's like, no.

Speaker 2:

No. No. That's, like, fake news that that Sam was spreading. Like, we're actually paying 15. Then you're like, well, no.

Speaker 2:

Like, I was I was hoping for a 100. But it's one of

Speaker 1:

us both. Machiavellian people and

Speaker 2:

Yeah. You see, could be like 40 chess on both sides. I don't know. Yeah. Over the past month, Meta has been aggressively building out their new AI effort and has repeatedly and most unsuccessfully tried to recruit some of our strongest talent with with, comp focused packages Chen wrote on Slack.

Speaker 2:

A source close to the efforts at Meta confirmed the company has been significantly ramping up its research recruiting with a particular eye toward talent from OpenAI, Google, OpenAI and Google. Anthropic, while also a top rival, is thought to be less of a culture fit at Meta. One source tells Wired. They haven't necessarily expanded the band, but for top talent, the sky's the limit. Which is funny because, yeah, of course, like, Anthropic's, like, so missionary driven.

Speaker 2:

Like, it's gonna be very hard to to to go to Anthropic and research scientists and be like, ads. Better ads. They're like, we're building God here.

Speaker 7:

Yeah.

Speaker 1:

Anyway. $100,000,000 is a $100,000,000.

Speaker 2:

Not for not for anthropic guys. Anthropic guys are true believers. They don't care. What's a $100,000,000 in a post scarcity world? Doesn't matter.

Speaker 2:

You wanna be on the you wanna be on this on the on the post scarcity train. Chen's no

Speaker 1:

is so it is so telling, though, like, you how you, like, sort of, like, try to, you know, pinpoint, like, Zuck's views on AI. He's like, I'm not even gonna bother with those people. Anthropic. Not a culture fit. Too AGI pilled.

Speaker 2:

Yeah. That's great.

Speaker 1:

Senator, we sell ads.

Speaker 2:

Anyway, if you're looking for sales tax superintelligence, go to numeralhq.com. Spend less than five minutes per month on sales tax compliance with superintelligence for sales tax.

Speaker 1:

The official sales tax Superintelligence. Automation provider for Lucy.

Speaker 2:

Yes. That's true. Chen's note included messages from seven other research leaders at the company where they wrote notes to staffers in an apparent effort to encourage them to stay. One leader on the research team encouraged staff to reach out if they received an offer for Meta. If they pressure you or make ridiculous exploding offers, just tell them to back off.

Speaker 2:

It's not nice to pressure people and potentially the most important decision. Wired is not naming the leader as they are not a C suite executive. This is weird. Where's the where's the end of this quote? That's odd.

Speaker 2:

I'd like to be able to talk you through it, and I know all about their offers. The remarks come as OpenAI staff grapple with an intense workload that has many staffers grinding eighty hours a week. OpenAI is largely shutting down next week as the company tries to give employees time to recharge. Let's give it up for OpenAI's grind set. It's really fantastic.

Speaker 2:

Also, extremely American to give a full week off right around July 4. Yeah. Like, I'm normally pretty anti taking time off, but if you're gonna go really hard on a particular holiday, make it July 4. You know? Yeah.

Speaker 2:

Give a week off around July 4. I love it. I was I I I was posting, you know, the over I need a poly market on, whether or not we get a Zuck American flag video on July 4. Do you think it'll happen?

Speaker 1:

I actually think they're a 100% should be a poly market on that.

Speaker 2:

Right? It's kind of fifty fifty. Right? Like, he is definitely locked in, very busy, probably not a lot of time off. He sees that he knows from this that OpenAI is down this week.

Speaker 2:

So that's probably, like, really encouraging to be like, I wanna go harder this week. They're off. Yeah. My my At at the same time

Speaker 1:

biggest like,

Speaker 2:

pieces of content he

Speaker 1:

puts Think about how hard Zac is going right now. Go harder. Of July 4. And just don't get outworked by him. Yes.

Speaker 1:

Just don't get outworked by Zac this week. It's gonna be tough.

Speaker 2:

He's in year 20. What excuses do you have? You're you're three months into your startup.

Speaker 1:

Yeah. Come on. Go harder. Go harder. I love it.

Speaker 1:

Yeah. Meta knows we're taking this week to recharge and we'll take advantage of it to try and pressure you to make decisions fast and in isolation. Another leader at the company wrote. If you're feeling that pressure, don't be afraid to reach out. I and Mark are around and wanna support you.

Speaker 2:

While OpenAI's leadership is taking Meta's efforts seriously, Chen also said that the company is getting too caught up in the cadence of regular product launches and in short term comparison with the competition. The sentiment is backed by a former OpenAI staff who worked closely with Altman and this and said the CEO wanted to see buzzy announcements announcements every few months. This was a really this was a key to their strategy was like,

Speaker 1:

oh, cool. Buzzy announcement ready if the competition has anything. Just drop a

Speaker 2:

Oh, Google IO is happening. Wouldn't it be crazy if we acquired a company called IO the day before or something like that? Like or like, oh, Google's announcing the new Gemini. Let's just steamroll them in the timeline that Oh, deep sea.

Speaker 1:

I'd love to introduce you deep

Speaker 2:

research. Like, I mean, it's it's masterful. And like but that's the game you gotta play. That's the game on the field. Like, I I I think No.

Speaker 2:

He he really is best. But he's

Speaker 1:

not doing anything

Speaker 7:

wrong. He

Speaker 1:

is up there with Elon in terms of creating and riding hype and constantly delivering enough

Speaker 2:

Yes.

Speaker 1:

Real product value Yes. To justify the hype that that was being sold Yep. You know, six months ago, basically. So, like, staying on the bleeding edge.

Speaker 2:

And this is why the anti tech people hate hate him and Elon so much because they're like they're like, it's overhyped. It's not useful. And then, like, a year later, it's like, there's Tesla cars everywhere. The rockets are landing and, like, 90% of people are eating every day. Tens of billions of vehicles monthly.

Speaker 2:

Okay. So it was real, but also he was overpromising, but he delivered at a at a higher level than anyone in human history. And that's just like, it's those two things are in in in conflict. So someone said this, like, Elon is both Thomas Edison and Barnum and Bailey or whatever, BT Barnum. Yeah.

Speaker 2:

He's a circus man who can put on a show and put a robot in a human in a suit acting like a robot, but then he can actually go and deliver the thing, which is just it it just breaks people's brains. Anyway, we need to remain focused on the real prize of finding ways to to compute. A lot more supercomputers are coming on later this year into intelligence. Chen wrote, this is the main quest, and it's important to remember that skirmishes with meta are the side quest. Last but not least, I'll be around this week recharged and ready to go pound for pound.

Speaker 2:

DM me anytime. I love it. He's ready to fired up. Get him in the ring.

Speaker 1:

I love talking with Mark. Yeah. That was a that was a fun conversation.

Speaker 2:

Also, I mean, guys in shape. Let's get him in the let's get him the octagon with Zuck. Mark on Mark violence. Let's see it.

Speaker 1:

I mean, if if there was a pound for pound list for the top AI researcher in the He's world in the conversation.

Speaker 2:

He's in the conversation. It's amazing. It's really been amazing to watch Mark's leadership and integrity through this process, especially when he has had to make tough decisions, Altman wrote in on Slack in response to Chen's message. Very grateful to we have him as our leader.

Speaker 1:

I mean, these it can't be overstated making these decisions, right? Yeah. One of your best people comes to you and says, hey, I'm really happy here at OpenAI.

Speaker 2:

Yeah.

Speaker 1:

I have $80,000,000 of stock.

Speaker 2:

Yeah.

Speaker 1:

And I was just offered a 150 over the next year and it's highly lick or over the next few years. It's highly liquid. Yep. Meaning that I can market sell it quarterly forever.

Speaker 7:

Yeah.

Speaker 1:

And meanwhile, you're going through this for profit conversion.

Speaker 2:

Do you think the liquidity thing is real at all?

Speaker 1:

I mean, obviously What

Speaker 2:

you I mean, like, if you want to let's say you want to buy a $50,000,000 house and you have a $100,000,000 in, like, illiquid OpenAI. Like, is that gonna be hard at all? I feel like Goldman Sachs and Morgan Stanley, private wealth management are gonna be beating down your doors to give you a loan that's backed by the shares. Sure. You're only gonna have to put down 10%.

Speaker 1:

Sure. But but meanwhile, the company's failing to go through conversion.

Speaker 2:

Yeah. But I mean,

Speaker 1:

Satya's taken 20% off the top.

Speaker 2:

I mean, that applies.

Speaker 1:

No. I'm just I'm just

Speaker 2:

It saying applies like a a small haircut, I I would say.

Speaker 1:

I'm just saying there there is always

Speaker 2:

Yeah.

Speaker 1:

There is always going to be a tension of somebody having massive wealth but not having greenbacks. Yeah. There's always going to be that tension. Yeah. And people seeing that that Meta is one of the most liquid stocks in the world.

Speaker 1:

Yeah. Can constantly trade in and out of it. And it's ultimately just it's stable. Yeah. So I mean, you're

Speaker 2:

interested in liquid stocks, go to public.com. Investing for those who take it seriously. Multi asset investing. Lead industry leading yields.

Speaker 1:

Fashion stocks.

Speaker 2:

Trusted by millions. Who knows? Maybe they'll have open AI soon. Anything's apparently

Speaker 1:

Anything's possible.

Speaker 2:

You yeah. We should we should you know, Robinhood has, you know, the the tokens around private companies. Maybe we should tokenize nonprofits. I wanna be able to trade, you know, PETA. You know?

Speaker 2:

Let me let me let me go long PETA. Let me get some let me get some leverage on that. I want some Leverage please. Levered options on PETA. Look at Tyler right now.

Speaker 1:

Get on the Tyler cam.

Speaker 2:

How how are we doing

Speaker 1:

on Tyler? I

Speaker 3:

think I'm in the right mission.

Speaker 2:

Oh, you did it? No way. Fifteen minutes left. Let's see it. Okay.

Speaker 2:

I can't believe you got it set up so fast. That's amazing. Okay.

Speaker 1:

That's honestly

Speaker 2:

He's sitting at he's sitting at twenty six minutes so far.

Speaker 1:

Alright. Well, have fun in there, Tyler.

Speaker 2:

Just look over and I just didn't realize he was on It's amazing. I think he's actually playing it. This is crazy. Because my number one complaint with the Quest was that it just takes a long time to set up. And I think that it can't cause a

Speaker 1:

lot of intentionally did this challenge and made it seem like, oh, I'm going to give away my new VR Xbox. Yeah. And thinking that there's no way he could finish it in

Speaker 2:

five nights.

Speaker 1:

But now now he's

Speaker 2:

I think he's going do speedrunning. Speedrunning. Pillar of Autumn. It's happening. Anyway, Swiggs has the story.

Speaker 2:

He's breaking it down.

Speaker 1:

Yeah. Some good highlights here. The Mark Yeah. We've Mark War has resulted in going from, oh the good people don't leave to someone has broken into our home and stolen something. One week company shut down, question mark, question mark.

Speaker 1:

I mean, I was going back and forth here. I I wouldn't be surprised if the shutdown was somewhat strategic. Yeah. And that, hey, people already need a break. We shouldn't have them walking around the office.

Speaker 2:

It it with the way they were writing it, it felt like this had been on the table for months because the company has been working really hard launching a ton of stuff. Yeah. And it seems like if it it it just seems like an odd an odd choice to if you're in the middle of this mark v mark war, that that would be what you choose to do would be a one week company shutdown, but I don't know. Maybe. I I I I I I'm just I'm not a 100% that, like, the the the AI talent war has resulted in the shutdown.

Speaker 2:

It's more just like the shutdown is happening at a weird time. Anyway, this third point is very interesting. Pivoting product launch calendar to AGI race, especially when Stargate comes online. Dylan Patel says December is when Stargate's coming online, and that should be a big like, the biggest the biggest cluster ever and unlock even even more maybe pretraining scale, but there's a question about how how important that is and maybe you can do more reinforcement learning in a more distributed way. So I don't know.

Speaker 2:

Also, like, it's not exactly like Zuck is GPU poor. So I don't know I don't know how big of a deal it is, the Stargate thing. We'll see. But number four, for some reason, Meta is ignoring Anthropic. I don't care about safety, if this is true, of course.

Speaker 2:

The Anthropic point was very funny. Anyway, there's other posts in here that we should cover. Satwik Singh says, what Sam never understood was that Zuckerberg torched 20,000,000,000 over VR headsets with Nintendo Wii graphics and didn't even flinch.

Speaker 1:

Didn't even flinch.

Speaker 2:

You know what's so funny? Like, those the the Nintendo Wii graphics thing, that was like that was, like, three months. And then after that, it was he was doing that that Lex interview in the metaverse. Did you ever see that? Where it's like, it is, like, photo real immediately.

Speaker 1:

We got a wild demo at MetaHQ Yeah. Of all the new products, and we were blown away. I'm not even sure. I don't think we can

Speaker 2:

say much about in, like, the abstract. But, but, yeah, I mean, none of that was, like, Nintendo Wii level graphics. But Yeah. But but, I mean, that that was that one picture was, like, iconic and kind of, like, it felt like, oh, this is kind of a step back in terms of

Speaker 1:

Well, many more people saw that picture than actually use the product. Yeah. And so

Speaker 2:

Yeah. Because there were plenty of there were plenty of VR games that didn't have Nintendo Wii graphics. Like the hero game for the for the Quest when it came out was Robo Recall, which looked super sci fi and just looked awesome. And it was it was it was great. And there were a bunch of other games.

Speaker 2:

But, yes, I mean, the point is is that, like, Mark is totally ready to spend $20,000,000,000 on a big project if he thinks it's valuable. And here he clearly does, and so he's going after it. Anyway, similar founder led company that's building amazing AI stuff. Finn, the number one AI agent for customer service. We've had Owen on the show before.

Speaker 2:

He's been on

Speaker 1:

Number one in performance benchmarks. Number one in Bake Offs. Number one

Speaker 2:

on g Let's go. So if you're looking to do customer service, head over to Finn.

Speaker 1:

I'm saying this. They're not saying this, but I'm gonna say this is the official the official AI CX tool of a both AI safety and being AGI PILT. Okay. Because Anthropic uses

Speaker 2:

Oh, really? No way. There we go. Okay. Awesome.

Speaker 1:

So

Speaker 2:

Yeah. I'm sure Anthropic will love that. They're like, we said you could put our logo on your website. We didn't say these the the technology's data is

Speaker 1:

too fast. Could say it's

Speaker 2:

the official over their skis on this endorsement.

Speaker 1:

The official CX agent.

Speaker 2:

Anyway, Rune had an interesting take here diving into the the why the these trade deals are so not really trade deals, these poachings can be so can be so damaging, and he's trying to put it in broader consumer terms. We're gonna debate whether or not his point is well taken. He, of course, works at OpenAI and has for a number of years. Rune says, intellectual property by default is a market failure. Single, well informed, talented defectors can walk away from organizations with billions of dollars of tacit value of knowing what works and what doesn't.

Speaker 2:

The athlete metaphor is somewhat wrong in industries where patents and such aren't viable market participant aren't viable, market participants will drastically under invest in r and d efforts due to IP seepage just like the I drink your milkshake milkshake effect of oil seepage and land rights, there is a compelling pro consumer case for non competes. Interesting. Very rare that you hear a tech person argue in favor of non competes. Interesting. I mean, the the the immediate impact of non competes in this context would be dramatically lower wages for AI researchers like Rune, which is interesting.

Speaker 2:

The other question is

Speaker 1:

Yeah. It's interesting. You'd have to move AI out of California.

Speaker 2:

Oh, people would then. May maybe. Or they or they

Speaker 1:

I'm just saying I'm saying if you really want to use noncompletes Yeah. Yeah. Yeah. Go where they can be enforced. Right?

Speaker 2:

Yes. But then as a

Speaker 1:

historically

Speaker 2:

impossible. As an AI researcher, you'd say, I definitely don't wanna leave California because I I'm gonna get a much better

Speaker 1:

school deal here.

Speaker 2:

Yeah. My my my question is, like, I've always been interested in, like, why can, you know, like, the consumer packaged goods industry, like, patent, a tab on a can for twenty five years and, like, Walt Disney can own, like, the shape of mouse ears for, like, eighty years, but, like, Google can't own the transformer for, like, even a minute. Like, I get that it's, like, some sort of fundamental discovery, but I've always been interested in how little IP works in technology. So that's

Speaker 1:

is is a byproduct of hacker culture

Speaker 2:

The culture?

Speaker 1:

Sort of like positive some culture of Silicon Valley.

Speaker 2:

Yeah. But you think that would change if there were tens billions dollars on the table. Yeah. Right?

Speaker 1:

Yeah. I mean, there's a culture of, you know

Speaker 2:

So I agree with you, and I'll give you an example. So so pull to refresh on the iPhone was created by a Twitter employee. Like, a Twitter UX UI developer built that built that user interaction, And then Twitter had a patent on it.

Speaker 1:

Lauren Brickter. Yes. American soft software developer Yeah. Best known for creating

Speaker 2:

pull to refresh.

Speaker 1:

Yeah. The pull to refresh interaction.

Speaker 2:

So we created that that that design pattern, and then he had a patent on it. Twitter owned the patent, and Jack Dorsey and the Twitter crew were libertarian and said, you know, we're not gonna enforce this. And then it got baked into Instagram, and it's it's now natively in iOS, and it's in in all of, you know, it's it's basically like one

Speaker 1:

button to, like, improve. If you're true if you're truly terminally online, you're using it hundreds of times a day. True.

Speaker 2:

Like, little sloppy. Now we've actually moved past that. We've moved into endless scroll, where you actually don't need like, you're on TikTok, you're never pulling to refresh. You just keep scrolling to the bottom. So so we're actually post pull to refresh, and now we're in the endless scroll.

Speaker 2:

But endless scroll, I believe, was developed by Pinterest, and they had a patent on it for a while, I think. I might be wrong here. But Yeah. But but endless scroll was also something that was not able to be owned, which is just odd to me. And then and then there's the other question of like Coca Cola has been able to keep the the the, you know, the formula as a trade secret for, like, a hundred years or something.

Speaker 2:

And it's just odd that that tech companies can't seem to do either. They can't keep the secrets.

Speaker 1:

You can have trade secrets. Right? It's not like an OpenAI researcher can go over and say, here's here's a bunch of code. Let's roll let's roll it out. Right?

Speaker 1:

Yep. It's basically their the the sort of way of doing things that they've internalized from being at a company

Speaker 2:

Yeah.

Speaker 1:

For a certain amount

Speaker 2:

of value of knowing what works and what doesn't. And so the my my, like, naive read on this is that if you had had a top tier OpenAI researcher at Meta during the LAMA four build out, that researcher would have said, hey. Don't focus on pre training scale as much. We need to get a reasoning model out, and we need to focus on RL and post training much more. Yeah.

Speaker 2:

Much more. And and when you talked to Dylan Patel, I was asking him about DeepSeek and how DeepSeek was was was doing all these innovations and, like, moving to floating point eight, like, smaller smaller basically, smaller numbers storing the weights. And he was like, oh, yeah. But OpenAI was doing that, like, two years ago. DeepSeek just open sourced it.

Speaker 2:

And so it's like the world found out If you found out the best practices from deep seek because they open sourced it, like, you were not on the cutting edge and you should have been poaching AI researchers from the top labs earlier.

Speaker 1:

So two things could be Yeah. Could be happening here. One, Zuk is right to just go and overpay

Speaker 2:

Yeah.

Speaker 1:

And and try to recreate some of the magic and and get a, you know, leading a leading model or or set of models out of it that he can use in a bunch of different ways that we've discussed. Or none of this matters and Rune is is baiting baiting Zac into just overpaying. This is like The four d chess is like, oh, yeah. This is they're getting everything. They're taking everything.

Speaker 2:

Yeah. Yeah. Yeah. No. No.

Speaker 2:

I I I don't I don't I don't necessarily believe that because I I did talk to Rune, like, years ago, and I was like, but, like, is this gonna commoditize? Like, is the foundation model layer gonna gonna commoditize? And he was there like, there are like, the secret to OpenAI in many ways is, like, the Thielian secrets. Like, they understand things at a lower level that other that other people just don't, and so that will compound and stay at advantage. And he hadn't even predicted the the dominance of the consumer and, like, the aggregation theory that happens when you're the front end consumer AI.

Speaker 2:

That's incredibly valuable. But just on the research side just on the research side, like, they can stay ahead just by understanding, like, the secrets of what works and what doesn't. But it's interesting. Anyway, let's tell you about Adio, customer relationship magic. Adio is the new is the AI native CRM that builds, scales, and grows your company to the next level.

Speaker 2:

Next level. Next level. It will I mean, we, you know, we we we have a light guest booking today, but, you know, we're we're spending an hour talking about one story because it's fun stuff. We can continue. We can we can do the go over to the Wall Street Journal because in the exchange section, there's a fantastic article related to this.

Speaker 2:

It says, it's known as the list and it's a secret file of geniuses. And I'm looking at this. It says Mishka Belenko, Yu Zhong, Mark Zuckerberg's been reviewing this list. Lucas Byers on this, the new guy. It says Jordy Hayes, John Coogan, Tyler Cosgrove, Ben Kohler, couple other folks.

Speaker 2:

But, you know, it's just the usual people that we know. Anyway, it's known as The List and it's a secret file of geniuses. Only select AI researchers have the skills for the hottest area in tech. Mark Zuckerberg and his rivals wanna hire them even if it costs ungodly sums. We love ungodly sums.

Speaker 2:

We should be ringing the gong more this this this episode. All over Silicon Valley, the brightest minds in AI are buzzing about the list. A a compilation of the most talented researchers and engineers in artificial intelligence that Mark Zuckerberg has spent months putting together. Lucas Beyer works in multimodal vision research, which Kyle broke

Speaker 1:

down for us. Can can you imagine if if if Zach spun up, like, basically, like, a Hot or Not style tool that just, like, pins pin two researchers together. Yeah. Like, makes other researchers, like, just do the test. And then just puts out a list of like, these are the this is the the top 50 people.

Speaker 2:

Yep. Yeah. This is so good. So Lucas Beyer describes himself as a scientist dedicated to the creation creation of awesomeness. Yuzhong's specialist Reddit coded.

Speaker 2:

But he's a poster on X as well.

Speaker 1:

Yeah. Of

Speaker 2:

course. Yujang specializes in automatic speech recognition and barely has an online presence besides his influential papers. Misha Belenko is an expert in large scale machine learning who also enjoys hiking and hill climb hiking and skiing, or as he puts it on his website, applying hill climbing search and gradient descent algorithms to real world domains. That's very cute. The recruits on the list typically have PhDs from elite schools like Berkeley and Carnegie Mellon.

Speaker 2:

They also have experience at places like OpenAI in San Francisco or Google DeepMind in London. They're usually in their twenties and thirties, and they all know each other. They spend their days staring at screens to solve the kinds of inscrutable problems that require spectacular amounts of computing power, and their previously obscure talents have never been so highly valued. The chief executives of tech goliaths and heavyweight venture capitalists are cozying up with a few dozen nerdy researchers because their specialized knowledge is the key to cashing in on the

Speaker 1:

artificial It is funny. Some of the losers out of this talent war are just VCs because how do you compete when you know, historically you could have gone to some of these and really overpaid to invest? You could have come to them with a ridiculous, what would be very ridiculous for a VC which is I will spin up a new company. Not ridiculous, it happens, but it is kind of ridiculous in some ways. Spin up a new company, I'll give you 200 on a billion Mhmm.

Speaker 1:

And you can take 10,000,000 personally and secondary

Speaker 2:

Yeah.

Speaker 1:

Immediately. Yep. And that is no longer even that compelling of an offer. No. If the odds of actually winning are low and you just want the most access to compute and be Trading.

Speaker 2:

Will is in the studio. Welcome back to TVPN. How are you doing?

Speaker 1:

How's it going?

Speaker 2:

Congratulations on the trade deal.

Speaker 1:

So coming back. Yeah.

Speaker 2:

What supercar did you buy? Did you go Bugatti Chiron? Or you go f 80? I mean, we we hear that the the numbers are huge right now for folks like you.

Speaker 4:

I mean, I'm taking a plane more.

Speaker 2:

So, like Okay. That was I mean, yeah. Yeah. Lot of luck. Yeah.

Speaker 2:

Yeah. Yeah. People say, you know, fly you know, we wanted flying cars. We got

Speaker 1:

high 48 characters. Fact that you would imply that Will drives himself

Speaker 2:

Yes. Is offensive. Defensive. I mean Yeah.

Speaker 4:

Waymo. Waymo for the first time recently. I'd like I'd been to like S F, but like, I hadn't been around S F proper that much since the the Waymo craze took off. And so like, that's been, like, the the car experience really. Just like

Speaker 2:

Sure.

Speaker 4:

Self driving car. Stand.

Speaker 2:

Yeah. So are you in SF full time now?

Speaker 4:

Still back and forth New York. I'm, like, kind of slowly starting to spend more time in SF.

Speaker 2:

It's on brand. You're decentralized. You're decentralized with AI. It's it's it's a perfect fit. Perfect fit.

Speaker 2:

Anyway.

Speaker 1:

Did you touch grass at all this weekend, or were you locked into the timeline?

Speaker 4:

It's a lot of stuff was happening on the timeline. I did I mean, I'm in New York right now. I did some New York stuff on Saturday. But, yeah. I mean, it's been it's been good.

Speaker 4:

Lots of drama, intrigue as usual.

Speaker 2:

It's a drama.

Speaker 4:

The trade deals. I know I've been tuning in to earlier. I know you guys have gone over all the the major roster updates. But, yeah, it's exciting times. OpenAI, open model soon, hopefully, but not this week.

Speaker 4:

Yeah.

Speaker 2:

So yeah. I mean, the first place I wanna start with is, the actual trade deals. Did it do any of these people jump out to you as particularly interesting? Have they been on your radar? Have you met any of these folks?

Speaker 2:

Can you give us more context on how how academic are they? How product oriented are they? It does it's it feels like this is purely research, you know, beefing up of the research team, and Meta kind of has product solved. Is that the right model to think about this? What how how can you characterize Yeah.

Speaker 2:

Or even shape of the team that Zuck's building?

Speaker 1:

Even there's thousands of really great software engineers that work in AI.

Speaker 8:

Yep. Sure. Sure.

Speaker 1:

And, but what, you know, even kind of getting a sense of like how many people was was Zuck really going after? Was it 250? Was it a 100? The list that came out today is, you know, somewhere I don't know, there's like 20 people on it, something like Right.

Speaker 2:

Right.

Speaker 4:

Yeah. I mean, it's a great list. It's a lot of very senior senior researchers who know who have been through some real product cycles who I think the goal is people who can bridge the gap from research to product. Because, like like, one thing Meta has a strength of, but I think doesn't translate in, like, this era is they have a lot of really good, like, academic researchers. Like, Jan Lakun's whole org has a lot of very talented, very capable researchers who write important papers, it's the kind of stuff that doesn't really turn into exciting products.

Speaker 4:

It's like more betting five or ten years out how do we wanna do things eventually versus, like, going through a pretraining cycle for a Gemini or a Claude or an OpenAI model. Like, that has a different set of considerations. Sure. And so this to me looks like they want, like, ten, fifteen great people who have been through those research to product cycles in terms of the model being the product. Yeah.

Speaker 4:

And it seems like they probably like, they got Alex Yep. And then supposedly, Matt and or Daniel to kind of handle a little more of the not the business side, but product y business stuff.

Speaker 2:

Yeah. It's a management role.

Speaker 4:

Yeah. Yeah.

Speaker 2:

Yeah. What what is your take on Yan Lakun right now? How should we think about his position in the AI world? There's, I've I, you know, I see these memes of, like, he's a non believer. He doesn't believe in, like, super intelligence and, like, AI god.

Speaker 2:

He's a little more practical. It seems like he's been he's been right about some things in the sense that, like, we haven't had we we're not feeling a fast takeoff right now. So I think you gotta give me some credit for that. At the same time, like, there's Ben Thompson's been criticizing on Lacuna a little bit for not driving the the product side of the business. It's like, it's one thing to say that, hey.

Speaker 2:

We're not gonna go straight shot to AGI god, but then the the the downstream implication of that is that if you believe that we're gonna be living in, like, b to b SaaS world of AI implementation and productization, then, like, go do that and, like, make sure that the models work really well for business. And so I don't know. What what is your take? How how would you how would you describe Jan Lakun in his history and kind of where he sits in the organization to, like, a layman?

Speaker 4:

Sure. Yeah. So I think people over index on thinking of Jan as the meta guy, like, too much. Sure. I think it's a shame that they didn't necessarily have someone in the llama org who was as visible as him because he was not involved with llama at all.

Speaker 4:

Like, fair fundamental AI research is like a whole other group that they mostly write papers and do very academic work. Yan Lakun's work is very academic. And it's really like it's I think he's like I think on one hand, he is right in the sense of we don't have the full picture yet. We don't have full on AGI that can do a human's job forever. And I think that's kind of the drum he's beating is, like, there's more work to do.

Speaker 4:

There's more science to do. It's not just productize the current set of techniques Mhmm. In terms of, like, the end of AI forever. But his goal is not to do the productization. Like, he like, the org he's in at meta is really isolated from product.

Speaker 4:

He's there to kind of boost the reputation of the org academically as well as to be able to potentially advise on other stuff that's more productive. But his job is not to, like, go train a business model. To train a model that is going to be used by businesses and, like, I think people like, that org fair, the friends I have who are there, like, are there because they essentially wanted to be a professor, but they want tech company resources and not not to teach. And that is essentially what that org is.

Speaker 2:

Yeah. Does that lead to better recruiting on campus, essentially? Like, you can go get researchers because Jan Lakun can come do a a a really powerful lecture.

Speaker 4:

Maybe. But I think of it more as, like, the same way that Zuck did metaverse stuff. Zuck is very willing to make ten year bets.

Speaker 2:

Yeah.

Speaker 4:

And so that org for meta is not about what product are released next quarter. It's what are our ten year bets.

Speaker 2:

Yeah. Okay.

Speaker 1:

I wanna What about sorry. Yeah. You can go MetaShares just hit an all time high. Really? So the stocks the stocks performing.

Speaker 1:

I'm I'm curious from your time at Morgan's Thank you for that, John.

Speaker 2:

It's incredible. Yeah. That is that is incredible.

Speaker 1:

But but generally up 5% over the last week. Wow. What how did Wall Street generally sort of like process the like meta AI story? Because now, obviously, at least the market seems generally excited. It may get to the point where we did with VR where people are like, hey.

Speaker 1:

Woah. Woah. Cool it on the metaverse. You know?

Speaker 2:

Well, it's funny because, like like like, you add all you add all this up. Even if the $100,000,000 offer number is real, it's like, okay. So now we're at, like, $1,000,000,000 in spend. It's 5% of the metaverse hole that was burning. And it's like, you know, from a Wall Street perspective, you're like, I don't care about I I don't care about $1,000,000,000.

Speaker 2:

I care about $20,000,000,000. Because that was what was weighing on the stock during the metaverse build out. But this is like AI is so much more productizable. You can make so much more money from it upfront. And it seems like even though it's a big number, a $100,000,000 offer, it feels like the spend is maybe less.

Speaker 2:

But I don't know. What do you think, Will?

Speaker 4:

Right. Yeah. I mean, I think the I don't know the context fully of the 100,000,000. I know people have all these theories, but, like, the the Scale AI numbers we do know.

Speaker 2:

Yeah.

Speaker 4:

And, like, that that's a big chunk of their, like I don't know. They don't do those every day. Yeah. But they've done a handful in the past at that scale. Like WhatsApp was 20,000,000,000 or something like that.

Speaker 4:

And I mean, it's hard to read into like the the charts. Like,

Speaker 6:

I Yeah.

Speaker 4:

I think people are generally like expecting that Meta will take AI seriously and are kind of happy to see change. Whether that is like justified or not, I mean, we'll see.

Speaker 1:

Yeah. It was interesting George.

Speaker 4:

Product side is gonna be interesting. Because like Yeah. They're not a b to b SaaS company.

Speaker 1:

Yeah. So They're an company. Yeah. I think think that I I don't I don't think that people are are fully understand the potential of Gen AI around entertainment. Like it gets talked a lot around, oh, you're gonna be able to generate, you know, an entire movie or generate video games or things like that.

Speaker 1:

And I and I think that we haven't seen, we've seen some fantastic examples so far but nothing nothing that that is I I think like George Hotz was on our show and he was talking about how like basically AI is gonna be like having five CIA agents follow you around all the time convincing you to buy products. And like that is like one kind of dark bold case for

Speaker 2:

Dark bull case.

Speaker 1:

Dark bull case for meta in the context of AI because it's like, it's possible that that Facebook is already the best ad advertising product in human history. Yep. Like period hands down, there's nothing And then could you make it, like, two, three times Yeah. Better? It's very possible.

Speaker 2:

Yeah. Yeah. I wanna push back on the b to b thing because, yes yes, they don't sell b to b software, but I was thinking about it in this terms. Like, if you if you were running a company where your entire you only had one client and that client was Meta, and you sold them CRM and infrastructure and LLMs to improve the back office and do, you know, censorship and and, you know, and and reality checking and and looking for bad words and looking for improper posts and recommendation algorithms. Like, how much would that company be worth and how much revenue would they be making just from that one client?

Speaker 2:

And I think it's in the billions because it's just such a large organization that it's potent it's potentially like just the b to b applications of AI inside of Meta are, like it's a multibillion dollar, like, cost center or revenue driver or something like that. I don't know.

Speaker 4:

I think a company at Meta scale has already rewritten all that stuff themselves for the most. Like, they probably have some services Yeah. That they're using, but, like, they could they rebuild their own cloud? Like, they use AWS for a while.

Speaker 2:

Yeah. Yeah.

Speaker 4:

Yeah. Can they're not a compute company though. And in terms of like platform stuff

Speaker 2:

Yeah.

Speaker 4:

I feel like those companies, once you hit a certain size and you just like have so many engineers that you want everything like Google rewrites all their own stuff. Yeah. Microsoft rewrites all their own stuff. Mhmm. And can they is there another like a tale of end of that that has not been done yet that they can squeeze some more money out of?

Speaker 4:

Entertainment will be interesting. So one thing that just I was just reminded of, have you seen the Italian brain rot videos?

Speaker 2:

Oh, yeah. Love them.

Speaker 4:

Yeah. Yeah. So, like, I know people it's, like, silly. It's stupid. But it there is something there in terms of this, like, communal narrative storytelling with a level of vibrancy that we haven't seen so far.

Speaker 4:

It's like Looney Tunes. But people just, like, come up with these where you could imagine Meta kind of eating a good chunk of the, like, video gen market, if they have an answer to, like, v o three, where that becomes part of the platform. It ties into the whole metaverse thing of like creating stories and sharing these stories with your friends. Mhmm. But instead of like Instagram stories, it's like, okay, what's what's the evolution of stories?

Speaker 1:

Yeah. Have you have you seen Higgs Field? No. Higgs Field AI, I think it's an ex Snapchat team.

Speaker 5:

Yeah.

Speaker 1:

Their new image model is basically already at a point where it's creating images that

Speaker 2:

Oh, yeah. I've seen this.

Speaker 1:

Photorealistic. But in the sent photorealistic

Speaker 2:

It looks like Instagram photos. Not like

Speaker 1:

Full time influencer generated it after spending hours trying And to create you can't tell that it's not real. So it it really feels like feels like meta integrating that kind of thing where it's like Yep. It it it'll be interesting to see how this actually plays out because everybody will be able to be a super tasteful creator or like generate these sort of unique styles. And I'm sure at that point, it'll switch. Everybody will be like, well, I only follow organic farm to table influencers.

Speaker 1:

Who knows? Yeah.

Speaker 4:

I feel like they their AI integration so far have all been kinda weird. Like, there's these Instagram accounts you can chat with, and then there was the whole Meta AI homepage disaster where old people were leaking personal info without realizing it. But Yeah. I guess Facebook's always had that. Yeah.

Speaker 4:

I think it'll they'll have to thread the needle on product in a way that will have they'll have get creative because they haven't. It's been a while since they had a product innovation that was homegrown that really sucked. Like

Speaker 1:

Yeah. It's it it is interesting that they haven't tried to create a Studio Ghibli moment by you're, like, going

Speaker 2:

I was saying this. They just pre render everyone's profile picture as a Ghibli. And just when you open the Instagram app, it's just like, here, we did this for you. It would be incredibly expensive from an inference cost, but then it's just like you could share it and it's just more virality around it. But I don't know.

Speaker 2:

And and, like, even baking that down into a filter. No. It can't

Speaker 4:

so, like, OpenAI branded now. They have

Speaker 1:

to go, like, they're not saying Yeah. I'm saying, like, an entirely new Sure.

Speaker 2:

But yeah. Yeah. Basically, like, the the the filters in the Instagram app should be, like, style transfer or, like, fully generative instead of just, like, color grades. That's clearly the next thing, and they should just and they should just do that. But what do you think of Sam Altman's ability to get through this?

Speaker 2:

Like, he's had a series of exoduses and seemed to continue to march on the anthropic departures, then SSI and thinking machines, x AI. Like, this is not the first time. First rodeo. It's not his first rodeo. And so there's this whole narrative, like, never bet against Zac.

Speaker 2:

He's been down on the metaverse, came back. He's been down on mobile with the HTML five thing, and he came back from that, went native, and super dominant and bought Instagram, WhatsApp, super dominant in mobile. And so never bet against Zuck, but then also maybe never get bet get bet against Sam. Maybe they both win and maybe, like, the real loser is, like, I don't know, some some other company or something. I don't know.

Speaker 4:

I think as long as OpenAI doesn't totally drop the ball on product or research, like, they have the center of gravity for the AI world. Like, in the same way that no matter what Android ships, people are not gonna switch from their iPhones. Like, it's something that have to go very wrong for people to not see OpenAI as the winner, I think. Or there's the kind of the default.

Speaker 1:

Yeah. I mean, the other thing is, the average ChatGPT user is not waking up today being like, oh, I can't believe OpenAI lost a handful of its top researchers. Right?

Speaker 8:

They're Right.

Speaker 1:

They're just still using ChatGPT as a Google replacement or they're talking with it as a companion. So, yeah, I think the lead is still very, very real. And and I just and again, it's gonna create opportunities for people to say, hey, I wanna go work on this product that hundreds of millions and soon billions of people use every single day.

Speaker 4:

Right. And I do think it'll get to a point, especially with, like, 2025 into '26, where there's a lot of product stuff that people seem like, think we're early on products, and we'll keep getting better versions of the same things in a way that's, like, kind of predictable. Like, the image models will keep getting better. The agents will mess up less. But, like, we can already use those things and we can start building proof of concepts around them.

Speaker 4:

And the question is, like, okay. What are the winning apps? Like, I I imagine that, like, the real time Cluelink kind of thing, some OpenAI is gonna do their version of that at some point. Other people probably will too. Like, does that become modality that people really want, like, live overview, on their screen, on their phone?

Speaker 4:

Like, what's the what's the way that people are gonna be interacting with AI generally? Because a lot of those use cases, like, it's they're already smart enough. It's not about making the models, like, way smarter or whatever. It's like, how do you have the model be useful or fun, engaging?

Speaker 1:

There's a bunch of paths here going back to the Cluelly thing where one, Roy is, like, right about the UX and like wins the market. And the other option is like he's right about the UX but like doesn't win the market. And then there's like, you know, it's just not the right form factor or whatever. But and and if at least the first two outcomes are hilarious in that in that Roy Lee ushers in the new the new paradigm of for for engaging with with language models. I'm curious any is all the stuff around every b to b SaaS player descending on the sort of like single interface like a a chat interface that generates software?

Speaker 1:

Was that predictable to you? Do you think that's do you think that that's like, part of a multiyear trend, or is that just FOMO?

Speaker 4:

I mean, like, Copilot was early. It was, like, 2020, 2021. Like, the first GPT three Copilot came out, and, like, that was already one of the early, like, LM applications that people were interested in at all. And then it took until Cursor for it to really like, I think Cursor plus, like, three five Sonnet was when it became a thing that was good enough that people were excited about it. And it really ushered in the trend because people were starting to find it more useful than a toy and, like, a thing that actually wanna use day to day.

Speaker 4:

And so I think that, like, that's one path is like and then the more background agent kind of things are like starting to take off now, which I imagine like those will get reliable enough that they're like useful for cranking stuff out. They already are kind of depending on what you're doing.

Speaker 1:

Yeah.

Speaker 4:

Like, I think yeah.

Speaker 1:

We're about halfway through the year. What what are what are kind of like the big moments that you're kind of tracking? The the OpenAI open source model could be one. Whatever meta launches next. Right?

Speaker 1:

I'm assuming they're gonna be dark until this new team can can really cook and and bring something great. It it may maybe they don't do anything this year, but I would assume they they come with with something. What else are you are you tracking?

Speaker 4:

Yeah. I mean, for me, the o three release in ChatGPT was like a pretty, like, game changer kind of thing. Where it was like we saw with, like, deep research that, like, okay. They kinda figured out how to make agents work, but it was also just like this one version of an agent. In o three, can kinda get it to be a pretty general agent

Speaker 7:

Mhmm.

Speaker 4:

Where it can like do some pretty complex stuff that was kind of new to see like, the geoguesser thing was crazy. Yeah. And having that as a vision of what AGI starts to look like, I think is pretty cool. Of course, from the research open source world, there was a deep seek as the RL craze taking off. But I mean, I work on RL, so I obsess over it and, like, think about it a lot.

Speaker 4:

But I do think we're really starting to see these recipes, at least in the broad strokes of, okay. Here's how the LM thing can go. We figure out what we want it to do. We give it some tools. We set up these environments.

Speaker 4:

We figure out who how to evaluate it, and then we can just kinda, like, let it go, and these things get better at doing those things via trial and error. And so, like, I I, like, I think that is one way to kind of forecast where things are going. It's just like, what are the plausible use cases that people wanna use LLMs for? They want an agent to do x y z. And then how do you make this a thing that you can help climb?

Speaker 4:

And I think, like, you see a lot like, there's been a lot of news about OpenAI trying to, like, sell their RFT service. There's these startups popping up. There was a What's the RFT service today? Reinforcement financing.

Speaker 2:

Okay.

Speaker 4:

Yeah. So, like, they're pushing this pretty

Speaker 2:

This is the is if you're spending over $10,000,000, like, Palantir, OpenAI will also customize a model for you?

Speaker 4:

So they have, like, a What

Speaker 2:

was the headline?

Speaker 4:

Serious heavy like, a heavy paying customer one, but they also have, like, a more, like, in the thousands of dollars service

Speaker 2:

Sure.

Speaker 4:

Where you are getting to essentially do fine tuning on O four Mini.

Speaker 2:

Yep.

Speaker 4:

And that is a little more self serve. But they're also, like, doing consulting around that before we're deployed, paying around that.

Speaker 2:

Is that an important strategy for OpenAI to to to kind of stay in the game against open source models like LAMA?

Speaker 4:

Yeah. But I think also against other like, thinking machines, there was some report that

Speaker 2:

Yep.

Speaker 4:

This is a version of the strategy they're getting. It's like, go to enterprises

Speaker 2:

Yep.

Speaker 4:

Talk to them about their problems, turn these problems into things where you can create really good customized agent models.

Speaker 2:

Mhmm.

Speaker 4:

And if the like, that's one potential road map towards, like, having more, like, AI everywhere. It's just, like, having services that were whose job is to, like, come in. Whether or not it's fine tuning or not, just, like, make them into agent shaped tasks where you can then optimize the model and have someone craft the model experience, like, for you. Because, like, I think a lot of enterprises are still, like, in the spot where they don't really like, there's there's not as you see by the, like, the talent war, there's just not that many people in the world or in the market who really understand this stuff at a level where they can go make it happen. Mhmm.

Speaker 4:

And so, like, the open source thing, on one hand, it's cool that you, in theory, can go do it, but it's also like there's not that many hands who can who are equipped to, like, go make the thing happen.

Speaker 2:

Actually fine tune Lama four Yeah. Or whatever. Exactly. Yeah. Yeah.

Speaker 2:

We actually talked to a startup that's doing basically that, like, small models for specific business use cases. Like, okay. You just have a ton of CSVs that need to turned into JSON, and we build you an LLM that just does that or whatever. Yeah. Interesting.

Speaker 1:

How how do how do you hear after the the scale news and then there's a bunch of players popping up or, you know, everyone from Merkor to Labelbox to Handshake.

Speaker 2:

It's a lot of

Speaker 1:

a lot of different people competing for that market. How do you see that market evolving over the next one to two years? I think there are certain people saying scale got out at the perfect time Yep. In many ways. But clearly, there's a lot of at least gross revenue up for grabs right now in in the near term.

Speaker 4:

Yeah. I mean, I think like the the broader sphere of like creating stuff to train models on with humans in the loop to do that curation is gonna be important, for these domain specific applications. I think it's gonna be less like, oh, we just need more tokens, and it's more about we need curation of goals and objectives. Mhmm. Because, like, tokens, you kinda hit diminishing returns pretty quick in terms of just, like, more Internet text or more, like, human written math solution examples.

Speaker 4:

It doesn't scale super well. But the thing nice thing about the pass specification is you can kind of like pour in more compute without necessarily needing more data. Mhmm. It scales much better with compute. Whereas we don't really have, like, a great way of, like, spending a 100 x more compute per prechain token other than make a giant model.

Speaker 2:

Yep. Do you think, Llama will stay open source for the foreseeable future?

Speaker 4:

We'll see. I it wouldn't surprise me if they go kind of the Google route where, like, they are still they still do open source. They still have the Llama brand.

Speaker 2:

Yeah.

Speaker 4:

But it isn't like their flagship thing where they start doing whether it's for internal products or it's they really wanna show that they're the winner, releasing closed models via API or other

Speaker 8:

services where

Speaker 2:

Yeah. Do they do they have the infrastructure to serve a closed model? I mean, you mentioned that they use AWS, so it'd be kind of awkward. Right?

Speaker 4:

Yeah. I mean, I don't know that they really want to that much. I I think that it'd be something that comes in a product.

Speaker 2:

Sure. Yeah. Makes sense.

Speaker 4:

Or maybe they partner with like, you could see them partnering with someone like a CoreWeave

Speaker 2:

Yeah.

Speaker 4:

Or Nvidia directly. Nvidia's hoping to get into the inference serving game, it sounds like.

Speaker 2:

Yeah. Do you think so so one of Swix's takeaways from the the Wired article was that, like, one big development that's coming is when Stargate comes online, OpenAI will have the largest single cluster for pretraining, but it feels like we might be at the end of that game, and we might be doing more compute intensive work in a distributed fashion. And so maybe having it all in one place is a little bit less relevant to just, you know, oh, trump card, Stargate, boom. I win. I have the best model by far.

Speaker 2:

How how do you think about the impact of Stargate on, like, the AGI race?

Speaker 4:

I mean, I think the biggest experiment that they're definitely gonna do is take a pretty big model. I don't know if it'll be, like, quite as big as, like, a 4.5, but, like, a big model and do way more RL on it than anyone's on the app. Like, what does, an o five level of RL look like?

Speaker 2:

Okay.

Speaker 4:

See how good that is.

Speaker 8:

See what Do you need to

Speaker 2:

that online? That, though, or could you do that just across a bunch of data centers? Because I've heard, like, a lot of RL is, like, you're generating you're doing verifiable rewards. You're generating in a bunch of different data centers. You could do it completely distributed, and you don't necessarily need a stargate to do that.

Speaker 4:

You can do it distributed. Like, it is very inference heavy. There is still, like, a lot of weight updating. You have to sync the model across. You have to keep sending the training model to your inference workers.

Speaker 2:

Okay.

Speaker 4:

And so having it colocated certainly makes it easier. It's You easier to do it in a distributed way if you want to versus pre training. Mhmm. But it's not like trivial. Mhmm.

Speaker 4:

But I think like a lot of this is just data people are building the data centers without necessarily knowing exactly what experiments they're gonna run. They just know more stuff is coming. But also a lot of this is gonna be inference where like, they wanna be able to do like, Meta has a ton of GPUs that they just use for Instagram ads. Yep. Recommendations.

Speaker 4:

A lot of Stargate will probably be serving Ghibli. Like

Speaker 2:

I mean, that makes a ton of sense. Like, I'm I'm running into rate limits on haven't seen one in Literally everything. Like like Yeah. Google. It's like, how how do they run on GPUs?

Speaker 2:

They spend 60,000,000,000 on CapEx every year at least. And and o three pro too. I I mean, I get time outs on this stuff.

Speaker 1:

Last question. I'm guessing you weren't surprised to see OpenAI leveraging the TPU for for inference or reporting to be starting to use it. Was was that kind of

Speaker 4:

I mean, it makes sense that they are considering all options given the friction with Microsoft. Like, they don't have a a core cloud that they're super close with. They're not like, there's they need to both try to lower their cost for inference as well. Like, that maybe was a factor behind the because the o three had a big price drop that might have been TPU related. I don't know.

Speaker 4:

Pure speculation. But, like, Google has TPUs and they can serve Gemini really cheap and really well, really fast. If OpenAI wants to compete at that level, they kinda have to consider all options.

Speaker 1:

Yep. Makes sense. Well, great day to have you pop on.

Speaker 2:

Yeah. Thanks so much for helping out.

Speaker 1:

Knowing how things are unfolding, we'll probably we'll probably ping you to jump on again later this week. So stay ready. Good. Great to

Speaker 7:

have you, Will.

Speaker 1:

Hot Topic. Cheers.

Speaker 7:

Yeah. Bye.

Speaker 2:

Talent wars continue to rage on. Sam Altman slams meta AI's talent poaching spree. Missionaries will beat mercenaries. He says, quote, what Meta is doing will, in my opinion, lead to very deep cultural problems, said Sam Altman in leaked memo sent to OpenAI researchers. So big, big question.

Speaker 2:

Paula says, with all the AI labs in SF being in the mission, who called them members of technical staff and not missionaries? It's funny. Oh, wow. This is an old post, yeah, from, from over six months ago. And there's a bunch of other new details emerging from, the new Meta Superintelligence team.

Speaker 2:

Didi, who's coming on the show, wait, in, like, three Three minutes. Okay. I was just thinking, wow. Yeah. We we've highlighted a lot of his posts.

Speaker 2:

Every single one of the 11 meta meta superintelligence hires has an is an immigrant who did their undergrad abroad. Seven from China, one from India, one from Australia, one from UK, one from South Africa. Eight are PhDs or PhD dropouts in The US. Immigration is key to US AI innovation. Jeff Beff Jasos has a hilarious post here.

Speaker 2:

The foreshadowing here was insane. And it's Sam Altman interviewing Mark Zuckerberg. And Mark Zuckerberg says, the thing that I think Facebook has done exceptionally well is hiring. This is this is an incredible series, the startup school series that Sam did while he was at YC. He interviewed, Zuck, Elon, a ton of really interesting folks and got these really definitive interviews from the top tech leaders of the next generation, like, right as they were, kind of on the on the biggest, moment in their come up.

Speaker 2:

Why are you laughing?

Speaker 1:

It's just I love I love this business.

Speaker 2:

You're, like, tearing up looking at how happy how happy

Speaker 7:

you are.

Speaker 1:

No. It's just it's watching this great power conflict play out in real time is is wildly entertaining.

Speaker 2:

It's entertaining.

Speaker 1:

And the consumer is going to be the net winner.

Speaker 2:

I was about to say that. And also America. Yeah. It's extremely this is a pressure cooker of the highest. It's it's so competitive.

Speaker 2:

But the good thing is that at the end, don't like, they we aren't playing this game where someone goes to jail or someone gets, you know, like, goes to the gulag if they don't perform or they get locked up. Like, this is this is, you know, the difference between being a to having $10,000,000,000 or a $100,000,000,000. It's like, it's kind of all fake, but it's all it's all, like it's super high stakes, but it's still capitalism and it's still, like, the consolation prize is still being able to build something cool even if you get beaten. But the end result is that we have this hypercompetitive race to build the best thing with, like, kind of a safety cushion so that you're not you're you're like, it's not like the government's putting

Speaker 1:

a gun in It's global. Right? It's greatest minds from all over the world coming to compete in this market. Yep.