Technology's daily show (formerly the Technology Brothers Podcast). Streaming live on X and YouTube from 11 - 2 PM PST Monday - Friday. Available on X, Apple, Spotify, and YouTube.
You're watching TVPN. Today is Tuesday, 10/28/2025. We are live from GitHub Universe here in The Fort Mason. Fort Mason in San Francisco, and we have a ton of exciting interviews today. We are interviewing the CEO of Microsoft, Satya Nadella.
Speaker 1:We're very excited. We've been on a quest to interview Bag seven CEOs, and we are very excited to sit down with him today. And it's a huge day because Microsoft to just today announced that they have entered the next phase of the partnership between Microsoft and OpenAI. There, of course, are dueling blog posts, one on OpenAI's website, one on Microsoft's website. We will go through some of the Microsoft, update to give a little bit of background before we go into our interview with Satya Nadella.
Speaker 1:But first, let me tell you about ramp.com, time is money, save both, easy use corporate cards, bill payments, accounting, a whole lot more all in one place. Let's go. So this all started back in 2019. Microsoft and OpenAI, it says here, has a shared vision to advance artificial intelligence responsibly and make its benefits broadly accessible. What began as an investment in a research organization has grown into one of the most successful partnerships in our industry, and I think that it might be one of the greatest deals of all time in business history.
Speaker 1:It it is a remarkable, remarkable deal. Your site. I was I was digging through some of the other deals that where big tech companies worked with each other or bought stakes in each other, and there are some wild ones that people might not know about. I think it might be worthwhile to go through. Before we do, let me tell you about Restream.
Speaker 1:One livestream, 30 plus destinations. Multistream, reach your audience. Yeah. There they are. Alright.
Speaker 1:So in, in back in, what was it, 1997, Microsoft bought a $150,000,000 of nonvoting Apple stock, which settled some litigation, committed to, they were gonna put Microsoft Office on the Mac for five years, and it made Internet Explorer the default browser on the Mac. And so I I
Speaker 2:yeah. Slightly before my time. I forgot it was born.
Speaker 3:But
Speaker 1:By they they did this deal, but by by by 2001, Microsoft had converted all of the shares into common stock, netting the company approximately 18,000,000 shares of Apple. And then by '20 by 2003, they'd exited the position, which I don't know if that's a good deal. They should maybe they should have diamond hands it, but, it's still it's still a wild, wild moment.
Speaker 2:Yeah. I I I think know, something I'm excited to talk to Satya about is just, like, how much how much foresight he had Yeah. Whether he knew whether whether he was expecting a base hit or he really felt like it'd be a home run. Yeah.
Speaker 1:Yeah. Yeah. It is a very fascinating thing. It's like you're doing a deal with this nonprofit. Sam Altman's obviously a big character in even in 2019, Sam Altman was an important figure in Tech of Force.
Speaker 1:But at the same time, I was I was running the numbers, and I was like, at at at least today, Microsoft makes like a billion dollars in revenue every single day. And so if you think about it I I don't know if you actually think about it this way, but if you just think about it like, it's your job as the CEO to steward capital.
Speaker 4:Yeah.
Speaker 2:And a billion dollars sounds like a lot.
Speaker 1:Yeah. But you're making a billion dollars every business day. Yeah. Like, there's five business days a week, fifty two weeks a year. You're basically like, revenue for Microsoft right now is about a billion dollars a day.
Speaker 1:And so do you treat that deal like it's just another day at Yeah. Microsoft? Or or is it something that there's weeks of negotiation? Because you do have a sense that this is gonna be one of the more important deals. And I always I was Yeah.
Speaker 1:When you when you look
Speaker 2:at when you when you look at the check size Yeah. Relative to the capital available, It looks like a flyer Yeah. If were a VC fund. Right?
Speaker 4:Yeah. Yeah. Yeah.
Speaker 2:And, you know, I I think it's fascinating because there's so many, you know, scaled platform VCs that have to just be faced. They have to look at this announcement Yeah. To see that Microsoft owns 27% of potentially the most consequential company to come out of, the 2010.
Speaker 4:Yeah. Right?
Speaker 1:Yeah. Yeah.
Speaker 2:Yeah. And they just have to look at that announcement.
Speaker 1:Yeah. And I haven't dug in, but, you know, I've seen there there's a little bit of sour grapes for the vet from the venture capital community. Saying you didn't get enough of we didn't get enough
Speaker 4:of Yeah.
Speaker 2:If you look at the the risk reward that the the the the amount of capital that the seed investors deployed into the company relative to their ownership today, it looks, you know, certainly, they made a great return Yeah. On paper, but but, did they actually make a great return relative to the risk of, you know, investing in a company that had, went against every YC practice there is. Right? Really? Like, YC says, like, there's so many videos of Sarah Mullings saying, don't don't reinvent the wheel.
Speaker 2:Me do
Speaker 1:reinvent the wheel. No. I I'm a
Speaker 2:And and ultimately, I mean, this has led to, like, so much of the you know, as as Chad GBT Yeah. Has exploded, the chaos around the company has almost entirely centered around the corporate structure. Yeah. So Yeah. Yeah.
Speaker 2:I wonder, you know, this may be, you know, the the final company for Sam, right, in terms of of but I but I wonder what he would do next time around.
Speaker 1:I mean, we we we have run the experiment. Right? Because he has a bunch more companies. Most of them, I think, are pretty clean c corps. But then again Maybe he's Bitcoin has a token and stuff.
Speaker 1:Like, there are multiple things going on. But Yeah. I think, probably, if we dug into his, his BCI company, we would see a cleaner c corp.
Speaker 2:Yeah. And the I I'm excited in in the fullness of time when we get the Yeah. When we get the books and the documentaries on both OpenAI and this investment Yeah. I can't wait to see and try to understand, where what OpenAI was really what they were facing at that moment when they did this series of deals with Microsoft. Right?
Speaker 2:Because the ultimate deal of, you know, selling such a large amount of the company with a with a rev share attached, which is The rev share is wild. Ask ask, you know, a YC partner if a portfolio company, if one of the companies in their group came to them and said, yeah. Have this investor. It's a large tech company. They wanna invest, like, they wanna buy, like, a lot of the company, and they want a 20% rev share for, like, the next ten years.
Speaker 2:Yeah. They'd be like, you need to walk away from that deal immediately. Totally. But Sam Sam and Satya did it, and here they are.
Speaker 1:How about two on 20? Safe note.
Speaker 4:Yeah. We Good old fashioned way.
Speaker 1:Good old fashioned. Why are we reinventing the wheel? Yeah. There are a couple other interesting, cross industry deals. Jeff Bezos famously, had a big stake of Google.
Speaker 1:Yep. Right? Was he an angel investor? There's also the time that Intel, TSC, Samsung came together to invest in ASML, which, of course, makes the lithography machines kind of pulling forward kind of the initial, like, weird, you know, circular deal that people point to.
Speaker 2:But it
Speaker 1:worked out. But that one worked out for sure. And and it yeah. It's, it's interesting seeing the the evolution of this deal in particular. Some quick history.
Speaker 1:07/22/2039, Microsoft invests $1,000,000,000 in OpenAI. Azure is named the exclusive cloud provider. Microsoft is named the preferred commercialization partner. Yep. In 2020, Microsoft receives an exclusive g p t three license and for its products and services.
Speaker 2:And the foresight the the foresight here
Speaker 5:is
Speaker 2:just from Zatia is incredible. Like, this is like, at the at the time, like, it wasn't that Yeah. Like, only a couple years prior, Elon was basically walking away because he said, like, there there was no there there were sure it was material progress internally, But to have that level of Yeah. Of of, understanding and that love that much conviction to invest a billion dollars when you're years out It's
Speaker 1:more it's it's more complicated than that, I I think. Because, there are plenty of big tech CEOs who have taken $1,000,000,000 flyers on crazy ideas. We see that all the time with the, you know, oh, you wanna build some new hardware thing, or how much how much did Apple spend on the car? Or they probably spent a billion dollars working on that car already and, like, you know, they kind of, like the risk adjusted reward the risk adjusted bet made sense, but then ultimately, they pulled back from that. And that's happened probably all over the place, and Sajid himself probably has other times when he's put down a big investment for something that was risky and it didn't pan Yeah.
Speaker 1:What what is interesting about the OpenAI deal is that I know investors personally who are looking at the deal before that
Speaker 2:Yeah.
Speaker 1:And they couldn't get over the complicated structure. Yeah. And so they dipped out for that. And so it's not that it's like, oh, wow. We need to give a round of applause for someone who's helming a trillion dollar company or write a billion dollar check.
Speaker 1:Like, that's not that crazy. That happens all the time. Yeah. What is crazy is to get over all the lawyers being like, you're doing what and how it's structured? Structured?
Speaker 1:What are you talking about? There's a nonprofit involved? Why are we doing back to me. Exactly. Exactly.
Speaker 1:And so but knowing that that we do live in a society where if things if value is created, if a new platform emerges, everyone Overcome
Speaker 2:can overcome overcome any chaos and all the craziness. Complexity. Yeah. It's out.
Speaker 1:And so in 2021, Microsoft followed on with more investment, and then, the OpenAI service went general availability on Azure in 2023. Before we move on, let me tell you about Privy, wallet infrastructure for every bank. Privy makes it easy to build on CryptoRail. So if you're a fan of Lighttable or friends, and they just integrate on chain and encrypto, all of
Speaker 2:you on a simple API. And and and the reason that I say this is so notable Yeah. This this conviction, really matters is that think of when you look at other scale, you know, hyperscalers. How slow how how even years after the the the sort of ChatGPT moment Yeah. Granted, everything we've talked about so far predates ChatGPT.
Speaker 2:Yeah. Right? And and years after this ChatGPT moment, we still have people that are only now coming around and saying, like, we're we're ramping up we're ramping up CapEx. Right? And so, I just think that Satya was incredibly ahead of the curve here.
Speaker 1:Yeah. It's it's, yeah. It's easy to look at the deal and through the lens of, oh, well, Copilot was the glimpse of value creation out of what is effectively like a nonprofit academic lab. But it's important to remember, Copilot happened three years after. Yeah.
Speaker 1:They had Copilot launch the two or three years
Speaker 5:2022.
Speaker 1:Yeah. Two or three years fully after the the that initial $1,000,000,000 investment. Yeah. So, yeah, remarkable remarkable progress. Let's go back to the Microsoft announcement today.
Speaker 2:We can go through some of the key points.
Speaker 1:Details on what's evolved.
Speaker 4:Do you wanna read
Speaker 2:some Yeah. So what is evolved? Once AGI is declared by OpenAI, that declaration will now be verified by an independent expert panel. So I'm assuming they're gonna Joe Rogan Yes. Andrew Huberman
Speaker 5:Yes.
Speaker 2:Lex Friedman I would love to. And get a, you know, a panel of podcasters to decide this. Now, Floris Great question. This is a question that I wanna dig in with Satya in just a few minutes trying to rate rate today. Like, nobody, you know, some folks can agree on on the definition of AGI, but it's it's very much in flux.
Speaker 2:Right? Tyler Cowen was on our show a few months ago saying that he felt, AGI had already been achieved Yes. That we keep moving the goalposts. Yes. Others believe that we're in this era of of spiky intelligence, and we need, sort of, you know, more broad intelligence before we can get to, true general intelligence.
Speaker 2:But going forward, Microsoft's IP rights for both models and products are extended through 2032 Mhmm. And now includes models post AGI with appropriate safety guardrails. That feels significant. Microsoft's IP rights to research, defined as the confidential methods used in the development of models and systems, will remain until either the expert panel verifies AGI or through 2030, whichever is first. Research IP includes, for example, models intended for internal deployment or research only.
Speaker 2:Beyond that, research IP does not include model architecture, model weights, inference code, fine tuning code, or any IP related to data center hardware or soft and software, and micro Microsoft retains these non research rights. Microsoft IP rights now exclude OpenAI's consumer hardware. Okay. That that's, notable. They need to start figuring out carve outs.
Speaker 2:Yes. But, and OpenAI can now jointly develop some products with third parties. API products developed with third parties will be exclusive to Azure. Non API products may be served on any cloud provider. So, again, Satya,
Speaker 1:If it's for you're just joining us, Satya Nadella will be joining us in five minutes to break all of this down live on TBPN. For right right now, we are setting the table with some analysis and, looking through the details of the story that emerged today.
Speaker 2:Yep. From So Microsoft can now independently pursue AGI alone or in partnership with third parties. If Microsoft uses OpenAI's IP to develop AGI prior to a g AGI being declared, the models will be subject to compute thresholds. Those thresholds are significantly larger than the sit size of systems used to train leading models today. The revenue share agreement remains until the expert panel verifies AGI.
Speaker 2:The payments will be made over a longer period of time. OpenAI has contracted to purchase an incremental 250,000,000,000 of Azure services, and Microsoft will no longer have a right of first refusal to be OpenAI's compute partner. Again, like that 250,000,000,000 number, is, you know, certainly not the biggest number we've heard. Yeah. But it's a quarter of a trillion is nothing to, scoff at.
Speaker 2:Yep. And OpenAI can now provide API access to US government national security customers regardless of the cloud provider. And finally, OpenAI is now able to release open weight models that meet requisite capability criteria. Yeah. So, again, I'm I'm I I I feel like on a number of of these points, it feels like they are, kicking the can down the road a little bit Sure.
Speaker 2:Again. Obviously, this was important to complete the convert conversion from the LLC to the Public Benefit Corporation, which presumably can go public. Yep. But again, my question and and my immediate thought is how many of these things are gonna be critical to iron out before the IPO? Is there gonna be enough Yeah.
Speaker 2:Demand that it doesn't matter again in the same way that certain investors, you know, you know, our friend Josh over at Thrive and others were had incredible conviction to be, you know, deploying again and again and again into OpenAI's for profit subsidiary, even when there was so much uncertainty around the structure. Right?
Speaker 1:Yeah. Big open question in how Microsoft's internal AI research efforts evolve now that this is a little bit more concrete. Will be very interesting to see.
Speaker 2:Yeah. This I mean If Microsoft uses OpenAI's IP to develop
Speaker 1:Oh, AGI.
Speaker 2:And then they're Prior to AGI being declared.
Speaker 1:He'd love that.
Speaker 2:OpenAI's dribbling dribbling towards the basket. Yeah. Satir comes in with,
Speaker 1:who knows? Who knows? If you're just joining, we'll be live with Satya Nadella in one minute and seventeen seconds There we go. According to our timer. In the meantime, let me tell you about Cognition.
Speaker 1:They're the makers of Devon, the AI software engineer. Crush your backlog with your personal AI engineering team. So Yeah. So Yes.
Speaker 2:Again, there there's so many of these points that leave kind of open Totally. Open questions. They'll need to be effectively renegotiated again and again down the road. But at least this provides a pathway and it's it's no longer the the elephant in the room.
Speaker 1:Yeah. It does feel like the the cap table is getting slightly cleaner. Yeah. And here, towards something where I mean, if you look at the at the history of the Microsoft deal with Apple, they had a position. They eventually rotated out of that, sold out of that because Yeah.
Speaker 1:There's there's this question of, you know, if you're the CEO of Microsoft, you're Satya Nadella, should you be a venture capitalist as well? Like like, oftentimes, big tech companies do make investments, minority investments. Yeah. Sometimes they make whole co acquisitions. But is that primary business?
Speaker 2:Yeah. Yeah. I mean I mean, ultimately, this comes down to feeling like potentially one of the greatest corporate venture investments of all time. And so I'm not coming up with any that are
Speaker 1:No.
Speaker 2:That are better. It's pretty good. Of course. If you get in terms of not just owning a massive piece Yeah. Of a generational company Yeah.
Speaker 2:And a future you know, potentially, what what looks like a future, you know, hyperscaler Yeah. But also giving your business just this incredible strategic advantage, in the race, broadly. So
Speaker 1:Yeah. To go any more impactful, you need to move over into the Holco acquisition world. You have to talk about Instagram. But even even that is tough. But, it's a very different very different deal structure.
Speaker 1:Yeah. And something that is just down the fairway by the entire company, by the entire entire product, as opposed to, make this bizarro minority investment and then Yeah. Grow from there.
Speaker 2:Yeah. The, it's worth noting too that OpenAI and, Microsoft Office are on, you know, already on a collision course. Right? Like, can imagine that over time, these products, you know, overlap today. Yeah.
Speaker 2:You can use Copilot for a lot of things that you can use ChatGPT for. That's only gonna become there's still gonna be this, like, massive tension there.
Speaker 4:Yep.
Speaker 2:And we'll be covering it a lot.
Speaker 1:Well, let me tell you about figma.com. Think bigger, build faster. Figma help design and development teams build great products together. And I believe we're ready for our first guest of the show, Satya Nadella, CEO of Microsoft. Welcome to the show, Satya.
Speaker 1:Great to see you. How are you, and great to see you. Thank you so much for doing this. Please, I think there were a ton of bullet points in the announcement today. Can you just zoom out and explain it to me like I'm five?
Speaker 1:What action would happen? What what what actually changed? Because you've been in partnership with with OpenAI for six years now, but this feels like an important moment. What happened?
Speaker 6:Yeah. Look. First of all, you know, it just feels yeah. You're and you said it right. It's a good it's an important moment, and the story continues.
Speaker 7:Sure. Yes.
Speaker 6:But the story actually got started, even the OpenAI one. I've known Sam for a long time since his first company, pre y c days. Louis. All the way back then. Wow.
Speaker 6:That's right. That's right. I was actually The Dublin holiday.
Speaker 1:I remember him being at WWC, d c, presenting it in the double polo. It's iconic, but I didn't realize that you were you were business with him back then.
Speaker 6:And it started, I think, in 2016. In fact, we were. Azure was the first cloud provider
Speaker 1:That's right.
Speaker 6:When OpenAI got started. Yeah. In fact, I think Elon sent me the mail asking for Azure credits. So that's how it got started.
Speaker 2:Hey, I have this nonprofit.
Speaker 6:Yeah. Yeah. Come on. So that's That's right. And they were obviously into reinforcement learning.
Speaker 6:They were doing DOTA and all of that stuff. And and then at some point, it reached where I think they went off. I think they went to other clouds. And so I lost touch for a while. Oh, yeah.
Speaker 6:And then I I think in 2019, Sam came and talked about sort of, hey. We're gonna really, we think this scaling stuff works. I I forget now. It's a little hazy when I read the paper. In fact, the paper was written by Dario, Ilya, the scaling laws paper.
Speaker 6:And the thing that's you know, Microsoft has been obsessed since Bill started at Microsoft Research in '95 is natural language. Yeah. It's just, you know, being the thing. We are an office company. We are a knowledge work company.
Speaker 6:And so we always thought about text and AI as applied to text in natural language. So you could say it's the prepared mind when sort of Sam said, hey. We're gonna go take a run. You wanna be on it. That's sort of what led to really coming together on this.
Speaker 6:Yeah.
Speaker 7:It was a not it was
Speaker 6:a research lab. It is a nonprofit.
Speaker 1:And as opposed to if they had stayed on the previous tech tree path of they were doing some
Speaker 5:How are
Speaker 1:good old robotics and were doing some video game stuff and DOTA two. That doesn't jump out to you as immediately relevant interesting
Speaker 6:you bring that up because, obviously, RL has come back in a big way in in relation to sort of these large language models. But, yeah, it's you know, this is the funky path dependent way things happen. Right? Because I don't think I would have gone in full on to say, hey, let's go, yep, you know, partner with these guys, build a computer that scales it if it's not natural language. Yeah.
Speaker 6:I'm glad we started there and then our old now is improving the quality of these models. For sure.
Speaker 1:How big of a deal was writing a a $1,000,000,000 check back then? I mean, it's a big company, Microsoft. We think it makes it's like revenues around a billion dollars a business day. Was it one day of work for you? Was it or was it, you know, weeks of negotiation seriously?
Speaker 1:Did you build memos? Like, did we build an Excel sheets? Like, what were you thinking?
Speaker 6:Even at Microsoft, you kinda gotta have to get a board approval. Oh god. Just go throw a billion dollars out there. But, you know, I must say it was not that hard to convince Yeah. Anyone that this is an important area, and it's gonna be risky.
Speaker 6:Like, I mean, retrospect, I mean, who would have thought, hey. I didn't put in that, you know, billion dollars saying, oh, yeah. This is gonna be a, what, a 100 bagger. Yeah. And I mean, that's not, like, what was going through our head because I think that this is a a partnership that I mean, by the way, remember, this is a nonprofit.
Speaker 6:Right? And Yeah. I think, you know, Billy even said, yeah. You're gonna burn this billion dollars. Yeah.
Speaker 6:That's right. And, yeah, we kind of had a little bit of high risk tolerance. Don't worry. Yeah. And we said we wanna go and give this a shot.
Speaker 6:Yeah. And then, of course, we subsequently you know, we here we are at GitHub Universe. Yeah. In fact, this is probably the place where that billion to 10,000,000,000 haven't because Yeah. In '21 Mhmm.
Speaker 6:Is when I first saw GitHub Copilot. Sure. And I said, man, this is what?
Speaker 2:This is a year before the release.
Speaker 6:No. Actually, in fact, I was fact checking my thing. I think GitHub Copilot launched in '21, ChatGPD in '22.
Speaker 8:Sure.
Speaker 6:And if I remember right, '23 is the blip, the the November blip with OpenAI, and then everything has been smooth since then. The blip.
Speaker 4:It's a blip.
Speaker 1:The gist. Nicest way you could put that. That's fantastic. Yeah. So so so that makes a ton of sense.
Speaker 1:Obviously, it's been it's been a wild ride up and down. You started with just natural language. Let's predict the next word. Now we're let's rewrite the entire global economy. How do you think about the territory that you at Microsoft have have kind of claimed, and what do you wanna hold on to?
Speaker 1:What's what's most important to Yeah. Map out where Microsoft where the edges of your territory are, and then where founders and other business people can build in partnership with you.
Speaker 6:Yeah. So, look, if in if you take so I always say Microsoft's a a, you know, a a platform company and a partner company. We define platforms as where the value capture platform is higher
Speaker 9:Yeah.
Speaker 6:Than by the platform. That's kinda who we are, and that's you know, GitHub is a great place. So if you think about even at GitHub universe today Yeah. It's it's interesting. Right?
Speaker 6:We as you said, we first started by saying, hey. Code completions. Yeah. Then we said, let's chat. Right?
Speaker 6:Instead of getting distracted, stay in the flow of coding, you bring, the information to the flow, and that chat became the thing. Then we said agent mode. Yeah. Then we said, hey, let's have autonomous agents.
Speaker 1:Yeah. Yep.
Speaker 6:And then, now we have multiple autonomous agents working across all these different branches, then bringing the PRs to me. And so with this entire conference is about what we call agent HQ Sure. And mission control where you have codex, you have Claude, you have Grok, every model you want Yeah. Each working across their own branches. Then you have the IDE, so you can bring up Versus Code where you can do the diff on each of the branches output, and then so the story goes on.
Speaker 6:So, therefore, to me, building a system Yeah. That really brings the innovation across the ecosystem into some kind of an organizing layer is what platform companies do well.
Speaker 1:Have you seen anyone here that you think might be working on AGI? Do you have a personal definition for AGI?
Speaker 2:Yeah. That that if you look at pretty much all the deal points, it it, you know, it keeps coming back to this moment when AGI will be declared. Right? It's a there'll be a panel of experts. Maybe that panel is still being decided, but a lot of experts today have differing definitions.
Speaker 2:So I wanted to get a better sense of how you how you imagine that kind of decision making process will go when the time comes.
Speaker 6:We'll put you guys
Speaker 7:on the panel. Yeah.
Speaker 1:We've been doing evaluations on on AGI, specifically around comedy. Candidates Yeah.
Speaker 6:Fun Let you go do that. Comedy event. To me, I think, first of all, I think one of the reasons why, quite frankly, both Sam and I, I think, agree on this, which is it's become a bit of a a nonsensical word. I mean, it's just changing, and everybody defines it differently. Yes.
Speaker 6:And and we now know what the issue is. Right? We know everybody describes even the intelligence we have, which has been exceptional Yeah. As jagged. Yep.
Speaker 6:Yeah. Spiky. Yeah. Or spiky intelligence. Alright.
Speaker 6:And so if you sort of say, well, we have spiky intelligence. And in fact, I think Andrij Karpathy's point, in one of the podcasts recently, which is a good one, which is even if you're having exception you know, let's call it exponential growth in one of the spikes Sure. It's not as if the the Jags are getting worked out. Yeah. That's the nines problem.
Speaker 6:Yeah. Right? That is each nine is maybe linear or even sublinear problem Yep. Right, in terms of rate of progress. So first step to me is even to get to broad intelligence.
Speaker 6:Forget sort of general intelligence. Yeah. We've gotta get rid of these jagged problems, and that, I think, is the first place to do. So if you ask me, I think what may happen is we will achieve more robustness, let's call it that Yeah. For different systems.
Speaker 6:Right? So coding is a good one. Yep. I think the entire goal with GitHub and GitHub Mission Control and AgentHQ is can I just like how I use compilers Yeah? Can I use agents to generate better coding artifacts?
Speaker 6:Yeah. Right? Today, coding are I mean, like so I sometimes think why coding is a sort of a slightly unfortunate term because it does lead to a lot of swap. Yep. Right?
Speaker 6:I mean, it's kinda like I'm sure you you code away, and then you when you lose control of the project, and then you gotta put everything back into a marked out. And, you know, so Yeah.
Speaker 2:And even traditional knowledge work that's happening in the Office suite, it's not like you want the biggest Excel model. Right? You want the one to
Speaker 6:But but even Excel is a classic one. In fact, one of the other things that's happened is even in when I see m three Microsoft three six five Copilot, man, the amount just like right now, the number of repos on GitHub is exploding. Yeah. The other thing is everybody's generating PowerPoint and slide decks and sort of Excel models. Yeah.
Speaker 6:The problem with Excel models is you know when intelligence is created, Excel model. I mean, it is like a thing of beauty that assumptions are clear. The formulas are there. Even the formatting and the all this stuff. And you can iterate on it.
Speaker 1:You can change It tells a story.
Speaker 6:It's not like a one shot. I wanna change. I can't go back and say zero shot the entire thing.
Speaker 4:So in
Speaker 6:fact, the agent mode in Excel, which I like is, it understands Office dot j s. It Yeah. Puts the formulas. I can then iterate like I iterate on GitHub Copilot. So so those systems so if you ask me about, you know, first, how do you get rid of this jagged intelligence problem is you build a great knowledge work system that is multi agent, multi model, multi form factor, get to a great benchmark and an eval where you can trust it at two nines, three nines, four nines.
Speaker 6:And until you achieve that, you're not gonna be able to move and say, hey. We have anything quite general intelligence.
Speaker 1:Yeah. It feels like there's there's there was a lot of uncertainty in the tech community around, yeah, all this super intelligence gonna come out of the lab tomorrow, and there's gonna be this fast takeoff. Now it feels like there's more opportunity both for Microsoft to to add those nines to products that you have and then also, to entrepreneurs who are building products maybe on top of Microsoft. How are you thinking about the entrepreneurial opportunity in future of
Speaker 6:good point. Because at some level, if you sort of buy the argument I made, there is a lot more invention to happen. Yeah. By the way, the other thing that we should also talk about, you I always say to this too. Right?
Speaker 6:Right? Today, it's all the conventional wisdom is, oh, intelligence is just simple, straightforward log of compute. So throw more compute Yeah. Now intel
Speaker 2:More energy.
Speaker 6:Who the heck knows, man? One of the researchers comes from here comes out of here and says, you know what? I got it. It requires compute.
Speaker 1:Yeah. Yeah.
Speaker 6:Any of you guys have thought about. Like, that's a game changer. Yep. So it pre and oh, by the way, we're all, like, excited about reinforcement learning. Yeah.
Speaker 6:Guess what? Pretraining is a more efficient form of training Yeah. Because you can advertise it. So there's a think pretraining will have new breakthroughs. Midtraining will have new breakthroughs.
Speaker 6:RL will continue to improve. We will then have to add more innovation to it. And by the way, this is another part of this partnership, which is I'm glad, you know, OpenAI is continuing to do great work. Jacob, Mark, others are great, and we'll partner with them, and we'll continue to do so. And Mustafa has built a world class team.
Speaker 6:Right? You know, Karen, Amar, Lando, these are I mean, we have now three cool Yeah. Models, whether it's speech or image or text, and we're gonna continually have it. So we'll ride our worst as well.
Speaker 1:Yeah. How are you thinking about the interplay between OpenAI, what you do internally at Microsoft? See, so our simple Are there certain things you can take your foot off the gas because you're like, actually, OpenAI's got that handled? Or do you want a duopoly? Like, actually, we're gonna fight it out on everything.
Speaker 6:I'm much more like, again, my my mindset is all platform, man. Like, okay, on Azure, do you run Windows? Yeah. Do you run Linux? Yeah.
Speaker 6:Yeah. You run SQL Server? Yeah. Do you love Postgres? Absolutely.
Speaker 6:Sure. .Net Java. Yeah. Yeah. Hey.
Speaker 6:I'm happy with OpenAI. I would love to have Anthropic, MAI, anyone. If Google wants to put Gemini on Azure, please do so.
Speaker 1:What is that like culturally? Like, what does it what does it mean for the next Satya Nadella? Somebody who's working their way up in Microsoft, do they need to be okay. I'm I'm building something internally, but my company isn't gonna favor me. I I need to fight it out with all my competitors across
Speaker 6:We all grew up in that culture Yeah. Where it doesn't mean because we it's always we're gonna bring our pieces together.
Speaker 4:Sure.
Speaker 6:We are going to innovate across these seeds. Yeah. But as a platform company, you kinda wanna support everything. Sure. And like most people don't Office was born on the map Mhmm.
Speaker 6:Before Windows was
Speaker 2:even Yeah. If you don't if you don't give people choice, developers here, like, will churn. Right? They'll find other platforms. Right?
Speaker 6:If you Bill had when he started Microsoft was, hey. We're a software factory. We love all software categories, and we're just gonna go create software. And so to me, we definitely wanna sort of have that same attitude, to innovation. We definitely need to stitch our stuff together so that they come together to solve bigger and greater problems, but doesn't mean we can't create opportunities.
Speaker 6:And apart the other thing that I grew up grew up, like, for example, you know, building SQL Server with SAP. And so we've always partnered and or Intel Microsoft. Right? I mean, we owned up the PC industry, but, you know, it is called the grave go the Grove greats model. Yeah.
Speaker 6:That's a good model to create value.
Speaker 1:Do you think that there's increasing returns? This is gonna sound like a loaded question, but I I promise you it's not. There's increasing returns right now to being a deals guy or or innovating on the on the deal structuring side. And what I mean by that is, there's there's all these difficult problems to solve with energy and data centers, and it feels like we there's innovation in tech that we normally think of as, like, the code or the algorithm or the design of the system. But then there's also this difficulty sometimes to just marshal the resources.
Speaker 1:And is that an is that, like, a new phenomenon? Has that always been true? Is there if somebody's pursuing a career in tech, is becoming a great deals guy or deal maker, like, a important path now?
Speaker 6:Yeah. I was just thinking about it, man. Like, which is, yeah, you have this great investment and it has great return and no carry all it all the value goes to my shareholders. That's awesome. Like, we should start a venture for
Speaker 2:You might you might be well there.
Speaker 6:Yeah. Well well See, I think the the thing you're touching on is something that actually platform companies should think about, which is what's the ecosystem In sensor. And downstream. Yeah. Right?
Speaker 6:To your point right now, we have to, as an industry, industry. Like, I mean, the reality is let's take power. Right? Which is if they sort of say intelligence is about tokens per dollar per watt, we gotta get efficient on all of it. Yeah.
Speaker 6:In order to get more efficient on it, you gotta really think about, even in our own industry, the token factory itself really getting better order of magnitude. This is, like, again, a renaissance time for systems Yeah. Architecture. And so we're you know, obviously, NVIDIA is doing great work. AMD is doing stuff.
Speaker 6:Broadcom's doing stuff. All of us are doing great work to just push that. Yep. Then the next barrier is gonna be, man, can we generate energy faster? Can we build faster?
Speaker 6:Can we build a cooling? I mean, like, who I mean, I now know more about campus cooling
Speaker 1:Yeah.
Speaker 6:Systems than I ever thought I'll know. Right? I mean and these are all choke points.
Speaker 1:How much do you want that to live within Microsoft versus you wanna just be a buyer and and the the all the different power players are out there building nuclear, wind, solar, and you're just dealing with it at a higher level of infrastructure.
Speaker 6:The vast majority of this infrastructure now Yeah. Now that you know, if you think about back at it, right, our data center builds mostly we built and we leased some. Mhmm. Because no one was in the business of building at the scale at which we were building. But now, I think there's gonna be opportunities for us to lease.
Speaker 6:Yeah. And there's going to be significant competition amongst builders, so therefore, the lease prices also
Speaker 1:Yeah.
Speaker 2:Do you think you're more ROI focused than others that are throwing around big numbers?
Speaker 6:I mean, I'm I'm always focused on long term return.
Speaker 2:Well well and and and we're at a time right now where there's people that have come out and effectively said, I I don't actually care about ROI. I just care about winning. Right? And it seems from your
Speaker 1:view A couple years ago,
Speaker 4:I was there
Speaker 1:was the mood of, like, if you this might be the last invention. Yeah. No. No.
Speaker 6:If you always have someone else willing to give you the billion dollars when or the $10,000,000,000, you can always be about award winning and Sure. Yeah. The return. But at some point, that party ends and everybody needs to sort of have a path plan. In that context, in these platform shifts, to be short term oriented
Speaker 2:Yeah.
Speaker 6:Doesn't, help at all. Right? Because you gotta you know, I always say long before it's conventional wisdom.
Speaker 4:I mean, if
Speaker 6:you look back, you asked how we put the billion and the reality is we put the 10,000,000,000.
Speaker 1:That's right.
Speaker 6:Yeah. Or the 13 and a half was fully committed Yeah. Before it became a thing. Right? And remember that was all done before Chad GPT became a thing.
Speaker 6:And so go back to me.
Speaker 2:How do you I I think there's a general consensus now that it's it feels very possible to predict, like, a year out, two years out, and then ten years out is extremely fuzzy. What's your view on that given that you look like going back to the original OpenAI investment and the original partnership? It it seems like you've had at least really good, like, six year kind of, like, foresight abilities to sort of, invest against, like, a six year time horizon. But, how are you thinking about managing over, you know, the next decade?
Speaker 6:Yeah. I mean, I think, you know, to me, you know, one of the things about tech is as a percentage of GDP, you get it right around four or 5%.
Speaker 2:Yeah.
Speaker 6:And if you ask me five years from now, ten years from now, is that percentage gonna be higher or lower? I think the answer is pretty straightforward. It's going to be higher. It's just a question, is it gonna be 10 or 15? So why is that?
Speaker 6:Because the rest of the pie, the rest of the GDP would have grown faster. So that's why I always go back to it. At the end of the day, the only rate limiter here is the overall economic growth Yeah. And the factors of sort of input to it. So tech as an input, I think AI and everything that it entails Yeah.
Speaker 6:Is gonna be a core driver. And some of it will come from just this intelligence and its sort of continual march of capability, but it'll also come from, I'll just call it, great engineering and product making
Speaker 2:Yeah.
Speaker 6:Around it. Like, when I look at GitHub Copilot today with AgentHQ and what have you, that's great pry because right now, I'm inundated with multiple models. And everything is slightly different, except I have one repo, and I want all of these agents to come work on all of my repo in different branches. So you need great product making to bring more coherence to the chaos, and that, think, is gonna be the big difference maker.
Speaker 1:I was talking to Eric Lyman at Ramp who makes the show possible, of course. You too, friend. He had a question about how you what, like, what advice you would give to someone running a Decacorn thousand plus employees in this age of spiky intelligence where there is the possibility that tools are gonna get better very rapidly, and maybe you don't wanna scale up too fast and then have to do layoffs or retraining. Like, you run a huge organization. How do you think about managing human capital in what feels like an uncertain time?
Speaker 1:Does it feel more uncertain to you now than it did ten years ago? For it's
Speaker 6:a great point. Mean, actually, Eric's a great founder. I know him well, and he's they're doing some unbelievable work. Yeah. And so, in fact, whenever I've talked to him, in fact, I learned from him even how he's rapidly changing
Speaker 1:Yeah.
Speaker 6:The agents they have built. So to some degree, I think and I said Microsoft, or with its RAM, I think the key is learning the new production function. So I when I look back at Microsoft, I feel like, hey. Look. If your platform ships, we've navigated.
Speaker 6:I joined Microsoft when my our existential competitor was Novell. Yeah. Right? And so you know, in '90 and here we are. And so but the bottom line is we've, over the years, navigated many platform shifts.
Speaker 6:Yeah. We've also navigated tough business model shifts. Right? When you suddenly have you you know, you have a 98, 99% gross margin server business, and you have to move to the cloud, and you don't even know, man, is there a margin here? And yet, you have to make the shift and
Speaker 2:Yeah.
Speaker 6:Figure it out. This one is, interestingly enough, both a tech shift
Speaker 1:Yeah.
Speaker 6:A business model shift, because this is the first time you have marginal cost of software. It's not like COGS of the SaaS world, but true marginal cost. And three, the way you produce your artifact, your software is changing. Yep. So the product development process is completely getting ripped and replaced.
Speaker 6:And that is a map whether it's for RAM
Speaker 4:Yeah.
Speaker 1:In fact But even
Speaker 2:the competitive dynamic too because you have people that can say, hey, we can build this product in two months. Previously, it would've taken us twelve months. Why don't we enter that Exactly. Category.
Speaker 6:And it's kinda like rewiring yourself. Right? Unlearning is the hardest part. Learning is easy
Speaker 2:Yeah.
Speaker 6:At times. If you have to unlearn and learn, it's much harder. Yeah. And so to me, that I think is what all of us have to I mean, it's funny. I met a bunch of student developers right here.
Speaker 6:Oh, sure. It is the first cohort of developers who grew up with GitHub Copilot as standard issue when
Speaker 1:they It's crazy. Completely different environment.
Speaker 6:They say, oh, there was a word before GitHub Copilot.
Speaker 2:Crazy though. Like, don't wanna I don't wanna live in that.
Speaker 1:Create a completely different, abstraction way. On the on the topic of changing business model, shifting your business model, it seems like the console wars are over. Take me through the journey.
Speaker 2:You're a peacetime CEO now.
Speaker 1:We're a peacetime CEO. The war is over. But but but take me through the evolution of the of the business model shift on the gaming side of the business. It's one of the most interesting pieces of of Microsoft. Yeah.
Speaker 6:I I think you you gotta remember, in fact, Flight Simulator, I think, was the first product Microsoft built even before, I think, our, you know, our dev tools were first. Yeah. Flight Simulator was second.
Speaker 1:That says so much about the culture. Yeah. It is. This is like as soon as you gave the developers the ability to write code, they
Speaker 2:were like, let's make a game. They got so it's amazing.
Speaker 6:And so to me, win remember, the biggest gaming business Yeah. Is the Windows business. Yeah. To us, gaming on Windows. Yeah.
Speaker 6:And, of course, Steam has built a massive marketplace on top of it and done a very successful job of it. So to us, the way we are thinking about gaming is let's first of all, now we're the largest publisher Yeah. After the Activision. So so therefore, we wanna be a fantastic publisher. Similar approach to what we did, with Office.
Speaker 2:Yeah.
Speaker 6:We we wanna be everywhere in every platform, and so we wanna make sure whether it's consoles Yeah. Whether it's the PC, whether it's mobile, whether it's cloud gaming Yeah. We wanna or the TV. So we just wanna make sure the games are being enjoyed by gamers everywhere.
Speaker 1:Yeah.
Speaker 6:Second, we also wanna do innovative work in the system side on the console and on the PC. Yeah. And bring you know, it's kind of funny that, you know, people think about the console PC as two different things. We built the console Yeah. Because we wanted to build a better PC Yeah.
Speaker 6:Which could then perform for gaming. Yeah. And so I kinda wanna revisit some of that conventional wisdom. But at the end of the day, console has an experience that is unparalleled. Yeah.
Speaker 6:It delivers performance that's unparalleled. That pushes, I think, the system forward. So I'm really looking forward to the next console, the next PC gaming. But most importantly, the game business model has to be where we have to invent maybe some new interactive media as well. Because after all, the gaming's competition is not other gaming.
Speaker 6:Gaming's competition is short form video. Yep. And so if we, as an industry, don't continue to innovate, both how we produce, what we produce, how we get think about distribution, the economic model. Right? Best way to innovate is to have good margins Yeah.
Speaker 6:Because that's the way you can fund.
Speaker 2:So so interesting saying gaming's competition is short form video. It feels like the entire world's competition is short form video. Yeah.
Speaker 1:I we I mean, we've heard this thing a while ago. It just comes up again and again with, public SaaS companies that are maybe a little bit more of a point solution, and they have to go through a business model transition. And that could be harder than a tech transition. And we hear about, oh, well, if you wanna change your business model, maybe you wanna be private. But it feels like is there some sort of advantage of being a hyperscaler at $4,000,000,000,000 company that you can go and retool a piece of the business over here, change the business model, and and have almost the the privilege of, you know, not having shareholders come to you and beat you down about a slight shift to the business model in a subdivision that's that you don't get that Yeah.
Speaker 6:I can deny that, you know, diversity of business models, diversity of the portfolio that Microsoft has has been helpful. I mean, it's kind of, but that said, I don't think you can take that Mhmm. And say somehow you can make it. Yeah. If you don't reinvent yourself.
Speaker 6:See, I think what happens in tech, unfortunately
Speaker 4:Yeah.
Speaker 6:Is that when these shifts happen
Speaker 1:Yeah.
Speaker 6:Whether you like it or not, you have to first be relevant
Speaker 2:Yep.
Speaker 6:After having wait. It doesn't matter what the business model is. The business model may be like, hey. I had, whatever, 90% margins.
Speaker 1:Yeah.
Speaker 6:You are gonna, at best, have 10%. Yeah. But you have to jump all in because even that 90 is going to zero. Yeah. And so given the binary nature, you gotta make it to the other side.
Speaker 6:But then the category economics matters, Veru, because if you can't sustain long term
Speaker 1:Yeah.
Speaker 6:Innovation, if there is no category economics I mean, hyperscale is a great one. In fact, the best day in hyperscale business was the day Amazon announced their operating margins.
Speaker 1:On AWS, IPR. Yeah.
Speaker 6:It you know, because that's when everybody knew hyperscale business is an unbelievable business. It's a commodity, but at scale, nothing is a commodity. And so to me, that is kinda gonna be the key here, even SaaS applications.
Speaker 1:Quickly unpack why that was so good for you again, just because the market recognized that you were in the same business and it was fantastic.
Speaker 6:That is one. And more importantly, it was much more expensive. Right? I mean, think about our server business. Right?
Speaker 6:It's super profitable. Yeah. Except it was one tenth the size when I look at it compared to Azure. So we, like, we sold a few servers. Yeah.
Speaker 6:But, man, we sell a lot of cloud VMs Yeah. Or containers. Sure. Who would have thought how expansive Yep. The cloud consumption model is going to be in terms of people being able to sort of it's kind of the Jemin's paradox that sort of really played out in a massive way if you Just would argue broadly.
Speaker 6:Yeah. So I think on the business
Speaker 2:model Well timed Jevan's Paradox post, by the way, back during the the deep seek moment. Oh, yeah. It was an important set
Speaker 1:Spot on with that. Analysis.
Speaker 2:I wish we could keep going. Okay. I think we have to wrap up,
Speaker 1:but we Yay for you. Would love this gong.
Speaker 2:And if if it only hit for, 27%.
Speaker 6:Wow. That's a good
Speaker 2:good hit. Very strong hit.
Speaker 6:Let's Sung as
Speaker 1:big as possible. Oh, there we go. That is a fantastic signature. Thank you so much. Thank you for coming on.
Speaker 1:Thank you for having us. It's a really great time. I've always whenever these big news these big tech news things happen, I always wish I could talk to the person who's making the news, and now I get to. And so, what a wonderful conversation.
Speaker 2:What a moment. And What a CEO.
Speaker 1:Yeah. What what what yeah. What a moment in the in the in the tech world. Well, thank you. If you're new here and you tuned in just because of Sachin Hidalgo, the CEO of Microsoft live on TBPN.
Speaker 1:Please follow us. Leave us a comment. Add us to your RSS feed. We have a fifteen minute version of the show called Diet TBPN where you can hear All
Speaker 2:the deflators. All the deflators. Calories.
Speaker 1:You'll also hear ads from our sponsors like Vanta. Automate compliance, manage risk, breach trust continuously. Vanta's trust earns for Blackbaud and takes the manual work out of your security and compliance process and replaces it with continuous automation, whether you're pursuing your first framework or managing a complex program.
Speaker 2:You could've kept going forever. I know. Absolutely. We both could've. We should do we should do a giga stream with Satya sometimes.
Speaker 2:Just twelve hours straight.
Speaker 1:It'd be super easy to get that on
Speaker 2:the calendar. Sure. There's a twelve hour block For sure. Somewhere out in, the twenty thirtieth that we could lock in.
Speaker 1:But, I mean, there is really so much I mean, that's the that's the problem with these these conglomerate CEOs. They just got too many business lines. You know, you could do it you could do a whole hour just on Xbox and Activision.
Speaker 2:Yeah. The They didn't even That in English, right? Stands out to me. The thing that stands out to me is when you see some of these other hyperscalers or players Yeah. Their reaction time just Satya makes them look incredibly slow.
Speaker 2:Right? Oh, yeah. He's been making these, like, sizable bets. Like, he's been seeing the future, and yet only this year you've had other players who I won't, I won't directly name
Speaker 1:Mhmm.
Speaker 2:Deciding like, okay. I wanna get in the game now. Yeah. It's like, what what were you doing when when Satya was in the kitchen cooking?
Speaker 1:No. It was fun.
Speaker 2:Anyways, what a what a wild
Speaker 1:day. Alright. What else is on the timeline? What other news should we bring the folks while we are here? I'm seeing mostly people talking about nine nine six Porsches versus nine nine six ing working hard.
Speaker 1:What else is in the what else is driving the news cycle today? Of course, OpenAI just did a a livestream, with Sam Altman and the head of research over there talking about their side of the deal. All parties kind of aligned to, hey. We got a clean cap table. Let's move forward.
Speaker 1:We're excited in space. Wait. PBC. Yeah. Which is what Anthropic is as well.
Speaker 1:Right? Yep. And then Microsoft now owns 27% or a $135,000,000,000 stake in OpenAI. And OpenAI is contracted to buy a 100 and, 250,000,000,000 of Azure services. That's a lot of Azure.
Speaker 1:Should make it easy to underwrite, future CapEx on the Azure side. Earnings is tomorrow. We'll be back in Los Angeles at our at the TBP And Ultradome, and we probably won't be live when earnings drops post close, but we will be bringing you the news on Thursday, of course.
Speaker 2:Meta also reporting earnings tomorrow, which will be notable.
Speaker 1:Yes. We also have Alex, the product lead on Codex, but we're we're wiring him up so that we Beyond a second.
Speaker 2:I go here through a post from Semi Analysis. Please. They have a a green text here. They say, be me, Qualcomm. Time to enter NVIDIA AI chip market.
Speaker 2:NVIDIA's making money. How hard can it be? Spend years developing AI, 200 chip. Finally ready for big announcement. Make fancy slide deck.
Speaker 2:Put 768 gigabytes of memory on there. Sounds big. A 100 a 160 kilowatt power consumption sounds powerful. Add liquid cool sounds cool. O s h I TJP g.
Speaker 2:What about flops? Decided just to not mention it. Also, don't mention price, or how many chips per rack, or actual benchmark numbers. Just vibes. Launch presentation.
Speaker 2:Qualcomm AI 200. It exists and uses electricity. Refuse to elaborate. Stock goes up 15%. That feeling when investors don't know what flops are either.
Speaker 2:My, face went greater than 10 x with no baseline. Ships in 2026. AI two fifty ships in 2027. Still won't tell you the specs by then, probably. Low TCO.
Speaker 2:Trust me, bro. Confidential computing. The performance is confidential. Unreal.
Speaker 1:I have more there, but let me first tell you about Julius. What analysis do you wanna run? Chat with your data and get expert level insights in seconds. No. You know what's funny is that in the 2019 blog post from OpenAI announcing the deal with Microsoft, They say Microsoft is investing 1,000,000,000 in OpenAI to help us support building AGI.
Speaker 1:But, specifically, we're partnering with with with Microsoft to develop a hardware and software platform within Microsoft Azure. And so it feels like like I I imagine what they mean by hardware platform within Azure is just like a bunch of NVIDIA GPUs at that moment in time. Yeah. But it does it does lead to these, like, the the natural questions of, like, how deep do you go in the stack? And if you're if you're Sam and you're OpenAI and you're seeing that you're ultimately limited on on cloud capacity and then chips and then at at first, dollars because there was plenty of Azure capacity.
Speaker 1:It's not like in 2019 they used all of it. Yeah. But they needed the money, then they needed the the data centers, then they needed the chips, and now they're needed electricity. And it's just going deeper and deeper in the stock.
Speaker 2:Yeah. The thing that's notable is, just how married, OpenAI and Microsoft are. When you look at OpenAI's relationships with NVIDIA, AMD, Broadcom, and these other players, everybody in the chip space is sort of like, you know, you know, in these sort of like complex dynamics. Right? We we got some backstory on the dynamic between the just like the whole series of events between OpenAI and NVIDIA and AMD and how that all came together.
Speaker 2:And it seems like everybody in the chip side is sort of sort of I wouldn't say desperately, but desperately sort of, like, competing for OpenAI's attention and resources. Yeah. Meanwhile, Satya is able to just kinda sit back and ride this partnership out. So I wonder where Qualcomm's retraced, by the way. It's now only up eight and a half percent over the past five days, so dropped a little bit after the Is that on
Speaker 1:public.com investing for those that take it seriously? They got multi asset investing. Any food is real. They're trusted by Merrill.
Speaker 3:We gotta throw out
Speaker 2:a post here from, Spooks, early, early friend of the show. He's quoting a post from Samuel Hammond who said, melatonin in The US is sold in five milligram doses with the effective dosage range is point three to three milligrams. Americans essentially overdose on melatonin by default for no good reason. Spook says, okay. Well, if there's an easier way to soul speak with my ancestors in the dream realm, please let me know.
Speaker 1:Don't know if there's gonna read it all on this particular streak. We don't even have your tweets Loud and banger.
Speaker 2:I mean, Grokopedia is now live.
Speaker 1:Do you think do you think OpenAI will watch a a a Grokopedia competitor? Is I I I clicked on I clicked on a Grokopedia yeah. Yes. Immediately. But I clicked on a Grokopedia, like, entry, and I was like, oh, wow.
Speaker 1:It's like a pre baked deep research report on something that I already would have wanted to search for. Yeah. Actually, effectively It's a great product.
Speaker 2:I mean, deep research has, if if Deep Research has disrupted, in the same way that ChatGPT has disrupted search Yeah. Deep Research has disrupted Wikipedia. Yeah. And there's so many, so many prompts that I run that I'm like, I shouldn't be, like, burning up the GPUs for this. Yes.
Speaker 2:Yes. Yes. This should be stored in a database somewhere else. You'll able to access it.
Speaker 1:This is the funniest thing is that, like, over time, over the air, it goes.
Speaker 4:We have a slight changes.
Speaker 1:Good to spell it best.
Speaker 2:From Kodak. Alex, welcome to the stream. Let me tell you about Fall while he hops on. Genera needed the fire phone for developers. Nice to meet you.
Speaker 2:Hey. Hey. Back to the champ. Thank you.
Speaker 1:Congratulations. Incredible event. How many have you been to? Give us give me give me a little read on the ground of, like, what's the scale? Is this the biggest ever?
Speaker 1:Tell me a little bit about what's going on today.
Speaker 5:So we just had a OpenAI Dev Day
Speaker 7:Yeah. Two weeks ago.
Speaker 5:Yeah. That was awesome. It was massive. Actually, I got COVID, like, the day before, so I was not there. So, yeah, this is actually the biggest event like this I've been to this year.
Speaker 2:Yeah. So it's off with the COVID now. You don't get the same level of you don't get the same level of, like, oh, I'm so it's like, okay. So you yeah. I'm just saying.
Speaker 5:Yeah. I didn't think was a thing anymore. But anyways, here today, we announced a couple things. We announced that Codex is coming natively to GitHub. Okay.
Speaker 5:Very nice. And then we announced that today, actually, we're bringing Codecs to Copilot Versus Code. Sorry. I'm gonna mumble this.
Speaker 2:Try Copilot Pro Plus subscribers in Versus Code. Can you use Pro like mad part of this work. Yeah. Okay. Yeah.
Speaker 2:It's tough. Amazing.
Speaker 1:Walk me through, like, the different flows and, like, the different actual, like, user journeys because there's something very interesting about GitHub has the ability to even host pages. Yeah. And then Codex allows me to, from my phone, potentially, like, write a web app that then can be deployed, is that helpful to actually, like, instantiate a web app on the fly that actually lives on the Internet that I can send to a friend? Is this, like is there is there the beginnings? Are you are you starting to see, like, what the next era of vibe coding might look like?
Speaker 5:So totally. I mean so mostly, Codex is used by professional software engineers, although we have good amount of people Yeah. Who don't aren't as familiar with coding using this. Okay. But, like, I think the best analogy is to think of codecs kind of like a human teammate.
Speaker 5:Right? Sure. If we're working together, I could talk to you in Slack. Yeah. I could talk to you in GitHub.
Speaker 5:Yep. I could text you.
Speaker 1:Yep.
Speaker 5:I could come by your desk, we could, like, jam on something on your computer. Yeah. And so it's kind of the same you, but you're present in all those tools. So that's what we're trying to build codecs into. It's just like an AI software engineering team that works with you wherever you like building.
Speaker 3:Sure. Sure. Sure.
Speaker 5:Yeah. So, yeah, use it from your phone and, like, make some updates there. Maybe that creates a PR. You push it into GitHub. Yep.
Speaker 5:You know, maybe you you use Codex to review your PR in GitHub, and then you land the PR. Like, all those things, no matter where it is, it's just the same Codex agent. Yeah.
Speaker 1:Yeah. Is the is there some sort of, like, business model flow through them? Like, you have to be subscribed to OpenAI, but then you also subscribe over to GitHub. And then you're kind of like that's just like the default stack for a
Speaker 6:lot of people. So so this is, like, actually a kind
Speaker 5:of an interesting part of the deal. So to use Codex today, the main way that most of our users use us is they have a chatty chatty pity account, of course. You know, they're on Pro Plus Sure. Enterprise, and then they can use Codex.
Speaker 6:Yeah.
Speaker 5:Now as part of this deal, what we figured out with GitHub is how to partner Yeah. Is that if you have a Copilot Pro Plus account and you don't need to have a Chattypuppy account, you get the full power of Codex anyways. And when I say the full power, I mean, you get to use our model and you get to use our our model harness, which is kinda like the code that provides the bomb, tools, the run loop. Yeah. And so, you know, our goal in this is just to, like, make Codecs as ubiquitously available as possible.
Speaker 5:Yeah. Yeah. And so, yeah, you don't need there's no, like, flow through there. It's just like, you know Yeah. You just need your Copilot account.
Speaker 2:There's somewhere out somewhere out there, there's somebody that just started CS in college, and they're only gonna live a life that right? From, like, just running codecs in GitHub just naturally.
Speaker 1:And Yeah.
Speaker 5:And Well, mean, look. So actually, it's interesting. Right? Like, if you think of GitHub, it GitHub does a lot of things. Yeah.
Speaker 5:Right? But at least, like, where I personally spend the most time in GitHub is, like, actually collaborating with the other people contributing to the code base. Right.
Speaker 7:Yeah.
Speaker 5:And so, like, I actually think it's quite unlikely that you would only spend your time doing that type of activity. You're also gonna spend a ton of time in tools like Versus Code Sure. Or, like, the Codec CLI or IDE extension because that's where you're doing your work yourself. Right? Like Yeah.
Speaker 5:Again, like, I think the this human teammate analogy kind of works pretty well here. It's like most of us, 90% of the work we're doing is kind of at our desk. We're not, like, having a meet oh, well, having meetings, like, 90% of the time as a software engineer. Right?
Speaker 1:Yeah. Yeah. Yeah.
Speaker 5:So probably that, you know, that, person who's gonna become an engineer but is currently in college, they'll spend a lot of their time with superpowers but working at their computer doing stuff individually, like Yeah. Yeah. Commanding fleets of agents. Right? Yeah.
Speaker 5:And then they'll spend some of their time collaborating with their team, like, obviously, quite a lot of their time, but not all of it. I don't think the sort of individual productivity is going away.
Speaker 2:Yeah. Tell us how much one kind of follow-up question. Like, how much how important is the metric of, like, how long Codex is spending working? Is that something that you guys are, like, explicitly, like, tracking and trying to scale? Because it just means that It
Speaker 1:goes from, like, creating more value two hours. We're
Speaker 5:it's interesting. I actually shared at the keynote today that we last week, and then here on the Codex team, Codex for over sixty hours on this incredibly hard task.
Speaker 1:And that Wow.
Speaker 5:That's crazy, and we're, like, we're excited about the capability of that from the perspective of
Speaker 2:It's mind boggling. The model is
Speaker 5:actually able to do, like, very productive work for a long time. Yeah. And it means the model and the harness are working together
Speaker 1:Sure.
Speaker 5:To manage the context window because the action assumes more than one Yeah. Yeah. Length. However, it's not like we have an eval that's like, how long did the model work, and let let's maximize that. Like Yep.
Speaker 5:That's much more sort of a lagging indicator of, like Yeah. The intelligence capability of a model. Like, what we're really trying to drive is, like, how smart is the model? Yeah. Right?
Speaker 5:And how how easy is it to work with? Yeah. And then it just turns out that as you make the model smarter and smarter, it can work it can take on, like, longer and longer tasks.
Speaker 2:How how, how what what do you think about the user experience if you're setting Codex off to go work for sixty hours? There's some risk that, you know, it's doing things maybe incorrectly, and you come back after sixty hours, and you're like, I just I kind of just blew sixty hours that I could have been doing this myself. Sure.
Speaker 1:I mean, I mean, that sixty hours,
Speaker 2:all of Game of Thrones. That's possible. Or you're watching Subway Surfers here. We gotta No. Yeah.
Speaker 2:Yeah. It's been good. But you're both playing with it. No. But my question is, like, the the workflow that that feels like the most natural, give it using the teammate analogy, is you just get a ping, it's like, hey, can you double check this before I continue?
Speaker 2:Is that is that is that a workflow that you're thinking about? Like, you just you're a developer, you get a push notification, you're at the gym, and you're like, yeah, looks good.
Speaker 5:Yeah. I so I I think, like, there's two things you said that that make a lot of sense to me. Like, one is just, like, steerability. That's something we're working on. You want us to be able to steer the model, like, short task, long task, whatever.
Speaker 5:Right? Yeah. The other thing is kinda like for activity. Right? Like, again, when when you hire someone onto your team, maybe at the beginning, you're hanging out, you're collaborating directly, then you start delegating small tasks.
Speaker 5:And at some point, you know, you will give them, a sixty hour task without, like, specifically prompting every single detail. Right? Yeah. And what you expect of, like, a good employee is that they know when to ask you questions. Right?
Speaker 5:Yeah. And so it flips from, like, the bottleneck of my productivity is how frequently I'm able to prompt an LLM Sure. To the bottleneck of my k of my, productivity is actually kind of, like, how I can structure the work. Yep. So that, like, independent agents Yeah.
Speaker 5:Or humans or whatever can, like, go do the work and then ask me questions when they have
Speaker 2:to go, like think because in, we're so used to now, like, trusting, a CRUD app. Right? Like, you can trust that you can put data in and and and you're gonna come back and it's gonna be there. Right? We have that faith.
Speaker 2:And it feels like with with agents as a product category broadly, the agent if you're building agents, you need to be focusing on, like, how do I develop trust with the user? Yeah. And it's come you know, again, maybe it's like focusing on short term tasks initially and having that steerability.
Speaker 5:So so this so if you think about it, like right now, the place where most people are using coding agents is to write code. Right? Like code gen. And again, I keep coming back to the human analogy, but like imagine you had a human teammate and the only thing they can do is write code. They can't read user feedback.
Speaker 5:Yeah. They're not in Slack. Yeah. If there's an outage, they're not gonna see it. Sure.
Speaker 5:Right?
Speaker 2:Sorry. Sorry.
Speaker 5:So, like, argument very much. Human teammate, even if this is the smartest human teammate in the world, like, no. Right? So, like, what we need to do to, like, build this trust is we need to, like, extend what agents can, like, look at across the software development life cycle. Right?
Speaker 5:Yep. So they're, like, present in more of the team conversations and ideation and prioritization and planning. Yep. They're also able like, more and more capable at the code review stage. And, actually, that's one of the the recent product releases we ship is, like, Codex code review, people loving that.
Speaker 5:But more present at code review, more present at, like, the deployment stage and, like, code maintenance stage, aware of, like, what's going on and, like, your telemetry tools. Yeah. And, like, I think that's actually how you get to the trust. So some of this is, like, increasing model capability, but a lot of this is actually changing the form factor of, like, how these models are harnessed so that they can, like, interact with more of what you need.
Speaker 1:So it feels like couple years ago, we were just trying to predict the next token, and it was like, oh, wow. I can do poetry. Oh, wow. I can write code. Like, this is amazing and very, like, undirected fundamental research.
Speaker 1:Now we're in the age of spiky intelligence. Is there a feedback loop where you're actually trying to take feedback from, okay, Carpathi says that the agents can't write, you know, Nano GPT, so let's go work on that. Or, oh, we've seen in the data that it's easy to write a, you know, website in Django and Python. But if you're trying to use Fortran to refactor some obscure the hybridity trading system or something like, we're falling down on that. Let's actually put a team on this.
Speaker 1:Like, how does act how does feedback work now? And, like, what are the humans on the team doing? Or is it all just zoom out and hope that the emergent properties, like, solve? Like, is it deus ex machina?
Speaker 5:You know, this is OpenAI. So the way that we build is constantly evolving. Yeah. Like, the Codex team was just, like, five engineers Yeah. Like, a few months ago, and now we're, like I actually don't know, but I think we're, like, 25.
Speaker 3:Okay.
Speaker 5:So, it is constantly evolving. But what I can say is that our team is, like, possibly slightly unhealthily on social media just, like, reading all the feedback. Sure. But we love we love it when people send feedback to us. We're also starting to, like, try to get, like, a better understanding of, like, okay.
Speaker 5:Like, how do different, like, model snapshots and stuff compare?
Speaker 6:Sure.
Speaker 5:And so, yeah, we're we're starting to build up a more and more systematic way of doing it, but I still think it's, like, quite early days. Yeah. And there's still a lot of taste involved.
Speaker 1:Yeah. It does feel like we're entering the era where if you see in the data that, you know, maybe there's developers here that are like, I want to use Codex for this specific thing. You're you're you're falling down on this. You're great at everything else, but you're not with this. There is the world where you can actually go in and and run the
Speaker 2:actual How do you I wanna make sure you don't get the wrong signal from social media. Because, like, there's a certain type of person who posts about post product feedback publicly. And then there's for every one person that does that, there there could be even, you know, a number of people that just churn Yeah. And and never say anything. There could be a number of people that, are just like super hungry power users.
Speaker 2:And yet, you know, these people think like getting a push notification and somebody is saying like, you know, they're DMing you or something that somebody said about Codex. It's like you need to make sure that it doesn't, you know, consume like a 100% of your like world view on how the product is actually resonating with users. Totally. So, yeah. Let me answer that.
Speaker 2:And I actually wanna go quickly back to the
Speaker 5:fourth time where I'll pick as well. But on that note, I think, like, yeah, a lot of the feedback you're gonna get on social media is, like, your power users. Yep. Right? And so power users, I think, are really good.
Speaker 5:Like, I kinda put that in the, like, how do we advance the capabilities? And, like, we are trying to advance capabilities so that looks good feedback. Right? Like, what are they doing? Like, what should the product do make easier for them?
Speaker 5:Yeah. And then at the same time, I think I like to balance that kind of with, like, just what is the first mile of the product? Like, literally, like, the first 20 keystrokes. Like, what are those? Right?
Speaker 5:And I think for me, I just, like, kinda focus on mostly these two extremes. Mhmm. Yeah. So we're constantly looking like, okay. What is the new experience you get to the product?
Speaker 5:And frankly, I I think there's a ways to go. It's still a very power user y product and
Speaker 6:Yeah.
Speaker 5:Lots to improve there. On the sort of, like, the Fortran question, like, one of the interesting things about building codecs in open source, which we're doing, is that we're seeing, like, larger enterprises with very bespoke needs who are excited about the capability. You know, maybe an engineer was using codecs on the side, wants to bring it to work, notices like, oh, in this code base, it's not doing as well as as it's doing in, the code bases that OpenAI, like Sure. Has more you're gonna see more. So, like, what we're actually seeing is, like, certain customers are starting to, like, fork the CLI or work with us to deploy it in a very specific way where you can inject, like, you know, more instructions for you the, like, company specific language, like, into the context.
Speaker 1:If I'm business if I'm a big enterprise and I have a million lines of Fortran for whatever reason Yeah. And I, you know, authenticate with Codex, you're not training on my No. On my code, which is good for privacy, but maybe bad for performance. So what you're saying is that there's a world where we could work together to figure out how to actually fine tune the model or train the model or work together to to have the actual product work better on my code base?
Speaker 5:Yeah. And I think, like, fine tuning and training are, like, definitely levers that exist. But I think even before that, there's, a ton
Speaker 7:of work
Speaker 6:you could do. In the harness?
Speaker 5:Yeah. In the harness. Got it. Even in, like, in terms of, like, you know, agents.md and just, like, how you tell the model what it needs to know if it doesn't
Speaker 1:So in every layer of abstraction Yeah. We can do it. There's there's opportunity to squeeze out extra performance before we go back to, like, hey. Let's pre train on your
Speaker 5:data, which not necessarily is, like Yeah. I think there's a giant capability overhang from models today. And so, yeah, with it's it's kind of it's exciting. We're seeing, like, a lot of pull from enterprise now. Yeah.
Speaker 5:And it's exciting because we get to kinda, like, go deep Yeah. Right, and invest a lot of time on our side to figure out how to make it work even with, like,
Speaker 2:the current model on the current model.
Speaker 1:The pull from enterprise. Do enterprises care about benchmarks, or do they feel like they've been hacked? Going back to the social media thing, I think people were really into benchmarks, then pretty quickly, everyone kind of assumed that they were saturated, they were gameable. But what are you hearing on the enterprise side?
Speaker 5:I think yeah. I don't I don't hear a ton about benchmarks, to be completely honest. I mean, maybe folks read it. But I think it very quickly comes out here. Think of
Speaker 2:the SaaS error. It's like our our our our CRM is a split second, actually, faster than competitor. Like, you should you you should use us instead, actually.
Speaker 1:Yeah. It's up to
Speaker 5:10 So mostly, I think what it comes down to, at least on a lot of things we're seeing, it well, it's actually this kind of two motions. Sure. One is, like, it's just, like, they give the tooling to developers, and it's like, do you like it? Yeah. Right?
Speaker 5:And this is, what do developers like more? Luckily, developers love Codex, that's great. The other side is actually, it's like, hey. Like, we have this, like, really big project that we wanna do. It's like a replatforming, like, a migrate migration from one cloud provider to another Yep.
Speaker 5:Or something like that. And that's where we're actually, like, working more closely with enterprises to figure out, like, okay. Let's let's actually set up, like, a meta harness almost for, like, Codex to do this work. This started with, like, some customers, like Instacart runs Codex, like, in one of these. I don't know if they don't probably don't call it a meta harness.
Speaker 5:Sure. But, like, in basically a system that runs codecs automatically to do stuff that, you know, that they wanna do for code maintenance. Yep. And then now we've been like, okay. This is actually a pretty good idea.
Speaker 5:Like, we can go help customers who have these, like, larger things that they wanna do to set up this kind of, like, workflow automation. Okay. So there's kind of the two sides.
Speaker 1:Last question. Are you feeling GPU rich or GPU poor right now?
Speaker 2:Any any requests for Sam and Sarah?
Speaker 5:We we so the Codex team is getting a ton of support. So You feeling GPU rich? I think I think the Codex team feels supported, but I think OpenAI, like, could definitely like, things are growing and more GPUs would be more good. So, yeah. Definitely far ahead.
Speaker 1:Okay. So so internally GPU rich
Speaker 2:Yeah. Every time wouldn't
Speaker 5:say that. That's probably overdo you.
Speaker 2:No. Every every time I'm in ChatGPT now and I I prompt it on something that it should really think, you know, you should think a little bit. And it and it's like, you know, I I had a request yesterday for like, give me a list of like 50 companies that meet these criteria. And it was like, I can't do that. What?
Speaker 2:And I was like, and I was like, yes, you can. But in that in that time, it was like, yeah, the the GPUs were like This is you guys, I'm sure. And somewhere out there, there's millions of Codex agents, you know, running and
Speaker 1:Yeah. This is the this is the the the the endless product feedback. But thank you so much for coming. No.
Speaker 2:Thanks, guys. It's very It's doing it's great to hear your verses. Thanks for coming on, guys. Have a good time. We
Speaker 1:have a few more people joining this show. If you're tuning in for the first time, please subscribe, follow us on YouTube, or add us to your Spotify. And before We're
Speaker 2:out LinkedIn. Our next guest We're like LinkedIn. Don't forget about LinkedIn.
Speaker 1:Yeah. Honestly, I know a lot of you who are listening on another platform. Do not follow us on LinkedIn. That's right. Head over there.
Speaker 1:We are actively hiring someone. We just hired someone. We're we're working on ramping up our LinkedIn presence. Very excited for that. We're also excited to tell you about Turbo Puffer.
Speaker 1:Search every byte, serverless vector and full text search built from first principles on object storage. Fast, 10 x cheaper
Speaker 5:Thank you.
Speaker 2:Extra lap. Scaled in. Thank you. Next up, we have Kyle, COO of GitHub. Very excited.
Speaker 2:Very excited. This where
Speaker 1:we are wiring him up.
Speaker 2:He will be coming on in a second. Next, we'll have Jay, EVP of Core AI. I talked to Jay
Speaker 1:a couple months ago at this point. I was in the back of a car. It was it was kind of hard to get, to get to to to bring, like, the level of enthusiasm that I had, but I'm very excited to talk to him. Because a lot of the questions that I got when I asked people, hey. We're we're going to Microsoft.
Speaker 1:We're talking about their relationship with OpenAI was we wanna hear what Microsoft's doing inside. What is the CoreAI team? Where where's where's that driving the business? So we're excited for that.
Speaker 2:And then next, we'll have Jared Palmer, VP of Product, Core AI, and the SVP of GitHub. And then we'll finish it off with, Michael, founder and CEO of Workflow. Workflow. Very exciting. Excited about as well.
Speaker 2:Ken heads back to the timeline. Tomorrow for the show, we'll have to dive more into Grokpedia. Gotta Yeah. Gotta start using it, checking it out. While we have time, there's an app, launched by, a product designer at Meta.
Speaker 2:Did you see this? It's an AI app that if you can't afford a vacation, and, The Verge is saying an AI app will sell you pictures of one. So you upload images of yourself I saw this. And then it put it creates a vacation photos for you. So, I guess short the tourism industry.
Speaker 2:People are gonna start.
Speaker 1:This was announced by a founder who just said like, I wanted to feel the feeling of like the warm and fuzzy vacation photos. And so Yeah. I used AI to generate those. It was it it it was pretty well received originally, but I think that they're spinning it. And so the narrative might be getting away from them.
Speaker 1:But we have Kyle from GitHub coming into the studio. Hey,
Speaker 3:Hey. Great to hear.
Speaker 2:Good to meet you. Great to Thank
Speaker 1:you so much. Bob, hi, Michelle.
Speaker 2:Yeah. Mass day, beautiful event. Good weather too.
Speaker 3:I know. It's a little 80 here when the weather is not great, but when it is, it's awesome, man.
Speaker 1:Give me a little background on you. How how'd you wind up here? How'd you wind up at Microsoft?
Speaker 3:Yeah. So I joined GitHub twelve years ago Oh, you're so nice. We had managers. Yeah. We call it open allocation now,
Speaker 1:but it
Speaker 3:was anarchy. Free Matt. Freeman, yeah. Nat joined in 2018 as part of the acquisition. There you go.
Speaker 3:So, yeah. We were a 140 employees Yeah. When I joined back then.
Speaker 1:Wow. Yeah. And and and do you even think about employee count in GitHub now? Because it's so merged into Microsoft, but it's still its own brand.
Speaker 3:Yeah. I mean, we have over 3,000 employees that have worked full time on GitHub. Yeah. Then, obviously, we partner with Microsoft Teams to do a lot of, you know, the AI model hosting Yeah. Training and so on and so forth.
Speaker 1:On on sort of a meta question, I mean, AI and, you know, AI can do so much. How are you thinking about scaling that team over the next ten years? Is it harder to forecast, like, human capital allocation in the age of AI?
Speaker 3:Yeah. Mean, part of the problem is that there's places where AI is, like, incredibly helpful, like, in software, I think, for sure. Yeah. There's a ton of places that AI hasn't hit. Yeah.
Speaker 3:You know, I mean, there's what we talk about like IT, so many of the sort of business operations side that AI hasn't proven to be as valuable to us yet. But I think over time it'll get there. Events like this take people. Yeah. You know, full in, full out.
Speaker 3:And so, it's just an imbalance, a little bit of where software has been so great, and then the rest of what makes GitHub GitHub. Yes. These people still.
Speaker 2:Yeah. Earlier with Satya, before we jumped on, we were catching up with him, and I said, my words, GitHub copilot is criminally underhyped. And I think the reason for that is, like, you guys don't need to go out and raise a venture round every couple months. You know, you're And, obviously well well capitalized. But can you give us a sense of the scale and kind of the growth of of Copilot over the last couple years?
Speaker 3:Yeah. I mean, you know, GitHub is used by, like, 80% of everyone that joins GitHub right now. It's 36,000,000, I believe, joined in the last year, the first week. Mhmm. Like, one of the first things they do when they join.
Speaker 3:So we're definitely still hitting the world's developers with Copilot. Yeah. All the time, like every day. Now, these days, just like, I don't know, ten years ago, devs are using whatever tool they want. They're so quick, you gotta keep up, you gotta try all these tools.
Speaker 3:But we keep seeing folks using, you know, Copilot over here and then trying out a new tool, or using Copilot over here and finding this new flow. That's a big part of this like reopening up, like GitHub has done over and over. Yeah. Let's bring us all together so you can collaborate in that single place. Wow.
Speaker 3:You're gonna pick whatever tool you're gonna use. And that's I can't tell you That's
Speaker 2:part of being that's part of being a platform. I was saying when he was on the air, it's like, you can't, if you wanna be a platform, is the more closed off you get, the more you're encouraging other people to go elsewhere. Because the tools are changing so quickly, you just you need to be able to give people that flexibility. Right? We had Alex on from Codex, and that's a good example of it.
Speaker 3:Yeah. I mean, the this AI moment, it feels a little bit of like before every app had an API back in the day. Because that all like that wasn't the norm. And now we're in a kind of quasi walled garden moment where everyone's making their thing really really great. You know, their model, their app, their service, whatever.
Speaker 3:But in order for all those tools and agents to actually be valuable, we have to interconnect them.
Speaker 4:Yeah.
Speaker 3:So my hope is that just in general, not just for software devs, we can go back to that platform first approach just like as an industry Because then each of our products will be more valuable for our customers. Because we're not gonna have to deal with, well, how do I actually place the grocery order? Yeah. Because there's not an API for that. You know?
Speaker 2:What what were the kind of key moments for you in understanding that AI would completely change software engineering? Because when you well, you know, we're just going back through, like Yeah. The history even of Microsoft's investments in, OpenAI, obviously, because of the announcement today. And it just feels like Satya had this incredible foresight, you know, in in 2019, in the early twenty twenties that only now a lot of other CEOs are kind of reacting to. But I wanna know for you, you've been here Yeah.
Speaker 2:Getting out for twelve years, like, were kind of the key moments that were eye opening to you where you still thought, I've seen the future. We need to just invest heavily heavily
Speaker 3:in in this Yeah. Mean, the first moment was, kind of a pure open source moment. Right? Like, when it first started happening and we
Speaker 2:were talking about transformers and everything,
Speaker 3:like you see the groundswell on GitHub from the open source side. So we started talking about that. And then when we got access to the first, you know, GPT three, I think, you know, model to ultimately build Copilot, the thing that was so interesting was we were building it to write docs. Like, was what Copilot was. Copilot was taking Oh, yeah.
Speaker 3:Copilot was taking that model. We were going, oh, What are gonna do? Gonna hike right in code. They hate documentations. So we're gonna generate the docs.
Speaker 3:And what happened on with Wait. Code is exactly
Speaker 2:the same in the same characters.
Speaker 3:That's what Exactly. And so then we flipped it and then we got to this From do the setup.
Speaker 1:Go to code.
Speaker 3:Exactly. And then I think, know, while it seems very simple now, like the idea of ghost text, more than just like an auto completion or whatever. The first time we used it, that was truly the moment where we've said, oh crap, because no one had to do something differently. And I feel like that's the big problem with some of the AI tools. You gotta go interact with them in a way that's not normal.
Speaker 3:You gotta go, okay, I wanna go write an email. Write an email for me that that's not how our brains work. We just start typing. And when we were able to do that with the IDE, that very, very quickly kind of shook all of us because it meant, oh, it won't be the same anymore. Now with agents and whatnot, like, because we can verify the code, we have an advantage in software versus some other agents where it's harder to verify.
Speaker 3:But that all started that first time you, like, wrote something and then it just appeared. And I didn't have to learn anything. It just happened. Yeah. And now we just, you know, we all take that for granted because it's a de facto.
Speaker 1:Yeah. Do you have a philosophy of, of how where inference happens will change over the next few years? Like, in a lot of worlds, there's it used to be there's a decision between, like, fire off an agent, wait twenty minutes, wait an hour or something, or do it quicker, you know, a couple of seconds. But then, there is the world where a lot of the work that's going on in software development is so high value. Like, why not just do all three inference it locally immediately and then also in the fast model and then also kick off an agent for every single task?
Speaker 1:Is that where we go? Or or is there sort of like some sort of shift in in where inference happens over time?
Speaker 3:Yeah. I think, you know, we clearly are gonna have more and more inference tests that that should happen locally. Yeah. That seems pretty obvious at this point, I think. And then I
Speaker 2:think when we're talking about,
Speaker 3:you know, how much we're gonna kick off into what cloud, ANs and models, etcetera. I think the thing that's really interesting is that we're pretty close to having that now. Like, we're, you know, we talked about it a lot today. Other folks have that. The real problem is is like, the age old like garbage in garbage out problem.
Speaker 1:Yeah.
Speaker 3:Which is like we talk about abundance and you just fish off five tasks and we pick the winner. Well, if your input was crappy, then probably all five of those are also kind of crappy. Just Yeah. Variance of crap, guess, you know. And so I really think it's about when you're having a discussion with a colleague, or you're on a Zoom call, or you're in an issue, or you're in linear using a ticket.
Speaker 3:That is the moment when we can actually get as much context as possible Yeah. And ask questions then. Like, why isn't Copilot in that moment going Yeah. I think I know what you're trying to build, but like, are you sure about that? Yep.
Speaker 3:Why do I have to carry that even via a click Yep. And then go, let's plan to build this. I get Yes. Humans don't do that. Yeah.
Speaker 3:Yeah. You just read it and we get started.
Speaker 1:Yeah.
Speaker 3:So I think it's it is a bit about where inference happens, but I think it's how early in I have a problem that needs a solution. Yeah. I should be doing inference in the background immediately before I ever invoke something, you know, to go say, now it's time to work. It started while we were in the shower thinking about the idea. That's I think what we need to get the AI to do more of.
Speaker 1:Yeah. I'm I'm I'm thinking about, like, AI at GitHub is the funniest thing because it's, like, seven different Yeah.
Speaker 3:Yeah. Yeah. Yeah. Definitely.
Speaker 1:Walk me through your your thinking on I with a lot of companies, I I see this idea, but AI above the fold or below the fold. Yeah. So above the fold is kind of, like, I put a search box, and I let you interact with my app, my SaaS, my CRUD app Yep. Be a natural language. But then behind the fold or below the fold is, like, in the back in this behind the scenes, I'm running inference over the data to improve the user experience, but I'm instantiating it with HTML, basically, at the end.
Speaker 1:But with GitHub, you also have inference that you're selling directly. You have a whole bunch of stuff. Are you seeing any, exciting developments on, that behind the scenes, like, AI to improve GitHub as a product that isn't actually bubbling up to the user experience directly in the form of just, like, you know, a a text box? Have you have you seen the developments there?
Speaker 3:Yeah. So I mean, a big part of what we've been figuring out is one. We have so much information about your interactions. Yeah. You know?
Speaker 3:Pull requests, the ones you got closed.
Speaker 2:Like, I told a story,
Speaker 3:the first pull request I ever shared didn't make it. And like, but now you know that about me. And so I think the thing that we're figuring out is like, we have like new models that allow us to really deeply understand your code more. Sure. Beyond just like, the tabbing and Yeah.
Speaker 3:Asking a question. Because then if we have that, then we have to understand what does Kyle how does Kyle work in a pull request?
Speaker 1:Yeah.
Speaker 3:What mistakes does he make every single time?
Speaker 4:I'm super fascinated by this. That's the thing.
Speaker 1:And then we we were just talking about with like, Grokopedia today, where basically, it seems like the x AI team went and ran a bunch of deep research reports for all the topics that you'd wanna know, and then you just have their precached output. And I'm so fascinated by this idea of you have a ton of user data. You have a ton of inference. It's gonna be really inference expensive, but what what is some sort of, you know, cron job that you can run over your entire user base, all the data, and then just have surfaced results or surfaced action items? That that seems like an interesting, like, underexplored territory.
Speaker 3:He he seems to be like shared mean, okay. You're But I mean, like If you think about it today, we're saying we're gonna bring all these coding agents. Sure. Sure. Why why does each coding agent have its own memory of how I've interacted with it?
Speaker 3:Yeah. Yeah. I've been a developer for twenty something years. We could just go, here you go. Take this with you.
Speaker 3:Yeah. You know, you can understand how I work. So I'm gonna get a result that matches what I'm looking for.
Speaker 1:Okay. Walk me through the game theory around enterprise pre training for coding agents. So if I'm Coke and he's Pepsi
Speaker 7:Yeah.
Speaker 1:And we both have written a bunch of corporate code and have a massive GitHub installation with you. Yep. And if we both say, yeah, we're gonna train on us Yeah. Yeah. Maybe we will get better models.
Speaker 1:Yeah. But at the same time, we don't want all the information. So, like, what's the current thinking amongst, like, big enterprise customers around, like, will they jump over and say, yeah. You know what? It's worth it.
Speaker 1:Or or from a from an actual, like, would an app scientist just be like, yeah. Don't need that code anyway. What what's the current thesis?
Speaker 3:So we spent a long time trying this. Yeah. And the problem is that everyone goes, hey. Our code's very different. It's super unique.
Speaker 3:Yes. You work a certain way. Yes. You don't. Like, well, so many companies don't.
Speaker 3:Now there are dip there are examples of where that's not true. Sure. Particularly, really, companies with a really long legacy of, like, COBOL, mainframe code, etcetera. We've been kind of discussing with them, like, what would it take to to get another 100,000,000 lines of COBOL code? Sure.
Speaker 3:Because then that does matter.
Speaker 1:That actually moves the needle Underquick. Quality of the coding.
Speaker 3:A 100%. Yeah. Because the problem is is that the practices and principles don't change that much. Yeah. And then most of these companies are also trying to modernize, so they don't want the code to look like their old code.
Speaker 3:They want to use their unique IP and look like the thing they want it to look like in the future.
Speaker 1:But if I have a 100,000 line Django project and he has a 100,000 line Django project, like, you're not like, oh, if only we had that.
Speaker 3:No. No. Because we no. Because we we've, like, done it and, like, there's, you know, margin of error Sure. Improvements, but nothing major.
Speaker 1:Yeah.
Speaker 3:When we look at the, like, looking at the chain of commits Sure. Then you can get to some interesting information.
Speaker 1:Okay. How was the enterprise built? Exactly. The history. Exactly.
Speaker 3:And why was that choice made? There's little You
Speaker 1:you obviously still instantiate that on the fly with with an enterprise partnership. There's just always the question about, like, is there something beneficial that all the companies working together? But that's very helpful. Yeah. Thank you so much for hopping on this.
Speaker 1:Yeah. Of This was a lot of fun.
Speaker 2:You have I had a wolf voice for podcast Come back on anytime.
Speaker 1:Thanks, guys. If you ever wrap up, we have Jay Parikh next. Oh, no. We are moving on. Okay.
Speaker 1:We are gonna take you back to the news. Back to you. For tuning in. We also have to do an ad read for Google AI Studio. We are behind enemy lines here at a Microsoft event, but we are presented by Google.
Speaker 1:Google AI Studio, it's the fastest way from track to project production with Gemini. Chat with models, vibe code, monitor usage. We are obviously very happy to be supported by all of our sponsors who make crazy events like this possible. Thank you. We are obviously able to come up here on short notice due to our sponsors.
Speaker 2:And And, what else? Jay looks like he's getting miked up here. You are bringing Jay Parikh in? Me I I brought this up yesterday. John, if you remember I brought up, I did not know Interstellar that with Interstellar, Christopher Nolan spent a $100,000 to plant 500 acres of real corn Yes.
Speaker 2:In Alberta. Yes. Then sold the corn for a profit after filming. Yes. And it remains the most profitable commodities trade in Hollywood history.
Speaker 2:Yes. You acted like this was
Speaker 1:This is old news. I I I knew about this years ago. This is this is what I think is viral.
Speaker 2:Ian Tobiard in the AI area of like, well, I could just generate the scene with with AI, but I wanna be
Speaker 1:one of the commodities trade, for sure. Trade. No. I I I think Christopher Nolan just got lucky here, honestly. It is it is pretty hilarious.
Speaker 1:I also wonder, you know, how how apocryphal is this story? Because, it doesn't account for everything else that went like, if if the corn was planted by production assistants, right, like, you have to burden that cost into the actual ROI. Yeah.
Speaker 2:It's to plant corn. Who planted the corn? I'm just saying, like Who planted the corn?
Speaker 1:Is this gross is this gross profit or net profit? That's what I wanna know, Christopher Nolan. Everyone's talking big big game about the inter seller trade, and it might not have been as good as you think. Also, who owns the who owns the rights? Does the does the value of the of the corn accrue to everyone who has points on the back end?
Speaker 1:Like, does Matthew McConaughey make a couple dollars off of that corn trade? I don't know. Well, I get to the bottom of it. We're just getting We to the bottom have our next guest. Welcome for the stringing day.
Speaker 4:Hello. How
Speaker 2:are you, Derek? Thank you so much. Welcome to speaking to Eric Peeh. Introduce yourself for anyone who's been living under a ASR dataset. And and
Speaker 1:explain a little bit about what you're working on today.
Speaker 8:I I am Jay Paree Yeah. And I am the EVP of Core AI Yes. Here at Microsoft. Okay. Important Do
Speaker 1:you have anything to share that updates your job today based on the news with OpenAI? Or is it just exactly the same? It's exactly
Speaker 6:the same.
Speaker 1:It's exactly the same. Yeah. Really? Okay. So are you are you marching towards AGI?
Speaker 1:Is it a race? It's on developers. Is it a race? If Microsoft becomes a platform for AGI and OpenAI can compete there and Microsoft's AI internally, team can compete, is there a world where you're racing to AGI against them?
Speaker 8:I think we we have a process for figuring out what AGI is. Yeah. And that is something that both companies will continue to Sure. Collaborate, work on, research. Yeah.
Speaker 8:Yeah. In the meantime Yeah. We have this mission, which is to focus on developers Yeah. And how we unlock way more creativity Yep. And to build a ton more things.
Speaker 8:So I have this idea, which is or this concept, where you think about all of the potential you think about the Hoover Dam, for example. Right? You guys are familiar with the Hoover Dam? There's 9,300,000,000,000 gallons of water behind that. It's about one gigawatt.
Speaker 8:Right? And it's, like, massive. Right? In terms of the amount of energy that it can generate. So think about all of these large language models, whether they be small ones, big ones, closed, open ones Yeah.
Speaker 8:Multi modal video, audio, text, etcetera. And you think about how we're gonna unlock that intelligence. And in order to unlock that intelligence, we have to write a lot of software. Sure. Right?
Speaker 8:And so if you think about the history of Microsoft, Satya commented on this earlier, you know, it's like Microsoft has been around fifty years. Right? And you think about all the software that's been written by Microsoft and everybody in the last fifty years. And I've I would posit that only 1% or less than 1% of the software that has been written in history. And that what we're gonna see in the next ten years is just this, like, prolific expansion of the amount that's gonna create
Speaker 1:a big chunk. Unlock.
Speaker 8:You might be the undercoat against. Right? Yeah. It's gonna be crazy. Right?
Speaker 8:So Yeah. That is why we're all here today. Right? Which is, like, how do we really drive and use agents, use this technology with the right guardrails, with the observability, being able to customize this, personalize it, be able to tap in and bring in open source, being able to bring in your enterprise specific knowledge and controls and all of that, and to really just change that trajectory of creation of of imagination in a building. Right?
Speaker 8:And I think, actually, the notion of even what we think of as a software developer is gonna change. Right? Now, to make this way more approachable by anybody who has an idea, being able to translate that into, you know, showing something, building an app, getting out there, getting feedback, iterating on it way faster than we've historically been able to.
Speaker 2:In in in CodeGen, like, I I I wanna get, your read on, how you're thinking about, like, today developers are you know, maybe they have some favorite tools, but they're willing to constantly be experimenting, trying new things. You guys are in a great position to be able to support that through partnerships, and and sit, you know, at a foundational layer with, with GitHub. But how are you thinking about what what what's your view on switching costs today and how that might evolve as, you know, in five years from now, do you believe developers will continue to just, you know, wanna always be trying the latest thing? Or do you think they'll, like, eventually switching costs will get to the point where it doesn't make sense to just constantly be looking over in other places and really makes makes more sense to just focus on what you have?
Speaker 8:Yeah. I think there's like an element to, you know, developers, builders, around craft. Mhmm. And I think you're always gonna want to find like the best tool or the tools that suit your sense of craft. Right?
Speaker 8:Whether you're a wood like a woodmaker or you're a painter, you know, and and there's, a a big element of craft. So I think that there's gonna be use cases where, hey. This is the way to do it. These are the best tools to, say, modernize or upgrade some version of, like, old Java code that you may have. And there may be just, like, this is the one or two just true tried ways of doing it and proper tools that you use.
Speaker 8:Then there's gonna be new use cases that we haven't even discovered seeing yet today. You think about some of these rapid prototyping apps, and I think, you know, it's great for the ecosystem that we're seeing different startups. We have different you know, we have GitHub Spark. We have different ideas that are all kinda competing and trying different versions of this. Those things do mature, and there may be a a smaller, narrower field.
Speaker 8:But I think right now with this inflection that we're seeing in terms of building and velocity of change, that there will always be lots and lots of things to go try out. And I think that's good for developers. Right? Yeah. And I think our platform is such that we care a lot about that ecosystem of startups, other companies that can bring that choice, bring those tools into it.
Speaker 8:But then we can help, like, hook those things together Yeah. From an observability controls, like, just a sensibility perspective. So if you wanna scale this adoption inside of your enterprise, you need those rails, so to speak. Right?
Speaker 1:There's so much there's so much, like, practical on the ground. Just make the piece of software 5% better with AI today. There's so much low hanging fruit. It's a very exciting time. At the same time, we're in this like, I feel like we're taking a breather from all the AI fast takeoff, and it's exciting because they can go build so much enterprise software, so much value, so many new companies, so many things built on top of Azure and and Microsoft.
Speaker 1:But at the same time, it feels like there is a new need for going back to the roots of academia or these, like, academic labs or these scientific labs. Do you have a pitch for, if there's someone out there who thinks that they're going to be the they're gonna write the next attention is all you need. They're gonna write the next transformer paper. And you know what? In the short term, they're not actually gonna help optimize, you know, knowledge retrieval or cogen for this next couple years.
Speaker 1:But they're but they they they believe they wanna do Do you have a pitch to them where they can come and work at Microsoft and and do that level of research?
Speaker 8:Yeah. Absolutely. So I think there's there's lots of different adventures you can pick inside of Microsoft deep for Yeah. And focused on, like, builders, developers. Right?
Speaker 8:Because one of the other fascinating and fun things about the Core AI team is we have this super tight collaboration with Microsoft Research. Yes. Right? So Microsoft Research has Yeah. All of, you know, thirty plus years of history in science research and programming language research, compiler security, and you name it.
Speaker 8:Right? So we actually have a lot of collaboration and joint problem solving, right, where they can focus more on that open ended research whether it be, hey. Here's I'm gonna go optimize this model. Here's how I'm gonna do formal verification of the code that comes out. Here's what I'm gonna do in terms of how to secure this code better.
Speaker 8:And so those things are out there. They're, like, big unsolved problems. They're longer time horizons, then as those innovations, those inventions happen in research, we can do the tech transfer. We can do the combined like product making together. And then that accrues into GitHub or in Versus Code or into Foundry, whatever, you know, whatever is the right avenue to bring that stuff to our customers, to developers.
Speaker 8:Yep.
Speaker 2:Yep. Where where do you stand on the should you learn to code debate?
Speaker 1:Oh, that's a good one.
Speaker 8:I think, yes. I think you should learn everything you can learn about these systems because the fundamentals, you know, ultimately, if you can understand, like, how this stuff shows up and it's instructing a computer, a GPU, a mobile phone, then I think that and it's less about maybe even knowing kind of the the the code, but it's that systems thinking mindset. Right? It's the cultural aspect of it. It's like, hey.
Speaker 8:I'm creating. I'm prompting. I'm guiding this thing, but here's how the code is going to generate. I understand what these models can and can't do, how to guide them Mhmm. More with a higher efficacy.
Speaker 8:Right? So absolutely. But I think of it more as, like, less of a narrow question of, like, hey. Should I learn to code or not? Yeah.
Speaker 8:It's like, how do I understand the system, the new system for how we're gonna build software build innovation? There's understanding the hardware, understanding the software, understanding, for example, evals. Yeah. Super super, like, important concept. Yes.
Speaker 8:Totally underreported. Like Yeah. Right? In terms of what's gonna happen, you have these offline evals. We have the And then we're smarted.
Speaker 8:What was saying is to her side, in the media? Yeah. And turn to like, how important that is to get higher, like, quality outputs of these things. So because there's the offline evals that we can sit there and we can Yeah. Score and say, we got these evals.
Speaker 8:Then there's the online or the lived experience, right, where when you put this AI into this product, you're like, wait. That doesn't quite work the way these evals said. Was not gonna work. Right?
Speaker 7:And if So what
Speaker 8:is selling in zero ways, right, in terms of sense. Sorry.
Speaker 1:I I we I have one more on that. We got your answer on should you learn to code. I wanna know, should you learn to deal? Should you learn to do deals? Is deal making underrated in 2025 in the age of AI?
Speaker 1:Being a deals guy, understanding incentives, bring people together around the table, iron out a deal. This is something that's it feels like it's growing. We'd saw it with the Microsoft OpenAI deal. That was a very unique deal. That was something that a lot of people, if they were just saying, oh
Speaker 2:It was much more than a than a traditional
Speaker 1:You were a ton of reasons not to And it got done, and it's probably one of the greatest deals in in tech history. I think so. Is there value in learning how to do deals and becoming a deals guy?
Speaker 8:I don't know that that's a twenty twenty five question. I think that is a life skill to know how to collaborate Yeah. And how to negotiate and how to compromise Yeah. And how to see you know, and sometimes, like, there isn't a deal to be made, and other times, there's a greater output or there's sort of a greater, like, a global maxima that you can attain. Right?
Speaker 8:And and that's where even if you look at the news today with our announcements of partnering with OpenAI and with Anthropic, bringing that all into this platform together, I think is the what we can go build and what we're gonna discover and how we're we're going to accelerate our joint learning, I think is important. Right? And that can turn into a deal, but I think that comes up with this, like, hey. There's a greater good. There's a greater opportunity.
Speaker 8:There's sort of a greater market. There's a greater problem a bigger problem to go solve. Then, yes, figuring out how it's gonna work nuts and bolts.
Speaker 1:I like it.
Speaker 2:How do you think about Jevan's paradox in the context of code? Just during the deep seek moment, Satya, quickly came out, and I think he posted the Wikipedia link to Jevan's Paradox. And it sort of, like, steadied, the market broadly. There was people that just weren't weren't familiar, but I think it was well timed from his side. But Looks very high.
Speaker 2:When it when it when it comes to, you know, on our side, you know, we're a media company, and we have a developer on our team. And I think that, like, five years ago, we wouldn't have had a developer. And as it's become basically cheaper and faster to create software, we now wanna make software. And we're a company that historically just wouldn't have. So I'm I'm curious how you think of that in the context, you know, going back to your earlier point of, like, we might have a 100, thousand a 100,000 times more code.
Speaker 2:So what's your view there?
Speaker 8:Yeah. I think that's what we wanna see the acceleration. Right? I think we talked about today, there's a 180,000,000 developers in in in GitHub today. Right?
Speaker 8:And and new developers joining GitHub every sector.
Speaker 1:Right? So it's it's literally a culture.
Speaker 2:It's a gun.
Speaker 1:Second. And I was like, that's that sounds like miles per hour, but, like, this is just such an abstract
Speaker 5:concept.
Speaker 2:How that's how the countries talk. They're like Yeah. Every second. Yeah.
Speaker 1:There's a baby born every second.
Speaker 8:Yeah. But think about, you know, it's not you think about the the types of personalities Yeah. And and backgrounds. Right? Yeah.
Speaker 8:Can be a product person. You can be a designer. You can be a you can be a marketer. You can be a deal maker. Like, all of this stuff, you can join GitHub.
Speaker 8:You can start building. You can start checking in code. You could start mashing up different things. So I actually think it's a super exciting time to see what the industry is doing. Right?
Speaker 8:And I think it's hard to predict the future, but I do actually really, really fundamentally believe, like, from a mission perspective in CoreAI, our job really is to unlock that creativity both in the AI powered tools that you heard about today, plus the platform making these things secure. And, like, really, like, anybody who's got an idea wherever you are in whatever department you are in an organization or an individual, you should be able to actualize that. Like, you know, we have this saying in our team, which is like, you know, more demos, less memos. Right? It's like all about building and showing and iterating.
Speaker 8:Lots of stuff gets like, we don't like it. You know? Yep. Yep. But the fact that I can, fifteen minutes, go through 15 iterations versus in the past, I might get a quarter of an iteration done.
Speaker 8:That I think is gonna Yeah. No matter
Speaker 2:how good a memo is, like, seeing seeing the product tells you 10 times more
Speaker 8:about you it it sort of gets more creativity from the team, a small group of people. Now we have to make sure we also spend time dealing with the fact that there are there are gaps in these in the technologies. Right? They don't, like, work perfectly. Right?
Speaker 8:So we've gotta keep building those guardrails. We gotta keep building that training. The models will get better. The tools have gotta get better as well. And that's where I think the GitHub community working together with these different partners that we have, the platform, We just have to keep learning faster and faster and faster.
Speaker 8:That's what we're focused on.
Speaker 1:So more demos, less memos. Let's role play for a second. Deals. More deals. Potentially.
Speaker 1:That's a let's role play. We're trying to do a deal. If I'm a Fortune five hundred CEO I'm coming to you and I'm saying, I wanna transform my business with AI. I don't wanna make mistakes. What what pattern should I avoid?
Speaker 1:What mistakes have you seen broadly, trends that, I wanna stay away from so that I can move forward with something that actually drives shareholder value and isn't just rah rah. I'm doing AI now.
Speaker 8:Yeah. So the first thing I would say is, like, really understand what the top one, two, three outcomes are more, like, specifically, like Yeah. Hey. I wanna transform my business. Okay.
Speaker 8:Well, what does that mean? Yeah. Okay. Do you know what that means? Are you saying, hey.
Speaker 8:I I need to I'm in an understand phase where I even just need to even create some bright lines around what is the ideal or kind of my dreams around the outcomes of what transformation means. So get into the specifics of the what that actually means. Yeah. Is it some revenue thing? Is it some product thing?
Speaker 8:Is it some You can adjust the difference between
Speaker 1:are you trying to cut costs or are you trying to grow top line?
Speaker 8:Right. That's number one. It's just understanding, like, one How do you think customer? I I keep asking lots why Yep. And to try to get more grounded in what those those specific things are.
Speaker 8:Number two is one of the things that I I will always encourage them or talk to them about is to then don't just talk about these things. Right? It's like, what are you doing to start learning? Because if you're early in that journey of of understanding AI, there is only so much that you can sort of, like, read and talk about and conduct meetings. You do need to have, like,
Speaker 1:this
Speaker 8:internal adoption, right, where people are and you're encouraging, you're incentivizing, you're really driving that experimentation, that curiosity of your organization. Right? So how do you understand what your base level of curiosity and risk taking is? If you are a more risk averse company in a slower moving company, then how do you change that culture? Right?
Speaker 8:So cultural transformation is it comes up in 90% of my customer conversations. We'll talk some tech stuff and then they're like, okay, Jay, do we do this people wise? Right? And then the third
Speaker 2:thing Headcount. Headcount planning. They're like, what what's your plan? Maybe I'll adopt that.
Speaker 8:Right. And then the third thing that I always will encourage folks to do when we'll have a conversation is, like, raise your level of ambition. Like, wherever you think you are in terms of ambition and that outcome, I promise you it's not enough. Because the technology, the models are growing way faster. They're getting way smarter than we humanly understand.
Speaker 8:So whatever ambition you have for this fiscal year or this half or this quarter, take it up a notch or two and then strive and push and lead to that to that point.
Speaker 1:Yeah. Are you do you have a bright line internally with, I feel like there's some organizations where core AI means not generative AI, but I don't think you use that exact dividing line. But should there be a dividing line between, like, machine learning recommendation systems, how Netflix recommends me the next thing to watch, for example? Like, that is an AI system. What what what pops up on my news feed is AI, but it's not generative AI.
Speaker 1:It's not what we think of when we think of generative image models. Is it worthwhile in 2025 to have a bright line between those teams or those skill sets, or is everything bleeding together?
Speaker 8:I think things are definitely blurring together, and there's stuff that's informing, you
Speaker 3:know,
Speaker 8:from one set of techniques to the other and vice versa. I do think that those systems are very, very sophisticated. They're very, I would say, powerful in terms of like user experience today. There are definitely places where people are using Gen AI when they shouldn't be and they should be using, you know Yeah. Machine learning techniques that just really work and Yeah.
Speaker 8:Are faster, better, cheaper. Right?
Speaker 1:They're cheaper. Yeah. You can imagine a bunch of things.
Speaker 8:Those are the things that, you know, we have to watch for in organizations where, you know, Gen AI is the hammer and everything looks like a nail when we actually have these mature, optimized, and, like, really exceptionally bright people technology to use those and not forget about those. Sorry. But I do think that in at scale, the stuff that we've learned in these more maybe, you know, more mature, more scale out machine learning systems will feed back into how we make products using Gen Air.
Speaker 1:Well, thank you so much for coming on the show. You fellas. This is always an outlast.
Speaker 8:I could do this. You do it. Take care. Sweet.
Speaker 2:We got you to come back. I'll be back soon.
Speaker 1:We have Jared Palmer, the vice president of product for our eye, and as the VP of Githam, the SVP.
Speaker 8:The VP. But don't worry. Let me beat the pact.
Speaker 2:You're both a vice president and a senior vice president? Yes. Is this like a two phase situation? Max. Yeah.
Speaker 2:Title max. Yeah.
Speaker 7:I might just Oh. So to, you know, regional branch manager. Yes. Right?
Speaker 1:Yes. Yeah. Yeah. It's like ass assistant to the CEO, assistant CEO.
Speaker 7:Technically, is VP of product core AI Okay. And SVP of GitHub.
Speaker 1:Okay. Does this It's incredible.
Speaker 2:It really Welcome welcome to the Yeah. The gig.
Speaker 7:Thanks. Good.
Speaker 4:See you the show. Monday Thirty? Thirteen? Oh, thirteen. Yeah.
Speaker 2:Okay. Maybe we do more for month.
Speaker 7:I was VP at AI of Vercel. Vercel?
Speaker 8:Yeah. That was great.
Speaker 1:We had Guillermo on the show yesterday.
Speaker 2:We rated a little thing called v v zero.
Speaker 1:V
Speaker 7:zero. Yeah. Congratulations. Yes. So I've been in the game for, I don't know, a little bit doing that stuff.
Speaker 7:So, yeah. Yeah. It's been fun. Vercel has been great.
Speaker 1:Yeah. So, I mean, have you had time to actually develop, like, a vision for what you're building here? Is it too early to ask? Or or or are you still just in kind of, like, let me assess the tools in the tool chest over here?
Speaker 7:It's day 13. Okay. So definitely, but I've I've been a long time GitHub user for for Yeah. Very very long time. Like, I don't think over ten years Yeah.
Speaker 7:I made my account. So
Speaker 1:And I imagine you've been thinking about, like, a broader developer experience and what this means in the age of AI all through the last I mean, the last five years have been, like, a deafening ring of, like, AGI and takeoff and timelines and stuff. You must have engaged with that, of course.
Speaker 7:Yes. Yes. And we obviously, at Vercel, we thought deeply about
Speaker 1:Totally. Developer experience.
Speaker 7:Yeah. I think that's really the vision is how do we bring with with CoreAI and the formation of it. We just had Jay on. Yeah. I think by combining Microsoft's assets across the stack.
Speaker 7:Right? We've got DS code, Visual Studio
Speaker 1:Yeah.
Speaker 7:GitHub, and putting these actually all in one org Yeah. Make for the ultimate developer experience, and that's what our our goal has to be. And focusing just on that is, I think, my first and foremost focus.
Speaker 1:Yeah. Do you think that developer label just melts away eventually? It feels like you you you think there will be a dividing line in five years, ten years?
Speaker 7:I don't know. Five or ten, guess. I mean, it
Speaker 1:it just feels like there's a world I don't know that. But But it's just it just feels great, you know. Like, there there was a time when when to take a photo, you needed to be a professional photographer because you need to you needed to understand how to change film in a dark room. And now everyone has a smartphone camera, and everyone's a photographer. That feels like it's coming.
Speaker 1:I don't know. I just see, like, I can open up an app on my phone, type a prompt, get code. Sure. It's, like, kinda hard for me to need to link my GitHub account instead of pages to, like, actually deploy it. But, like, we're only a couple months away from that, I feel like.
Speaker 1:And then eventually, it becomes, like, more prompt driven, but then there's still value. I don't know. How do how does all this play out?
Speaker 7:I think there's always gonna be a market for people who get stuff done. Yeah. Right?
Speaker 1:Yeah. Just high agency people.
Speaker 7:Lighter people. So builder Yeah. Yeah. Who and whether it shifts into, more product focus Yeah. Knowing how to build just, like, systems that are big and large Yeah.
Speaker 7:Yeah. That may be outside the training set.
Speaker 1:Sure.
Speaker 7:Sure. I think it's always gonna be important. Out of I also think that some of the the way I think about it at least is some of the the pipes, the tooling probably aren't changing as fast as the AI is. Yeah. What I mean by that is, like, the way that packages and code is distributed, tested, built.
Speaker 7:I don't think that's gonna change as fast as maybe the models will. Yeah. That makes sense. So with that's with that, like, infrastructure in place, I think you're still gonna have, human involvement for quite some time. I think the things that people will build may be more ambitious.
Speaker 7:I think that's really exciting. Yeah. And our job is to facilitate that and empower developers and think about, you know, what they need. But, you know, in five years or so, I still think people are gonna be building stuff with they're still gonna be coding in some respects. It just may look very different.
Speaker 2:Where do you wanna see model progress? People talk about the models are gonna get better. Like, they're just gonna get better Right. And plan around that. But, like, when you're when you're talking to labs, like, when you when you're at Vercel or when you're when now now at Microsoft, like, where specifically are you even thinking and kinda pushing them to say, like, hey.
Speaker 2:Like, it needs to be better here.
Speaker 7:Yeah. I mean, that's a great, great question. At, at Vercel, we worked deeply with the Model Labs. We obviously were very focused with a product like b zero on a specific subset of what models can do. Even in the coding realm, Vercel was always focused on front end, right, and specifically Next.
Speaker 7:Js. So not just one language, but one specific tech stack. And so we're always, you know, engaged with how can we make it better for Next. Js. Yep.
Speaker 7:Switching switching gears for a second to GitHub, obviously, we're now multi languages. We care about everything, but we do care about coding. That's the primary the primary focus point. But coding involves so much more than just generating, like More than auto worse code. Right?
Speaker 7:It's more than autocomplete. We need models to be great at research, to be great at reasoning. And I think and then also delivering mergeable code. Right? That's, I think, slightly different than just complete my comment.
Speaker 7:So we've been focusing a lot there and focusing on quality and something that we look to continue to hill climb on Yep. As time goes on.
Speaker 1:How much have you studied the the open source company, like, scalable business model? Like, what, like, what Vercel did with Next. Js? Like, are are you are you familiar? Oh, yeah.
Speaker 1:Can can you give me, like like, the the crash course if I'm, like, I'm a developer. I wanna build a business. I'm I'm gonna open source a a, you know, a a package that does something, and then I wanna build a business about it. Like, what are the pitfalls that I need to avoid? How do I actually bala like, what are the trade offs that I'm making to actually build a great, like, open source for profit company?
Speaker 1:Because there does seem to be some tension there, but it's held that model's held for but going back to Red Hat Linux all the way to Vercel today.
Speaker 7:Sure. I think I'll I'll I have a controversial take. Please. There aren't as many pure open source companies where their core product itself is open source. Sure.
Speaker 7:I think the more successful strategy is actually if you really dig into Vercel, because Vercel is not open source Yeah. But Next. Js is open source. And Next. Js is a complementary satellite product Yes.
Speaker 7:That drives attention and that is used by Vercel to make a better product. Yeah. They got this amazing feedback loop of internal dogfooding. But there is a community around the the project, which then I think some you know, Vercel has a material amount of Next. Js overall builds and developers use Vercel, but it's not like Vercel is an open source business.
Speaker 7:Yep. Right? It just has Next. Js as one of its largest pieces of the open source portfolio, but it also has now AI SDK. Yep.
Speaker 7:And with Vercel, the idea was to do something what we used to call framework, defined infrastructure. So framework defined infrastructure. And the idea was you can build this framework, and with no configuration, you can deploy it. Mhmm. And you don't have to think about scaling it.
Speaker 7:And so the the analogy
Speaker 5:I would
Speaker 7:make is, like, imagine you were asked to, I don't know, cook food for everybody here at at at Universe. Yeah. With Vercel, the idea was like, oh, what if we gave you the the pots and pans, and all you had to focus on was, like, cooking for your, you know, family of four? And then Vercel would worry about, like, scaling it to everybody here. Yeah.
Speaker 7:And so I I think to your to your point about, like, open source, my my my my suggestions for the crash course is a common pitfall that you should not run into is just assuming that your free open source users are gonna directly translate into paying customers. Interesting. I think that's actually really hard because you've set up expectations that you're giving away free service. We have this free this code. Right?
Speaker 7:Yeah. And then all of a sudden, they're gonna convert and pay you x dollars a month, and you're gonna have an enterprise business which you haven't been really honing in on and grinding on, and that's just gonna happen overnight. I think that's I think that's and that's walls. If you need to start from the beginning with both and also set expectations with your with your user base that this is paid. This is open source.
Speaker 7:So And if you can find a beautiful symbiosis between those two, that's where I see, like, a really big success.
Speaker 1:Is there some sort of, like, barbell strategy where you should axe like, actually go really broad with your with your open source package? Anyone's using it, but probably, like, you know, small developer startups, solo indie devs are using it. And then if you jump all the way to, like, oh, you notice some big corporations are using it. So you go with an enterprise plan out on day one. It's like they're not gonna be they they have no ground to stand out if they complain.
Speaker 1:They're like so so it's a lot easier than being like, okay. Actually, I'm nerfing the open source thing, and now all the indie devs need to pay me $25 a month. You know? That's way different than going like, hey. Look.
Speaker 1:Fortune 100 company was using this. Now we got a million dollar contract with them. Is that best practice?
Speaker 7:It's hard. Sometimes this big contracts early on can really be devastating
Speaker 1:Oh, sure.
Speaker 7:Because they can remove your focus on growing that inertia that No.
Speaker 3:No.
Speaker 7:No. And so you have to be careful. Okay. Obviously, they're great. I know what's happening.
Speaker 7:But focusing on your core value proposition, your core customers, and it's really great to get feedback by those enterprises early on. And Yeah. Mean projects I've been involved with, whether it was TurboRepo, whether it was Next. Js, even v zero. Like Mhmm.
Speaker 7:We didn't launch enterprise for almost a year or so, and we even I even it was a big debate between me and Guillermo, and I think we were actually early. You should actually delayed it even further. Really getting that groundswell is so important, and you could always do enterprise. Okay. You'd be careful.
Speaker 7:I say always do enterprise. You'd be careful. Yes. Somebody could come in, but just driving up even like ChatGPT, by the way, didn't have enterprise for, like, a lot people were egging for it. Yeah.
Speaker 2:Yeah. And then when finally, a lot of companies report, like, yeah. We don't pay for ChatGPT, but our employees all use
Speaker 5:it. Right. And then they and then you go
Speaker 7:to the CISO, you're like, hey. By the way, we have a lot of your here dating.
Speaker 1:I know exactly what you're referencing. I can't wish. It is a wild choice to mean
Speaker 2:How do you think, you know, a lot of there's so much excitement around the potential AI and science and law, these other categories. And, obviously, adoption is happening. But how do you think adoption will will, will kind of how would you imagine adoption will look in those categories? Because I think, AI adoption in software engineering is very natural because the people that are building and and, doing the research are are adopting the product, and, like, it's this super tight feedback loop, and you're not really gonna see in the same way in some of these other categories. So,
Speaker 7:Yes. And? I don't know. I before I got into software development, I I was actually a banker. And so Let's go.
Speaker 7:Go. Yeah. Gold Goldman Sachs, Fink.
Speaker 9:Thank you very much. Let's go. So I did I did my sanker. Yes. I remember.
Speaker 9:Right? So I I think I I gotta tell you.
Speaker 7:I'll be honest. Like, I think, you know, Anthropic, I was talking to Mikey. They they announced Claude for Excel. I think that's gonna do wonders. Yeah.
Speaker 7:I think if you talk to any Goldman Sachs analysts, they'll be very excited to to have that deeply integrated. And if you're building Delmarva's all data
Speaker 1:There's whole businesses that are built just on, like, templates. Totally. So And it's like an obvious way
Speaker 7:in the What's interesting, though, is if you look at Quad for Excel Mhmm. I think their core foundation is still the coding agent. And that there's something about the coding the coding runtime that can be then augmented to other verticals. I think that's what you're gonna see in the next year or so is is is these model labs build out these harnesses and go vertical by vertical, whether it's banking, health care, or consulting. Right?
Speaker 7:They're gonna go through that through knowledge work, and they're gonna iterate on that just like in the hill climb.
Speaker 5:Yeah. Yeah. What do you think
Speaker 2:on on how do you how do you think about switching costs now and over time if you're a products company and you're leveraging intelligence from a lab? Like, do you think the labs will, over time, make it harder and harder to kind of, like, rip out one one model provider and use another? Because right now, it feels like there's this land grab happening in enterprise and this race between Anthropic and and, and Gemini and OpenAI. But, like, how do you how do think that evolves?
Speaker 7:I think most of the product builders that I've talked to, like the the from the companies you that that are on here all the time, most of their teams are working with multiple models. And they have and they're constantly evaluating Yeah. Whatever sort of product analytics or test harnesses. They're looking for any edge they can. They're they're so competitive Yeah.
Speaker 7:At least in, like, the the startup space that, like and but switching models is not easy. That takes time. And especially when there's big rearchitectures, like when reasoning came out, for example, that may require a rewrite of all the prompts and all the edge cases that you've been massaging, and these models have different characteristics. But I think most of the high performance teams are are are dialing in harnesses for each and every lab, and they're they're just Yeah. So hungry
Speaker 2:that So switching could cost our high status. Switching costs are high, so the answer is, like, use you get all of them in the beginning.
Speaker 7:Correct. And and there may even be certain subsystems or certain tool calls where you're gonna switch models and mix them together, and that just is, you know, part of the part of this. If you look at, like, what Windsurf did with a released, that that specialized model for research Yeah. You know, they're combining. They're mixing and matching.
Speaker 7:I think that's the the next you know, the we'll see that throughout the next year. I don't think it's like, oh, we're just gonna use anthropic models or we're just gonna use OpenAI models, or we're just gonna use, you know, whatever model I just I think you'll see a lot of combination.
Speaker 9:Thank you so much for coming on the show. My guys, you're
Speaker 8:a big fan.
Speaker 2:Congrats on Have a great Come back. Come Come here to LA anytime.
Speaker 1:Thank you. Before we bring in Michael Greenwich from Work OS, we had some breaking news. Did you see the blimp? Did you see the blimp? Do know whose blimp that is?
Speaker 1:It's Sergey Brin's blimp. Let's go.
Speaker 2:He I mean, this is bag seven on bag seven.
Speaker 1:I'm actually gonna take I'm gonna take a little bit
Speaker 2:of code for that. Yeah. I told the Gemini team
Speaker 1:Oh, you know I
Speaker 2:I I care. I I have a You gotta split for
Speaker 1:like twenty years. What's up, guys? Hey. Good to see you.
Speaker 2:Great to finally meet you, guys. See you as your old friends. Yeah. David. David Sanders.
Speaker 1:Everyone. Oh, yeah. Mutual friend David. Love David.
Speaker 4:Yeah. I got you brought you guys both. Oh, please. One of our highly coveted super rare Enterprise. Enterprise ready.
Speaker 1:Ben order. Okay. How are you enterprise ready? Alright. Enterprise ready.
Speaker 4:What what does it mean? Well, pretty much every software company, eventually, when they get product market fit and go up market
Speaker 3:Yeah.
Speaker 4:There's a ton of stuff they have to add to their app to go sell to enterprise. Yes. So the guys at Microsoft and GitHub, they did this years ago. Yeah. But if you're a new company, you have to add all this stuff to your product.
Speaker 4:Sure. And it's things like single sign on Yes. User provisioning Yeah. Logs, security. WorkOS just does all that Sure.
Speaker 4:For you as a developer.
Speaker 1:Got it. Yeah. Okay. So you can just focus on the core products. Yeah.
Speaker 1:Yeah.
Speaker 4:In the same way you use Stripe for payments or Twilio for messaging, WorkOS is really that or
Speaker 1:This is an easy business because it feels like it it's not something that you could just, like, go through YC and, like, sell to another startup. So, like, who was the first client? How did you get in this? What were you doing before?
Speaker 4:So I worked on started working this a long time ago, almost seven years ago. Okay. So work about was a nice success. It's kinda It pains me.
Speaker 2:It's just Yeah.
Speaker 4:It's like a we're also like a pre AI company. Someone called us the other day, which also kinda hurt Yeah. A little I know. Shook my ears. Dinosaur.
Speaker 2:I started as AI native before AI existed. Yeah.
Speaker 4:For real. I saw this problem with another company at the start.
Speaker 1:Okay.
Speaker 2:We had
Speaker 4:we had built an email product.
Speaker 1:Yep. Got a bunch of usage. Got a bunch of adoption. Right. You're an enterprise.
Speaker 1:Tried to sell
Speaker 4:it to these guys, they said, no way we can let this
Speaker 1:company get you know, is the CTO, c a CISO, like
Speaker 4:Typically, engineering leaders, co founders, VP of Eng
Speaker 1:Okay.
Speaker 4:Whoever is kind of responsible for the technology.
Speaker 1:But is it because they want those features or they need them for legal reasons?
Speaker 4:They gotta have them. They usually have deals that are blocked because they don't have these features. Okay. Got it. So you'll start growing up market and there'll be some customer that says, hey, we'd love to
Speaker 1:use your product. We'd love to roll it out Yep.
Speaker 4:You know, at Coinbase or Microsoft or something, but we can't do it unless we have these features.
Speaker 2:Has demand just been insane because people are building products so quickly and then they start, you know, employees employees at companies start adopting them kind of personally and then they realize
Speaker 4:It's compounding. Yeah. Sure. So we had a lot of growth, you know, years ago through the kind of the early cloud era SaaS. Like, Vercel is one of our customers, Carta, Plaid, folks like that.
Speaker 1:Yeah.
Speaker 4:In the last year and a half, two years, in the last year and a
Speaker 1:half or two years, what
Speaker 4:we found is it's actually perfect for all these AI companies. Yep. So today, we're powering enterprise off for OpenAI, Anthropic, Perplexity, Cursor, Sierra. Yeah. You know, all these guys that are growing faster.
Speaker 4:So
Speaker 2:I guess I wanna step step on. Yeah.
Speaker 1:Okay. Walk me through the thesis. In the YC era, it became like, the the YC trade was basically, you could be a kid in college, you know, graduate and move to y move move to Mountain View or or Silicon Valley. And for a $100,000 and some cloud credits from Azure or whoever, you could set up a website and go kinda build the first era of consumer. And we got our Airbnbs from there.
Speaker 1:We got a variety of consumer companies. But in the AI era, it's becoming easier to go enterprise on day one. Is that real? Is that is that a is that a reasonable thesis? Do you see any data to that effect?
Speaker 4:Absolutely. So I think that previous era Yeah. The privilege that those companies had is they could take a while to get to enterprise. That's right. If you look at Dropbox, Figma, Slack.
Speaker 1:Yeah. Yeah.
Speaker 4:It was years. It was like three, four, five, six, seven years before they actually went after enterprise. Yep. What we're seeing today is AI businesses get pulled up market way faster. Sure.
Speaker 4:And it's way more competitive. So companies like Cursor or Complexity, pretty much in year one, year one or two Yeah. They get pulled up market to the enterprise. Yep. That's why they
Speaker 1:need it. Because of that competitive dynamic, but also the tools that they're building like, the the enterprise is just so ready for them.
Speaker 4:There's another piece of it as well. It's not just that they they grow faster at market
Speaker 1:Yeah.
Speaker 4:But you think about these AI products, they are touching sensitive data. You have one of these things that it it's only valuable if you get access to all of your stuff. Sure. You give it access to to do things on your behalf. So suddenly it becomes this huge security concern.
Speaker 4:Yeah. Maybe an old product like Figma, could say to the design team, just don't put any sensitive data in it. Yeah. But you get one of these agents or something connected, you you need it to access everything. And so they're scrutinized at a higher amount, plus they grow faster, plus in their life cycle.
Speaker 4:It's a perfect storm where where we come in to help them grow.
Speaker 1:Talk to about domestic versus international. I imagine a lot of your come a lot of your clients are already international. So does that mean you're international, or are you focused on, making American companies enterprise ready immediately, and then maybe you'll go after the European market later? How do think about that?
Speaker 4:So many of our customers are actually right here.
Speaker 1:Yeah. Like, probably at Universe, literally right here.
Speaker 4:We were joking we could cut up our sales territories by north and south of Market Street and SF because we have so many businesses that are here that are growing quickly. Yeah. What we find is their customers are international. Of course. Right?
Speaker 4:So they're going and selling to larger organizations elsewhere in the world. Okay. So our products that are kind of our customer's customer, those types of things we, you know, we localize. We just did a big project to translate everything using AI. So we launched a 100 languages.
Speaker 4:Yep. So we do that kind of stuff. But Yeah. We find that the best product the best companies that are using Work OS are these high growth AI businesses that are taking off.
Speaker 1:Yep.
Speaker 4:And of course, they're mostly here. Yeah. You know, they're they're mostly here.
Speaker 1:Yeah. That makes sense.
Speaker 2:What what's your philosophy around operating the business? I I don't remember. I don't necessarily recall the last time you guys raised money. Like, I'm sure people are throwing money at you all the time when they see the logos. Yeah.
Speaker 4:We raised our our series b, almost exactly four years ago, actually, which is That
Speaker 2:was the last financing.
Speaker 1:Yeah. That was
Speaker 4:the last financing, which is like an eon in the SaaS era. Yeah. Keep that, dude.
Speaker 1:Since Then we've So we got them back, they've defogged our hands.
Speaker 4:You've just been building it slowly since then, you
Speaker 1:know, bit by bit by bit
Speaker 2:by you know, slowly Actually, I do have
Speaker 4:something to announce. It's pretty exciting. Let's do it. You know, you you had Asati here previous talking about how they do a billion in revenue every day. We're very proud to announce that we just crossed 30,000,000 in annualized revenue.
Speaker 4:Amazing. So that's our Congratulations. That's our big number Overnight success. We're sitting today.
Speaker 1:That's great.
Speaker 4:We're a bit smaller than Microsoft, but but we're coming for you. Seth. Yeah. No. We've been compounding since then.
Speaker 4:The the AI stuff has really been this huge tailwind for us. Okay. And it's it's so fun to build infrastructure
Speaker 1:Yeah.
Speaker 4:Where we get to see into all these companies. Yeah. Like, my customers are the fastest growing, most exciting AI businesses out there. Invest? Do you angel invest?
Speaker 2:I do some. Yeah. Because I I actually keep seeing these companies at this, like, crazy Yeah. Blockchain point. I've had It's hard.
Speaker 4:VCs start asking me for the data. They wanna invest just to get the gross data out of it. Yeah.
Speaker 1:It's a little
Speaker 3:Yeah.
Speaker 4:It's it's a little difficult. Yeah. We don't we don't share that kind of stuff. But just through, you know, building for developers and running events and Yeah. I mean, I I love GitHub Universe.
Speaker 4:This is like Coachella for, like, you know, developer stuff. You meet founders and meet other people building stuff and stuff.
Speaker 1:Who are some of the entrepreneurs that you look up to? Who or or where's the what's the story from a founder or a business person that you keep coming back to is like, oh, that one
Speaker 2:I'll go first. Was just David Senora.
Speaker 1:Just his story? Just how he By my guess?
Speaker 2:No. I I do. I think I think I I I he is, like, I I feel like I have the blessing of, like, I by being friends with David, I'm friends with history's greatest entrepreneurs. It's the most fish show down about the fish Yes. That's role model.
Speaker 2:Right? He has a specific type of business, but whatever business you're building, you can learn from. Yeah.
Speaker 1:I mean, yeah. There there's a lot of other things you could have said, but that one's pretty good. You got anything?
Speaker 4:I was trying to think from my David's great. I I'm
Speaker 1:just laughing because, like like like, the stakes are like, you know, Henry Ford inventing the, you know, the the what what's it called? The actual assembly line. Yeah. Yeah. The main the automated assembly line or like, you know
Speaker 4:Well, yeah.
Speaker 2:Yeah. It's something the most impactful versus, like, the most valuable for you.
Speaker 1:Yeah. Yeah. I guess who who do you come back to? I don't know. I always turn back to a mark we're very much in a marketing business, and so I always come back to a quote that I attribute to David Senra, but is actually from David Ogilvie.
Speaker 1:You are not advertising to a standing army. You are advertising to a moving parade. Yeah. And so the question is like, why are we Proving
Speaker 2:to a point you you, you've heard that for the first No. Instance of
Speaker 1:I read I read Ogilvie on advertising. I'm familiar with the book. I read it before David read it. But but he did stick it in my brain, and he advertised it to my moving parade. And, I and I've always liked that idea of of even if you've shown someone an advertisement once or you've sent them a message or you've given them a pitch once, like, you there there is a moving army.
Speaker 1:There is so many distractions. There's attention all over the place that you need to be hitting again and again. And it's why they don't just do one GitHub universe and say, yeah. We did it. We're good.
Speaker 1:They do it every single year.
Speaker 4:And the message is different every year for you too. They're evolving. Yeah. Man, there's so many there's so many to choose from. I'm I'm I feel like a little bit of an old soul in that when I when I heard Satya talking about, like, the early days of Microsoft and Bill, I love building platforms.
Speaker 4:Yeah. Like, I built all this other stuff earlier in my career. And as soon as I started building stuff for developers Yeah. Other people making stuff, Sure. I was like, oh, that's really sick.
Speaker 4:Yeah. It's like, you see other people make stuff with the thing you made and then build their own businesses on top of that. And to me, Microsoft is like the first big software platform company. Yeah. You know?
Speaker 4:Windows enabled so many developers to build and ship these experiences to change change the world. And it's it's it's, you know, previous cycles, I think a lot of people forget it. But it was this huge enabler, this huge, like, democratization of access to Yeah. Technology.
Speaker 1:Is there a, a specific sales funnel that flows through GitHub with your product?
Speaker 4:We do just a ton of stuff with developers. I think, you know, I mean, we do everything from, you know, sponsoring podcasts and newsletters Yeah. Meetups and and doing developer events. Our we did our own conference last week.
Speaker 1:Yeah.
Speaker 4:We do a lot of open source stuff. We run a really op popular open source design system project Really? Called Radix. Oh, that's Yeah. Well, that's on GitHub.
Speaker 4:But I think I think my GitHub account is probably one of the earliest, like, personal identities online. Had an account, know. I've had it since before I was in college, so Wow. I'm thrilled to be here. Yeah.
Speaker 1:Yeah. That's very cool.
Speaker 2:Are you speaking at all today
Speaker 9:or today?
Speaker 4:I am. I'm giving a talk tomorrow, all about AI and identity for agents.
Speaker 1:Okay.
Speaker 4:So so this is a new thing. WorkOS is kind of like an identity security company. You help people with sign in and off. There's And this big question right now of how we're gonna secure agents. Yeah.
Speaker 4:You know, if we have, 7,000,000,000 people on the planet, we'll probably have trillions of agents running around and doing stuff for us connecting to different systems. And security is even more important. You can think of an agent kind of like a a crazy hyperactive intern. Yeah. You're gonna have access to all of your systems.
Speaker 4:And so there's this question of how do you authenticate them, how you build security around it, permissions, approval
Speaker 2:Prompt injected. Right.
Speaker 4:Right. Yeah. There's this old old quote, you know, to is human, but to screw up 10,000 times per second, you need a computer Yeah. To do that. Agents are kinda like that.
Speaker 4:Right? They make it really easy to do stuff really quickly but also make mistakes. Yeah. So my talk is all about that and some some ideas that we have around security for
Speaker 1:Yeah. Do you think, agent security bifurcates along the consumer and biz business to business access? Do you think there's a discrete enterprise versus b to b layer? It feels like we talked to the CEO of 1Password, for example, and it does feel like the like, my the password to my Yelp account, I might not be using, like, Work OS for that in the future.
Speaker 4:It's definitely gonna be some blurring, you know. Yeah. And and, you know, talking to, like,
Speaker 2:like, API keys. Like, like, my my the way I'm thinking about it is, like, a small business might have 200 agents that are out in the world. Maybe some are selling. Maybe some are doing customer support. And it's like Right.
Speaker 2:When when should a CX agent be able to share account data? Right? Right? When when can a sales, agent, like, provide pricing? And I mean, like, there's so many different things.
Speaker 2:And you, you know, CEOs, and they're working with teams. Right? They have, like, processes in place for individual people. And so I think I think it's, like, really important problem area.
Speaker 4:It's it's completely changing the way people think about security. I think if you if you go to any of these, like, security focused conferences, it's the topic on everyone's mind. Might have. Yeah. Exactly.
Speaker 4:Exactly. Yeah. Because we within companies, previously, you've had these kind of silos of information or control. Yeah. You have permissioning systems that are pretty static.
Speaker 4:Yeah. But with agents, you know, you might have 200 today and zero tomorrow. You might spin them up and down depending on a task, depending on a project. Yeah. And so that permissioning model is, like, completely changing.
Speaker 4:Yeah. And it's really exciting. Know, we're we're right in the middle of it. Like, us working with all these different AI businesses Yeah. They themselves are building their own agentic workflows, whether it's, know, stuff like Codex or Claude code or what Cursor's building with their
Speaker 1:Is there a back
Speaker 2:end the
Speaker 4:AI horror story and security yet? Oh, yeah. It
Speaker 2:They didn't even get to
Speaker 1:talk about Is it mean, there there's one thing I I yeah.
Speaker 2:Thuba security, I think it Yeah. It's still feels notable. Like, there hasn't been, like, a specific day on the Internet that everyone was Oh, like
Speaker 1:we went down because of AI. Yeah. That hasn't happened yet.
Speaker 4:Not not yet. I feel probably just latest
Speaker 1:the AWS outage, like, I don't think no one pinned that on AI. No one pinned that on on on generative AI or stochastic systems.
Speaker 4:Yeah. There was one one a few months ago where, Jason Lemkin, you know, from Saster, he was Yeah. Vibe coding an app on Replit. Oh, I saw his AI.
Speaker 1:Yeah.
Speaker 4:Yeah. And he was like, was pure prompting.
Speaker 1:Right? Yeah.
Speaker 4:He's not writing any code. He's like, just talking to the thing. And, he asked the agent to do something and it deleted the full production database. Yes. And then he was like, what the hell?
Speaker 4:And then the agent lied about it. Yeah. It was like, no, didn't do that. You know?
Speaker 1:Adrian was like, yeah.
Speaker 2:He's like, yeah. Cover it. You know?
Speaker 1:It's like you're like, oh, yeah. At what point, think the agent did just You got it.
Speaker 4:My bad. Yeah. My bad.
Speaker 1:But but I do think they were able to roll it back. So I wrote your job getting him a thread and and kinda just to close it out and not leave the
Speaker 4:Yeah. And they've built a lot of stuff since then as guardrails. But that just shows you, like, early on people pushing these systems to their limit. And they can have catastrophic effects if you don't put up these guardrails. Yeah.
Speaker 4:So, that's what the talk's about. We're doing a lot of innovation and and research here, but it's, it's gonna take a while to get right. Yeah. Yeah.
Speaker 2:Well Awesome. Amazing to finally have you on the show.
Speaker 1:Thanks so much.
Speaker 4:Long time. Fan screech. Yeah.
Speaker 7:Yeah. Have you
Speaker 2:in person.
Speaker 3:Come back on again. I will. Take care.
Speaker 1:See you. We'll talk to you soon. Awesome.
Speaker 2:There are a lot of posts here in the timeline that I want to share that we can't Timelines. We can't I mean, I'm We're not paying them up. Being held back by I wanna just come back to them tomorrow. Okay. Held back.
Speaker 1:Tomorrow, you have our word. We're doing lots of timeline. If you're new here, leave us a subscription. Follow us on X. Follow us on LinkedIn.
Speaker 2:Subscription everywhere.
Speaker 1:Yep. Sign up for our our newsletter, tbpn.com. We bring you the news in text form. Yeah. Well, it has been a fantastic day here in San Francisco.
Speaker 1:Thank you
Speaker 2:to Is everyone stunning out?
Speaker 1:It is. I'm gonna go try We gotta go hunt down a blimp. Gotta find out what is not the Gemini blimp. I'm telling you. It's Sergei's blimp, his personal blimp.
Speaker 1:It's not a it's not a Gemini project. It's not a Google I thought
Speaker 2:you said it was branded. No. Thought you
Speaker 1:said Sergei. Was Okay. So been funding Okay.
Speaker 9:Blimp company.
Speaker 2:I They're testing So funny because I sat I sat down with Logan and the Gemini team. We were just talking about marketing ideas, and I was like, the obvious thing that you should do is get a blimp, wrap it with Gemini branding, and just fly it around, San Francisco.
Speaker 1:This founder Moe Fae. Sergei is on top
Speaker 2:of And and we were talking. Was like, okay. Finding a blimp. I was doing some research. There's like six active blimps.
Speaker 2:I was like, man, this is gonna be hard to find a blimp that can get to SF that can be wrapped. And of course, Google, incredible foresight from Sergei to create a beautiful billboard in the sky. Yes. That's just waiting for, branding. But,
Speaker 1:Waiting.
Speaker 2:Well, super fun day. A surreal moment. A lot of fun. Talking to You had a great time. One of the greatest living CEOs.
Speaker 2:Yeah. And
Speaker 1:Thank you to everyone on the Microsoft team. Thank you to everyone, on the GitHub team who helped organize this. Thank you to our sponsors, ProFound Linear, numeralhq.com, sales tax on autopilot. You got a nice bim.ai, the number one AI agent for customer service. Adio, of course, customer relationship magic.
Speaker 1:Aid sleep. Didn't sleep on my aid sleep last night. Can't wait to get back to it tonight. Of course, we mentioned public.com. Also, adquick.com.
Speaker 1:Getbezel.com. Your bezel can see it was available now. So we're Jared
Speaker 2:Palmer had a nice I
Speaker 1:saw it. Saw it. He had a nice Yacht Master. It was good.
Speaker 2:That was looking good. We And, got of course, circle wander. On that.
Speaker 1:Find your happy place. Book a wander with inspiring views. Hotel great amenities. Dreamy beds. Top tier cleaning in twenty four seven concierge service.
Speaker 1:It's a vacation home, but better.
Speaker 2:Thank you, folks. We will be back in Hollywood tomorrow. The podcasting will continue.
Speaker 1:11AM sharp Pacific. Cheers. See you then. Goodbye. Bye.