Hallway Chat

Open format Q&A this week. Fraser and Nabeel explore AI data privacy, the ethics of copying features, and maintaining innovation. They discuss enterprise data challenges, the importance of a strong product identity, and strategies for early-stage startups during fundraising season.
  • (00:00) - Open
  • (00:53) - Q&A Session Kickoff
  • (01:14) - The data you gather is your roadmap
  • (14:03) - The gravity of slowing down in your startup
  • (21:20) - Your quarterly goal: Something Fundamentally Changes
  • (21:57) - Low Risk, Low Outcomes
  • (24:31) - Large Organizations and Mediocrity
  • (30:58) - When to steal a feature?

What is Hallway Chat?

Fraser & Nabeel explore what it means to build great products in this new world of AI.

Two former founders, now VCs, have an off-the-cuff conversation with friends about the new AI products that are worth trying, emerging patterns, and how founders are navigating a world that’s changing every week.

Fraser is the former Head of Product at OpenAI, where he managed the teams that shipped ChatGPT and DALL-E, and is now an investor at Spark Capital. Nabeel is a former founder and CEO, now an investor at Spark, and has served on the boards of Discord, Postmates, Cruise, Descript, and Adept.

It's like your weekly dinner party on what's happening in artificial intelligence.

Q&A: Is NotebookLM an exception? When to copy? Small bets vs big bets?
===

[00:00:00] Open
---

[00:00:00] Fraser Kelton: My strong, hypothesis or belief, worldview, my worldview, it's like a religion to me, is that you can do ticky tack type stuff to reduce the risk of those things in the upper right quadrant that have high reward.

[00:00:16] Nabeel Hyatt: I don't want it to be like eBay and just basically be the same for the next 25 years with just more cruft.

[00:00:21] Like, I want it to keep innovating.

[00:00:23] Fraser Kelton: They clearly have taste, and for you to ship something with taste at a large organization takes, like, a lot of chutzpah and, like, great work.

[00:00:31] Nabeel Hyatt: There will be particular types of use cases where their own model development is going to wildly outperform anything out there in the world because they have the context and the data.

[00:00:38] Except unless if they don't have the context and the data.

[00:00:42] Fraser Kelton: Every time I go into the Instagram search and there's an ask meta AI experience. Oh my gosh. Oh no. You had FOMO.

[00:00:53] Q&A Session Kickoff
---

[00:00:53] Fraser Kelton: Welcome back everybody to Hallway Chat, it's Fraser.

[00:00:56] Nabeel Hyatt: Hey, I'm Nabeel. Welcome. We're going to do A little Q& A today on some of the things we've been handling over the last couple of weeks. I think that's the format because honestly, these are the hallway conversations we're having right now. We're in the midst of fundraising season and the midst of board meeting season.

[00:01:10] Just trying to help companies. So that feels like the right thing. Sound good? Let's do it.

[00:01:14] The data you gather is your roadmap
---

[00:01:14] Nabeel Hyatt: In our conversation recently with Anil from Meter, we talked about how his contrarian AI take was that the data that you're collecting from your users and the work that you're doing with your users should inform the kind of model building that you're doing.

[00:01:27] That was the nature of the process. It wasn't like, look at the one or two pain points that the consumer is having this week. And then that trying to use a model to solve that problem. It was like, no, the long term vision of this company, whatever company you're building is like inferred by the data that you're collecting, because that's going to be the area where you have the largest leverage to truly natural things over time.

[00:01:49] So how do you connect that with all of the data privacy concerns that people have in almost any enterprise context. So like I have a company I'm working with, Fraser, that they work with enterprises. It's prosumer to enterprise. We'll leave out the name for now. It's not that important because I think it's a common problem.

[00:02:09] Now, if consumers using it and enterprises using it, the first thing in enterprise conversation is going to happen is the guy from IT calls and he's like, we've got 10 people in our org using this. And like, the first thing I want to say is I don't want any data. to be used for training, right? And the right answer from a sales perspective is, of course, of course, of course, we don't use your data for training.

[00:02:32] And obviously, if you say that, you should stick to that, you should do it. But the right things for the company is obviously not that, right? If we want these models to get better, Then they need to understand how users are using their product. And so there was this beginning of a conversation in a board meeting I was just having recently where the conversation started from objection mitigation.

[00:02:53] Where do we put the disclaimers about data privacy and make sure that we're not trading on their data? And I was like, listen, if this thing is going to be 100x better in four years. Then you should flip this on its head. How do you get people comfortable with data and data training? And I don't know, have you thought about this?

[00:03:12] You come across this? I'm sure this came up with some of your portfolio companies. I'm sure it came up at OpenAI and frankly, like as a class of companies, what do we do about this? Like it would suck if we all implemented great AI products that then could never get smarter. Right? Maybe.

[00:03:30] Fraser Kelton: We had that long held view at OpenAI.

[00:03:34] And then the moment we started to get use and attention, I think the very next day, Sam quite publicly said, Nope, we've changed it. We are giving you an option so that we will never train on your data. Right? I think the simple way of thinking about it is the product is good enough today to serve those 10 users within that enterprise where then the, you know, the technology officer or whomever is concerned about privacy.

[00:03:58] You didn't need to train the model to get in there in the first place. I would be surprised if the quality of the model is keeping them from 10 to 20 to 30 users growth across that enterprise. Right. And so like my historical bias is like, don't, don't worry about it for now.

[00:04:18] Nabeel Hyatt: Maybe I have something I'll learn there.

[00:04:20] My worry is. That it's not that the data is, you know, the data is valuable and I want to, you're right. The model must be good enough to produce a product that's good enough that these customers care in the first place. Great, right. But aren't we in the business of thinking about none of these things are static.

[00:04:36] Like you want this to be all these models to get better and you want these models to serve. I'm not even talking about value to the customer or reselling the data or anything like that. I'm just talking about like, If I'm a customer of a product right now, and it's an awesome AI model doing great stuff.

[00:04:51] Right. I personally want it to be 10x better in four years. I don't want it to be static. Yeah. I don't want it to be like eBay and just basically be the same for the next 25 years, with just more cruft. Like, I want it to keep innovating. And in a model development world, that means they have data. And the ways I can imagine solving this are, And OpenAI has done a version of this, you know, Oh, we have a free tier and we have a paid tier, right?

[00:05:16] So we're going to train off some free version set of users. The interesting thing in an enterprise context, though, is that not all data, of course, is equal. What you could end up with is like, every enterprise that converts and has 50 users, you're not trading on their data, which means that any use of the product that is team oriented You just don't have the information on, right?

[00:05:39] If you just think about like the, uh, I'll just take a random company like Slack had a, was one of the first like bottoms up SaaS companies, right? This, this random person in the org adopts Slack, and then the kind of like trigger for payment and moving up to IT was when it got outside of your expense report.

[00:05:57] So, you know, I, like, when, when we adopted Slack years ago in the very, very, like, first year of the company or whatever, like, You know, I put it on my card. I didn't want to have a conversation with anybody about it. I just put it on my card, got it inserted, and then it was a small user product. At some point, it starts growing virally internally, bottoms up.

[00:06:12] It triggers, and I now got to talk to somebody about some expense report and getting it filed and getting admin rights and all the rest of that stuff. That's very typical bottoms up SaaS. If every single org that converted and was viral and used it in mass suddenly wasn't, the data was not able to be trained against that, you have polluted data,

[00:06:32] Fraser Kelton: right?

[00:06:32] Yeah, I, I get it. And, um, you know, listen, my, my concise, simple answer earlier, you know, rarely, rarely is there a simple framework that fits all use cases, you know, but there's like a bunch of different things going on here is, do they require training? on the data to create a model or can they keep the data and do retrieval when it makes sense?

[00:06:56] Like there's all sorts of different things there. And then, you know, the other thing that I would come back to is I think anybody who has bumped up against this friction within an organization understands that there's malleability to it. What do you mean? I was at Airbnb. We had a whole bunch of good policies that were there for good reasons.

[00:07:17] And with one project, I was able to railroad my way through every single policy just to get it done. And it was like the degree of importance to the organization trumped some of these concerns. And we were able to just find a happy path forward throughout all of them. And so like, one of my questions is, Is the value high enough is really the thing.

[00:07:38] Like, is the promise to the end users experience good enough that they're willing to have these discussions? It feels like theoretical at this point, like, Hey, in four years, we need to get a hundred X better. Nobody internally is able to go and have that discussion today.

[00:07:52] Nabeel Hyatt: Oh, that, that, that's exactly what worries me though, Fraser, right?

[00:07:54] What worries me is you have like, take the average YC company or whatever. Let's just say right now they're an applications company. They're building something awesome for enterprise or consumer. They're probably not doing model development day one. Right. They're, they're a proverbial wrapper on top of other technology, which by the way, I'm a big fan of.

[00:08:09] Fine. All good. The issue is they're in the field of the meter conversation. They will be model developing at some point in the future is my contention. Like there are three people right now, they're going to take ChatGPT as it is. They're going to run it through a bunch of stuff. They're going to use their own data.

[00:08:23] Fine. If they get to any size and scale and they're suddenly at a hundred thousand users or a million users or whatever it is, They will not, not necessarily stop using Claude or stop using ChatGPT, but there will be particular types of use cases where their own model development is going to wildly outperform anything out there in the world because they have the context and the data.

[00:08:43] Except, unless, if they don't have the context and the data. I think we, like, as an industry, I hope we can find better knee jerk reactions than, don't worry, won't train. And because I think, you know, so like people are just throwing up those random terms and conditions. I don't think it's healthy for these companies long term, and I don't think it's good for consumers long term.

[00:09:02] We want to keep innovating on models.

[00:09:04] Fraser Kelton: I wonder if like that softens over time though, right? Like, I wonder if, if that's the knee jerk reaction in the moment and either we realize it's not as acute as we once thought, or solutions appear that allow us to, to overcome the obstacles that people have for that.

[00:09:23] Thanks.

[00:09:23] Nabeel Hyatt: Let's brainstorm for a second because what's going to happen is, I'll put it this way, either that means in two years you realize you want to do a models because now you're at 10, 20 million ARR, you want to build the next set of features. And so now you're like revoking terms and conditions and changing things on users, which is going to piss off a lot of people.

[00:09:39] Maybe it just happens that way and it's fine. Or maybe that causes a blowback and you, you churn 10, 20 percent of your users and you're in trouble. The other way is like, we just, I probably already think about that conversation a little bit ahead of time. What we're really talking about is just like, how to put this.

[00:09:56] Did you were trying to get there earlier, Fraser? Like, some of this is just like how you word this to a customer, right? Instead of saying, no, we will not use your model for training. You can have a little checkbox on, on, on the product that says something like, we will never send. your data to any third parties for training.

[00:10:16] Right. So like, we're not going to send it to ChatGPT. Great. And then maybe you add a second checkbox. I want nuanced permissions. I want a conversation that meets what you're saying, which is like, I want three permissions or four permissions. That a consumer can have, or a buyer can have control over, because I honestly think it's in their best interest.

[00:10:37] Like, if you have the right level of conversation, then obviously a customer actually wants you to train on their data. They do. They just don't want you to say that if you have their financial data, you don't want to take the fact that, you know, their office in 4 billion a month, and that gets polluted and gets traded and sent to somebody else.

[00:10:55] Like, that's the part they don't want. They just don't want really private, non anonymized information to be used somewhere else. If it's in service of making the product better, I think most customers would be totally down with that. It's just finding a way to word that in a way that doesn't sound scary.

[00:11:17] I guess. I guess I assume there's a language for it in two years that we just haven't come up with yet. What would we say? What would you say to a customer to be like, dude, it's okay. Oh, I'm not going to wordsmith on this. This

[00:11:29] Fraser Kelton: is like a cathartic experience where like you have to find the five words that all make it make sense.

[00:11:34] You know, the other thing that you said at the start of this though, is that Like, you went back to the meter conversation and talked about how they're using the data to understand what their users are doing. And I would say that that feels like the P0 in this discussion. If they can't actually look at their users data to understand how their users are interacting with the product today, that feels far more acute and pressing of a problem than, like, how are they going to retain the right to use the data for, you know, Augmenting a model in the future because like I was listening to you very carefully and I agree we should have a vision that helps us see around some corners years into the future but my guess is this company is not at 20 million in ARR right now and it's going to be like You know a small miracle for them to solve the product and go to market problems to get to that point and i'm kind of in the camp of like spend some amount of time to make sure that once you get to that point you're in a good position to then you know go beyond it but like don't focus on the wrong

[00:12:36] Nabeel Hyatt: thing.

[00:12:36] Today. Ah, such good advice. It's a common truism. Like, your point is like you can spend, you spend a half an hour on it, but don't spend a week on it. Like large companies get to run things in parallel and a very hard thing is that early stage companies really have to run things in serial. That's right.

[00:12:55] Fraser Kelton: Yeah. And, and like, listen, I get it is, it might be tomorrow's problem that you've created by not dealing with it today, but there's a lot of pro companies that are, you're alive, you're alive to see it. Yeah, that's right. Yeah, that's right. But I get it though, but. My guess is that there's a lot of value that they can squeeze out and growth beyond having to solve that.

[00:13:19] I think that there's trends such that I think our, our response, like we've had this knee jerk response of, Whoa, this is really risky and dangerous. And either we will soften on that or there will be products that help us soften our perspective on that. And then like, as you said, like there's probably some.

[00:13:37] Some wordsmithing and permission work that can solve it. There's probably some discussions with your customers that can solve it. And there's also then, I don't know, like, they're not training on it today, right? But retaining the data for other reasons sets them up to not be completely behind the ball in a couple of years, if and when they get to the point where they want to do that.

[00:14:02] Let's go back to focus for

[00:14:03] Nabeel Hyatt: a second.

[00:14:03] The gravity of slowing down in your startup
---

[00:14:03] Nabeel Hyatt: In working with founders nowadays, you know, we're generally getting involved. We can write a 1 million check or a 50 million check, it's a wide range, but we're generally getting involved, like I would call it like, you know, post launch pre scale, like it's before it's all been figured out, there's lots of stuff that's broken.

[00:14:23] But you finally like got a thing that people can play with and use and there's something to that even if there's no revenue or small number of users or whatever it is and there's a feeling to that that I have always come back to which is like That feeling when you just launch something or just finish a round or just do something where you like have the desire to exhale and then go look at all of the spring cleaning of bugs and tiny things that are around.

[00:14:57] Do you know what the emotion that I'm talking about? Yeah. And it's probably the emotion that happens literally after we write every check. Like, it's that they just sprinted, they probably just got a big release out, they also went through a fundraise, and then you get to the other side, and you're like, The team has a kind of emotional center feeling of just like, slow down, and also clean, and also, I don't know, like, be a little unfocused?

[00:15:24] And your comment earlier about being focused and having priority. One, have you come across this with founders as well? And two, Have you found any way to help navigate them through this? I

[00:15:36] Fraser Kelton: was at an offsite for an investment that I've recently made that hasn't closed yet. And so let's leave it at that.

[00:15:47] And they pulled me aside and they said, Hey, listen, we're going to go into the next five or six months. They've just raised, they've just had a great set of releases. And they said, you know, we're not going to focus on any of the metrics that you just invested into. I said, okay, tell me more. And they said, because I think there's like clear paths for us to.

[00:16:13] increase the top of funnel conversion by 5 percent in the next month. And I think there's a way for us to get ARR up by 20 percent over the next, you know, two weeks. I I'm making stuff up, right. But, uh, they said, we think the real opportunity is to go in this direction, which is a, a larger swath of work that is building on top of the foundation that we put in place today.

[00:16:44] And we're just going to ignore everything else. The

[00:16:47] Nabeel Hyatt: opposite of what I just said, which is not the ticky tack, but the, I want a bold beat. I want a big, I want to really think long term and I want to make a big thing that's going to take a long time.

[00:16:58] Fraser Kelton: And so like I raised that because my feedback to them was too enthusiastic thumbs up.

[00:17:05] Nabeel Hyatt: I'm glad you're not telling me which company it is because then I can ask you in the abstract when really of course the answer is in the particulars.

[00:17:10] Fraser Kelton: Of course, that's why I was too enthusiastic thumbs up is because we're not excited about the ticky tack improvements that they're going to make in the next couple of months.

[00:17:18] We're excited about where this is going to take them over the next couple of years.

[00:17:23] Nabeel Hyatt: That's right. Although I have to say that your framing, I don't know how big the big B is, but your framing also has me worried. Abstractly, like,

[00:17:32] Fraser Kelton: I'll tell you. Oh, that's so great that you raised that because I told them very clearly that I wasn't worried about what they wanted to do, but the risk was in how they were going to go about doing it.

[00:17:42] And we spent a couple of minutes understanding their thought process on the ladder there. Yeah. Which then made me feel good, right? You know, clearly the situation that's a failure in that world is that you put your head down, you don't talk to customers, and you think that you have, like, this grand vision, and then five months you ship it and you've built exactly the wrong thing.

[00:18:02] That's right. Right?

[00:18:04] Nabeel Hyatt: Yeah. My phrasing there is always the amount of effort that you put in to shipping something without getting feedback from users should be directly related to how convinced you are that you know exactly what users want. And the truth is that you probably don't fully know what users want, especially if you're an early stage company.

[00:18:26] And that's a process that takes years and years to get a nuanced understanding of the pain points of your customers and why they're coming to you and why they're telling their friends about you and why they're paying. So the size of the bet should match that at least before you get signal. It's not that you should only work on small things, right?

[00:18:42] It's how quickly can you get signal that you're on the right path?

[00:18:45] Fraser Kelton: Well, that last part is super important, right? The nice thing is that. Your concern is also easily mapped to the last thing that you said, which is, if you're at the start of that journey, you, you can get. signal fairly quickly on some of the larger unknowns.

[00:19:02] The extent to which you are refining your hypothesis should probably also match the degree of uncertainty or the risk, right? And so when you take that first step toward that product, there's probably some very sizable things that are unknown that will determine whether or not you're even heading in the right direction.

[00:19:27] And the nice thing is, like, you don't need a full, refined UI that, like, is pixel perfect to be able to get signal on that. You can probably go and get signal with a very small experiment of some form or another to, like, give you Even data is, like, a slightly exaggeration on this, right? But, like, even to give you some intuition among a sea of ambiguity that there's, like, signal here.

[00:19:48] And then I've been thinking of it as like concentric circles or like a bullseye that you narrow in on the degree of precision in your experiments and like the signal that you're getting to cut through that ambiguity as you're like certainty increases.

[00:20:02] Nabeel Hyatt: Yeah, I mean, the nudge of the discussion that I had recently, the way that my version of that discussion just ended with a company was just, it's the old, you know, risk versus.

[00:20:12] reward rubric, which is like you have an XY graph and then you have the bottom left corner is things that are small experiments that are also low risk with low potential outcome, right? That's the bottom left of that graph. And then the top right of that graph is, you know, things that are really high risk with high potential outcome, right?

[00:20:33] And the trap of a startup is people get allergic to risk very quickly. They feel like they need to win. And so they end up doing things that are low risk, that are probably low outcome. Because let's be honest, if there was a low risk bet that had really high outcomes, like that'd be wonderful. And it happens like twice in a startup's life, but it's not, it's probably not sitting, there's not five of those sitting on your pipeline.

[00:20:57] You're kidding yourself. The low risk bets are probably. You know, on the bad end, right? They're probably gonna not move a number enough. They're probably not gonna change your business. Like, that's right. I had a board meeting this week where at the end, we kind of all agreed that a lot of work had been done.

[00:21:13] These people are grinding 24 hours a day. Like, it's constant. But that since the previous board meeting, basically nothing had happened in the company.

[00:21:21] Your quarterly goal: Something Fundamentally Changes
---

[00:21:21] Nabeel Hyatt: Like, a lot of work had been done, but when we tell the history of this company five years from now and it's wildly successful and awesome, we're going to look at that quarter and be like, nothing changed.

[00:21:30] And for a very early stage company, you can't go a quarter, nothing changing. It doesn't mean you had to ship something to go to your point, Fraser. It doesn't mean that only the only things you put out there, are little ticky tack things that are viral growth experiments or whatever it is you can get out in a week.

[00:21:47] But it does mean that, like, your fundamental understanding about the market or your customer, like, you should be learning so quickly. That you just, you have a different insight about this business, right?

[00:21:56] Fraser Kelton: Yep. Yep. Yep.

[00:21:57] Low Risk, Low Outcomes
---

[00:21:57] Fraser Kelton: And actually, so if I can go back and try to tidy up what I said earlier with your rubric that you just described is you can spend a small amount of time on the small features that are going to have a small effect.

[00:22:09] And, you know, as you said, Tiki Tac. The big sin and the big risk is that you go and you spend five months on the stuff that has high risk, high reward, and then you, like, ship it and you're like, all right, let's see if we land it. My strong, like, hypothesis or belief, worldview, my worldview, it's like a religion to me, is that you can do ticky tack type stuff to reduce the risk of those things in the upper right quadrant that have high reward.

[00:22:35] Nabeel Hyatt: Yes, those things

[00:22:36] Fraser Kelton: can be related. And go do like, not even a week, go do two days of really bad experimentation. And you can find a way to nudge that risk down a little bit, so that when you actually embark on shipping that, you have like higher and higher degrees of certainty. Yeah. To the point when you've then introduced it, that it's going to actually be good.

[00:22:56] Nabeel Hyatt: You know, who's doing a really good job of this right now? Uh, Descript. Andrew's been doing this really well. It just came to mind. I won't describe what it is, but like, they have a big, bold bet on a product. that is related to Descript and is connected to Descript, which I won't explain because I'm going to let, you know, let him talk about it in the future, but they've been working on it, well, you know, off and on, I'd say for like nine months, small team, sometimes it gets turned off, sometimes it gets turned on, and it's a great example of this, like, if and when it works, it would change how you use Descript in a really fundamental way, it would be super crazy, never over indexed.

[00:23:29] And by the way, it's a wildly different idea than when it started. Like, I remember the, I was at the offsite with the team at a product offsite where the idea came up and like, you know, you're just, your hair's blown back and you're awesome. And then they, they go and prototype and work on a little bit and so on and so forth, test and look with users And then just like, And if I explained to you what, what a product A was nine months ago and what product B is, like where it is right now, Like, they're not even related.

[00:23:53] You would be like, how are those two things in the same conversation? And it's like, no, no, no, we're walking the path. And that doesn't mean we worked for nine months and made the thing. It means we found ways to touch with users all along the way and try and learn.

[00:24:05] Fraser Kelton: I love it. You know, I, I don't think I've said this to you at all.

[00:24:09] Andrew is one of my like founder heroes. He is the type of founder that I think founders should look up to in that fact that he does it because he just needs to create, not because it's like a status thing or anything like that. Um, it's just, he's such a thoughtful, I don't know. He's a good founder. I'm a big fan.

[00:24:31] Large Organizations and Mediocrity
---

[00:24:31] Fraser Kelton: This, this whole topic is actually kind of like a core belief of mine as to why I think large companies. For the most part, haven't shipped anything great. Like I can't think of a great large company that has shipped something where you have to go and explore and solve the mystery in your parlance, rather than just like incrementally deliver great improvements at scale to something that already like is working.

[00:24:55] And that's because if you're planning for six or 12 months and you can't go and you can't run these ticky tack experiments to try to reduce the risk to figure out what it is that you should build to then go and build it.

[00:25:08] Nabeel Hyatt: I think the two problems in large orgs that we're going to have to, you know, with a broad brush, talk about it, but I think it boils down to earned trust and budget allocation process, right?

[00:25:20] It is what gets you promoted and what gets you more money to work. And if you just think about the startup ecosystem and the way that works, think about it the same way. I'm the CEO founder of a company. How do I get more money and more time? I produce over the course of the next 18 months, because I probably have 18 months I'll have to raise again.

[00:25:40] I have to produce something seismically different. Right. And so maybe I fumble around for two or three months and I do some bug fixing and I pick up for air. I very quickly realize, oh crap, like that's not going to be enough. Like I need to be in an entirely different universe in another. And by the way, I need three or four or five months to raise.

[00:25:59] So I have seven more months now to do something dramatic. I really do think the venture structure is a good structure for generating innovation with all of my issues with venture capital and all the other stuff. Like, and then, and then try to map the same thing internally. It never works that way. Like, even in situations we have budget allocation processes.

[00:26:16] One, how many people do you have to get approval from? You're trying to get buy in from like 15 different executives. It's terrible. Two, you can say, hey, you got a year and a half to work on this thing, but we all know that then like three months or five months later, like, That there's a reorg that goes across the entire company and you get sideswiped because of it and so forth.

[00:26:34] Or the inverse happens, which is like, you basically effectively have tenure. Like, you are in the org, you have earned your stripes, you've been at Google for five years, you can basically do whatever. And like, that's not the right pressure point either for taking that level of risk. It doesn't generate the right outcome.

[00:26:50] I will push back on good products from internal teams and large orgs that we've never seen in AI. Because we would talk about Notebook LM as a good product. It's an early product. It's a seed stage product, but a good product. Ironically, I think it's because they had a loose approximation. of a startup thought process there.

[00:27:12] Like when you talk to that team of what, what they were doing, how it came about, you know, it was, it was an experiment inside of labs. It was really structures that Clay Bevor, who's now at Sierra, he's now doing a startup, but he set up where he was trying to pair people from the industry with people inside.

[00:27:28] And so in a way, I think he did a half decent approximation. It will probably all die immediately now that Clay's not there and Google will do Google things, but like, In other words, I don't think there's going to be five more notebook LLMs popping out of Google next month. But it is a good example of like that worked, right?

[00:27:42] I don't know if it'll keep working, but that that's a decent product, right? And it was probably because they, in many ways, like there was a loose approximation. Nobody in there had enough power on the notebook LL team to think that they were going to last forever and had tenure. It was a small tight team of a handful of people working with somebody who was a customer that's internal to them.

[00:28:00] Like it was as close as you can get to a loose approximation of a seed fund investment inside of a large org. Oh, you don't like that.

[00:28:09] Fraser Kelton: No, no, no, listen, uh, I, I have become a fan of that team. I forget the individual's name on the product side. She's taken to Twitter and it's just, it's just like, it's great.

[00:28:20] It's great content. Undoubtedly. It's great product decisions have been made. Um, and I'm speaking particularly about the podcast experience. I forget what it's called. I don't know if it's a good product, right? I think there's a big difference between a good product experience of a feature that is part of a broader product.

[00:28:41] I don't know. Are you a daily active user of NotebookLM?

[00:28:44] Nabeel Hyatt: I'm not. I have not had that many podcast situations. Although I would definitely think to go use it for podcasts, but they don't have a DAU solution and the nature of most consumer is like consumers remembering things based on either big, huge emotional triggers in their life.

[00:29:00] Hey, I'm getting married. I got to go SEO myself and find which married website I should use that one time in my life or it's daily. Like, it's either huge external seismic event, or it really is habit, right? And outside of those two, nobody does things monthly on the internet as consumers. You just forget those things.

[00:29:16] Fraser Kelton: And like, listen, I think we'll look back in maybe a year and we will have an amazing product and we'll say, Hey, listen, if you had squinted at the Notebook LM podcast feature, this was all foreseeable. That's so right. And they clearly have taste. And for you to ship something with taste at a large organization takes like a lot of chutzpah and like, great work.

[00:29:35] Uh, right? And so I don't mean to take away any of that. I think that we have had a great. Novel experience that has really illuminated what's likely to happen in the future. But other than trying the podcast feature a couple of times, I'm like, I'm not converting to a regular user or notebook. We're having like a good deep.

[00:29:55] Nabeel Hyatt: Product development life of a conversation here that is probably indicative of a fact that we are coming out a lot of board meetings where we're doing a lot of these conversations. So this is literally our hallway chat right now. It's what we're trying to process. It could be that or either fundraising because it's fundraising season.

[00:30:09] So we could do that as well. Fundraising it's busy season. So, um, Yeah, you're right. And there's two things that are important to that product, notebook lm. One is the uploading of a repository for RAG, that I think is already existing in things like Claude Projects, although it was not. You know, there weren't a lot of things a year ago that were doing that.

[00:30:33] And so there, there's a first good taste, excellent surface there. The second one is the podcast product, which is really, I think, remarkable, but you're right. Probably someone else will steal that feature and we'll iterate on that feature. And can we talk to those startups instead of to Google? Since we can't talk to Google, when is the right time to steal a feature?

[00:30:57] Fraser Kelton: So that, that's a great question.

[00:30:59] When to steal a feature?
---

[00:30:59] Fraser Kelton: When to steal a feature? First of all, what do you mean by that? I mean, it's obvious, but, but I think it's obvious, but give a little bit of color.

[00:31:07] Nabeel Hyatt: This is everything from looking at the front page of a competitor's website and feeling like a call to action was really well worded and being like, man, I wish I had come up with that myself and just like doing it up to, you know, Zuck is probably the most you know, lamented public figure for unabashedly being like, TikTok is a thing.

[00:31:33] Like we're, we're, we're going to launch TikTok inside of Instagram. Like just anytime he sees a thing, he's not shy. One side of it is like, it is shameless and feels like it lacks taste. And the other side, is, it feels like it is, comes from a place of low ego, strangely, right? High ego is, I will never, ever, ever learn from anybody else, because if it didn't come from my wonderful, beautiful, curved brain, then, then it's not worth it.

[00:32:04] But there's something that feels smarmy and terrible and lacking in, in taste of all you're doing to try and figure out what you're supposed to do next, is stare at what other people are doing and then copy those things. And so there's finding that bit in between. So steel is everything in my steel example, was my previous experience that it was social games with conduit and, and housing and behave in that environment and, um, FarmVille and partly get lost to history, but FarmVille was a pretty direct copy of FarmTown, right?

[00:32:37] And FarmVille was, you know, for a while, I don't know if it still is fastest growing. You know, consumer product like ever on the revenue side, probably up until ChatGPT, to be honest, and yet it was, I mean, and not just like copy, that was a pixel by pixel, right? Like it was, everything was at exactly the same space, and there's a part of that that feels not great.

[00:33:05] So stealing is, stealing, I'm using the pejorative term, versus learning, uh, you know.

[00:33:11] Fraser Kelton: Listen, um,

[00:33:13] Nabeel Hyatt: great artists steal. Yes, yes, good artists make and great artists steal. But there's something about that that doesn't feel right either, right? When do you steal? You have a, you have a product. I'll put it this way.

[00:33:23] I'll word it differently. You walk into a board meeting. You're sitting down with the founder. They've got a product roadmap. You just talked about it. They got five months. They're going to go work on a thing. That's their big bet. And kudos to them for being ticky tack. But in AI, especially we're in a world where.

[00:33:38] Everybody else is also experimenting at the same time. This is not you off somewhere else. And we know that all of these affordances, all of these features are rapidly changing. So what do you do as an org? You got five months to work on a big, awesome thing. And then like, let's say for the sake of this argument, that it's it podcasting in nature.

[00:33:56] It's loosely audio in nature as a startup. And then. Two months into development of some new fancy thing, which is the next thing you think is going to help you raise the next round and grow and it's what you usually want, you know, notebook LM comes out and you wake up one morning and you look at it and you're like, crap, that's good.

[00:34:15] Fraser Kelton: What do you do? I think if you have a North Star as to what you're doing for your users, and something that clearly, clearly resonates. in an outsized way and is supportive of your North Star, like furthers your effort toward the North Star. You just, you prioritize it and you do it. And so in this case, like if you are trying to, you know, simplify the creation and consumption of audio content for your end users, And like, that's what you care about.

[00:34:58] There are aspects of that feature that you happily take. I don't think you take the Notebook LM product, right? You don't say, okay, let's go ship that. You take the stuff that really resonated, like the product work. They did great product work on that. Then you can call it product, product design, but like the, the fact that there was two hosts that had a rapport that they went back and forth and there was like affects built into it, all of that stuff.

[00:35:23] is great. And you just go in and you take that and you have an opinionated view on what's actually going to work for your audience and your North Star, and you happily leave the rest.

[00:35:33] Nabeel Hyatt: Uh, I worry that what you said will come off not as profound as it really is. Like, I actually think that's really

[00:35:45] Fraser Kelton: I'm always either left or right on the curve.

[00:35:48] I'm always left or right on the curve.

[00:35:50] Nabeel Hyatt: No, I, I, I, because I think we have had lots of conversations about what we see. And really great product founders and the behavior that the patterns you pick up, even though they're wildly different and they do different things, they lead different ways, they have different backgrounds.

[00:36:08] But I think one of those patterns is having this feeling of a North Star. that isn't fixed, that is, how you get there is unknown. Sometimes the unbelievably brilliant founders have, the people that are trying to play chess with a startup have a really hard time. They want to work the whole path out, uh, ahead of time.

[00:36:30] And that's obviously not the right answer, especially in a world where the technology is changing constantly and you might wake up tomorrow and a competitor's come out with a good thing and you couldn't learn from it, blah, blah. You can't, you certainly can't path plan it all out. But similarly, you have to know where you're headed and not just broadly in VC speak land that we operate inside of this market with this TAM and that kind of bull, but a, you know, what is the emotional resonance, the deeper, deeper problem that you're trying to solve?

[00:37:01] And then does that feature that you saw out in the world that got you excited? Does it fit that mission or not? You can deviate product strategy, but not deviate mission. Without talking about how kind of gross it feels when somebody copies something pixel by pixel instead of having a point of view on it, I think maybe part of the issue of when we react negatively to copying is when it feels like a founder is running at whatever is hot because it feels like they don't know who they really are.

[00:37:30] Yeah. That's a moment for pause. It's not about the, did I learn from another startup or not? It's why did you decide to get interested in it? And if you, because it's like. You think it's just going to solve a user growth problem, but it's a little off piste if it, if it's just because it's going viral on Twitter right now, your friends are talking about all the time, that is different than putting it through the filter of like, what is the emotional resonance of this product?

[00:37:54] I always think about the one I like as this example, because I think a good example is like Devin at, at, at Snoopchat in the early days there, and just like having this emotional insight of, I'm a teenage girl and I want 15 people to feel like my BFFs. And so that, in a group text, they don't feel intimate and authentic enough if I'm just texting them all.

[00:38:17] And so you need this one to many relationship. Because if I text every single one of them individually and maintain 15 relationships, it's just a lot of time. So a lot of work. And a lot of work. And so like the core is this feeling of intimacy And that's such a business y way to talk about the way Snapchat manifests itself because the way it manifests itself is deeply intimate and wonderful, but it comes from a North Star of understanding what values you're giving to your customer and then letting everything else fit into that.

[00:38:50] And so then you decide like, Oh, well, in that context, does a feed make sense? Should I copy the feed? Like, obviously not. That, like, decreases authenticity. Or whatever, does a TikTok environment make sense? And the answer was like, no, not for what Snapchat is, but maybe we need to start a new product on the side.

[00:39:09] Right.

[00:39:10] Fraser Kelton: Yeah. My favorite example, it's, it's overly simplistic, but it's so crisp, is Reels and Instagram feels like it is a great extension of the emotional experience that we have. And I will tip my hat to you and say you were right. It didn't take a lot of foresight, but every time I go into the Instagram search and there's an Ask Meta AI experience, Oh my God.

[00:39:36] Oh no. You, you had FOMO. That is the example of jamming something in because you had FOMO about what is happening and it doesn't fit the emotional core of what it is that you're trying to deliver for your end users. And like those two, those two things I, I happily flip between all the time. I'm like, Oh, I love Reels.

[00:39:55] I didn't know that I loved Reels and it's a great, Great experience in this product. And then I go into this other one and it's like, Oh, super jarring. What are you doing here? I thought Kevin at, at OpenAI did a great job. He was on an interview with Sam recently. This is the new chief product officer. And.

[00:40:16] He tipped his hat to Anthropx projects and I thought it was so loving, like, cause what does that do? That gives the organization permission to look a field and find inspiration. And I think, like, just a great moment of leadership. And I hope that they ship something that's inspired. We don't need to say copy or steal.

[00:40:34] That's a, that's a tough word, but. Inspired by that.

[00:40:38] Nabeel Hyatt: Well, it's that you called out several pieces of good behavior there. And you're right. Like, first of all, Kevin's just like good people. I think he seems to be great there. Not surprising that he would have an egoless version of trying to present that to the team and probably will have good output for the team to feel like they then can Come back and say look at this awesome thing that happened that happened to not be done inside of this org Right, and that's what you really want Yeah, it is permission to build that conversation.

[00:41:09] Yeah. Yeah, I think we should call it here Okay,

[00:41:12] Fraser Kelton: cuz

[00:41:13] Nabeel Hyatt: I realized I have another Great. Great.

[00:41:17] Fraser Kelton: This was this a great, we, we, we shouldn't go every four to five weeks. We should find a way to do it. I know it's easier said than done, but this is great.

[00:41:25] Nabeel Hyatt: Great, great, great. Okay. See you later.