The Circuit

Summary

In this episode, Ben Bajarin and Jay Goldberg discuss the outsized expectations for AI growth and the impact on stock market reactions. They explore the challenges of modeling AI growth and the difficulty of charging more for AI features. They also discuss the potential for AI to accelerate refreshment cycles and the importance of realistic expectations. The conversation highlights the small gains of AI in software and the experimentation stage of AI. They conclude by emphasizing the need to measure expectations and be reasonable in the AI industry.
Takeaways

  • Outsized AI expectations have led to negative stock market reactions, as companies have not met ambitious growth models.
  • Modeling AI growth is challenging due to variables such as product availability and demand.
  • Charging more for AI features is difficult, as customers may not be willing to pay a premium.
  • AI may not accelerate refreshment cycles, as the average consumer may not see significant improvements that warrant more frequent upgrades.
  • The gains from AI are often small and incremental, but still important in improving efficiency and productivity.

What is The Circuit?

A podcast about the business and market of semiconductors

Ben Bajarin (00:00.808)
Hello everyone. We are recording this episode February 2nd, which as I have now because of Apple Vision Pro launch day nicknamed it early adopters day because this is the early adopters celebration or early adopters Christmas when those who have to have the latest and greatest are excited and waiting in line and or waiting for UPS as I am. So I am Ben Beharon. Welcome to the circuit.

Jay Goldberg (00:30.286)
Greetings, internet. I am Jay Goldberg. I am Vision Pro Poor, apparently, in this relationship, because I am not... I guess I'm not an early adopter. My 25-year-old self would be very disappointed.

Ben Bajarin (00:46.256)
Yes, you're pragmatic in the later stages of life. I am not. So yes, we will obviously talk about that. I look forward to the episode where I have this futuristic face computer on my face in the opening and Jay reacts weirdly to it, which will be fun. Fun to discuss at a component and technological level.

like we like to do on this show. But this episode, so this week, as I'm sure many people who listen to this also follow the semiconductor industry closely, had a number of earnings reports. Some of them related to Semi, some of them don't. But a theme emerged that I think was interesting. One of the things that I don't know if everybody kind of caught that I'm gonna point out is,

Lots of companies had some very, very good earnings, a lot of it driven by AI growth, but their stock was down. There was a sort of negative reaction. And as we talked and looked at sort of the observation of why and just listened to some of the investor calls, it feels like this was a result of what I'm just calling outsized AI expectations.

These companies grew, but they didn't grow enough. And because, and I've talked to a bunch of folks on the street about this, they, they have these very, uh, ambitious AI TAM growth models, right? They've been building these models saying, Hey, we think so and so is going to, going to benefit from X, uh, X million or X billions of dollars of revenue over the next couple of years from an AI lift. And sadly these companies didn't, uh, didn't meet those expectations. So that's why.

I'm coming at this from the angle of AI's outsized expectations and some of the reactions of a number of tickers based on their lack of enough growth, not no growth, not growing or raising, but just enough growth relative to those AI expectations. So that's where I've landed. Lobbing that out there, Jay.

Jay Goldberg (03:09.17)
Yeah. So yeah, it's hard. It's hard to talk about Semis now this year without talking about AI. And I've actually been thinking about this a lot. I published a piece this morning. Mostly because I knew we were going to talk about this today. So I wanted to get this piece out, but I've been working on this piece for weeks. And that's unusual for me. My writing method, usually I write a post in 15 minutes to an hour, one draft, I proofread it.

And I publish it. Like, I've gotten pretty efficient at this over the years. But this one dragged on. And this is like the third version, not just the draft, but the third version of this note. There's other versions that will never see the light of day. It was really hard for me to write this note about AI. Because I realized I was stuck in this weird middle position where I don't want to be the guy who says, oh, AI is just a toy.

Jay Goldberg (04:07.746)
photos, right? I don't want to dismiss it. It's important technologies, important advances. But on the other hand, almost everything I read about AI is just complete nonsense. There's so much hype out there. And it's just sort of, it just, it rattles around in my brain. And I think, I think this earning season was a pretty good example of that because...

any company that had anything remotely adjacent to AI, if it didn't have enough AI in the numbers, the stock got pummeled, no matter how good the results were. And there are other things going on in the stock market too. Last year was pretty good. Last six months of the year were really good for semis. And so expectations just in general are very high. And so it's always hard. That's a bad setup for earnings seasons.

expecting more, you can never quite give the street enough. But definitely there is a strong take on AI. People wanna see AI. And management teams are responding to that, mostly in sensible fashions and getting punished for being rational and sensible about it. Which means next earnings seasons, we're gonna have a few companies that go AI crazy in their earnings calls. And I won't name names, but I can think of a few.

who have nothing to do with AI, but they're gonna find a way to work it into their story.

Ben Bajarin (05:33.308)
Yeah, and I wonder partially too if that's...

Ben Bajarin (05:38.652)
like an overcorrection since so many companies believed, right, for the past six to eight months, I would say really more than the past six months, that the more they could get benefit from the AI lift, the more it would help their story. And all that did was create these kind of really lofty expectations. I mean, you know, I know a couple of big banks on the sell side that I've contributed a little bit to their AI TAM model. It feels like, again, it's a very, very difficult thing.

to model because you've got a number of variables that are out there. You've got how many products can this company even make? There's not that there might not be a massive amount of money and demand, but can you even make enough? That was always my first point to keep in mind is like, look, you think they're going to do X amount of revenue, but can they even, the demand might be there, but can they even make enough chips?

Ben Bajarin (06:36.468)
budget shifts were going to come or the capex shifts were going to come from someone like again a cloud hyperscaler or an enterprise, right? Looking to invest or double down or a software company that was going to build their own model. You know, we just, you don't know like how fast that's going to ramp. So you're dealing with these assumptions and everybody has to build these models, but it felt like going again, going back to this, that they were kind of just numbers in a bucket.

without a tremendous amount of logic put behind them. And then that got applied to any, name your semiconductor companies, possibility of Lyft. And again, it comes back to, they did great. They sold more product. I mean, even with AMD, right? They raised their estimate about a billion and a half for MI300X, but they didn't double it, right? They didn't triple it, right? And again, part of that's because...

They can't make that many yet, right? But at the same time, it's just this, it's a complicated problem that I appreciate, but you can see kind of this now, this, all right, well, maybe we should over-correct. And I get that, and I guess that's where we're at right now. Some form of over-correction or rebalancing of our expectations on really what the revenues or the financial model for AI is gonna look.

Jay Goldberg (07:56.042)
Yeah, I mean, AMD is the poster child for this, right? Like you said, they had a really good quarter. They raised guidance. Everything looked really good. But then on the call, like after they printed, people looked at the results, stock traded up. But then after the earnings call, stock was down and it's down for the week. Which is crazy because the results were good. And the root of the problem was two things. One is Lisa Su, very good CEO.

started talking, she gave her estimate for the TAM, total market size for AI, I think it was $400 billion. And she got three questions from analysts on the call. How did you get to that number? How did you build that model? And it was so weird to me because she didn't build the model. She's got other important things to do. Somebody deep in the AMD team, probably a committee of people, built that model and sort of cobbled it

Ben Bajarin (08:32.424)
Hmm. Mm-hmm.

Jay Goldberg (08:55.022)
together with a bunch of reports. It's a long-term TAM model. Those are never super reliable. They're always estimates. But people whose job it is to build models were asking the CEO on the call, how did you get there? Like, help us, how did you get the mechanics? I mean, they should probably just do a separate call for people who really wanna geek out on it. That would solve them the trouble. But it was like such a focus, right? So that was one of the problems. The other big problem they had was MI-300.

Ben Bajarin (09:16.928)
Mmm.

Jay Goldberg (09:25.318)
And MI300 is their GPU, AI accelerator, monster, which is a really interesting product. It is fairly competitive with what Nvidia has. People have been wondering since it got announced back in October, how much do you think you can sell all of these? And on the call, she said 3.5 billion. And it's this year, like 3.5 billion for a new product.

is a pretty good size. It's a pretty good number. And remember, it's January. And having been involved in how companies come up with those sort of revenue forecasts for a new product, it's not a science. It's a lot of art. How much can we produce? Which customers do we give it to? What's their ramp going to look like? We have to allocate some for people to do demos, for sales that won't happen until next year. How do you allocate that? They don't have a lot of supply. And so 3.5 billion.

Ben Bajarin (10:19.825)
Yeah.

Jay Goldberg (10:22.142)
in January is a really reasonable number. Right, again, a product that just started shipping in December.

What happened on the street apparently was, ever since that product got announced, there was like an arms race among buy and sell side analysts to come up with the estimate. I've actually been involved in that too, like back when I was an analyst. And like someone, you know, let's say you like a company, you like AMD stock, you have a buy on it, it's one of your top picks. And somebody will come to you and say, hey, what's your estimate for MI300? And you'll come up with a number.

And then they'll say, well, that's not that exciting. And then you'll go back and you look at your model and go, oh, it doesn't really move the needle much. I feel pretty good about this. I'm going to raise the number. And then you raise your estimate. The next analyst down the line has to, they have a number and somebody on the buy side comes to them and says, oh, your number is only 2 billion. Well, Jay over there is saying it's 3 billion. You say, do you really like this stock? And then like, it's a weird game of

Ben Bajarin (11:21.096)
Yeah.

Ben Bajarin (11:25.521)
Yeah.

Jay Goldberg (11:29.498)
inverse telephone that sort of just right and as a result the expectations for mi 300 were like crazy numbers like there are some people talking about like nine ten billion dollars which is ludicrous like they couldn't they can't get that much capacity out of tsmc right so three and a half billion is a good number but somehow it's still disappointed expectations sometimes wall street and the market do weird things like this

Ben Bajarin (11:36.497)
Right.

Ben Bajarin (11:43.86)
Yep.

Ben Bajarin (11:48.361)
Yep.

Ben Bajarin (11:51.548)
Yep. And that, and that I think again, was just not just a reminder, but also, you know, again, when you're in a growth cycle, and I think that's the most important part. And I'm looking at this from, from two things, right? From one, how do you create that kind of excitement and positioning of your products opportunity while also secondarily managing the expectations when again, we're in a situation where you're up against very drastic supply constraints.

And it's just, it's an interesting tension. And, you know, we've seen this play off and a couple of different, you know, epochs in the industry over the past 20, you know, 20-ish years. And we're in another one right now where it is, it is so early. Everyone wants to talk about AI, you know, the software AI side of things. It's, it's, it's interesting. Like I was having a conversation with a number of software developers and they kind of said the same thing, and I didn't really fully appreciate that software is actually up against the same call it. You know,

compute demand shortage as hardware is, right? Cause I thought, well, you know, your software just duplicate this, train it on more H100s or inference it more on other things. But even they are seeing, you know, what they want to push the boundaries in terms of software latency, increased compute demands that they don't have. And so this is this, cause I thought like, you know, you're going to build a software model. And I'm just going to throw this out there, right? Cause I've seen a number of models do this where like,

the software model, Tam, is anything from like one to $4 trillion, right? Over the next decade. Okay, great, great. But my point is like you would assume software actually can scale, that model could scale faster than the hardware model. And to some degree that's true. But it took me a bit to really appreciate that even that software, Tam, is somewhat or Ramp is somewhat constrained also by compute accessibility.

And they're a little bit more closely related. And I just, again, I wonder, you know, are we basically in, and this is, you kind of alluded to this, but I guess this is the kind of broader question because I'm sure a lot of people who listen to this are also like very into kind of what AI and SEMIs looks like. Are we in a six month period of basically trying to rebalance those expectations? Still be super optimistic. We agree the potential is there, but we kind of need to reassess.

Ben Bajarin (14:19.858)
All of these things in light of new information that's come to light

Jay Goldberg (14:25.518)
Yeah, I mean, I think back to when probably a year ago, we had a series of conversations about is AI additive to the data center, CPU market or semiconductor market. And we came to the conclusion that it was, like this would add people are going to have to spend more than they otherwise would have. And I think about that a lot lately. I revisit that a lot.

Jay Goldberg (14:57.956)
Like you said, it's early days for all this. And so we're all trying to struggle and come up with a number. Sooner or later, Gartner or IDC, one of those companies will just come up with a number. And we'll all stick to that. And that'll just be the reference point. Everyone sort of frames around. But we haven't gotten there yet because it's a hard number to calculate. And

Jay Goldberg (15:21.934)
I actually, and that's actually really what sort of spurred the piece I wrote today, this morning about this market, which is I'm, I'm less excited about generative AI and the consumer applications that we're seeing. I actually think that's going to be, at least for the time being, less significant than the sort of under the, under the hood, under the fold type improvements that transformer based AI models are going to deliver.

And I get that OpenAI and ChatGPT is this hot consumer app, but I think the utility of that is fairly narrow space. People who really, really need that for their work or their life is really small.

Ben Bajarin (16:07.668)
Yeah.

Jay Goldberg (16:09.63)
Maybe they'll come up with something huge new. Don't discount that. But it's a small number of people who really, really would pay for it. But much more important, I think, are the gains we're getting from AI in all kinds of tiny little ways that we really wouldn't notice if we weren't all talking about AI. And I think every software company has AI in their stack now. And it's a big deal.

probably doesn't improve things in ways that we notice. It makes things 10% more efficient. It makes the chat agent a little bit more communicative. It makes it a little bit more efficient to do threat response if you're a security company. A lot of chip companies I know are using AI, you trained a bunch of AI models to improve branch prediction, the ultimate obscure super low level process. It makes their chips faster. It's a definite improvement. It saves a bit of power, 5% of power, whatever.

We would never notice that if it weren't for chat GPT, we wouldn't be talking about those improvements, but they're real and they're important because you compound them across the whole stack and it's you start to get real meaningful gains. And so I don't want to dismiss AI. I just think that a lot of the, I think we need to appreciate where the gains are coming from right now. It's in a lot of small places, still important, but not as glamorous, not as headline catching.

Ben Bajarin (17:30.248)
Well, and I wonder too, like when we, everybody's thought about this, you know, how additive is it to the TAM, which I do, I do think that's true. At the same time, I think a question within that model was, are we talking about dedicated products that are, that are positioned for AI versus the evolution of a CPU or the GPU? Because clearly people were going to buy, you know, GPUs they're obviously buying

new ones now and that architecture is getting more optimized and performant for something specific like training. But it's not like Nvidia wasn't going to sell, keep selling GPUs or just not selling as many as they are now thanks to AI training. So you could sort of argue the same thing, which is, well, does that mean that we'll sell more CPUs? Will someone pay a premium for that, not just order more? And that's where I think the dedicated part like MI300X or Intel's Gaudi and future versions of that.

that product are. But I think the same question applies to when you say it's additive, does that mean people are going to start paying premium for those parts like a CPU or something else, or maybe even memory versus just buy more of it? So that's question A. And then the same is on the software front. I think this is the real tough thing that Microsoft and SAP and Google and others are really going through, which is how much more will people pay you?

for those software experiences. And we don't know the answer to that. It might be far less than people expect today, right? Like that might not work. The 30 to $50 per person in your organization, that might not be it, right? It might be that you just need to, in fact, there's what we're recording on today. Riverside is a great app. It adds tons of AI features, AI editing, AI captions.

AI summaries, like all this stuff that I didn't have six months ago, they don't charge me anymore for it. It's the same price. It just gets better. And I'm never leaving this platform because it gets better every quarter. And I just wonder, like, okay, should you charge people for this? How much does that impact the TAM versus are you going to have to settle on just adding these features? Maybe it hurts your margins. I don't know. But people aren't really going to pay a lot more for this. And maybe that's a fair assumption.

Ben Bajarin (19:56.893)
that will come to light again over this next year.

Jay Goldberg (20:01.178)
Absolutely. I mean, I've tried to build a TAM model for AI. And I tried a long time ago, back when we were talking about it last year. And what really, where I really got stuck was this issue. Like, do we include the neural processors in the A series Apple chip in the iPhone? How do we factor that into a TAM? It's an important chip. It's probably the...

AI that all of us use the most on a daily basis without even realizing it. So it's significant, but does that go into a TAM model? It's like, I mean, in theory, we could calculate, oh, it's this much die size and it costs this much for them to produce it, and we should add that to the TAM. But no one's going to do that. It's not how you build one of these models. And I agree with you. It's interesting. You started with software. Software companies aren't necessarily able to charge for AI features.

I think about this a lot when it comes to semiconductor companies, because everybody's talking about inference chips. And I'm not convinced that any of these companies are going to be able to charge more for edge inference. Data center is a different thing. We can debate that. But on the edge, when we're talking about putting AI capabilities into a PC or a phone, who's going to pay more for that? This is something I always ask.

Ben Bajarin (21:07.145)
Yeah.

Right.

Jay Goldberg (21:28.554)
when we go to these industry with company events, like why should I as a consumer pay $1 more to run AI on my laptop without the internet? And nobody has a good answer for that. Like I understand some people pay for chat GPT or Dolly or whatever, but running it on a device is a separate thing. I don't really care if it's, you know, I'll run it on the cloud, it's fine. And so.

Ben Bajarin (21:34.888)
Yeah.

Ben Bajarin (21:40.147)
Right.

Jay Goldberg (21:56.734)
I've been on this theme for a while. I know I've talked about it before. And it was interesting. Qualcomm, who reported this week too, when Cristiano was talking about the uplift that AI would bring, because they have a new line of Snapdragon coming soon with all the Nuvia goodness in it, that Snapdragon will have AI capabilities, inference capabilities built into it. He didn't talk about it as a premium. What he said was it would provide positive uplift to their

Ben Bajarin (22:11.582)
Yeah.

Jay Goldberg (22:25.986)
blended ASP, which is not the same as saying he can raise prices. What he's saying is there's going to be a small subset of Snapdragon chips, which they can charge a little bit more for. They can maintain their price premium. And that's a very, very different story than saying this is a big new tam for us. Snapdragon is going to have one of the features you're going to be able to buy a dozen different SKUs of Snapdragon out there. A couple of them will have varying degrees of AI capabilities.

And so that's the premium tier of Snapdragon. And maybe you pay a buck more for it than you paid for last year's version of Snapdragon premium tier. Maybe not. It's a much, much weaker argument. And I'm glad he said that, because that's realistic. I think that's what's happening. And so in the context of Intel and AMD talking about AI PCs, I think they're in the same camp. Yeah.

Ben Bajarin (23:07.232)
Sure.

Ben Bajarin (23:16.336)
Yes. So it was insightful. The, these two points are related relative to Qualcomm. Also, obviously Intel and AMD on the client side. I think I just hadn't fully appreciated this point that, that we've made, right? People probably aren't going to pay more for their client devices. The hope from investors is just that it keeps a cycle and maybe makes a cycle. A shorter refreshments, you know, cycles will get shorter as people want new ones every.

whatever, three, four years, because they get so interesting on a yearly basis. Fine. But I'm also, I think my point is they didn't fully appreciate how much that may actually also translate to the data center in this point. But, but what's interesting was to the Qualcomm point, um, Apple was asked this, Tim Cook was asked this very same question, right? At the end, toward the end of the call, I forget who it was. Um, it might've been David from UBS. Uh, he said.

Do you believe the edge AI story that will benefit Apple? Right? That kind of whole thesis that we want to run more of this stuff at the edge. And Tim Cook basically said, you know, absolutely. We believe that Apple silicon is well positioned for this, that there's a lot of value that can come to Apple's platforms around AI and gen AI on device for privacy security. But, but I don't think Apple's going to raise ASP because of that feature.

right? In the same degree that we're having this conversation with them, it's just going to come to the product. And yes, they already charge a lot of money for these products, but it's going to be built in. It just happens to be something that they are well positioned for also to this Qualcomm point. But it was just interesting that he, for the first time, I think Tim Klick was very clear. They believe that thesis also that a lot of AI processing and whatnot is going to come to devices. Same with Qualcomm's thesis.

But I think what we're layering on top of that is at the end of the day, the only thing it might do is shorten your refreshment cycles or to create a boost in cycle, people aren't going to pay more for these products.

Jay Goldberg (25:20.522)
Yeah, and to be fair, to least to Qualcomm, is if we can accelerate the upgrade cycle by three months or six months, that's a huge, that's good for their financials. It'll take quarters to play out, but it is something. It's not a small thing. Again, this falls in that category of things, like let's not talk about the big, big

Ben Bajarin (25:29.584)
Yes, right.

Totally.

totally.

Jay Goldberg (25:49.662)
gains. For right now, AI is going to be all these little small things that are sort of blended into lots of other things that are going to provide the uplift.

Ben Bajarin (25:57.5)
Right. But I think that's actually, so what you said made me think of an interesting point, right? And I think this is why everyone's getting excited back to this outsized expectations. Stuff has slowed down. Your need, and this is a data center point, because I've had this conversation prior to a year ago with people looking at infrastructure, they didn't always feel like they needed the latest and greatest. It was like, well, when I need to upgrade, I'll upgrade. And sure.

I'll buy a better product, but they weren't pushed to upgrade any sooner. That's true of PCs. That's true of China. In fact, I think it's worth mentioning that the single greatest reason why China has been so ugly for a lot of smartphone vendors has been the clear evidence that the length of refreshment cycle in China has extended significantly. This used to be like less than two years, like 1.8 to 1.9.

years that they refreshed. And now that's elongated. So every market has elongated their refresh cycle. Again, I'm making a client side point here, right? PCs, smartphones, everything, they have extended. There's no reason for the person to be like, yeah, man, I got to have that new one. And that's what I think this hope is. These expectations are maybe, maybe for, again, a short period of time, because this is not going to be a 10-year point I'm making. But for the next few years, there's a reason why you might upgrade.

or upgrade your infrastructure more regularly. I think that's the fundamental premise of hope, if you will.

Jay Goldberg (27:32.522)
I think that's a good point. I would extend that to the PC because there is an awful lot of hope coming out of AMD and Intel. Every time you ask them about this question, it's like, oh, what's the big deal? Why do I need AI on my PC? They all say, oh, Microsoft's coming up with something really cool soon. And OK, Microsoft, to their credit, is ahead of the curve on this.

Ben Bajarin (27:42.289)
Yes.

Jay Goldberg (28:02.19)
They've clearly been thinking about this for a long time. I'm willing to believe that they're gonna bring out a whole bunch of new AI features that are gonna make PowerPoint and Word and Excel more useful. But I'm hard pressed to think of a way in which I really am like, what do I really need out of Excel to be better? What's AI gonna do there? I've used ChatGPT to give me Excel code before and it's okay. It's...

a little bit useful, but I guess my point is we're hanging a lot of hope on Microsoft. And I don't want to discount their abilities, but this is not a company that's well known for user experience. They don't have that same telepathy the way that Apple does into understanding what consumers really intuit. And so I'm a little bit more downbeat on AI PCs because I just don't think that...

Ben Bajarin (29:00.841)
Right.

Jay Goldberg (29:01.294)
Microsoft, it's gotten to the point where the expectations around Microsoft are too high, unfairly so.

Ben Bajarin (29:05.893)
Right, right. I think that's reasonable criticism to your point.

Jay Goldberg (29:09.662)
Yeah. I mean, I really, I think one of the things we're all sort of, the whole industry is we're all waiting to see what Apple's gonna do. Like that's where tech has been for, you know, 17 years now. What's Apple gonna do? Okay, well, they'll figure it out and we'll copy them. Like that's, I think that's gonna apply here in AI as well. And the fact that Apple has taken so long to talk about AI, I mean, this is the first call where they really talked about AI, first call where they've mentioned generative AI. Like...

They're just being deliberate. I, you know, I was, on my note, I was saying they're not, they're not asleep at the wheel. They're not, you know, falling behind. They're just lying in wait. They're trying to figure out how to do, I mean, this is what Apple is really good at. They sit around, they think hard about a problem from first principles, from first principles of user experience, and then find ways to apply the technology, rather than just apply the technology and hope it works. Like, and so they're going to take their time and we'll see what they come up with. And maybe, you know, nothing, or maybe something interesting.

Ben Bajarin (29:52.788)
Right.

Ben Bajarin (29:59.432)
Yeah. Right.

Ben Bajarin (30:09.036)
Yeah, so I will say though, like, I think it's worth pointing out, I'm in general, I'm generally a tech optimist. Like, I don't, I'm not skeptical or conservative, like, very often. I do have a hard time though, believing that AI will accelerate any refreshment cycle of any category.

whatsoever. And I realize I don't say that lightly because generally I'm always very bullish of these things. But I guess I'm questioning the premise that that's going to happen, right? That it will shorten anybody's refreshment cycle anytime soon. And here's why. Over the next 12 to 18 months, I would bet very good money that your average consumer...

not your enterprise worker or your person in an enterprise being productive, your average consumer over the next 24 months could care less about AI. By the time they care.

It's gonna be good. It's probably gonna be good enough. It's probably like, do you know what I mean? Like it's not going to have the effect where, and I hate to use this analogy, but I'm just gonna go back to the Pentium analogy because year over year, stuff got faster in the Pentium era. Word opened faster, apps opened faster. Like that was pain point. Stuff got drastically faster. By the time this works itself out,

I can't see what the year over year on device gains are going to be that make people go, yeah, man, I hold my phone for four years now. All of a sudden I want one every other year. Outside of Apple, who's trying to convince more people to go on a payment plan to get a new phone every year to by default, which clever hardware as a service for Apple, not going to come to other vendors. Also United States point only not China or Europe. But you see what I'm saying? Like by the time it's there, I'm just not sure anybody's like, yeah.

Ben Bajarin (32:13.404)
every two years, because it gets so much better. It's gonna be good. And I just don't think the, and again, I hope I'm wrong. I'm generally not this skeptical, but I just wanted to make that point.

Jay Goldberg (32:25.766)
You're stealing my persona. Like I'm supposed to be the super skeptical one, right? But I actually, so this is what I was struggling with, is like I didn't want to be skeptical because I recognize the technical achievement that are transformers and generative AI. My take was the gains will be small and that's important. And that is something we should appreciate. I'm open to the possibility that there'd be something more out there. I don't...

Ben Bajarin (32:28.401)
I'm sorry.

Jay Goldberg (32:54.462)
assign it a high degree of probability, but I think it's possible that we'll get something more exciting sooner or somebody will come up with something out of the blue. We're in really early stages, and so there's lots of experimentation, lots of room for experimentation here. So I'm oddly less skeptical than you, but...

Ben Bajarin (33:13.384)
Don't do this to me today, you need to be more skeptical.

Jay Goldberg (33:15.998)
Well, because I've been fighting my urge for months to just say, oh, this is all AI nonsense. It's all crypto. It's crypto 2.0.

Ben Bajarin (33:18.288)
Yeah, yeah. Yeah.

Right. I think to be honest with you, and I think this year and next year, we will learn this. I think if we see enterprises justify the additional cost per employee for the software, because they have justified an X percent productivity gain, I think that will be a very good sign. I don't think we're there yet. I don't think they know. We did a survey, an AI survey.

I'm going to completely blank on the numbers, but I think the average that they were hoping for to make it worth this initial investment was between 20 and 30% of productivity gains. So that's a decent amount. Like they're not expecting 100%. They were expecting a reasonable, and I think that's a reasonable expectation. If that holds out and they can, they're willing to pay Salesforce, SAP, Microsoft, et cetera, X number of dollars for those products, because they've justified it, that to me would be a positive sign.

in this direction, I'm just not sure, I'm still not sold that dynamic is going to take place.

Jay Goldberg (34:30.262)
Yeah, 20%, 30% gains are big numbers from what I'm seeing. So I don't think we're gonna get that. I think it's five, 10% improvements, which is fine. Because I think one of the interesting dynamics that could sort of put us off mark here is the fact that AI software is still developing really, really fast, which means that deployment costs are falling rapidly. Right?

a trillion parameter model, seven billion parameter works just fine for a lot of things. And that is still improving. We'll get to a point soon where all those numbers will become smaller and the compute required to run these models will be much less. And that opens the door for a lot more experimentation. And that's what's really going to drive this is the cost of running these models is declining really, really rapidly because of the software improvements.

Ben Bajarin (35:28.392)
Yes, yes. Yep.

Jay Goldberg (35:30.414)
Um, and, and I think we're still like in that dipping their toe in the water, trying to feel, feel around stage. Like I actually talked to somebody on the cell side this week who was at a bank, midsize regional bank and got laid off a cell, a publishing cell side analyst, I, somewhere on the I ranking lost his job and was replaced by an AI team.

and

Jay Goldberg (36:04.194)
So that's a really bad idea, but I won't go into my rant about the decline in quality of self-side research. AI, self-side research is really dumb. Very bad idea. But I think that's where we are, where people are gonna try out all these different things. These are all like, and they're gonna be, the fact that somebody is trying this really dumb idea tells you that everyone's like, nobody knows exactly what to use here yet. So we're gonna try all the good things and all the bad things, and we're gonna sort out who it is.

Ben Bajarin (36:14.172)
Very bad.

Ben Bajarin (36:19.793)
Yes, yes.

Jay Goldberg (36:34.002)
And a year from now, that bank will probably not have no presence in sell-side research anymore because their content is useless. But somebody else, the flip side of that is somebody else has figured out a way to do something faster. And they won't be able to fire all their sell-side analysts, but they will be able to process claims 5% faster.

Jay Goldberg (37:03.138)
closing, end of the quarter closing will be possible to do it a lot faster. Something like that will come up. And it won't be something that we write headlines about. It would just be a little bit of game here and there, a little bit of game there. And I think that's fine. Right. I'll tell you what, when I think about this stuff, I've taken the thinking about it in the context of my brother-in-law. And he works in a, he's like an executive at a mid-level retail company.

And his big pastimes are he likes to ride motorcycles, dirt bikes. And when I go and visit him, we go fishing. He's got a nice boat. Like he drives an F-150, like he is like the sort of stereotypical middle American male. For the life of me, I can't think of a reason why he cares about AI. Right. And his company is probably going to have, they use all kinds of software, like all other companies.

Ben Bajarin (37:56.348)
Yeah.

Jay Goldberg (38:02.686)
All those software will improve five or 10% here, it'll get a little easier there. He won't know it's AI. He doesn't need to know that it's AI. It's not gonna change his day job at all. And that's, but it's like, that's what AI will do for him in ways that he won't know that we care about.

Ben Bajarin (38:09.408)
Sure, right, right.

Ben Bajarin (38:18.94)
Yeah. No, I agree. I mean, my wife's a teacher. She is the absolute latest last to adopt things. And it's the same thing. You know, she's tried chat GPT, she's tried whatever she's like, I don't, I don't see how this helps me teach at the end of the day, if it can help her create lesson plans fast, more quickly and efficiently. Fine. It's not there yet. Certainly not something she's probably going to pay a ton more for, or her school would pay a ton more for them. So

I'm with you. That's like I said, that's where my skepticism comes from is that the biggest chunk of this market, they just kind of expect these evolutions to come and they will. They will come in software and I just don't know what that lever, if any, is going to be that's going to get them to pay anything extra versus expecting that these features just come to my phone. I mean, again, if you look at the example today of where consumers have no idea that AI is being applied.

but it is, is on camera, camera technology. And no one is out there really paying more and, and really, and it's not helping the refresh cycle at all. Right. And I get it that this is a, this is a behind the scenes experience, but there's a lot of AI going on there. That if you said, can you have this Bokeh blur and this H smart HDR, and it going to cost you $2 a month extra, they'd be like, Nope, sorry.

Jay Goldberg (39:19.918)
Right, right.

Jay Goldberg (39:41.432)
That's right.

Ben Bajarin (39:42.432)
All right, anyway, I don't want to make anybody too much of a downer. We are optimistic about AI. The market is very different. I think it's coming back to measuring our outsized, perhaps, expectations and being more reasonable with the out.

Jay Goldberg (39:59.755)
I consider myself optimistic about AI, but I think that we just need to shift our focus in what we expect, where that upside will come from. It's not gonna be flashy things until it is, but for the most part, it's gonna be small mundane things that are really important, but in small ways.

Ben Bajarin (40:09.493)
Yes.

Ben Bajarin (40:19.696)
Yeah. All right. Well, everybody, you know, this is a deep tech semi, you know, podcast. So within that vein, I will just go say, if you don't want to buy a vision pro, just go try it. Because it's freaking amazing at a technological semiconductor component level and everyone should try it because that's the kind of thing that you look at and you go, you know what, this there's a hardware, there's a big dollar hardware category, five, six years from now or something around this. I don't know what it is.

But it's on top of the AI excitement. We've got a new category of face computers, which is kind of what I'm saying. It's of excitement. So check that out. Thanks for listening, everybody. And until next week, when a Vision Pro is on my face, we'll talk to you later.

Jay Goldberg (41:06.69)
Thank you everybody. Click like, subscribe, tell your friends. Have a good week.

Ben Bajarin (41:17.02)
me get this uploaded and then I will tell you the Microsoft

Jay Goldberg (41:18.664)
Excellent.