A podcast about the business and market of semiconductors
Ben Bajarin (00:01.061)
Hello, happy second episode this calendar 2024 of the circuit. I am Ben Beharon.
Jay Goldberg (00:10.21)
Greetings, Internet. I am Jay Goldberg.
Ben Bajarin (00:13.401)
So we went to CES, we survived my feet and back hurt, although you and I have a hack for that, which we'll share with our audience, how to alleviate your body pain with some very willing show floor vendors. But I also caught a little bit of the crud, which I'm on the tail end of, which doesn't surprise me. There was 150,000-ish people there.
many people when you ask for demos breathing right in your face, a lot of people like to get right up in your face to give you a demo, which, you know, wasn't ideal. But that's CES. All right. So, so today, independently of each other, we didn't plan this. Like, literally, it went from me to you. We're like, look where I am. Why don't you talk about our show experience of relaxation?
Jay Goldberg (01:10.002)
Yeah, all the thousands and gadgets and crazy technology. And by far the highlight of the show for me was the hour. So I spent doing intense research into the subject of massage chairs.
Ben Bajarin (01:22.821)
which, which I'll point out to everybody, every is an advancing semiconductor category because they're getting, no, listen, hear me out, hear me out, hold on, hold on. I've not, I'm not even joking you. This was my minus COVID, which I attended virtually, but theoretically, let's just, my 23rd CES. My first CES was the big news of the show was the launch of TiVo and Replay TV.
in a very muted, so now you know how long I've been going to CES. But for as long as I can remember, I have always done this. I go to the massage chair section. So I'd like to say I have keenly observed their evolution. You know, when you first started, it was a, you know, looked like a leather recliner and I did some massage and then passed through the years. Then you, you know, your feet get involved. Uh, now these things like encompass your body. You got to stick each one of your little fingers in a hole. It's doing like a hand.
arm massage, but this year, this year, subject to the topic we're going to talk about where was AI at the show, there was a chair that said it did AI sensing of your pain points. So like did a little, I didn't get to do this because I didn't have time, but somebody was telling me it does like a full body kind of roll. So like it basically like pushes up and up.
up through your hips, through your spine and senses where there was some friction. And it's like, all right, you know, you've got pain in your lower back or your upper back was more, you know, more, more firm than, than somewhere else. And then creates a nice little program for you. There's some semiconductors going on there. There are sensors, there's a microcontroller, there's the brain of some sorts of that's, that's my grand, it is an evolving category of semis that we have to keep trying on a year to year basis.
Jay Goldberg (03:21.222)
I agree with that part. I will also say that I think that this is not just a spurious example of Ben and Jay's adventures in CES. It's a pretty good proxy for how a lot of AI plays out because I did the AI scan chair. I sat in that one. I also sat in it last year before they called it AI. And it was the exact same experience.
Ben Bajarin (03:44.901)
Right, right.
Jay Goldberg (03:49.17)
It was a great massage. Don't get me wrong. Like the folks at OSIM know a thing or two. But it's AI wash. Let's just be blunt. By far, the much more impressive, important advance in the world of massage chairs came last year with the invention of 5D rollers. And I'm pretty sure. I don't know exactly what that means, because no one can really explain it to me. But I'm pretty sure what that means is they have.
rollers sort of at the in the seat and they massage your hip muscles particularly well. That was more important than AI but you're right there's a lot there's a lot of semiconductors in some of these chairs as with everything else.
Ben Bajarin (04:32.769)
Yep. But you hit on the topic, so this is what we want to talk about. Where was the AI? Because I do think there was two tracks from the show, which I think is worth pointing out. There was, and if anybody followed this remotely, I think there was a recognition that AI was maybe there in a subtle fashion, but not overblown, which I would say is the exact opposite of the show where Smart Assistant showed up and was in everything.
You exactly were right that there are some, I would say a number of products that were probably doing, I guess what we would call either AI or not AI that then labeled them AI, but it wasn't like a giant on the sign. We've got AI in our whatever's. That was true. There was a tempered and more muted and I think for good reason recognition from many, many vendors.
to not overblow that, right? Not try to capitalize on the hype, but instead maybe just show some features and really where it came down to. And this is where I think this is an interesting conversation about was there some machine learning that's been going on for a while and people noticed they just didn't want to call it AI because they thought it was gonna freak people out, which just has to do with there's a lot of sensors that do show up on end products that need to do some sensor fusion and some working together of data. But again,
It wasn't generative AI, but there was some intelligence happening. And this year people started to use those examples. So for example, Kohler had a smart toilet, which I've seen for years, does any number of things of, of sensor data, uh, around orientation. Some of them were bidets. So as weird as that is to talk about, there was some sensor fusion happening to be more, uh, accurate, I should say more thorough.
But this year that was an AI feature. So I did see some people, some of the media who were doing some of these walking around show demos saying, it's a smart toilet. It's got AI. And I was like, I mean, really it's doing the same things. It's done for a while, but we're saying, or we're comfortable saying AI now. But the meta point, it wasn't as pronounced as I thought. Coming off that one show where the smart speakers and Alexa and Google assistant and et cetera, were just.
Ben Bajarin (06:55.433)
everywhere, whether they needed to be or not. It was much more muted than that, which I sort of take as a positive.
Jay Goldberg (07:01.678)
Yeah, I think there's, I see this sort of two tracks for AI. One is the pure marketing term. Like there's lots of gadgets that have AI in them with air quotes, right? There's the AI chair, the AI backpack and the AI toothbrush. That's just the label AI being thrown on, you know, machine learning at best, but really just software.
It's AI in it. And then there were, like I'm not dismissing AI and neural networks and large language models. Those are important, but the vast majority of things that are called AI today are not that. In the category of real machine learning applications, there wasn't a lot. There wasn't a lot for a bunch of reasons we could get into.
But I thought that was interesting. I was really going to the show expecting to be AI in my face everywhere. And we didn't get that. And that's probably a good thing, because Vegas is too much as is. But you know.
Ben Bajarin (08:06.07)
Yeah, I agree.
Ben Bajarin (08:15.013)
So I will say though, that's why I say there was two tracks of the show. There was the kind of main area where this was exactly as we say, perhaps more muted. Then there was Eureka Park, where you had two parts of Eureka Park. You have the general section where there was just labeled themes, smart home, robotics, smart health, AR, VR, et cetera. And there was booth vendors there. And then each country...
Jay Goldberg (08:15.159)
What?
Ben Bajarin (08:44.749)
has its own sort of pavilion, right? So you name a country, they've got a startup cycle. So I did as much as I could. There was a lot more AI there than on the main show floor. There was AI everywhere, which has two things. And I appreciate the candidness and transparency of people who exhibit in a Eureka! Park because they are startups.
And a lot of venture capitalists go down to Eureka Park and use this as a way to sift through, try to find a needle in a haystack, if you will. So there's numbers of them right there that are just saying, seeking funding halfway through of a $700,000 round, please inquire. Like just in bold letters there on the side. But that was where I saw what we were expecting to see, which was a...
Ben Bajarin (09:43.501)
a lot of AI for the sake of AI, whether it was or wasn't, because in this context, that happens to be a buzzword that's helping some companies raise money, although I know most investors are not gullible to that. But that's where I saw the sort of, which is where you point out the AI backpack. There was some, I mean, there was a lot of random stuff that was definitely not AI, but said they had AI and tried to ask what their AI was and didn't really get any answer.
So yeah, that was where, like I said, the two tracks of the show, Eureka Park was a very different story, and probably where I got sick because it was very packed. I mean, like, extremely busy down there.
Jay Goldberg (10:24.402)
Yeah.
Jay Goldberg (10:29.323)
Yeah. But where we didn't see AI was in the booths of the big vendors. They were more muted or just totally silent on the subject, which I thought was telling. I thought that was really interesting to me. I was really expecting Qualcomm, if no one else, to have a line of AI-enabled laptops. That was a big thing.
Ben Bajarin (10:35.669)
Yes, correct. Right. They were more muted.
Jay Goldberg (10:56.854)
their big push for the last few months has been this subject, and they didn't have any on display. In large part, I think because they're not quite ready. I mean, they're ready, but the rest of the ecosystem is not ready. And I'm sure things are coming, but it just wasn't quite there yet. And it really left me with the sense that, for a lot of the big companies, we're all waiting to see what the thing is.
three or four big platforms are going to do. What's Microsoft going to do? What's Google going to do? What's Apple going to do with AI? Apple famously doesn't actually attend or doesn't exhibit at CES. They're there, but they don't tell people that. But what is AI going to be for Apple, if anything? And I think once those big companies start to answer, we'll start to see more AI things.
AI enabled laptops in particular and phones.
Ben Bajarin (12:00.337)
So on that front, from a semiconductor standpoint, I think a couple of interesting things sort of happened. So one was, and I'm gonna make this point without necessarily saying this is the norm for every semiconductor vendor, but I did think it was interesting relative to someone recognizing where things are going. So.
So I always like to go by Nordic Semi and just kind of see, you know, what they're new and cause I don't talk to that company really very much at all, but except at CES. Um, so it was nice to get sort of an update on their platforms. Obviously, you know, they make a lot of reference boards. They make a range of different chips, lots of IOT. You have to appreciate their, their dev board for IOT is called thingy, right? So this was thingy version, whatever. Love the thingy. So, but I said, you know, what, um, what
Jay Goldberg (12:51.455)
Yeah. Oh yeah. It's fantastic.
Ben Bajarin (12:58.713)
What are you guys thinking about on device AI relative to integration with your existing cores? Because they didn't really, they don't really have, and again, I'm loosely using this term, an NPU, but they didn't really have a dedicated AI sort of core. So I missed this news, which they brought up to me in August that, uh, they acquired at LAZO, which is a team that's got AI, ML, uh, sensor health related applications, sort of, et cetera. And so.
You know, but they basically said, you know, you can see where this is going. Imagine that for a roadmap, you know, hint, it's obviously coming more built into the core SOCs or whatever their packets are. But the point I'm making is I do think that that's an example, because there's a lot of examples like this where, you know, companies may or may not, may not, may not have been going deeply down this idea of starting to figure out. How much on device.
AI is needed. And so therefore either adjusting their roadmaps, which we know we've seen, right, relative to the NPU discussion you and I are having that most people's NPUs are retrofitted, which is your point, which, which I agree with. But, but this is requiring, again, if we believe it all is what it is, which we'll talk about what happens on device in a little bit, it is requiring a rearchitection of your design.
in semis. And so whether you either have that IP or you don't, if you don't, you need to get it, which is this example of Nordic, because you need to do more of it on device than you were doing before, which I think is what they're recognizing. However much it is, there is more that may be happening. So like I said, I don't know how indicative this is, but I think it's a point of everybody is re-examining their roadmaps and making these adjustments as they figure out how much they want to bring to the edge.
Jay Goldberg (14:50.722)
So there's two parts to that, though, because one, I think, is like, and Nordic's a good example here, because Nordic makes connectivity chips, Bluetooth, Wi-Fi, a little bit of cellular, some power management, right? And what exactly do they need neural networks for? What do they need machine learning for? Because I think there's really two categories for that. One is AI tools.
machine learning tools that improve the functionality of their chip in some way. And then there's AI that's going to be a customer facing feature. Where you're gonna say, oh, this is gonna make your customer experience better because of that. And in Nordic's case, it's very hard to come up with what that is, right? I don't think, you know, Nordic, as you mentioned, they have the thingy and they're known for being fairly customer friendly in terms of their tools and their support.
But are they going to have, I mean, I guess they could have a chat bot to help you program their devices. But I don't see them doing that. And it's not like you need rich, robust software interfaces, generative language models on your Bluetooth device. It just doesn't work. I think where it's more interesting for them is that they're going to find ways to make their connectivity more efficient, reduce power usage,
network hopping, channel hopping decisions around the Bluetooth stack based on some machine learning model that they're going to program and then they embed a little tiny, you know, some kind of neural core inside their existing chip. And you know, it's tiny, it'll be a tiny little piece because what they're really doing is they found a new form of calculation that they can do more efficiently with a neural network type model.
Ben Bajarin (16:48.145)
Thanks for watching!
Jay Goldberg (16:49.47)
slot that into their chip in ways that if we weren't all talking about AI, we would probably never notice. They're just, oh, they're just making, they found a way to make their chip a little more power efficient, great. And so that's probably more, I think that's more significant. Like the stuff that they're not even gonna talk about or normally wouldn't even talk about, that's probably much more important for someone like Nordic. And I think that's true of everybody, right? Most of what's going to end up as AI is gonna be stuff that we wouldn't otherwise notice if AI weren't such a big marketing feature.
Ben Bajarin (16:55.248)
Right.
Ben Bajarin (17:17.685)
Sure, sure. Agreed. And I think it's a reasonable sort of understanding about how some of these problems will get solved. Another interesting example was, I was in the Lenovo booth for a while and Lenovo was showing off a couple of products, Intel Core Ultra products, so
One of them, the Yoga Book i9, which is a very interesting sort of dual screen, you know, laptop that kind of mounts and you've got a full screen on the top, a full screen on the bottom and a keyboard. And they were saying they've actually got an AI chip in there, which. Sole purpose is battery life management. And so it's doing whatever it does at an algorithm level to cause you got two 4k LED screens staring at you in the face.
in this i9 that's kind of docked. It's super clever. It's the second version of this product, first one running Intel Core Ultra, so it should get some better battery life anyway. But they were using this sort of dedicated chip to really do smarter management of the system, not calling it AI. Like they're not going out saying, hey, we're calling it. They're just saying like, look, this is something we did to solve a problem. In this case, we want a better battery life. We felt that we could build this thing and it could take all these inputs and help manage thermals and do whatever intelligently.
and clever way to solve a problem, right? It's a discrete chip, but that was them saying, this is how we're gonna solve this problem. It's not doing generative AI, it's just got a job. This little chip has a very specific job and this is what it's doing.
Jay Goldberg (18:56.31)
Yeah, for me, my favorite example was the folks at John Deere, who, you know, in full disclosure, provided me with this lovely hat for free. Other than that, it's not a sponsored event, but like I was in the John Deere booth and people, I think people underestimate John Deere's technical capabilities. I think it is the most advanced industrial, large industrial company when it comes to all things compute and semis and software for that matter. Right. And
Ben Bajarin (19:14.061)
Yes, agree.
Jay Goldberg (19:24.942)
If you look around the world, the only company that actually has an autonomous vehicle right now is John Deere. In their booth, they had a live feed from a tiller plowing up fields, a cotton field in Texas, live streamed to the show, like autonomous tilling vehicle. Very advanced in what they're doing. So I was in their booth and I ended up talking to one of their leading software data analytics people.
We had a 15 minute conversation about what they were working on and, you know, all the different things and the full, the full explanation, pretty good conversation. We made it through 15 minutes and he didn't mention machine learning or AI once. He was just talking about kind of what they were, what they were doing. And I thought that was very telling because here's someone who actually built an AI system, an autonomy system, who didn't describe it as AI, didn't describe it as machine learning.
And so I really appreciated that. And it was funny though, because I had, we had a long conversation about all of these things and it's clear like they're very advanced and doing all kinds of important features. And they just don't think about it in terms of that. They just think about, oh, we have this, this is what we wanna accomplish. This is the way we're gonna do it is using these kinds of models.
Ben Bajarin (20:48.385)
Yeah, no, I agree with you. In fact, I did one of the research plants for, man, I hope I remember that, Blue River, which is who they purchased for software and AI is just down by me at Gilroy. So last year I went and did the demo where we kind of watched it do that thing. And they threw stuff in front of it and it would stop and make sure that it thought a human and it wouldn't move until, or an animal until it went away.
incredible. And one of the things that they had actually said is that agriculture had generally, and I thought this is true and you think about it, been a fairly early adapter of advanced technology because of the efficiencies that it brings. Now, yes, it was expensive, because I had said how likely is it that farmers adopt this? And they're like, look, if a farmer doesn't have to spend time eight hours a day till in a field or spraying weeds, that frees them up for other things.
It's an economic efficiency, despite the cost, this will pay for itself over time. And so it's interesting how agriculture is going to look for these things and appreciate those advancements again, without needing to go deep in the weeds and call it AI, et cetera. These are, those are useful applications. And that is a, obviously a ton happening on device, but also having a relationship with cloud connectivity in order to monitor, you know, it's cellular. And that was the other.
you know, sort of point I thought was interesting was, you know, four or five months ago, there was such a push toward on-device AI, on-device AI. And from a lot of my kind of backroom conversations, it's now, yes, there'll be on-device, yes, there'll be cloud. These two things will work together in some harmonious system. It might take time to figure out how that happens, but it's a recognition of this hybrid environment. You need the cloud for certain things.
You need that on device to get the data for the certain things that go back to the cloud, but it is going to be this more symbiotic relationship of cloud and device to do the whole AI story.
Jay Goldberg (22:57.538)
So my assumption has been for a long time that the only way for the economics of AI, and really what I'm talking about here is inference. The only way that inference economics pencil out is if a lot of inference is actually done on device, at the edge, right? Because that means the end user is paying the capex. My assumption is AI inference is so expensive that even Amazon and Google will struggle to afford to build out.
all that compute. I mean, that's a lot of silicon. And so I assume that a very large proportion of the workloads would end up on the edge. A lot of the queries will end up getting resolved at the edge.
Ben Bajarin (23:27.141)
Sure. It is. Yeah.
Jay Goldberg (23:39.498)
I still think that's the case, but one of the sort of standouts for me at the show was I'm starting to reconsider that. And again, I'll go back to John Deere. A lot of what they're doing in terms of that autonomous vehicle is still being done in the cloud. And
Ben Bajarin (23:57.957)
Yes.
Jay Goldberg (24:01.818)
Some of it, like safety probably still has to be done at the edge because you need ultra-low latency. You don't want to run over somebody. But I think even navigation to some degree is done in the cloud. Things that you would imagine would have to be on the device, they were doing it in the cloud. Because one, they have their own cellular network, MVNO type operation. So they have pretty good control over that. But I was surprised at how little they were really doing on the edge, in the tractor.
And so I'm starting to reconsider that. Like, I think we could get to a world where if the software moves in certain directions, we get some of these models small enough, efficient enough, or even, no, sorry, even if they don't get that super small, if the silicon moves in the right direction, we can end up doing a lot in the cloud, a lot more than I would have expected. So yes, I agree, probably it'll be hybrid, but there is a world in which we could do
much, much more cloud AI than I would have ever guessed. I don't think it has to be 50-50.
Ben Bajarin (25:07.565)
Yeah, yeah, I think that's the right point, which is we don't know what the balance is. The one thing that's interesting about the John Keir Tractor though is that it's, again, we don't say sort of this cloud to hybrid kind of AI bit. It's got connectivity, but it's basically a giant computer. I don't even know if we consider that it's doing AI, but it's basically taking all those sensors at the same time and just functioning autonomously. And that's just edge compute. That's just it doing what it's doing because it's got...
Jay Goldberg (25:23.522)
Mm-hmm.
Ben Bajarin (25:36.629)
I forget how many, four or five GPUs, a ton of cameras everywhere. It's just doing its thing. It's just computing locally. I don't think the whole getting over-borne with what it's going in AI is really the point for that machine, which is also the same thing. If it's just that Nordic or NXP or even Qualcomm just has something doing advanced sensor fusion. I think we're in this unfortunate situation where when we say AI right now, we say generative AI. You think GPT.
But there's all this stuff that's been happening for years behind the scenes, trying to do anticipatory work, trying to gather data, crunch that data, turn it again, is that AI? I don't know, but that's just the compute that happens. And now, unfortunately we have, we have more questioning of what is that AI? It's like, I mean, you know, it's trying to be predict predictive, but it's not naturally natural language predictive. Like it's a very different.
situation, but that's just the compute that's been happening at the edge for, for some time.
Jay Goldberg (26:40.81)
Yeah, but I think it is important to a number of companies who we speak to and speak of often just how much inference gets done at the edge. Because there are quite a few companies now who are sort of depending on a future where inference is very edge heavy. And I don't know.
Ben Bajarin (26:47.321)
Yes, agree.
Ben Bajarin (27:00.289)
Yes. So, so the point to this though, which kind of rubber meets the road, I'm sure you talked to some folks on the investment side of things, which leads to a point that I think is a big question for the whole. What does this mean for semis this year? Right? Yes, last year was a little bit weird. I think everybody's sort of hoping that we have two things that happen. We have in
increase in demand of more edge inference products at the edge, which helps the cycle. We already knew that was going to be there in the cloud. And not that last year was a terrible year for semis, but people are right now thinking that this could be an up year, which could be true. But that was sort of the question, like the state of this industry now coming into this year as more people.
want more compute at the edge, maybe more generative AI-ish stuff in TVs and smart sensors and appliances, wherever that goes. From a semi standpoint, I use CES as like, well, is it going to be an interesting year? Is it going to be a dud of a year? Are certain categories that we need to be successful, phones, PCs, TVs, automotive, etc., just going to have a boring year?
It feels like everyone feels like this could be an up year just based on some of these trends. But obviously it's early, but there is excitement that 2024 might be a stronger cycle for semis than last year and maybe even a couple of years before.
Jay Goldberg (28:47.566)
So it's a good question. I always get nervous about CES as a leading indicator of anything because it's so big and there's so much going on that the quality of signal that comes out from it is not great. I think auto industrial is not going to have a great year. They're already having a terrible year, and I don't think it's going to get better until pretty late.
Ben Bajarin (29:00.345)
Yeah, it's tough, right?
Ben Bajarin (29:10.641)
Yeah.
Inventory digestion is a thing for autos.
Jay Goldberg (29:17.586)
And for consumer semis, it's more a question. We've seen that turn. Last year was pretty bad than it turned in the last quarter. The question is really, is it going to be good or is it going to be really good? And I don't see a lot of signs yet that it's going to be really good. It's just going to be better than last year, which is not a high bar.
It sounds like it's going to be a good year for memory, because everyone's being very price disciplined, and there's lots of memory needed everywhere. So I'm more mixed, right? For me, it's a big question of, other than Nvidia, who's going to have a good year? And at what point does Nvidia hit a one quarter air pocket? Nvidia's going to have a good year. Everyone else is, I think, still to be determined. And a lot of it, I think, will come down to software.
What are the really interesting things that people are going to want to do on their devices? And as it stands now, I didn't see a lot of the show that really got me excited to think, oh, it's going to be a big year for consumer because of AI and it's not there yet. So I think we'll start to see more of those things percolate up through the year, but probably not until sort of the middle of the year.
And that means those things don't get deployed in actual devices until the end of the year. And so I'm kind of a little bit more muted on my outlook for semis for this year. I think it's going to be a little bit trickier. There's just not a lot of it. There's a lot of interesting ideas out there. People are talking about things that could be, but I'm not seeing the products yet.
Ben Bajarin (31:03.813)
So I have a similar take. I do think this year will be a little bit better than last year, but I don't think it's the cycle everyone's hoping. I think this year, there's still some wrinkles to get ironed out in a number of these categories. There's still some, as you like to say, retrofitting of prior architectures.
that's going into these client categories, anything edge. But I think that sets a table. This is my going thesis. I think that sets a table for 25 and probably 2026 where things start to get really, really interesting because around that time horizon, I do think
mixed reality is going to be a bigger thing, which is going to require a ton of silicon. I do think AI on device will start to feel a little more mature than it is, and we'll have some new silicon in cloud across the board and on device that is re-architected. So with the AI era in mind, which I think is going to make a very big deal on performance, because all of that has to
Ben Bajarin (32:29.881)
use cases that take advantage of those things, which I think we'll start to see scratch the surface, but again, limited by these old architectures in compute today, it's just not going to have the capabilities that it will. Once we start to see these things get re-architected and redesigned with sort of this in mind, I think we'll start to see some of that S-curve swing here in terms of performance and...
demand on devices in somewhat of that timeline. So that's my, it's a three year gradual build toward an environment where things start getting really interesting probably late 2025 and then in 2026.
Jay Goldberg (33:15.734)
So if you think about it, chat GPT, which really sparked all this latest excitement around AI, is 13 months, 14 months old at this point. And so if you had seen the chat GPT announcement and the next day started designing a chip.
it would just be you'd just be getting first samples back from the fab right now. That's how long it takes. You look at Microsoft, who's obviously very, very connected and very plugged into AI and what's going on at OpenAI. They announced their AI accelerators two months ago. All of those were designed prior to ChatGPT. And the company's made this big bet on GPT. And it's a big bet.
Ben Bajarin (33:56.273)
way. Yeah.
Jay Goldberg (34:02.398)
like their silicon, we're having to wait for the next generation of their silicon. And they are arguably the most ahead of the curve on this second after Nvidia. Right. And so nobody else's hot AI consumer facing accelerator is really going to be in silicon until end of this year at the earliest, just that's how the design cycles work.
So that means it's, silicon's available at the end of this year. It's not in a device until next year. So yeah, I think that timing lines up. It's a little disappointing though, because it means 2024 is gonna be more, we're gonna be in a holding pattern.
Ben Bajarin (34:40.494)
Of course.
Right. Yep, yep, agreed. All right, well, I guess we'll see if we're right, but that doesn't change, I think, the way we analyze this year is really, who's making progress on some of those specific architectures, who's starting to show, I think, some advancement in some of these more specific
use cases, right, in advancements of CPU, GPU, NPU, all of the things that are involved in and compute in AI. But I think that momentum is this gradual build, but I think this year is kind of that foundation bit. And so I think as we look for those signposts this year, where is the saying we're going kind of a thing.
That's sort of at least the lens I'm going to view this year is, is what does that build for that timeline? And assuming that there's no slow down there, um, what the impacts that's going to be over a two to three year sort of timeframe of perhaps accelerated growth or strong growth within the category.
Jay Goldberg (35:56.566)
Yeah, I think this year will be cyclically a good year for most of semis. The cycle is turning for most things. And then you have sort of underlying structural factors where more and more things have more and more semis in them. I think this year we're back to status quo. We're back to sort of neutral, steadily growing market. I'm going to be keeping an eye on software, user applications,
Ben Bajarin (36:03.225)
Right.
Jay Goldberg (36:26.802)
normal people can do with AI both at home and at work. I think those are the kinds of announcements that are going to be most significant in determining the longer term trajectory for anything AI. This is an important year on the software front because if we start to see really compelling things, then that will drive big, big uptick in numbers beyond. And then just going back to the point you made about VR, AR, XR, 100% agree, I saw a lot of
companies demoing all kinds of stuff for VR, all kinds of accessories, all kinds of software for VR. I think that had kind of disappeared from the show. It had been like, what, eight, nine years ago, it was the hot thing, and then it sort of fizzled out and went away, and now it was back, in large part because of Apple, but I think it's broader than just the Vision Pro ecosystem, because we all know Vision Pro's gonna be pretty small for the first few years, but I do think people are starting to pay attention to that more.
VR content, VR accessories, that's starting to become, we've gone through the hype cycle, we've gone through the complete disillusionment, and now we're starting to build real products, right? And it'll take a while, and a good showing from Vision Pro will help a lot, but it's not required. And on the AR front, I think, I mean, this is something I talked about in my newsletter today, was there's still a lot of technical problems with getting augmented reality classes. Like, that's where we need to go.
Ben Bajarin (37:38.817)
Yep. Agreed.
Jay Goldberg (37:54.242)
glasses that work. And that's incredibly challenging, technically. Battery power, video signal, all that stuff, video processing. But I think there were some important technical advancements this year that will, things that will dramatically improve power performance. I think everybody, the device that everybody asked me most about who wasn't at the show was like, hey, did you see those transparent TVs? I like the.
LG and Samsung and a couple other companies had these transparent LED TVs that were pretty interesting. They were using them in large scale kind of formats. I think it gets really interesting when they shrink those down. I don't know how long that's going to take. It's probably not soon, but like that's what we need to get these glasses to work is exactly that kind of technology. And so we're inching forward on AR in ways that I didn't think we were going to see for a while, so it's encouraging.
Ben Bajarin (38:51.329)
Yeah, agreed. Well, I have my alarm set for a few days from now to get up at 4 30 and prepared to order the vision pro at 5am on the dot in the hopes that I get in the early cycle. So that's happening. And it will be very, very fun to talk about that on the channel because there is some indication that it is the spec'd up version of M2.
which might mean that there's a lot of compute. So we will see, but that will be a fun category to talk about amongst the others. So interesting. Well, I'm excited to see how this year plays out. It's going to be an interesting one. There's gonna be a lot of people talking and a lot of people doing, and it'll be fun to be in the midst of all of that. So thanks everybody for listening. Have a wonderful the rest of the week, and we will talk to you next week.
Jay Goldberg (39:50.51)
Thank you everybody for listening and tell your friends, click like and subscribe.