TechSurge: Deep Tech Podcast

Digital imaging is so ubiquitous today that it’s easy to forget how improbable it once was. In this episode of TechSurge, guest host Nic Brathwaite sits down with Dr. Eric Fossum, inventor of the CMOS active pixel image sensor, to unpack the breakthrough that made it possible to embed cameras into billions of devices and the deeper lessons behind it.

Eric explains how his work began not with consumer electronics, but with a NASA constraint: how to shrink a refrigerator-sized space camera into something small enough for spacecraft. The solution required a fundamental shift in architecture. By moving from CCD-based imaging to CMOS, where sensing and processing could happen on a single chip, he enabled a level of miniaturization and scalability that transformed cameras from standalone systems into embedded infrastructure.

But the conversation goes far beyond the invention itself. Nic and Eric explore what it takes to commercialize deep technology, from the early days of Photobit to its acquisition by Micron, and the critical role ecosystems play in turning breakthroughs into global platforms. They discuss why intellectual property is less about protection and more about leverage, and why even the most important inventions require manufacturing scale, capital, and partnerships to succeed.

The episode also looks forward. As AI systems increasingly rely on visual and physical data, sensors are shifting from tools designed for human perception to components optimized for machine intelligence. Eric highlights the challenges of pushing intelligence to the edge, the limitations of current architectures, and the growing importance of sensing technologies beyond traditional imaging—including molecular detection and new materials that go beyond silicon.

While much of today’s investment is concentrated in models and compute, this conversation makes the case that the next wave of innovation may come from deeper layers of the stack, where machines interact directly with the physical world. The future of AI may depend not just on how systems think, but on how they see, detect, and understand their environment.

If you enjoy this episode, please subscribe and leave us a review on your favorite podcast platform.

Sign up for our newsletter at techsurgepodcast.com for updates on upcoming TechSurge Live Summits and future Season 2 episodes.

Episode Links
Timestamps
  • 02:00 From CCD to CMOS: Rethinking How Images Are Captured
  • 06:45 The NASA Problem: Shrinking a Camera for Space
  • 12:30 From Refrigerator to Coffee Cup and Beyond
  • 19:30 From Lab to Market: Founding Photobit
  • 26:00 Scaling the Technology: Micron, Manufacturing, and Cost
  • 31:00 The Role of IP in Deep Tech: Leverage vs Protection
  • 39:30 From Human Vision to Machine Perception
  • 44:30 Edge AI vs Centralized Compute: Where Intelligence Lives
  • 49:30 Beyond Imaging: Molecular Sensing and New Frontiers
  • 53:30 What Comes Next: Materials, Sensors, and the Limits of Silicon

What is TechSurge: Deep Tech Podcast?

The TechSurge: Deep Tech VC Podcast explores the frontiers of emerging tech, geopolitics, and business, with conversations tailored for entrepreneurs, technologists, and investment professionals. Presented by Celesta Capital, and hosted by Founding Partners Nic Brathwaite, Michael Marks, and Sriram Viswanathan. Send feedback and show ideas to techsurge@celesta.vc.

Each discussion delves into the intersection of technology advancement, market dynamics, and the founder journey, offering insights into the vast opportunities and complex challenges ahead. Episode topics include AI, data center transformation, blockchain, cyber security, healthcare innovation, VC investment trends, tips for first-time founders, and more.

Tune in to hear directly from Silicon Valley leaders, daring new founders, and visionary thinkers. Past guests include investor Vinod Khosla, former PepsiCo CEO Indra Nooyi, the Global Head of McKinsey, and executive leaders from Microsoft, OpenAI, and other leading tech companies.

New episodes release every two weeks. Visit techsurgepodcast.com for more details and to sign up for our newsletter and other content!

TS004 - Eric Fossum
Eric: [00:00:00] So I'm Eric Fossum. I am a professor at Dartmouth Engineering School and I am also director of our PhD innovation program. And I am also, uh, vice Provost for Entrepreneurship and Technology Transfer at darm.
Nic: So should we can get started right now. All right. But I am very pleased and honored to welcome Dr. Eric Fossum To the TechSurge podcast today. Eric, this has been a long time coming. We've, we probably first, um, contacted each other maybe 20 years ago. I'm a huge fan and I've spent a, a, a significant part of my career working on stuff related to your invention, so welcome to the show.
Eric: My [00:01:00] pleasure. My pleasure.
Nic: You know, you are, you are the inventor of the active CMOs, the acts, pixel CMOs image sensor, sometimes referred to by many people as the camera on a chip. You know, your invention in a transformed digital imaging and made it possible for cameras to become small, inexpensive, and embedded into billions of devices.
Billions of people use your, your technology every day, and your work has had just an extraordinary impact on how the world captures and shares information. But very few people have probably heard your name, and I hope we help to change that after this.
Eric: Well, at least they know what a cross image sensor is. That's good.
Nic: So, so why don't you tell us, start by telling our listeners a little bit about your [00:02:00] invention and, you know, to contextualize it. Talk a little bit about, you know, the technology that existed at the time, CCD, and the difference between CMO sensors and, and CCD.
Eric: Sure. So, um, yeah, know. I wonder how long you'd like me to answer for? 'cause I could talk for 30 minutes on this topic.
Nic: No. Well, well, you know, the, um, let's answer it in a way that a wide range of, of people with different backgrounds would be able to understand.
Eric: Okay. I can give like a a three or four minute analogy that might be helpful,
Nic: Yes.
Eric: which I do so, so. The charge couple device, uh, was invented at Bell Labs around 1969, something like that. Uh, and um, quickly became a very good technology for capturing images, uh, solid state, [00:03:00] solid state device. So it wasn't a camera tube that was easy to break or something like that.
Uh, and the way it would work is, uh, uh, maybe the best analogy is imagine it's raining on a football field and you wanna measure the rain. Across the field. So you get everybody, you know, to stand out in the field with a, a bucket. And, uh, they collect the water for a while. And of course how much water they collect depends on how hard the rain is falling.
Uh, and then, uh, you say, okay, everybody, we're gonna do a bucket brigade and you're going to pour your water into the bucket of the person next to you, et cetera, et cetera, et cetera. And eventually when you get to the edge of the field. There's gonna be another line of people with buckets, and they're gonna pass the water down to the corner of the field as fast as they can without spilling anything, hopefully.
And you just do that over and over again until every single bucket on the field has been emptied out and makes it to the corner of the field and at the corner of the field to [00:04:00] somebody with a measuring stick that measures how much water was collected. And, uh, basically you work backwards and figure out where that, where that water originated from in the field.
And then you can make a map of how much water fell different parts of the field. Uh, that's all great. Good. It takes a lot of energy to do all that bucket brigade passing it. Um, you could lose some water along the way just from spillage, which is not very good. Uh, you could have, um. Well in space, for example, you have radiation effects, which basically put holes in the bottom of the buckets as you're passing them along.
Uh, so that's also not good. Uh, so a lot of things that can lead to, uh, less than perfect images with the CCD, but generally speaking, uh, I always say there's like two things that are amazing to me in life on engineering. The first one is that seven 40 sevens can actually fly. The [00:05:00] second one is that, uh, CCDs work pretty well despite all those limitations.
Uh, so that part is great. Uh, but in the CMOs image sensor, which technically is the CMOs active pixel image sensor with int pixel charge transfer, just for the purists out there that say what it was that we invented, um. Instead of having everybody do the water bucket brigade thing, uh, you give everybody a, uh, their own measuring stick and a walkie-talkie to call in or phone or something to call in the results.
And then you don't have to worry about passing that signal through so many stages, uh, without spillage. So it's a much simpler way of doing it. It is just, uh, a little more equipment required for every bucket that you're collecting water in. 'cause everybody needs their own measuring stick. Everybody needs their own way of communicating the information out.
So the buckets are actually pixels on an image sensor chip, uh, collecting photons of light, which are like little [00:06:00] energy, energy, rainfall, uh, and uh, when that hits the silicon, those little itty bitty bullets of energy, uh, actually can generate an electron, whole pair in a semiconductor. And we care. We collect the electrons, so we just count how many electrons.
Or measure how many electrons we collect in each pixel. So that's kind of connecting the analogy to the actual silicon device.
Nic: Hopefully this analogy should help even the, the lay person understand a little bit more about the complexities, um, or the differences between the way, uh, CMOs image sensor captures the image and the way, um, CCDs do. But what was the problem that you were trying to solve when you started working on this CMOs active pixel, um, approach.
Eric: Yep. So I was, uh, [00:07:00] working at the NASA Jet Propulsion Laboratory and the Jet Propulsion Laboratory, which is managed by Caltech in Pasadena, California, uh, is charged with the robotic exploration of space. So all those spacecraft that have gone to the planet, the voyage or, and Galileo and Cassini, um, and then the rovers on Mars, for example, those tend to be, uh, built by JPL.
Um. And they have a lot of expertise in that space exploration area, robotic space exploration area. Uh, so they're building this spacecraft to go visit the outer planets. Uh, the cameras based on CCD technology resulted in a very large, uh, massive, uh, camera system. So kind of the size of a large refrigerator perhaps ultimately.
And, uh, these cameras contain both the sensor part, this part that's sensitive to [00:08:00] light and all the other electronics needed to control and digitize the image. And, uh, of course also there's the optics part. Um, but at, at some point, uh, the NASA administrator wanted to do more space missions that were faster, cheaper, better.
And, um, our job. When I was at JPL, um, and by the way, I came to JPL as an expert on charge, couple devices, CCDs. Um, but they said, uh, wow, these resulting cameras that are too big. So how do we miniaturize the cameras? And that was the problem that I was working on. Uh, how do we miniaturize 'em? And I had this, uh, idea for, uh, instead of having all these separate boards of electronics and everything, it's actually obvious to most.
Electronics engineers. You want to integrate everything on a single chip if you can, but there's a problem. And the problem is, is that the recipe to [00:09:00] make a CCD chip is different than the recipe to make mainstream microelectronics. Mainstream microelectronics uses a recipe that's called complimentary metal oxide, semiconductor or cm, CMO S or C Moss, as we say it.
Uh, and. At that time, it was very hard to build a very high quality image sensor using the CMOs recipe. Uh, but if you wanted to integrate electronics onto a CCD, somehow you had to like build. It was very hard. It was like trying to build a, uh, a pie cake at the same time. It's very hard to bake a pie cake, uh, in the same oven at the same time.
And have it both come out satisfactory. Well, I don't know. I don't actually cook, so I, I. I don't really know if that's true, but that's my guess. So, uh, the real trick was how to make a good high quality image capture [00:10:00] device using the CMOs recipe. And so what I did was, uh, use a technology that was actually created even before CCDs, just a couple years beforehand, called, uh, is now called a active pixel.
Device and I combine that with the things that made CCD imaging work so well, but avoid that bucket brigade transfer of signal that's has to be done thousands or tens of thousands times. So I actually put a little tiny c, c, D into every single pixel and that was the, uh. The idea that allowed us to get good image quality, uh, and yet still be able to use the, uh, CMOs recipe for building them.
And once we had that, then you could put all the other electronics onto that chip, like all the control electronics [00:11:00] signals and timing how to, how to do the signal processing afterwards to take that, those electrons that are generated by light and generate a. Digital set of digital bits that represent the value of that light intensity, and then maybe even do some image processing on the chip to either improve the image quality or to compress the image or, or things like that.
But now you could do it all on the same chip.
Nic: And you mentioned that when you, when you started working on this, the, the camera systems for, for these, um. NASA emissions were, you know, the size of a refrigerator. And I believe I read somewhere where you said the goal was to try to get the camera system done from that size. So about the size of a teacup or something like that.
Was that the actual goal that was set or was that something you just came up with later on?[00:12:00]
Eric: Uh, no, that was an actual conversation, but not a teacup. It was actually a, uh, a coffee cup because we were sitting around drinking coffee at the time and my manager said, can't we make something this big, you know, the size of a coffee cup? Uh, and, uh, it seemed, um, like a very good goal at that time. Um, didn't know how to solve it right away.
Um, but eventually, wow. We, we overachieve. Now we're the size of the sugar cube that goes into the coffee cup or even smaller than that. So.
Nic: But, but, but part of the story here is that, you know, these were big audacious goals and one of the, one of the things you, you see often in the technology space is these big audacious goals being set by the military or, you know, the, the aerospace industries. Um, and then later on, of course, being adopted for commercial applications.
Um, [00:13:00] but when you, when you first got started. When, at what point did you realize that this thing could become much bigger than you thought? Not in size, but the application could become much bigger than, than what you initially started working on, which was space technology.
Eric: Right. Yeah. So, um, we weren't thinking about other applications as we were doing the initial development of the chip, but after, uh, sometime naturally your mind starts to say, oh, you know. Wouldn't it be convenient if I had a little camera in the back of my car when I'm backing out of the driveway so I could see what's going on behind me, or, um, wouldn't it be nice if I, you know, think I left my briefcase on the kitchen table If I could just like, call up the house and the camera and be able to look at the house and see if I actually left it there, or, or not, or, uh, you know, 20, 30 other things like that, that, um.
You realize, hey, this little [00:14:00] camera that we're developing would be really helpful for all those things. And maybe, uh, in fact, this NASA space technology would be really good here on planet Earth. Uh, so we're daydreaming for a while. Um, one of my colleagues, we were over at his house for dinner, uh, at that time and, uh, drinking some wine and, uh, he was like, we should start AC company and make these little tiny cameras.
Uh, and, uh, well, ha ha ha. You know, we don't know anything about running a company or doing that, but I'll drink to that anyway. Uh, and, uh, but as it turns out, um, well there's a, a long story as to how we wound up actually getting, uh, our company launched. Uh, it was a simple story but took a while. But the, uh, and it's not your typical entrepreneur story of, oh, I've got this great new technology.
Lemme go out and find funding and, and do it. Uh, it was really [00:15:00] more that, um, my, uh, wife at the time, Dr. Serena Ney, who was also working at JPL and also an image sensor technologist, um, she had just given birth to our second daughter. And, uh, at the same time that that was happening, I was getting these calls from various people saying, Hey, could you make a chip that does this for us?
Is that possible? And I'm like, I'm sorry. You know, we're a federal laboratory. We can't just like do projects for, for companies, for products. I was like, well, uh, Sabrina, you know, if we got a little computer at home and, uh, some simple CAD software. You know, you're pretty good at designing these things when can, uh, put together these chips for other people, uh, while you're home watching the babies, which of course was pretty presumptuous on my part.
Uh, but it turned out [00:16:00] she was very enthusiastic about that, thought that was a great idea. And, uh, and so we, uh, opened up, uh, doing business as company, uh, called Tbit. Um. And started doing, uh, some custom design work for, uh, people. Um, now there's another part of the story, which is really the luck part. And anybody who's an entrepreneur knows that you can be the greatest engineer on the planet, but if luck is not on your side, it's got not gonna go anywhere.
And, uh, also it's hard for engineers to understand that you can't control. Look really, I mean, you can shuffle at that.
Nic: I wanna talk a little bit about the photo bit experience and you commercialization and that before we go there. know, I wanna understand a little bit about how we get from, we go from a refrigerator size camera [00:17:00] to the idea of a coffee cup size camera, and, and those, you know, that there's a lot of innovation that you need in order to do that.
But then to go from a coffee cup size camera to a camera, the size of a fingernail, usually the amount of work it takes to get to that next stage is often a lot more. What it took to get from a refrigerator to a coffee cup sometimes. So help me understand what was some of the major innovation that happened to take the camera from, you know, a coffee cup to a fingernail.
Eric: Sure. Um, actually. Once we decided to use the CMOs recipe, uh, we were, uh, able to, um, sorry, lemme restart that 'cause something popped up on my computer screen. So, uh, once we started using the CMOs recipe, uh, we started at a pretty large technology node size, and that kind of controls the size of the [00:18:00] pixels and the size of the chip.
Uh, but because we're using mainstream CMOs. We got kind of a free ride on, uh, Moore's Law. Now, if you're, uh, doing C-C-D-C-C-D recipe, there's only a few people in the world that manage the CCD recipe. And so the shrinkage rate for that technology was pretty slow. But if you're riding the coattails of CMOs, you don't have to do anything really.
And you get a free ride to the next technology node by foundries who have that capability. The shrink rate is pretty fast. Um, but also as it turns out, the coffee cup was, uh, was very easy to achieve. In fact, it was even the very first camera we made was much smaller than a coffee cup. Um, so, uh, yeah, that was, uh, it was a good goal to sell to people.
But actually we blew past [00:19:00] that goal right out of the gate. Um, but yes, you are absolutely right. Continued shrinkage, uh, is, uh, a challenge. Performance is a bigger problem, uh, maintaining the performance as you shrink. Uh, and that's where clever circuit design and clever pixel design, uh, comes into play. Um, but as our company grew, uh. Yeah, I know you're trying to like steer, get me back to the technology development side, but these things kind of went hand in hand a little bit.
Nic: No, but I, but I, but I do want to transition to the, to the company side because, you know, you started off, I would, I would say more as a researcher, kind of, you know, at, at JPL, um. not trying to make that look like you weren't developing products, but you know, there was a lot of research involved. It was research intensive.
Then you go into starting photo [00:20:00] bit, which is now all about commercializing technologies, and then from there you got acquired by Micron. Um, and that became the, the, you know, the foundation for what became known as, as Micron imaging, which then got spun out from Micron as aina. I became the first CEO of Aina, and that's, that's how we initially became connected.
But walk me through the experience. Of going from, you know, primarily this research focused environment to more of a commercialization of technology, um, environment and photo bit, then being acquired by a bigger company. And, um, walk me through those, those experiences, especially for, for some of our audience who are entrepreneurs who will be going through similar types of transitions in their own careers.
Eric: Yeah. So, um, there's actually a lot of different [00:21:00] angles to the question that you just asked me. Um. Okay, because, uh, before I went to JPLI was a professor at Columbia and part of a professor's job Besides working with a team of students that are learning on the way, uh, is to raise funding. So you're always out trying to figure out how to get everybody paid, right?
So it's not that far removed from a small startup company which is doing the same thing. And in fact, it turns out that's also true inside the NASA system that you have to figure out. Uh, how to get your share of that NASA pot of gold, if you can. Uh, which is competitive. Very, very competitive process. Uh, so the financial part is actually very similar.
Uh, the technical part also, uh, was not as much of a transition as you might think for us, because we had people calling us, asking us if [00:22:00] we could do a custom chip for them to solve a particular problem. One example for ex, one example was, uh, a group of Israeli, uh, missile designers, I guess. I don't know what they were, uh, but they, um, they had a different idea.
They wanted to build a human body missile that would go swallow. You could swallow this, uh, pill basically, and as it goes down through your intestines, it would take pictures along the way. And when they heard about our technology. Uh, they were very excited and wanted to know if we could build them a chip for their little tiny pill camera, uh, which we did.
So in a way, uh, engineers, we like to solve problems and there was a, uh, a very relevant problem and we had the tools to solve, uh, that was laid up for us. So it wasn't like we had to imagine what the [00:23:00] product would be. They already knew what the product was gonna be. We just had to figure out how to make it.
Uh, and, uh, it wasn't as hard as you might think, uh, or I had a great team, lemme put it that way. Uh, so, uh, we were able to recruit some really, uh, very, very talented people at photo bit.
Nic: And I met some of them. Some of them ended up working with me at at at aina. People like Shonda Bonner for example, who I think is just an incredible technical talent.
Eric: yes, yes. Uh, Roger Panache is another one.
Nic: Yes, that's right. Roger. Roger as well.
Eric: So, um, so really, uh, as long as you can lay out what the problem is and define the problem, engineers are very good at finding ways to solve their problem, uh, really so that that part of the transition wasn't that difficult either. The more difficult part was when we started to try to imagine that we wanna make products that other people would use, like for web cameras and that sort of thing. [00:24:00] Then having to, uh, sort out, um, what would be normal MBA stuff like, oh, well, how big is the market? What, what should we design? How do you do all this market research on what people need? And that was something we were not well versed in. And that was the more difficult challenge. Uh, other problems were like hiring people outside engineering, like.
We need to hire a, uh, a sales and marketing person. And you know, the thing about hiring a sales and marketing person, as I'm sure you know really well, is that they're very good at presenting themselves and figuring it out. Who's the person that actually close deals, which is what turns out is the part that counts, uh, for business development.
Uh, that's a lot harder thing to sort out because, uh, engineers aren't used to looking and poking that deeply at. Those sort of talents. Engineering, yes, go to the [00:25:00] whiteboard and solve this equation. That's, that's easy enough. But, uh, go close a business deal. They're, they're good talkers and you don't know when you've hired a good one or not.
Um, so those challenges were actually a lot harder in terms of growing the company and making it successful. And we did finally find a great, uh, business development person. Um, uh, did you run into Sean Maloney, for example? No. He was actually quite adept at, uh, closing deals.
Nic: But I would imagine what, when the company got acquired by Micron with microns processing capability expertise and their, um, maniacal focus on driving cost out being in the memory business, that must have provided a significant boost to the continued development. Of the app, the CMOs in a sensor, as you [00:26:00] know, in terms of its ubiquitous application potential.
Eric: Absolutely. Absolutely. I mean, it wasn't by accident that we were acquired by Micron. I mean, we went out and sought them as a potential acquirer. Because we thought that it was a really good match, uh, for all the reasons you just mentioned, that they have this, uh, great, uh, process, technology knowledge, deep, uh, deep knowledge and also, uh, how to do things at scale at the cost that is required for that market.
Uh, so they were, uh, quite good at that. It was, uh, very, um. Well, it was a happy day when we sold the company, but it was also, uh, felt like a, a really good, uh, plan at the time. So, um, I mean there are always, uh, things that happen when you sell the company. I mean, for me, uh, it was very difficult because, um, I mean, it's been personality, but it's really hard from being [00:27:00] the boss to being the worker.
And having somebody else tell you what you ought to do, especially if they don't know the market as well as you do, and, uh, uh, don't know the technology as well as you do. So that was a, a difficult adjustment for me. And, uh, about a year after the acquisition, I had to exit stage left. Was my, my plan too frustrating.
Nic: Yeah, but the truth is that, you know, around that time is when, you know, we at Flexon has decided to go invest in, in embedded cameras and business around embedded cameras because we felt that that was gonna be the next big wave in cell phones. And the truth is that while the, the, the technology had great potential for producing a small embedded camera.
The quality of the images coming out of them were not as good as one would've expected if you were using, you know, a digital still camera and not even to mention a A an SLR. So, um, there was a lot of work that still needed to be [00:28:00] done. In your mind, what would you consider the two part question? One is, what would you consider to be the, the biggest. Breakthrough that caused us to get to, to the kind of quality we're seeing now in, in these tiny cameras. Um, and secondly, is there a part of this whole CMAs in census story or the creation of it that the world usually miss? Is there something that we don't talk about that we should be talking about?
Eric: Hmm. Uh, well, one thing, uh, we're talking about knee, but really the development to where that technology mood was because of thousands of other engineers around the world that, uh, made that happen because of the, uh, the holy grail for that, uh, that technology was smartphone. Smartphone applications, and everybody knew it was gonna be a huge, huge market.
A lot of companies jumped [00:29:00] onto the bandwagon to develop it. They had much deeper pockets than we did at Photo Bit. That was one of the reasons why we decided to get, uh, acquired by Micron. 'cause we knew we'd just never catch up, um, at the rate that, uh, Samsung or Sony was able to put money into that, uh, uh, technology.
So, uh. So really, uh, part of the reason why that technology really developed was because the, um, companies that were very experienced with, uh, CCDs, like Sony, um, were very good at solving the quality problems. Maybe, um, not that Micron, uh, didn't, or couldn't, micron actually was also very good at solving other problems.
But, uh. Bench depth that Sony was far deeper in people that had solved that problem. So really the, the number of, especially for Deep Tech, the number of people with [00:30:00] that specialized, uh, knowledge that you can apply to the problem, uh, not only at the front end of the r and d, but also in manufacturing, um, is really critical to, um, uh, eventually, uh.
Overtaking others in the marketplace, which is what, what Sony did and what Samsung did.
Nic: Yeah, but I'm, I'm glad you mentioned that because one of the things that sometimes people miss in, in the deep tech space is the fact that oftentimes when you're driving the development and, and implementation of new technologies or new approaches, it takes an ecosystem. To be able to drive success. And oftentimes people don't, don't understand the need to develop a strong ecosystem in, in helping to drive the success, the successful implementation or adoption of new technological approaches.
And so just because you have the best technology doesn't [00:31:00] necessarily mean you'll win, is the technology that has the ecosystem. Behind it that often wins. And you mentioned the, the role of, of, you know, companies like Samsung and Sony, but also the role of companies like Apple and, and others as well that helped to make this, this successful. But I, I, I want to piggyback on that a little bit and talk about, you know, the value of intellectual property because in deep tech. When you invest in, in businesses, you're investing in businesses that are often built on unique intellectual property that, um, so, and you are a prolific inventor. I think you have my, my research suggests over 175 US patents
Eric: A little bit more, but yes.
Nic: more.
Yeah. So why don't you talk a little bit about the value. Of intellectual property in terms of [00:32:00] protecting, um, the, the, the products that, that you develop in products such as these, and as it relates to deep tech in particular,
Eric: Yeah. So, um, well first of all, um, IP has value in a couple of places. Um. It does not work well as a fence, that's for sure.
Nic: right.
Eric: Uh, if a company really wants to, uh, eat your lunch, um, and they can do that successfully, they will, um, well, they could buy your company or they could, uh, compete and almost drive you out of business, uh, or, um, eventually pay up something.
But by that time, they may have gained so much in sales and revenue. It's, uh, a relatively small drop in the bucket for them. So, uh, but IP is very important from, uh, the fact that later you [00:33:00] can, uh, um, do reciprocal licensing arrangements, uh, with companies because chances are pretty good, especially in electronics area, that, uh, your core technology, the deep tech, uh, you own that ip.
And it's pretty clear at some point that you own it, even though it doesn't stop somebody from doing something, uh, immediately. Um, but you're probably using a lot of other IP that you don't own, that you did not invent. And uh, let's say you're going up against a company like Sony. I mean, you, uh, you've got your little basket of intellectual property that you own, and Sony has a great big.
Tractor trailer truck full of ip. Uh, and if they want, they can turn around and, uh, attack you on the basis of that large inventory of IP that they have. Um, so it's very as asymmetrical [00:34:00] in that regard. And, uh, as long as you've got something to trade, it's good. If you have nothing to trade, then I think you're dead.
Uh, and I think the venture, uh, business, um, understands this too. You know, it's very hard to imagine getting, uh, someone to invest in your company if you've got an empty basket of ip, that there's just absolutely nothing that you can clinging to, uh, to create even a small competitive advantage. Even if it's just temporary, um, you're just pretty much dead in the water.
It's hard to get investment, so you're, you're dead on capital. You're dead on. Uh, long term, just no place to hide.
Nic: See, I think of intellectual property as just added value that, that you're creating and, and how you use it is, you know, depending on the situation. So you mentioned, um, the challenges associated [00:35:00] with, you know, sometimes dealing with bigger companies. Um, what's it, what's valuable there is when you have.
Essential ip, um, because oftentimes people focus too much on the number of patents or the number of, of, um. The size of the IP portfolio, but what matters most is actually the quality of the IP portfolio. And I had an experience with that, you know, when I was at Aina, for example, where we were able to use the quality of our IP portfolio to extract significant benefits in, um, partnerships with other companies.
Uh, you know, we, we had of 2,500 patents and we were going up against companies that had, you know, tens of thousands and were able to extract significant value in those relationships as a result of the quality of the, the patent portfolio. But why don't you, why, why we [00:36:00] talk more about when you, when you started photo it, um. Did any of the IP from JPL get transferred into photo bit, or did you have to start that from scratch?
Eric: Yeah. So, um, up to the time of photo bit. Caltech, which manages Jet Propulsion Laboratory, uh, would not license technology to the inventors if they worked at JPL because they felt, uh, the government might see that as some sort of conflict of interest or something, uh, which is in retrospect kind of crazy.
Um, so Photo Bit was actually the very first company that was created by JPL engineers that got to license their own technology back from Caltech. And that was, uh, Sabrina Coneys, uh, negotiation with Larry Gilbert at Caltech that made that happen. Uh, so Larry was a [00:37:00] very, uh, visionary thinker at Caltech.
Uh, and, uh, realized that, oh, the people that really have this strong burning desire to make their baby successful are the inventors of the technology. Uh, no one else will have that same drive and interest in making sure that it's successful. Um. He licensed, uh, photo it, uh, the IP portfolio that we had developed, uh, in the active pixel sensor technology.
Um, now, uh, what happened next was interesting after we were acquired by Micron. Micron didn't want to pay any royalties or anything to Caltech, but what they said was, uh, look, why don't we just give you all the IP back? And you just give us a, uh, a license, uh, to, uh, to practice that technology, uh, paid up, [00:38:00] you know, royalty free, paid up, uh, forever license.
And so Caltech accepted that, uh, that offer, uh, that had nothing to do with Sabrina or I negotiating that was between, uh, Caltech and Micron. Um, but Caltech wound up with, uh. All that IP that was back inside their house. And then later, uh, they used that, uh, let's call it quality ip. In fact, they used six of my patents, uh, to, um, extract value from their, the, uh, electronic camera makers, sensors makers that were out there.
It happened long after I was, uh, out of Micron or, uh, whatnot, but it happened years later. So it was kind of an interesting process to see, uh, Caltech successfully depend that portfolio, um, against these [00:39:00] larger giants like Ken and Sony. Um, and as you put it, extract value from it.
Nic: the CMO sensors, you know, are revolving from. Devices that were initially de developed and, and and designed for, um, capturing images to devices that are now becoming core, um, components at improving the intelligence of systems or machines. Is that something that you had envisioned at any point and.
That's two part question. And the second part of this question is how should people think about people who are in this business? Think about the requirements of the, of the sensors. Um, how are the requirements changing?[00:40:00]
Eric: Nice question. So, uh, when I was working in my PhD at Yale, the, uh, title of my thesis was, uh, basically smart image sensors, uh, which someone would call it, you know, sensors with, you know, sort of edge AI built into them. Uh. Uh, the technology readiness at that time was just way too early. Way, way. It was an idea.
It was a good idea, but far, far before its time. Uh, and now it's kind of coming back, uh, when people think about edge AI and, uh, applying it to sensors. And the idea of course is that, uh, for machines or cars. I was thinking about cars at the time, but, um. To, uh, be able to be intelligent, right? That, uh, they can make decisions without having to go to the main brain, wherever the brain main brain is, uh, that they can make some on the fly, things [00:41:00] that would save time in decision making, whether it's averting an accident or it's, uh, picking apart on an assembly line, that sort of thing.
Um, so it's a, it's not a new idea. I think, uh, now it's an idea whose time has sort of come. The difficulty, uh, these days is that the kind of processing, uh, that's required to do any kind of image comprehension requires comparing pixels across relatively long distances on the image plane. So you might have, uh, a 10 million sensor device.
And, uh, you wanna see if something is a long, is a tree. You gotta be looking at a bunch of pixels that are spread out across the chip and they have to be interconnected somehow in order to cognitively recognize that it's a tree. Uh, that's not the best example [00:42:00] perhaps, but that's the idea. Um, and that is not well suited really for implementation on the sensor chip itself, unless you go to three dimensional integration.
Which of course is possible now because there's further advancements in, um, chip stacking. It's still a difficult problem though as to how to do kind of these, uh, far distance connected connectivity amongst pixels to be able to do, um, even the pre-cognitive recognition of objects. So, um, I keep coming back to this problem.
The technology's there. You always have to ask yourself, is this the best way to approach it? And I know from my very early work, which was on focal plane image processing, to be honest, uh, I was doing at Columbia before I even went to JPL, um, [00:43:00] is that it's very difficult. Uh, it takes a lot of work and life is a lot easier if you can just use a faster CPU and do it digitally.
So, um. Can a couple of groups make progress, let's say over the course of five years, that's equal to, uh, 10,000 programmers around the world speeding up image processing, uh, so that you can do it. Uh, you don't have to do it on the focal plane, you could do it somewhere else. And I, I think that's the thrust of the problem is that, uh, digital.
Electronics are much easier than analog. Electronics computing is much easier than doing hardwired digital electronics, uh, which is why FPGAs work so well. And, uh, software is really easy. Anybody can code practically. Um, [00:44:00] and so you can have millions of people that are working on a coding problem and they're gonna make progress a lot faster than a team of five people that are trying to do it in a research laboratory.
So, uh, in the end it's a, it's. It's that kind of numbers game. So, uh, I mean, in my, uh, more recent years, I focused on just improving the image quality that comes out of the image sensor, uh, and have not really found the, the killer application for some sort of edge AI thing, even though I, I keep returning to that problem every few years.
Um, so. Uh, at Dartmouth in my research lab, for example, uh, we worked on sensors that could count individual photons of light one at a time. So what's that good for? Well, if you're imaging in very dark conditions, whether it's a dark bar or it's a night, uh, it's nighttime and you're driving, or it's astronomy [00:45:00] or it's, uh, biotechnology, you're usually dealing with very weak light signals.
And if you can detect every photon that comes in to the chip, you can't do any better detection than that than to count every single photon. So, uh, been working on photon counting, image sensors, for example, as my personal frontier. And, uh, we developed that technology at Dartmouth. We spun it out as a company.
And, uh, so far, so good.
Nic: I remember reading about that, but you know, me the, when people talk about the, the role of sensors in LJI, it's, it's a more complex problem than might first be perceived. Um, you know, I think we all at some point knew or suspected that. Images were gonna be used more for machine perception than human perception over time.
[00:46:00] Um, but I'm not sure that, um, we, we understood exactly what that meant. And now with ai, we know of a situation where the, the quote unquote viewer of the, the, the, the set, the, the image is not an AI model, not a human eye. And. How do we deal with those changing requirements, um, from a technical perspective?
That's one part of the question. And the second part of this question is. You know, while there's a lot of focus associated with GPUs and large models and all of those things, don't you think there should be more attention being placed on the, the, the images and on the sensors, call it the sensors and the requirements of these sensors in this new type of environment?
Eric: Yes, that is just what we were talking about really. Um, and you know, the interesting [00:47:00] thing about, um, human eyes and that sort of thing, uh, we don't know exactly. I mean, we know how to, how they work at, you know, chemical level and electrical level and neuron level to some degree, but. How does your brain recognize an image that it's not clear?
That's the same as the way AI AI is doing it today. Um, and even if you look at it, ai, you know, a lot of people cannot tell you deterministically exactly what the AI is doing to come up with the answers. Uh, you can train it and we understand the kind of perception, perception theory, and all that sort of thing as to what's going on, but what's.
What is it really responding to when you train it to, uh, you know, avoid a dog in the street? It's, it's looking at all kinds of things and it's a massive model. Uh, you know, it's, uh, probably not gonna be able to put all that stuff onto one chip and then that chip will wanna be able to recognize dogs crossing in front of a car.
It's [00:48:00] gotta be responsive to all kinds of other things as well. And, uh, you know, this is why, you know, you don't have. You have connections to an AI machine through your phone, but the AI is not really in your phone, per se. Right? It's somewhere else, uh, in a much big servers system or something somewhere.
Um, and then trying to think about how are we gonna put, what, if we could know exactly what information was required from the sensor that would speed up the AI recognition. We be golden and we can implement that on the sensor. But actually at the moment, it's a little bit of, uh, unclear, especially for a general purpose sensor for self-driving car, for example.
Nic: You know, one of the challenges we have in our industry, oftentimes on the investment side is that. People tend to focus on, um, they tend to follow the [00:49:00] puck, put it that way. Um, but there, there are other, there are other areas that need as much attention maybe as some of the areas that are, that are being looked at.
And I think ultimately sensors, um, are, are in sensors are, um, one of the devices that we need to be spending more time looking at and trying to understand how designs can be optimized. To support, you know, the, the, this whole movement, this whole AI movement. Because at the end of the day, the sensors will become a very important and a sophisticated piece of, of this whole intelligence system that we are trying to put together.
So, um, but I want to take it one level, um, away just from the image sensors. What are your thoughts around molecular sensing?
Eric: Uh, yeah. Yeah, it is just a very early part of [00:50:00] that, of that story right now, happening now. Um, so I think. Um, there are so many things to do there and so many different ways you can recognize things and make very, very compact sensor systems that, uh, yeah. If I were a young person right now, I might be aiming a little bit more in that direction, or bio electronics area or something like that.
The merger of electronics and, uh, biological systems, uh, and. It's like, I mean, we still can't build a electronic nose that's as good as a dog yet. It's more like their sensor system. Right. Um, so, but doing it to recognize, recognize diseases, uh, recognize, uh, signature fingerprints of things, you know, in the defense space or, um, just to, uh, help us with [00:51:00] human health is, um.
The sky's the limit right now, I think as far as that goes.
Nic: Yeah, so you know, obviously AI is. The most significant, um, innovation in, in, in our lifetime, probably. And, um, it is catalyzing a complete re-architecture of everything. But as we do so, we need to also look at, at other technologies that are just as important. Um, in the proliferation of AI and applications such as medicine and robotics and climate and industry and space and all of these things.
And I think more advanced sensors, whether it's on the imaging side or the molecular side, are areas that we all need to be focused on. Um, are you seeing more and more work at universities in terms of the research and discovery, scientific discovery [00:52:00] efforts, um, in the molecular sensing space?
Eric: Uh, I would say that, uh, it could be better funded at this time. Um, so there's some, I mean, we have worked, actually I'm leading at, uh, Dartmouth for, um. Biosensors for war fighters, for example, that, um, it's very important to, to monitor the, the health of war fighters, for example, under extreme conditions. Um, so I, so yes, I'm, I'm well aware of some of that potential, at least, uh, tiny sliver and we do see some of it, but I think that, uh, the government is, uh, waking up in terms of like funding academia for doing this stuff.
Um, I think there's also incredible potential if, um, private industry recognizing the long-term [00:53:00] potential of some of this technology would partner with academic institutions and how to really create strong academic industry partnerships that do meaningful research. Um. Is a untapped potential in this country right now. I.
Nic: So I have one more question and it's kind of a general question, but you know, as sensing technologies including CMOs and sensors continue to evolve, what do you see as some of the most interesting frontiers, um, for sentin technology today? And, you know, are there technologies or applications that deserve more attention than they're getting?
Eric: Uh, for imaging in the imaging space. Yeah. Yeah. I think other wavelengths, uh, are, uh, [00:54:00] very interesting. You know, we don't think about it because humans are only really sensitive in the, the visible wavelengths. Uh, the universe, uh, has lots of signals that are in other wavelengths that, uh, we can't see. Uh, so, and the part, the, the issue there is that, uh, you can't use silicon, which is like God's gift to technology.
Uh, you can't use silicon for detecting some of those wavelengths. Uh, so, uh, then you get into a materials development problem, which is. A long, tedious process of, um, developing materials that are sensitive in these other wavelengths. But that's a big frontier still. Um, of course we do it for infrared. Um, but um, for example, molecular sensing is like, how do you measure whether molecules there or not?
Well, you might want to use like Ramon spectroscopy. If you're doing that, then that's in some wavelength range, [00:55:00] which is very inconvenient for. Silicon. So then you are into, again, you gotta go into these other material spaces. Um, but I, I think, uh, imaging those areas is uh, um, got a lot of potential. Uh, but a lot of work to do.
It's not low hanging fruit, that's for sure. Uh, uh, low light, I think we're, we're in good shape on low light imaging. I think high dynamic range imaging is good. High speed imaging. Is also, uh, pretty well covered. The sensors that are going into, uh, smartphones today are amazing quality. Do all kinds of things we didn't think we could do with that kind of sensor.
Uh, especially coupled to AI that you're connected to through your phone to, uh, to work on the image. Um, but, uh. I think a lot. You know, I, I hate to be the person that says everything that can be [00:56:00] invented has been invented, because that's never true. But, uh, you know, CBOs, image sensors today are pretty mature technology.
Uh, and so there is no low hanging fruit. It's all very expensive to, uh, to try new things. Um, it would be, it would accelerate the process if. Let's say more research minded people, academic institutions, for example, could get access to advanced processes, uh, that, uh, help you, uh, realize some of these desired performance characteristics.
Um, but that's, that's very hard for the big foundries that are, um. Doing that to accommodate, uh, the pesky academics. They wanna do this, they wanna change that, and, uh, and they only want five of them when they get done. That's, that's a very hard part of the process too, is that [00:57:00] the ability to make image sensors in 1993 when we're doing it at JPL in a low cost, high rate of, uh, discovery regime.
It's very hard to do that today. It costs a million dollars to, to do a math set. It costs, uh, a year to, to turn around devices on that. And generally speaking, you can't get anybody to talk to you anyway about doing that. So it really stifles research and creativity, unfortunately.
Nic: Well, you, you, you made a point that resonate with me a lot and that is, um, and, um, I'm paraphrasing what you said and so if I'm wrong, please correct me. But, you know, many of our existing technologies are pretty mature and, and, and it's not just Moore's Law that toting there is. Um, we are continuing to squeeze a lot out of existing technologies, but it seems to me that there needs to be a lot more money being invested at [00:58:00] the university levels and beyond in material science because, you know, the next big breakthroughs are likely to be heavily influenced by new discoveries and material science.
Do you agree or disagree with that?
Eric: I agree with it completely. It's just that we get so spoiled with silicon, uh, and anytime we can find a way to do a job to solve a problem using silicon compared to an exotic technology, everybody immediately goes to the Silicon solution if you can, because,
Nic: Well, the, the, you know, the, the ecosystem is set up to, to utilize silicon and, and it's not for others.
Eric: Yes.
Nic: So Eric, I want to thank you very much for joining us today, and this has been a long time coming. I am a huge fan of yours and, um, have benefited in my career from your invention, and hopefully I help to contribute to some of the adoption of, of your invention as well.
Um, but thank you, thank you for joining us and we'll link your, your [00:59:00] Dartmouth profile and, and your work. Um. The episode description so people will have access to, um, a lot of, a lot of what you're current, not just what you've done in the past, but what you're currently working on.
Eric: Well,
Nic: again, appreciate your time.
Eric: it was my great pleasure to spend some time with you, Nick, so thanks so much for inviting me on.
Nic: It's a pleasure.