Quantum hardware iteration is gated by the simple fact that real QPUs are expensive, scarce, and noisy in ways that generic simulators don't capture. In this episode, Izhar Medalsy, co-founder and CEO of Quantum Elements, explains why the industry is still building "wooden models in the air tunnel" — and how hardware-faithful digital twins, scaled to a distance-7 rotated surface code with 97 physical qubits on AWS HPC, could give pulse engineers, circuit designers, and QEC decoder teams a faster, cheaper place to iterate.
Your host, Sebastian Hassinger, interviews brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era. We will cover topics in quantum computing, networking and sensing, focusing on hardware, algorithms and general theory. The show aims for accessibility - Sebastian is not a physicist - and we'll try to provide context for the terminology and glimpses at the fascinating history of this new field as it evolves in real time.
Sebastian Hassinger (00:01.376)
Is our thank you very much for joining me. It's good to see you. can you start by providing a little bit of context? You are not a physicist. so a little bit about where your training lies and how that brought you to, where you are in, in, in quantum today.
Izhar (00:21.612)
Yeah, hi, Sebastian. Great to be here. We've, we've waited way too long to have this podcast together. So I'm excited. I'm excited to be able to do it. My journey started in really gravitating to where my interest lies. My bachelor's degrees were in neuroscience and, and then I felt that I want to go maybe a bit deeper.
Sebastian Hassinger (00:28.428)
Agree.
Izhar (00:50.082)
So I've switched to physical chemistry and did my PhD in the intersection between quantum and non electronics. So we were taking quantum dots, connecting them to very stable proteins that can create those two dimensional lattices and then using atomic force microscopy. I was actually able to realize
Sebastian Hassinger (01:03.706)
Hmm.
Izhar (01:18.114)
nanoscale computational systems on rather than binary on ternary state logics. So it was pretty interesting. And the idea is that back then was that ternary logic is mathematically more efficient than binary. so yeah, was quantum dots, quantum charge effects, et cetera, but not quantum computing.
Sebastian Hassinger (01:26.006)
Interesting. Yeah.
Izhar (01:47.598)
did my post-doc at ETH and then once again said, I feel like I need to reinvent myself again and switch to industry and took it from there.
Sebastian Hassinger (01:59.468)
And initially, think you were in like 3D printing, think was your, what you told me about your first sort of entrepreneurial venture was, right?
Izhar (02:07.106)
Yeah, the first entrepreneur venture was 3D printing, but what led me towards that was my 15 years of experience with atomic force microscopy. So I was developing atomic force microscopy technologies, of, you know, hardcore scientific instrumentation. And I took my initial design as a...
Sebastian Hassinger (02:16.517)
Mmm, right.
Izhar (02:29.09)
as a scientist was then adopted by, by Brooker. And I kind of switched to, call it the dark side of product management and, Bizdev and, and took this product to the market. So it was quite a nice journey, kind of two years from really ideation all the way to launching a full integrated atomic force microscope that is connected to a confocal microscope. So it was quite a nice journey to really see firsthand, how do you take an idea and kind of launch it into a product that's, you know, has quite a high price tag.
Sebastian Hassinger (02:36.462)
Ha ha ha ha.
Izhar (02:58.894)
But what it taught me is this intersection between control, the mechanics of moving things, so control electronics, the mechanics control software and materials. So when I got the opportunity to build a 3D printing company, it kind of all came together. Obviously, I had to reinvent myself because I didn't know anything about 3D printers, only that they look cool and they make parts. But nevertheless, we made the fastest printing.
printer on the market. I was the chief product officer, then switched to the CTO. And it was quite a successful company for about seven years. yeah, that kind of led me to rethink my view on another industry, which is the quantum industry, led me to where I am today.
Sebastian Hassinger (03:59.29)
That's really cool. mean, it puts you in a very small group of people in the quantum computing community that has a scientific background, a PhD postdoc experience and entrepreneurial and operator experience as the chief product officer and bringing something actually all the way to maturity of the market. So that's a very, very rarefied set of skills, I would say, which is great. It's a huge advantage. So, okay.
Izhar (04:24.568)
Bye bye.
Sebastian Hassinger (04:28.908)
You've co-founded Quantum Elements with Amir Yakobi, who's a condensed matter physicist, and with Daniel Adar, who's a theorist who we've had on the podcast, who's very, very well regarded in terms of, in particular, noise modeling and coherence and its effect on coherence. Did the idea come out of the three of you altogether, or was there sort of a sequencing of discussions with Amir and with Daniel?
Izhar (04:47.33)
Yeah.
Sebastian Hassinger (04:56.782)
that sort of brought you to the thesis that's represented by quantum elements.
Izhar (05:03.214)
So I'm gonna say something that probably would resonate with a lot of people who built companies and my, it is the fact that if you want to have a successful company, there are so many things that can go wrong. So choosing the people that you will go on this journey together with is probably the most important decision you're gonna make.
so I met Amir more than a decade ago. We actually worked on NV center detecting detection systems, for atomic force microscopes. and we stayed very close friends and we felt that, you know, together we can do something, in the quantum space. and I'm also very good friends with Daniel. so the three of us kind of started the ideation process a few years back.
where we looked at the industry and I came with my background, Daniel, with three decades of quantum scientific discovery and Amir on the applicative side and as an experimentalist. And I also have background in aviation. And for me, it was really amazing to see such a cutting edge technology, kind of a
like quantum, know, an industry that ultimately can really transform how society thinks about big problems and address them, really lacking fundamental simulation and development tools. So given the fact that we now have this amazing ability that Daniel really developed over the years to be able to describe
quantum system from first principles in a very accurate way, not only how those qubits behave, but also how the environment affects them. And also solve those systems at a large scale kind of allowed us three to have this really very unique canvas to be able to say, okay, we think there is an opportunity here to build a company that addresses this key need to
Sebastian Hassinger (07:07.374)
Mm-hmm.
Izhar (07:27.243)
be able to understand how those systems are behaving at scale and provide the industry for, we think for every user persona in the stack, the ability to move faster with much deeper understanding of the governing principles behind the things that go well, but also go wrong.
Sebastian Hassinger (07:51.66)
Interesting. Daniel's work in understanding noise, I've heard you and him sort of refer to what quantum elements, sort of at the core of the value proposition on quantum elements is building digital twins of physical qubits. that right?
Izhar (08:09.269)
It is right. And the reason we're using this term of digital twin is because we are looking at the system as a whole. Where, you know, when you have qubits, you have crosstalk, you know, they influence each other. They don't work in isolation. When you work on a two level system, you have leakage. Sometimes you will excite the third excited state. So you need to take all of those things into account and you need to take them into account.
in the time domain, meaning how those effects evolve over time, because quantum circuits start at a certain point and then through the gates evolve into the point where you measure your end results. If you're not able to see the evolution of everything that is going on in the system, you're effectively missing out on a lot of information that is critical to the understanding of what is going on in the system.
Sebastian Hassinger (09:08.674)
Interesting. So it sounds, I mean, it's funny when I talked to John Martinez, he, every time I've had this conversation with him, he will get around to essentially saying what he thinks his, you know, strength is, is, is thinking in systems engineering terms, not just in experimental physics terms, but in understanding that the broader dynamics of the system and how they can be, the dynamics can be marshalled and managed in a way to make a more
predictable, more effective, entire system. that kind of how you see that? It sounds like you're saying, understanding those dynamics through the simulation of the digital twin can lead to better operations or better design of the system. Is that right?
Izhar (09:56.654)
It absolutely is, right? And I'll take it one step further. I think when you control a classical device, a zero is a zero and a one is a one, the kind of physics is very well-defined. You have a gate, you provide some bias, you're flipping the switch, you're opening or closing the tap. When you're looking at superconducting qubits versus neutral atoms versus ions versus...
any other modality, each configuration, each modality, each QPU, each QPU generation might have a different set of pulses and different strategies to just realize a gate. Once those gates are alive and they start interacting with other gates and kind of idling or going through whatever processes that they're going,
they're developing different noise processes and different errors that are all related to where they originated from and how they were realized. So in other words, in order for us to be able to create the control stack for the quantum computer, we need to be able to understand where we're starting and where we're getting towards, meaning we have to be hardware aware.
And the way to be hardware aware is to be able to look at the system from first principle, develop those digital twins, and scale them to a point that you can now address the main bottlenecks that the industry is facing. If you can save time and money on running hardware experimentation versus doing it virtually, you'll get
quicker to your end goal. I don't want to be too cynical, but sometimes when I look at the quantum industry and I compare it maybe to the aviation industry, there are instances where you think, well, it's almost like building the next fighter jet or commercial airplane with wooden models in the air tunnel. kind of shave it a bit, you put it in the air tunnel, you see if it shakes and wriggles, and you iterate again and again and again.
Sebastian Hassinger (11:51.768)
Yeah.
Sebastian Hassinger (12:15.599)
Right.
Sebastian Hassinger (12:19.898)
It's like an artisan model. Yeah. Thankfully we're not making quantum airplanes yet.
Izhar (12:20.16)
We need to move away from... We need to move away from... Not yet. Yeah, yeah. One day. So, you know, going back to where we are, I think those tools are critical and using those tools, we can now be, as I said, hardware aware and accelerate the development of every layer in the stack, right? From optimized pulses for...
for advanced gates and new gate strategies to reducing noise on the circuit level through error suppression and error mitigation that you can try and test on virtually and then implement on hardware all the way to testing and developing new quantum error correction strategies and decoders.
Sebastian Hassinger (13:12.932)
Yeah. Okay. So, I want to sort of pick that apart because we've had this conversation before and full disclosure before I left AWS, I sort of helped drive the collaboration with quantum elements, that resulted in a recent blog post, which we'll get into in a moment. but the reason I'm, I want to sort of, probe the, the applications of your technology from that.
R and D phase through to like operational phase. So starting with design and research and development, you mentioned error correction. What we did together when I was with AWS, the collaboration with quantum elements was creating a simulation of physical qubits that mimicked the implementation of surface code that Google had done with their physical chip. So talk me through how that was.
and what the value of that kind of simulation might be for a hardware vendor or other commercial entity.
Izhar (14:22.39)
So we had to overcome a few scientific barriers in order to get to this result. But let me start with what is the value proposition, why it matters, why people should care. When you look at developing specifically surface code and then developing decoders based on that,
The go-to platform is Steam. The advantage of Steam is that it's extremely quick. You can run relatively large systems using Steam. The thing is that it uses only Clifford gates. So it's kind of unaware of the noise that is the underlying governing principle of the things that limit your.
Sebastian Hassinger (15:10.618)
Well, and it is sort of the equivalent of like a classical circuit simulation, has sort of assumptions. Maybe it assume perfectly. It has a noise model, but the noise model is also very abstract, right? It's sort of a toy model, in a sense, of what the real operating conditions would be. Yeah.
Izhar (15:25.004)
Exactly.
Izhar (15:30.942)
Exactly. So what we, the goal we kind of set in front of us together with AWS and, know, your amazing partnership was, hey, can we take our digital twins and enlarge them to a point where they can be able to address this kind of milestone of distance seven surface code, distance seven, meaning 97 noisy qubits.
That's kind of the threshold where we can start looking at the conversion from physical to logical qubits using surface codes. Now, it's very non-trivial because if you look at the way that traditionally you're solving those master equations that are allowing you to take into account open quantum system, a quantum system that interacts with the environment,
They're very expensive computationally. So if you look at brute force, you're usually limited to about 16 qubits because, you know, you just, it just explode computation and you know, it makes sense if we can brute.
Sebastian Hassinger (16:40.354)
Yeah. It's sort of, I mean, it's, sort of a proof that quantum computing, if it works, will be more powerful than classical computing is it's impossible to simulate.
Izhar (16:46.876)
Exactly. Exactly. Exactly. Absolutely. Otherwise, what are we doing here? We can build a classical representation of any quantum device. So you have to come up with a new method of breaking those barriers. You have other technologies like tensor network, but they have problem with the connectivity or the entanglement size of the system.
Sebastian Hassinger (16:54.135)
Exactly.
Izhar (17:16.556)
So we had to come up with a new way of enlarging those digital twins to 97 and beyond. And we used a new method code that we kind of developed on top of Quantum Monte Carlo using stochastic compression, meaning we are looking at the whole, let's call it universe of solutions, but really cherry picking the areas that
There is a sparsity in the system and that allows us to maintain the noise models and this high fidelity understanding of what's going on in the system, crosstalk leakage and others, and take it to sizes that are big enough to address this distance seven and beyond. And in the blog post, what we've shown is that when we are running those kind of surface code,
we are able to almost predict which qubits are more prone for developing errors because we understand how their crosstalk affect the behavior of the system and other noise parameters that we really can insert into the system. it's easy to see how once you understand the noise
and the behavior of the system is such a large scale, your decoder should naturally be able to more faithfully predict how to address errors as you're running codes.
Sebastian Hassinger (18:54.522)
Mmm.
Sebastian Hassinger (19:00.76)
Interesting, mean, it's intuitively, it makes sense that if you want me so difficult for to do what Google did to stand up a hundred plus qubit physical qubits in an experiment and then operate them to get the surface code implemented. There's like, so many plates spinning to make that happen that if you can do that in a simulated environment, it feels like there's enormous opportunity to learn.
As you said, where are the sensitivities? Where are the things that really matter the most versus things that you can sort of take more for granted? How can you engineer the, how can you lay out the topology of chip or design the, you the, the approach to fabrication, cetera. Like there seems like there's so much you can learn that would make that, you know, the next iteration of the Google's Willow experiment easier to do or someone else's implementation of their own surface code.
easier to do. So that makes total sense. you mentioned decoding. mean, decoding is sort of an invisible unsolved problem in quantum computing. People talk about fault tolerance. They talk about surface or error codes. But decoding is where the potential error is detected and corrected. That's actually what makes it fault tolerant. And that's a
a classical loop in the middle of the quantum operation, meaning it's very, very sensitive to performance because if it's too slow, it doesn't matter how fast your quantum calculation is going. that, mean, do you see a role for quantum elements technology, not just in the design, but actually potentially in the operation of something like a readout scheme and syndrome detection?
Izhar (20:30.485)
Mm-hmm.
Sebastian Hassinger (20:54.904)
scheme and feed forward for that matter.
Izhar (20:58.645)
So it's interesting that you mentioned readout because readout is one of the things that is still kind of not fully solved. We don't have robust models for readout. So that's something that is definitely of an interest for us. But I think another thing that we need to add on top of this ability to generate the digital twin is it creates really democratization. It allows teams
in companies or independent companies to run a parallel track to the hardware development track. Meaning, if we go back to this analogy of the wooden models in the air tunnel, if you're able to develop decoders predicting or hoping that
the hardware will get to where it's supposed to get in time, you almost kind of accelerated the development because you don't sit idle and wait. The other aspect of having the ability to generate such a large digital twin that has all the intricacies and all the noise models that govern the behavior of a quantum computer is the ability to generate large amount of data and to train using this data different
Sebastian Hassinger (22:01.529)
Yeah.
Sebastian Hassinger (22:21.53)
Mmm.
Izhar (22:24.221)
AI workloads, primitives, and models. And we might see, and we're not there yet, but we might see a future where AI decoders can work in real time in the quantum classical loop. Because we're seeing a trend from the AI community to make more smaller
models that can be deployed as an edge. Meaning, your edge compute resources are limited. They're not usually cloud connected. It's a computer that sits on an edge, whether it's a robot or whatever. In order to allow independence for those systems, you ideally would have a fully deployed AI system at the edge.
It means that it has to be small enough and efficient enough to be able to work independently. So if we see the continuous drive of getting AI more effective, the continuous drive of making those GPUs more efficient, can we use AI-based decoders to ultimately be deployed on the quantum classical?
hybrid workloads.
Sebastian Hassinger (23:52.794)
Interesting. would you classify, there's another experiment that you guys did. think you, you announced it back in the fall at some point with the IBM Eagle chip where you increased the logical, the logical fidelity from something like 45 % to above 90%. I think it was 95 or 96%. Is that the kind of sort of
Izhar (24:17.64)
Yeah.
Sebastian Hassinger (24:20.634)
machine learning enhanced operation that you're sort of describing.
Izhar (24:27.994)
I think that's where it will ultimately develop towards. So what we've done is, you can address, when you have the quantum circuit, obviously those qubits develop noise. One of the noise models that develop there are coherent noise, meaning those qubits kind of sit idle and they start to, you can think of a spinning top where you don't.
where the energy of the spinning top starts to go down, it starts to wiggle. So that's the analogy I developed and the intuition I developed for those of noise models. So you can address the physical qubit noise evolution using air suppression on the physical qubits. But what we've shown is that you can now go to the logical level and apply the same
Sebastian Hassinger (24:59.066)
Mm.
Izhar (25:21.922)
methodology and strategy on the logical level and address logical noise using those strategies, improving the overall circuit fidelity significantly. And that's kind of the next phase, right? are, the industry is talking about this transition from physical to logical qubits. We cannot assume that logical qubits will be noise-free. So how do you deal with the noise level on the logical system? And that's
the work that we've shown with IBM that demonstrated this kind of additional capability. And if you think about how do you adapt that to every modality, every QPU, maybe every calibration cycle, you can do that if you're able to train AI models that will adapt to those changes in real time and will not require to run each time.
a full digital twin simulation. The advantage of running digital twin simulation is that you get the full density matrix, meaning you get the full picture of what's going on in your system. But in some cases, it can be computationally expensive. So if you add to the mix AI, now you see the very strong synergy between digital twins and AI to accelerate those kind of.
Sebastian Hassinger (26:24.026)
Hmm.
Izhar (26:48.23)
workloads and really build this full software stack that addresses the failure modes of the system.
Sebastian Hassinger (26:57.786)
Interesting. You mentioned all modalities. We've only been talking about superconducting qubits. Are you using the same technology on other modalities as well?
Izhar (27:09.022)
Yes, so we are now already showing neutral atoms digital twins. We have some very interesting results where we can run codes that using our capability on the digital twin, we can show significant improvement in performance.
and we're definitely looking to add in the near future ion traps as well.
Sebastian Hassinger (27:43.93)
That's really interesting. And so we've talked about collaborative work that you and I did when I was still at AWS. You've got an ongoing relationship with AWS. We've talked about your work with IBM. Now you're working with Neutral Adams. And most recently you have announced a partnership or continuation of work that you started in the past with Rigetti. Is there anything you can tell us about what
what the plans are with with Forgetty, what's how, what that collaboration is, is meant to address.
Izhar (28:19.524)
we're getting, they've been amazing partners of us. You know, we've been working with them, for a couple of years now. first we showed that, we can push the performance of their Novera system, and a single and two qubit gate fidelity using our capabilities. now what we're doing is we're taking it, deeper.
I cannot disclose all the details, but what I can say is that using our deep understanding of how their system is behaving and through this ability to develop those digital twins of their specific system, we're helping them to pinpoint noise models and optimize performance through a closed loop operation.
and really kind of be part of their roadmap to full-tolerant quantum computers.
Sebastian Hassinger (29:20.654)
That's really cool. And you mentioned sort of the ability to parallelize with digital twin, being able to run in multiple efforts. I imagine that's also really interesting from a pedagogical perspective, right? I you can get a whole classroom running on the twins of the same physical qubits, so to speak. Is there a way for people to
sort of kick the tires on the platform that you're building and try out the technology.
Izhar (29:55.156)
They absolutely can. They can go to our website, quantumelements.ai, get started. They'll get access to our platform. And what we are providing is something very unique. You can build your own virtual QPU. You can insert your own T1, T2, detuning. You can experiment with different connectivity. And once you're happy with your QPU, you can now run circuits and see
the bit-string output and experiment on different error suppression and error mitigation strategies and then take it all the way to quantum error correction. Quantum computing runtime is expensive and I think for a good reason. These are expensive systems that the company spent a lot of money and time to build. But there should be a synergy between where you experiment and where you execute your workloads.
try out a lot of things in the virtual environment. And then when you gain the certainty and the understanding of where you're heading, go ahead and spend with confidence on the quantum hardware. And I think it's a very nice way of getting faster to where you want to get with higher certainty and ultimately with better results.
Sebastian Hassinger (31:16.238)
Yeah, that was a leading question because full disclosure, I've already played around with your platform constellation, as you know, and I think it's actually remarkably well designed from a UI UX perspective compared to a lot of other products here in the market. So it's quite a pleasure to use. to your point, like lowering the barrier to entry democratization, that D7 surface code experiment that you ran on AWS,
Izhar (31:21.545)
You
Izhar (31:29.993)
Thank you.
Sebastian Hassinger (31:45.828)
that was on a large instance, HPC instance, but I think the total cost was actually pretty reasonable if somebody ran that themselves, right?
Izhar (31:55.069)
Yeah, it's interesting. When we started, I'm sure you remember when we started this journey together, we thought it's going to be horrifically expensive. were, I think, talking about thousands of virtual CPUs and terabytes of memory. You know what? We were able to optimize the code to ultimately run distance seven on only 96 virtual CPUs and a few hundred gigabytes of memory. So yeah, it's extremely
cost-effective, it doesn't take too long to run such a large system. So we're very happy and we're not stopping there. We're gonna keep on pushing scale and speed to make it more accessible and to keep on pushing on this democratization aspect that you just mentioned.
Sebastian Hassinger (32:43.834)
Excellent. Well, thank you very much, Azar. I think this has been really interesting. really, I'll put all the links in the show notes. I really encourage people to read the blog posts, read your papers and get on Constellation and try it out for themselves. I think it's really exciting stuff that you guys are doing.
Izhar (33:01.747)
Thank you Sebastian, it was a pleasure as always.
Sebastian Hassinger (33:04.346)
Thanks.