How I Tested That

Markus Muller, co-founder of Flinn, shares his journey of testing ideas and building a business in the healthcare industry. 

He emphasizes the importance of testing assumptions and using experiments to validate ideas. Markus discusses the use of LOIs (Letters of Intent) as a tool to gain commitment from potential customers. 

He also highlights the value of co-creation with customers and the need to balance risk in heavily regulated industries. Overall, Markus provides insights into the testing mindset and its application in building a successful business.

Is your innovation pipeline clogged?
  •  Uncover the risks, bottlenecks, and gaps holding your best ideas back.
  •  With the EMT Diagnostic, you'll get a clear, actionable plan to fix them.
👉 Book a free discovery call at https://www.precoil.com/innovation-diagnostic

What is How I Tested That?

Testing your ideas against reality can be challenging. Not everything will go as planned. It’s about keeping an open mind, having a clear hypothesis and running multiple tests to see if you have enough directional evidence to keep going.

This is the How I Tested That Podcast, where David J Bland connects with entrepreneurs and innovators who had the courage to test their ideas with real people, in the market, with sometimes surprising results.

Join us as we explore the ups and downs of experimentation… together.

David J Bland (0:4.490)
Hi, welcome to another episode of How I Tested That. Today's guest is Markus Muller. Markus is the co -founder of Flinn, and he has a mission to enhance access to high quality health products. Markus wants to create significant, sustainable value for society through technological and business innovation. Thanks for joining us, Markus.

Markus (0:23.827)
Hi, thank you for having me.

David J Bland (0:26.154)
So we first were connected online because I was on LinkedIn and I saw this amazing story come through my feed of you testing ideas with Flinn using the testing business ideas book. And I thought, I have to get this guy on the podcast because I want to hear more. It was just a few paragraphs and I was just really, really wanting to hear more. So.
I just wanted to thank you so much for joining and being able to just talk about your testing journey and how you're living this day in and day out.

Markus (0:58.163)
Yes, and you can not imagine how excited I was when you actually connected and reached out because the book Testing Business Ideas has been one of my most recommended book that I, in the courses I've given and many product managers I've worked with, I've recommended so many times and I was excited to talk to one of the authors. So, yeah, pleasure is on my side.

David J Bland (1:21.994)
No, I appreciate it. It's you never know if a book's going to be received well until it's out there. And as much as we tried to test it, I was I was just pleasantly surprised that people are using it and so happy with it. But coming back to you, why don't you give us maybe a little background of, you your journey and how you came into this kind of testing idea mindset with Flinn. So maybe just give us our listeners a little background from you.

Markus (1:48.659)
Yes, I started my journey in school where I was a bit of a web designer and web engineer. I had my first business while studying business administration in Vienna. We actually opened up a restaurant with a tablet -based ordering solution. That was 2011 when tablets were quite a new thing. So at that time, I had no idea about testing ideas, which just said, well, it sounds like cool, let's do this. I decided to interface, I asked my friends.
Do you like it? Yes, I like it. Okay, let's build it. So very, very little testing. At university, I learned a little bit about this, but didn't really understand what do they actually mean with testing assumptions. I then joined a company called N26, a bank in that early days where there were like 15 people, small flat in Berlin. And I was the first product manager and had also very little idea about like systematically testing assumptions. We were just listening to customers and
more or less building what they were asking us for. And at some point, I remember we built a product at N26 that really didn't work out at all. And then I stumbled somewhere about first interesting medium articles. I think it was 2014 or 2015 something, about experimentation -driven product management and hypothesis -driven product development. I thought, that's interesting. I read some articles about booking .com and Skyscanner applying such things.
And so I got interested in this type of product management and thinking and said, okay, how could we have avoided our mistakes of launching something, putting so much effort into it? And I figured out it doesn't work and found my way into this field and read more and more about it. And at some point I stumbled upon your book, learning about all the different methods out there. And yeah, and then I joined a couple of other companies as a head of product before starting my own business two years ago.
which as you mentioned, we are building software for medical device manufacturers. We're now 15 people team, remote first company, software as a service in P2P. And yeah, here I am.

David J Bland (3:59.946)
So I was just unpacking that journey. It sounds as if you firsthand experienced the pain of building something that practically like just either nobody wanted or didn't have the impact that you were hoping it would. I know I personally experienced that at a couple startups I joined early on in my career. And I feel as if you can, it's like, how do you learn from that?
experience. You either say, oh, next time we're just going to polish it even more and make it even better. Or, you know, maybe I should test some of the, some of these assumptions before jumping in next time. It feels as if that, that really influenced your kind of mindset on, on how you view testing.

Markus (4:41.169)
100 % and today I would say we're not only doing product development in an assumption driven way but and that's why initially what was a bit irritated by testing business ideas because that is how like it's a product toolbox but like over time I understood okay it's really the whole the way you think about building businesses everything is an assumption and it's just the next level of a higher fidelity test that you run and then.
You have your first product up and running and working, and you repeat the whole cycle again. And a country you expand to is an experiment. Even a new business unit you build up is an experiment. And to be honest, I actually do a lot of experiments with the organization. So I experiment with organizational design and say, OK, when we reorg our team, I say, OK, let's make this an experiment. We will see in a few months whether it works or not. Here are the assumptions. This is how it could fail.
So I consider almost everything we do an experiment because there's so much uncertainty. And we want to make explicit what our assumptions. So I think it started with understanding, okay, this can help me product development. And now I kind of feel that the whole entrepreneurial journey is as many, many, many experiments. And it's actually, I read this somewhere once. The quality of your company is like the quality of all decisions that you take together. And.
Experiments just help you to take better, more informed decisions. And sometimes it's better to just decide and go, but sometimes it's good to run some experiments upfront. And so that's for me is like experiment, reflect, next experiment.

David J Bland (6:23.626)
Yeah, it's almost like just applying scientific method to an extent of anywhere you feel uncertain, you know, to try to navigate your way through that. I think that's, that's really a smart approach. I'm curious back in, I would say early journeys of starting this new company, how, how did that come to be? Like, what kind of big assumptions did you have? What kind of tests did you run when you were, when you were starting Flinn?

Markus (6:47.219)
Maybe to just tell you how we initially got the company started, I have a co -founder or have a co -founder. And we just said, the two of us want to start a business. We have had a business in the past, our restaurant business. We come with different backgrounds, it's more business, I'm more product. And we said, we want to start a business and it should be something that we enjoy, that is impactful. So in healthcare, sustainability, and we went out there and like, we're looking for problems. So where is a promise? Interesting to solve.
and a problem that can also be an interesting business. And somehow we stumbled upon the meta problem of, okay, bringing medical devices to the market is very complex. My co -founder has spent his last seven, eight years before building Flinn in the medical device industry, building products. And he said, I never want to do this again. It's so complex. And we started interviewing and we actually ran 40 interviews in one and a half months from startups.
all the way up to corporate organizations, and just learning about the different problems, needs, and challenges that customers have. So the experiment was basically customer interviews to learn about potential problems. And then once we had a bit of an idea what are the different clusters of potential personas and different problems and first patterns, we picked a handful of problems. And so here we do next level, we go deeper into deep dive problem interviews.
And initially, we thought the biggest problem is getting a new medical device certified. But then we learned due to some major regulatory changes, a big problem is actually keeping the compliance of your portfolio and managing the continuous compliance of a portfolio of 20, 50, 100 products. And this is a problem that is even more present for those companies. And so we picked this one, we went deeper, we've built the first click prototype, really like,
simple Figma prototype on a specific use case and did some first solution testing on that one. I said, okay, we presented the solution after initial interview and we're interested to get some first reactions. And then we were asking when we've seen that there's some base interest in the solution and we thought, okay, this could actually be a customer for the solution. We asked them, would you be willing to sign an LOI, a letter of intent for

Markus (9:11.187)
actually, once we built this product, you will be willing to commit more time and potentially even money to it. And so we collected out of, we had a total of more than 50 customer interviews, present, I think the prototype to about 20 of these customers. And then we got 10 of those 20 signed LOIs, light off intents, over a journey of three, four months, two people only. And
These 10 LOIs, a pitch deck and a click prototype got us then our 1 .8 million pre -seed funding that helped us build the first team and so on. So we spent seven months, eight months before we've written the first line of code. So we started in August and first line of code written in March. So just to give you an idea of how much upfront research we've done. And we then continue to use ability tests and so on, concierge services. So we prepared a lot of experiment.
one after another to build our business upon that.

David J Bland (10:14.474)
So you started very wide in your discovery, just exploratory, trying to find out what are the pains that these folks are dealing with, and then being able to narrow in on one, go deep on that, and then have something clickable you can put in front of them that could potentially solve for that. I'm curious, with that process, were you coming back to some of the people that you spoke to previously and showing, let's say, for example,
they had some interest and they wanted to be contacted again. And so when you come back with a Figma prototype, they're just really eager to see what you've kind of created to solve their issue. Is that how that flow sort of went?

Markus (10:55.955)
Yes, exactly. So when we did the first interview, we had first a feeling like we were trying to learn and understand and what we wanted to see how like, how excited would they be to talk further about this topic with us and spend more time on us with them together exploring potential solutions. So for me, first indicator of valuation is if it's very hard to get people
talking with you about the topic, you either suck in sales and acquiring people for you to talk to, or you're just not approaching a field or theme that is most important enough for them. So we had two Gmail addresses and we're reaching out to top managers of companies saying, we want to talk about automation and regulatory and quality management in MedTech with you. It's like very broad and we want to see how we can help you. And they kind of responded and said, yeah, we are open to talk with you.
And we just broadly, like one of the first questions in the interview was like, what do you spend most time on to keep your continuous, to keep your portfolio of products compliant? And they kind of spoke for 10 minutes. And so we've seen veterans and then we usually pick them one and say, would you be open to talk more about this topic with us in a follow -up call? That is the first validation. If you say, yeah, I'm happy to spend more time. And then a follow -up call, we brought some first sketches and ideas and showed it to them. So with some clients,
that are now paying customers, we've built a relationship over one and a half years, at least had 10 interviews. In the beginning, extremely broad. At the end, it was like, usability, do you understand this button? Do you understand what you do in this step of the process? And in the beginning, it was like very broad. So yes, we identified some clients where we invested heavy time and went really deep with us on this, which is a strong validation that there's a need because otherwise,
would not spend the time. They are not folks that we know from the past. And some kinds or some prospects, we've talked with them once and then identified either they don't have the pain points, they're not clicking with the topic, but there were also some kinds that said, please come back to me when you have a solution, like it's too early. And then we came back one and a half years later and they said, well, really cool, let's do a trial period. So,

Markus (13:22.803)
Not everyone also has such an open iterator and interested to join a journey so early on, because we always told them, if you join now, you can influence it from the beginning on. Some of them we made our advisors. So it's really, yeah, it's kind of an early sales almost.

David J Bland (13:40.074)
Yeah, it feels as if it was really almost like co -creation. You know, you have this shared experience where you're giving them space to talk through their problems and then you're able to co -create with them over time and they feel maybe even invested in what you're building because I mean, obviously some of them became advisors later on. You almost have your own advisory council now that came from those interviews. I think that's a really smart way to approach that versus
pitching them a solution right away, which tends to be the culture in a lot of these companies, especially B2B is, well, if you're talking to me, I expect you to be selling me something and it's really polished and you're just going to tell me how much it is. And over the last, I say, five to 10 years, that culture is slowly changing to, hey, I'm not going to pitch you right away. Can you just talk about some of the problems you're experiencing? Because we're exploring the space and potentially we can solve for this. I think.
I think that's a big shift that's happening in this industry. I mean, what are your thoughts?

Markus (14:41.779)
Totally agree. But at the end of the day, there's like people on all sides of the spectrum. You have some folks, as I mentioned, they said, come back with a solution. Some said, we're going to build this in -house. And then interestingly, one and a years later, we talked to them and said, we have the requirements done now, but actually we'd like to test the solution. And some are super excited. They even said, can I join your company? I really, like for 15 years, I tried to solve this problem internally, but I can't.
And of course you want to look for a good number of people who are...
interested in talking about the problem, maybe have tried out to solve it somehow themselves with workarounds or so because it seems that it's really a significant pain and have a bit of a mix, particularly in the beginning, smaller clients, like smaller companies, bigger companies, very big companies, different personas, like team leads, experts, C levels and see like, where do we focus in the beginning? Because I think this is a second big mistake and many people make kind of
either go only broad and always stay broad and then never solve a problem really well for anyone because they try to do everything at the same time or start only super narrow and then optimize in a local maximum. So starting broad, but then not being afraid to narrow it down and say, okay, we started a very niche problem and very small thing and then step by step build from there.

David J Bland (16:12.106)
Yeah, I think that makes sense. So I'm curious, what have you done since? So, I mean, it's a big journey, you know, several years of going from interviews to then you mentioned LOIs, maybe let's dive into that a little bit. I mean, for our listeners, you know, an LOI letter of intent, it's usually a one page non -legally binding contract, but it's a little better evidence because people are putting in writing what they might be just saying verbally and not everyone puts things in writing. So maybe explain.
just the LOI process and how you came to that realization that this is something as a call to action that would help us get better evidence.

Markus (16:49.299)
In the beginning, we were thinking, should we even put it in a document or should we just send them an email and then can you send this email back to us with this email confirming that you're interested in this? And then we said, okay, what is a good enough level of commitment? Because sometimes you can make LOIs that say, if this product is live with these requirements, this is how much I will pay. So we thought, okay, we've heard that they are generally willing to pay for software.
if it solves their problems. So it was less of a question for us whether or the risk was lower on the assumption, will they be able to pay or willing to pay? It was more like, will we be able to build a good enough solution that they're willing to pay? So we said the more important aspect for us is that we get enough time from a handful of clients to optimize the solution to a level that is good enough. So we said, okay, let's make this a part of the commitment of the LOI.
And so, okay, if we build this software according to these requirements, are you willing to test it over eight weeks and spend half an hour every week with us to give us feedback? This is what they had to sign. And we sent initially like to the first two or three and emails, they can you send us this email back with your, like, here's a template, so to say. And if you're free to add, if you want to. And then we had after the second client has put it into a Word document, signed it and logo on it. And so I was like,
Wow, like the curve going further. And it was a, well, okay, let's try this. We sent them a Word document with like one or two page high level requirements. It's very important to not to be too narrow on the features because you can lock yourself into something that will be a pain afterwards, but describe the more or less like themes and the key problems you saw for them. Maybe it's one or two flagship features that you might include. And then.
add a logo and then add a commitment that can be financial, it can be time if time is the more critical part of this game. And then let them sign it. For some clients, we had to follow up two or three times. Some send it over immediately. For one client, it went all the way to top management because we had in the LOI that we allowed to use their logo. And as soon as there's a logo, there was the legal department involved. There was top management of a billion dollar company involved. And

Markus (19:12.477)
That was actually fun to see such a lot of intent really creating a commitment in that company. Even though as I said, not legally binding. So I think it's a great tool to something in between selling something because at this point we didn't have even a product, nothing, no single line of code. Of course you might sell it as a bit more than you actually have, but you of course still need to be mindful.
But I think it's a wonderful technique to do some early sales. And that's why I always say first sell then build. Yeah, because building is the most expensive way to test. I don't know whether I have this quote from your book or from another one.

David J Bland (19:56.554)
Yeah, probably in a few books at this point. But you deferred building for quite some time and were able, it feels as if the risk on your side was less so maybe on desirability and even viability because they had painful problems and they're willing to pay for it. And you seem to have evidence that they have spent money trying to solve for this. It was almost shifting really quickly back to feasibility, which is you need to give yourselves enough time.
to build something that can solve their problems because it was almost like a take my money now type approach from them where it's yeah, we'll sign LLI, when can we have this? And so just being able to craft that in a way where you have time to respond, it sounds as if that was maybe the riskiest part at this point in the journey. I mean, what are your thoughts on that?

Markus (20:46.771)
I would say it's a mix of feasibility and desirability. Definitely not feasibility from like, is it technically feasible? We currently have a lot of projects going on in the AI space where it's clearly a tech feasibility question. Can we actually do this AI thing that we promised with a good enough accuracy? So there's like in the AI space, a lot of technical feasibility. But for our initial proposition, there was not
it was clear that technically it can be done. It's more of a question, is this a kind of three months, four months, five months, six months project to get the first version done? But it's of course the feasibility. Is it feasible to build a good enough, simple enough product for first version? So it's more the user experience feasibility, so to say, to figure out all the little details to make it work, make it superior to the competitive products.
And for this, I would say it's a desirability for on a more granular level, not on like, yeah, I desire this solution to generally solve this problem versus is my specific solution more desirable than the solution of a competitor. So that's why I would say it's a mix of desirability, viability on a lower level, not on a meta level of like, yeah, they would actually develop, value proposition was tick. The problem was a tick. It's like the solution was the stuff to be figured out.

David J Bland (22:14.058)
Yeah, being able to break it down into smaller chunks and tests. And it sounds like that's where it's kind of your journey has led you to today. And that's what I was kind of taken back by. And when I saw that come through my LinkedIn feed, which was, you know, in that specific post, you were talking about, well, here's how we're testing features now. So maybe you can give our listeners some detail about how your...
taking that broad level testing over time, getting to something that's a solution, that's a good fit. And then how are you still implementing that kind of that testing mindset into even details of the solution?

Markus (22:54.675)
So basically for every theme, topic, idea, whatever you want to call it on a roadmap, we try to think what could be an experiment to figure out test the most critical aspects. And sometimes it's the feasibility and say, okay, we need to win pilot customers, give a sharing us some test data and run a proof of concept to see is it even technically feasible. Sometimes it's.
the desirability, like will they need it, will they love it? And for example, for this feature that I posted a LinkedIn post on, it was more like, will they like it? And what would they like the most about this feature? And this is about like what our first product does is basically monitoring databases, very specific databases for clients. And today they do this completely manual. And with our product, they kind of somewhat automated. So we run it continuously. And then we said, okay, we should,
We early on already tested this feature in a click -ported type and said, yeah, you can take a box and say, send me notifications whenever there's something important, important finding. And we've heard this already in like the early testing that users said, that's cool. Like I want to, this means I don't need to log in every day and look at it, but you can just trust it and once a month, like maybe double check or so. So this will take a lot of time away from me.
So we've heard this very early on, but we kept pushing it back a bit. So this high ability, overall this high ability checks, so to say. And then once we had the foundation built, we said, okay, now it's time to build our, our notification thing. But then we had so many ideas what we could do around this feature, how like, how configuring, how frequently you can send it daily, weekly, monthly, who should get the report, one person or everyone versus.
What should be in the report? Illustration, statistics, just links, what do they want to see? So there were so many questions around this. And it was not so easy. We did some interviews, but we said, OK, we want to actually combine this with a real life test. We make it a Wizard of Oz test, and we pretend to have this feature already, because it's so easy to kind of manually serve it in the beginning. So we created in our tool a little.

Markus (25:18.452)
tick box and say, yeah, send me notifications. And then basically what happened is just like end of month, our product manager checked, okay, who ticked the box? And we have a small enough group of customers that we offered this to at the beginning, say, okay, we can still manually handle it. And then the product manager, he went through these customer accounts and said, okay, what interesting findings can I get? And crafted, handcrafted a manual email for them, like an interesting one, send it to them. And then we interviewed them afterwards.
with real data, said, what did you like most? And we had a little survey in there, like tell you, tell us whether it was helpful or not and what did you like. And so we did this over three months and like learned, okay, some customers explained us, oh, I would love to have this information edition, more of this, that's not interesting. And it was completely like, yeah, we didn't have to write a single line of code to engineers or focus on building stuff that was already validated while our product manager could really test this.
And we were also already able to sell the feature as a premium add -on, so to say. And then after three months, we said, OK, now we understood what is the first version. And we started really automating this so that the poor art manager doesn't have to manually continue doing this because it was quite time consuming. And we also scale it across our whole customer base. So yeah, this is this concrete idea. And we apply this thinking.
really across every initiative that is on our roadmap.

David J Bland (26:50.570)
That's I love. So you're taking maybe something that doesn't scale and you're trying to find is there value whatsoever and what we think we could build. Let's get real feedback from our customers on that. And then at some point, it sounds like over the course of three months, you get to the point where you say, Hey, I think we have enough information here to build this. It may not look exactly how we imagined it, but we've used customer feedback.
and shaped it in a way that we think it's a better fit. And then that scales because that's something you can build in software and have it automated. And so it sounds as long as you can balance the manual aspects of that, right?

Markus (28:43.998)
This is exactly a deal because I think if we could have just started based on our assumptions and we might have completely over -engineered it, built up features that users would have not used and wasted a lot of time because for us, we are in a very regulated industry. So everything we build needs to undergo very thorough quality processes. So everything we build is even more expensive than with other companies. So we need to be very certain that what we build works. But still,
Despite doing this, it not always works. We recently took out one feature that we built three months before because we just learned we were wrong. Despite having done user tests, we were building something that we just misunderstood those aspects, but your hit rate gets much better. Usually, I would say more 19 out of 20 features we release at least could and work.
make sense to be developed further. Instead of like, if we probably would not apply all of this, maybe 50 % would be off. Like it's more 50 % gut feel is good and the other 50 % is off. So for us, it's really a numbers game. We still will fail because if we over -engineer the testing and we run, we go too crazy on it, we never progress. So we have to take risks somewhere and we have to be smarter about prioritizing and balancing.
Yeah, that's what I also like about, I think, the risk matrix, for example, in your books, to say, okay, what are the riskest assumptions? I test those, and then I need to kind of take the next step. And that might be the experiment of building a version one.

David J Bland (30:22.314)
So you're doing this in a pretty heavily regulated industry. What kind of advice would you give to folks that are also in heavily regulated industries and would like to test things but feel very restricted, feel as if maybe they aren't allowed to test things? What kind of advice would you give other founders or other innovators that are looking to test in heavily regulated industries? What would be some words of wisdom?

Markus (30:49.438)
I think there are two things. First, we are regulated and second, we are in an industry where we have a small quantity of users or customers. It's really about few customers, but big ones. Once we build long -term big contracts. So for many people, they tell me, yeah, you cannot experiment with if you don't have huge quantities of users. And that's not true. That just means we need to focus a bit more on the qualitative spectrum of experiments.
concierge services, wizards of Oz tests, interviews, LOIs and so on. But as the toggle thing showed, we can combine also some quantitative things. Even though I would say that was not really a quantitative, we didn't measure like how many users did it because it was again, 10. It was more a qualitative method that felt a bit quantitative to get people talking in a different way. Because if we would have interviewed them on this,
they can oftentimes not imagine certain things if they don't see it and experience it themselves. So we are special. And that's why I would say if you're similar to us, you might focus more on the qualitative spectrum of experiments and maybe like large scale AP tests is not the thing for you. And you need to be also smart in like, you can only use certain things cannot be tested in a...
life environment because it might be too risky in reputation or the core job of your user. So certain things could just not be done. But I would say many things can be done at least in a concierge way. For example, we have also done a concierge on one thing that we're now trying to automate with AI. We just basically acted like consultants and just manually
pre -evaluated search results for clients, which we now step by step replaced with AI and just manually did it to understand how do you even evaluate such results? No one in our team ever has done this before. So we said, okay, let's get some experts shadowing what we do. So it was the first time for us doing it. So we used best knowledge and had then an expert double checking whether what we did was actually good. And over time we became experts ourselves. So...

Markus (33:13.904)
When you do such concierge services, for example, make sure you have someone quality -proof the things you do, sending out the emails and so on. Make sure you don't get your customers into trouble, promising something. And if you don't deliver on that, it's like they have issues. So it's really a risk management approach. What's the risk of doing this? What's the risk for yourself of losing a reputable customer? But also what's the risk? What could in the worst case happen to your client?
And for example, this experiment I mentioned with email, that there's very little risk because we never promised our client to say, we will keep you informed about everything. And if you're with us, with us, you cannot miss any death case anymore. We made it very clear to them that this is an add -on feature, but the responsibility to monitor the database is on them.
but it's an add -on feature that will provide them additional service. And now as we have more confidence, we also ramp up our promise to say, okay, you can actually fully rely on this service. So be mindful and balancing this. It's like what you sell and what you promise to not cause harm to users. I think that can be more damaging than anything else. And yeah, come with a risk approach. And maybe one last tip. We do...
to have a heavy validation process or testing process for releases. And not every release we do is undergoing the same validation. So we do some official releases that are fully kind of checked and tested and where we say, okay, this is a release that's bulletproof. And here's a beta version. If you want to test the new features, here's a beta version. You can have a beta account. So I think it's also something quite common in the consumer world, early access beta things.
Some clients just say, I'm fine with this. Small companies, like if there's a little bug somewhere, it's okay. As long as it's a beta version that I just test in my life system, a production version, everything needs to be perfect. And some clients just say, give me the features once they are done and completed and working perfectly. So also figuring out who are customers who are willing to maybe be on your experimental version and test with them.

Markus (35:37.214)
And we sometimes also highlight features in our software as beta feature or alpha feature and say, this is still in testing. And with this, also do not create false expectations around certain features that might still be not 100 % consistent in the way they work. So in this case, create transparency, even though a wizard or false test is not necessarily the most transparent thing you could possibly do.

David J Bland (36:6.698)
Yeah, it seems as if it's about reducing risk and even if you have a small quantity of customers, erring more towards co -creation, the quality of your experiments, try to learn as much as you can in a very thoughtful way.
It is possible you're doing it, you know, you're living that journey and building a company with that in a heavily regulated field. So I really appreciate you sharing your journey. As we wrap up, how can people find out more about you? Like, where can they go to find what you're working on? And if they have any questions about anything they've heard here, where would they find you online?

Markus (36:44.798)
I would say it's best if they reach out over LinkedIn. I have also five medium articles that I've written around product management topics such as road mapping or how to free up time for discovery. Feel free to check this out. If you want to learn a bit more about our journey, we have also our podcast Founders Journey Unplugged, where we document basically our journey as founders. We speak about topics that we spoke about today, maybe from a bit of a different angle, but we have a monthly...
episode where we just talk about different things. So if you want to learn a little bit more about the founder's journey, then feel free to check this out. Otherwise, it's just a link in. This is, I think, the most straightforward. You can, of course, send me also an email to our business email. You find our email, company email on flint .ai. Yeah, it would also work out.

David J Bland (37:39.370)
Awesome. Thanks so much for spending time sharing this wide -ranging journey of going from the problem space to testing solutions in a heavily regulated industry. I think our listeners will really enjoy hearing this. So thanks so much for sharing your story with us today.

Markus (37:55.934)
Thank you David for the interesting questions.