The Healthy Enterprise

Chris Dayton, Co-Founder & CEO of Quality Assured AI, explores how AI is being responsibly integrated into the pharmaceutical industry, with a focus on quality management, regulatory compliance, and data security. He discusses the use of closed systems and air-gapped architectures to protect sensitive data, the challenges of deploying AI in regulated environments, and the importance of mitigating hallucinations in AI models. Chris also shares lessons from launching and scaling the platform, transitioning into the CEO role, and building AI literacy within organizations—highlighting how thoughtful AI adoption can drive innovation while reducing operational strain and burnout.

Chapters:
00:00 Introduction to Quality Assured AI
02:51 The Role of AI in Pharmaceuticals
06:02 Understanding AI Implementation Challenges
09:04 The Importance of Data Security
11:50 Navigating Regulatory Compliance
15:00 The Evolution of AI in Quality Management
17:59 The Human Element in AI Integration
20:56 Client Onboarding and Customization
23:49 Feedback and Market Reception
26:59 Future Growth and Scalability
29:50 Parting Thoughts and Advice


Guest Information:
  • Guest's Name: Chris Dayton
  • Guest's Title/Position:  Co-Founder & CEO
  • Guest's Linkedin: https://www.linkedin.com/in/chris-dayton-b414bb57/
  • Company / Affiliation: Quality Assured AI  https://www.qualityassured.ai/
  • Guest's Bio: Chris Dayton is the Co-Founder and CEO of Quality Assured Ai, a company delivering closed-system, on-premise AI solutions purpose-built for pharmaceutical Quality Assurance and Quality Management. With over a decade of experience across protein, cell, and gene therapy manufacturing, he has served as a lead investigator, QMS author, and technical operations manager overseeing QDS authorship and validation approvals. Chris has designed and deployed AI/ML models that streamline deviation drafting, SOP generation, and CAPA support while maintaining strict compliance with FDA, EMA, and MHRA regulatory frameworks. He has consulted with top-25 global pharmaceutical companies on GxP-aligned AI pilots and is a recognized thought leader in AI governance for life sciences, speaking at PDA events and leading AI literacy initiatives for QA and regulatory professionals.

Takeaways:
  • Quality Assured AI focuses on the quality and regulatory side of pharmaceuticals.
  • AI can significantly reduce the time needed for documentation in the pharmaceutical industry.
  • Closed system deployments ensure data security and compliance with regulations.
  • Understanding and addressing hallucinations in AI is crucial for reliability.
  • AI should not continuously learn from user interactions in regulated environments.
  • Effective training and onboarding are essential for successful AI integration.
  • Feedback from users highlights the practical benefits of AI in quality management.
  • Collaboration with consulting groups enhances scalability and deployment efficiency.
  • AI literacy is vital for stakeholders in the pharmaceutical industry.
  • Future growth includes expanding AI capabilities to automate more processes. 

The Healthy Enterprise Podcast is produced by Bullzeye Global Growth Partners  https://bullzeyeglobal.com/


Creators and Guests

Host
Heath Fletcher
With over 30 years in creative marketing and visual storytelling, I’ve built a career on turning ideas into impact. From brand transformation to media production, podcast development, and outreach strategies, I craft compelling narratives that don’t just capture attention—they accelerate growth and drive measurable results.
Guest
Chris Dayton
Chris Dayton is the Co-Founder and CEO of Quality Assured Ai, a company delivering closed-system, on-premise AI solutions purpose-built for pharmaceutical Quality Assurance and Quality Management. With over a decade of experience across protein, cell, and gene therapy manufacturing, he has served as a lead investigator, QMS author, and technical operations manager overseeing QDS authorship and validation approvals. Chris has designed and deployed AI/ML models that streamline deviation drafting, SOP generation, and CAPA support while maintaining strict compliance with FDA, EMA, and MHRA regulatory frameworks. He has consulted with top-25 global pharmaceutical companies on GxP-aligned AI pilots and is a recognized thought leader in AI governance for life sciences, speaking at PDA events and leading AI literacy initiatives for QA and regulatory professionals.
Editor
Griffin Fletcher
Griffin Fletcher is a Junior Project Manager who wears a lot of hats. He’s skilled in podcast and video editing, film production, cinematography, and social media management, bringing creativity and organization to every project he touches. Griffin also has a sports background—he’s worked in hockey analytics and as a referee—which sharpened his attention to detail and teamwork skills. With a BA in Economics, he mixes analytical thinking with a creative edge, making him a versatile and hands-on contributor to our team.
Producer
Meghna Deshraj
Meghna Deshraj is the CEO and Founder of Bullzeye Growth Partners, a strategic consultancy that helps businesses scale sustainably and profitably. With a background spanning corporate strategy, IT, finance, and process optimization, she combines analytical rigor with creative execution to drive measurable results. Under her leadership, Bullzeye has generated over $580M in annual growth and more than $1B in client revenue, guiding organizations through large-scale integrations, business transformations, and organizational change initiatives. A Certified Six Sigma Black Belt, Meghna’s superpower lies in strategic marketing and growth consulting, helping businesses grow through innovation, efficiency, and strong, trusted partnerships.

What is The Healthy Enterprise?

Join host Heath Fletcher on The Healthy Enterprise as he explores how healthcare leaders and innovators are transforming the industry from the inside out. Whether you’re a provider, tech entrepreneur, marketing strategist, or industry executive, these conversations deliver actionable strategies, innovative solutions, and human-centered insights to help you grow, lead, and make a lasting impact.

Created and Produced by Bullzeye Global Growth Partners — Let’s build it together!

Heath Fletcher:

Hello, everyone. Welcome to the Healthy Enterprise Podcast. Thank you for returning. If you're back for more and if it's your first time, I hope you enjoy today's episode. It's gonna be very interesting.

Heath Fletcher:

Chris Dayton is the cofounder and CEO of Quality Assured AI . It's a company delivering secure closed system AI solutions for pharmaceutical quality and regulatory workflows. With an experience in leading AI pilots for top global pharma companies and speaking at industry forums, Chris is helping life science organizations to streamline compliance, reduce burnout, and accelerate digital transformation. Please welcome to this episode, Chris Dane. Hey, Chris.

Heath Fletcher:

Hi. How are you today? Thank you for joining me for this episode. Welcome to the Healthy Enterprise.

Chris Dayton:

Yeah. Well, thanks for having me. I'm I'm glad you guys reached out and we we we get to do this.

Heath Fletcher:

Yeah. Awesome. So why don't we start off just, you know, take this initial time to introduce yourself and the company and, tell listeners about, what you do there.

Chris Dayton:

Yeah. So I'm Chris Dayton. I am the CEO and cofounder of a company, Quality Assured AI. At the crux of it, we develop, AI systems that are focused, to support the quality and regulatory side of the pharmaceutical industry and, you know, life sciences in more broad strokes.

Heath Fletcher:

Right. And was this an area that you were already in in the life sciences and and

Chris Dayton:

Yeah. So I actually spent the better part of the last decade in pharmaceuticals manufacturing. I've done everything from running autoclaves to mopping ceilings. Yeah. Worked my way up.

Chris Dayton:

I was running a technical operations group, was responsible for floor execution and all the documentation. Admittedly, that got kind of tiresome, you know, deviations, kappas, SOPs, all of it. So we developed a language model that does technical writing. And then from there, we kind of built on the the technical writing expertise of it. So, yeah, now we support a lot more in the quality and regulatory sides.

Heath Fletcher:

So interesting. You really embrace this whole, you know, flood of AI technology that's come into all our worlds, you know, not just in life sciences, but all over the place. But you've managed to find something, a way to implement it that really suits this niche of the industry. Right?

Chris Dayton:

Yeah. So our our implementations, they're a bit different. We don't use those, like, you know, ChatGPT, OpenAI calls. We actually do, completely secure closed system on premise deployments, which is really just fancy talk for we show up with all the hardware. We install it in client server racks.

Chris Dayton:

Everything lives within their VPN. No data ever leaves. It's kind of a neat trick that we've you everybody thinks you need these massive data centers to run AI, and, you know, that that's not true if you're not like a Gemini or, you know, Google or one of those.

Heath Fletcher:

Right. You don't need fields of generators to to to power this stuff or or solar. Right? And I I'm really glad you brought that up because closed system is an important that's the term that you referred to. That means it doesn't, yeah, it doesn't leave the premises.

Heath Fletcher:

It doesn't leave your server based information as enclosed within the organization. Right?

Chris Dayton:

Yeah. So the way we usually do it, we can do, fully air gapped. It's all up to kind of the client and what they're willing to expose the network to. You know, if they want more of the API calls to pull from, like, your your TrackWise or your Veevas, you open it up a little more. But we can do completely air gap deployment, which is not something you usually see in pharmaceuticals.

Chris Dayton:

That's more reserved for, like, DOD, you know, skiff type stuff. But it's, you know, it's a hard sell to walk up to a pharmaceutical company and go, hey. Give me all of your information. I promise I'm not gonna do anything with it. So the best way to do it, you know, deploy the hardware there.

Chris Dayton:

They know nothing's ever gonna leave, and then you can actually have more of those beneficial conversations with language models where, you know, you are writing deviations about something that actually happened or, you know, doing your audit responses that can actually pull from your site SOPs, which is, you know, some of the other neat tricks that we've we've managed to do.

Heath Fletcher:

Okay. And and you're chucking some acronyms at me. I'm going, no. None. Don't know what that one is.

Heath Fletcher:

And so there's probably a few listeners that are saying the same thing. So my job part of my job is to be able to kind of pull us back and get, what are those acronyms? What do they mean? And and and bring us up to speed with what you know because you're very familiar with some of these. So there's a couple you use there.

Heath Fletcher:

There was a that you went can you go through those again for me? Because just so to clarify where they are.

Chris Dayton:

Yeah. So I guess I'll start with some of them. So, like, air gapping, one of the less familiar terms. Yeah. That's when we have hardware that never actually connects to the Internet.

Chris Dayton:

Right. So what we do is we leave one port open. And then, you know, when the client's using it, it feels a lot like it's on the Internet because you can just go to it from a web browser. Some of the other ones, QMS authors. I don't know if I hit on that one.

Chris Dayton:

That's a quality management systems.

Heath Fletcher:

Okay.

Chris Dayton:

So that's, like, when you're making pharmaceuticals. Right? You want everything to go perfectly. Sometimes not everything goes perfectly. Sometimes you have to, like, pause what's going on, and then you have to write a report as to why you paused, how you fixed it, you know, why there's no impact from it.

Chris Dayton:

So that's something that we actually we we help with. Some of the audit responses. Right? You have if you're like a CDMO, which is a contract drug manufacturing organization, you make a different client's products. I don't know if everybody knows this, but pharmaceutical companies can outsource their manufacturing, you know, to a different site.

Heath Fletcher:

Right.

Chris Dayton:

When that happens, you know, there's a lot of paperwork that goes along with it with, like, tech transfers, batch records, even audit responses. So, like, as a like, let's say, outsource some of my manufacturing, I can show up and audit them, make sure that everything's going well. And then usually, any of my findings, I draft a a report about it. You know, you send it back to whoever's doing the manufacturing. They respond to all of it with, you know, per our standard operating procedures or SOPs.

Chris Dayton:

This is how we do things. This is why everything's fine. When the FDA does it, it's called a four eight three letter. We help turn those around. You usually have about fourteen, like, calendar days to get those turned around, and it takes a lot of time to draft those.

Chris Dayton:

So our our system can have those drafted in a matter of minutes instead of a matter of a few days.

Heath Fletcher:

Interesting. And so, really, you're you're leveraging time, right, which is also money. So you're actually saving time, which saves money. And these are some of the activities that most of these organizations actually have to go through and several times over and over again for every each and every product. Right?

Chris Dayton:

Yeah. Absolutely. The way that we like to look at it is that in the world of pharmaceuticals, you make two things. You make drug product and you make paperwork, while everybody else is, you know, throwing their their AI money at drug product trying to find the next big thing. I've always been on the other side of the equation.

Chris Dayton:

If the paperwork's not done, I know the drug product doesn't make it out the door. So we try to step in to, you know, help support your quality teams that are trying to do their lot disposition, which is, like, the release of the drug lot. It's, it's a mountain of paperwork, and, you know, some of it's kinda templated. It kind of follows the same flow. So we tap into that flow with the templates and how everybody likes it to sound and read, and that's really how how we help, you know, get the proper well made drug product out the door to the patients in need.

Heath Fletcher:

Is there an aspect of the of the technology that you're using that is helping with with the drug development side of things, or is it mostly an administrative venture?

Chris Dayton:

So we don't do much in the way of development. Mhmm. We don't want to enter in we don't wanna create, like, a product, if you will, a system if it's not the best system that we can make. Right. You know, there AlphaFold is some good open source stuff out there for protein folding.

Chris Dayton:

There's somebody that's better at, like, you know, the protein folding. They're trying to find the next good monoclonal antibody. Yeah. On the flip side, I mean, we our expertise was quality and technical operations. So we we live in the language and reasoning model world, and, we we do pretty pretty amazing stuff with it.

Heath Fletcher:

That's cool. It's I mean, it's it's great when you can you come from an area of expertise, and then you're so you you know the gaps. You know the weaknesses. You know the the holdups and the the setbacks. So it's cool.

Heath Fletcher:

You're kinda taking that knowledge and and now implementing it and using it with this artificial intelligence technology, which is really interesting. And it's it makes sense. You know? You know where you know where the gaps are.

Chris Dayton:

Yeah. I actually had a joke about that when we first started that, the first couple models, they sounded exactly like me when they would write. I was like, well, these these are the best models at doing this out there, and I'm the and I'm the one they sound like. That must mean I must be the best out there doing this. That that joke has a tendency to fall flat in some crowd.

Chris Dayton:

But

Heath Fletcher:

So it's made in it's made in your image, I guess.

Chris Dayton:

Yeah. Yeah. It there there's definitely some times where, like, you see me and, and our CTO, Nick, where you read through some of the stuff, some of the output, and you're like, oh, that definitely sounds like me, and that definitely sounds like Nick. And you can kinda see where it's kind of absorbed some of some of our writing techniques and, you know, how how we investigate things.

Heath Fletcher:

Hey. All that all you gotta do next is create the AI of yourself and voice, and, you can just roll it all roll it all out, and you'd be, the two of you just rolling it all out. It look look like you and sound like you.

Chris Dayton:

Yeah. Yeah. That's, believe it or not, that's actually something we've talked about. We're we're kind of calling it, like, the ghost models where, right, you have these folks that have developed something, and then they move to a different department, different company, onto the next project. And it's like, well, they take all that knowledge with them.

Chris Dayton:

Man, it'd be really nice. Like, know, you if you had their ghost around that you could talk to that would have all that information and be able to answer all of your questions

Heath Fletcher:

Right.

Chris Dayton:

A year or two after they leave, and that's actually something that we're we're kicking around with now with some of the consultants and everybody we work with to be like, hey. We should we should really start building these ghost models.

Heath Fletcher:

That's an interesting idea. I mean, it was it and it I guess the IP is yours. Right? I mean, that's it belongs to the company with whoever is working there. And so it just it's really just emulating whoever it was that came up with that or or or developed it is there to tell you what they did and how they did it.

Heath Fletcher:

That's that's an interesting approach, actually. Yeah.

Chris Dayton:

The the way we like to think of it, like, we have one model that really just functions as, that QA assistant. Mhmm. And the way I like to think of it is, you know, like, when you're when you're really trying to grind through something and you're like, oh, I need to go talk to, like, the QA director about this, but they're always in a meeting because they're the QA director. Like, man, it'd be really beneficial if, like, I had something that just, like, knew the regulations or new industry best practices so you can kind of bounce ideas off of it and do some brainstorming without having to kick down somebody's door in the middle of a meeting for

Heath Fletcher:

Right.

Chris Dayton:

You know, five minute question.

Heath Fletcher:

Interesting. Well, it'd be interesting to see what you guys come up with with that one.

Chris Dayton:

Yeah. We're that that's something that's piqued our interest in the last few weeks. I think I I think that's some a direction we're gonna have to add.

Heath Fletcher:

Some value added products there.

Chris Dayton:

Yeah.

Heath Fletcher:

Yeah. So when you were early days, when you were thinking about this as a business idea, you know, what were kinda what was the was there any sort of sort of setbacks or challenges that you were kinda like, I don't know if I wanna go down this road? Or, you know, was this just like, yeah. Let's just let's dive in. Let's go right into it.

Chris Dayton:

So it was I got the idea driving home. You know, I was reading the news about these four eight three letters, and I've been there. Like, if you have a site at your company that gets a four eight three letter. Right? You know, the ones we're talking about where you the FDA does an investigation.

Chris Dayton:

You now have to respond to all this. Usually, you know, the rumors that kinda go out in the company are, oh, man. There's gonna be some layoffs. There's gonna be some type of impact.

Heath Fletcher:

Oh, really?

Chris Dayton:

Yeah. And we just you know, the way we approach this was we don't want bad locks making it out the door. We don't want somebody that shows up to work every day that does the best that they can to lose their job because at a different site, you know, a deviation maybe goes sideways. It doesn't get investigated correctly. Now there's, you know, the FDA scrutiny.

Chris Dayton:

And it just kinda hit me that this is probably something we can use these models for. And it was one of those ideas that I just couldn't shake, you know, day two, week two, week three that Mhmm. I just I reached out to one of the guys I was working with, and I was like, alright. What do you know about language models? And he's like, I don't know.

Chris Dayton:

Let's find out. And, yeah, we just we just started running with it. To answer your question, yeah, there have been some setbacks. Nothing quite goes the way that you hope and you plan right off the bat. I know everybody in the AI world, they talk about hallucinations.

Chris Dayton:

That was quite the trick to get it to not hallucinate. Really? There's some prompting that goes into it. You have to do really good training. We developed a good RAG system.

Chris Dayton:

I don't know if you're familiar with RAG or you're listening. It's retrieval augmented generation. Right? So, one of the things that we really overcame was when you do these deployments at, let's say, like, a pharmaceutical company, everybody says, oh, we wanna train it on SOPs. I can tell you we tried that.

Chris Dayton:

It doesn't work very well. Either the main reason is as these SOPs go through revisions, you start to get conflicts in the neural network where, you might be asking it a question and your current revision is, you know, let's say, revision seven, but you've been training it since revision two. It might try to pull information from revision two to answer your question, and that's not current and effective. That's all these other problems. So we developed a really robust rag system to go along with it, which is your retrieval augmented generation where it will pull from your current and effective procedures and use that to supplement your prompting rather than having to train the model on it.

Chris Dayton:

And that that in itself took a hot a couple minutes to figure out if you

Heath Fletcher:

Yeah. No doubt. And and just explain what hallucinations are just because I you know, some people may not know really what that means.

Chris Dayton:

So the crux of a hallucination

Heath Fletcher:

on some sort of recreational drug or something.

Chris Dayton:

Yeah. So these, these hallucinations, so I guess to take a step back, these language models, how they're trained, is either organic or synthetic datasets. It basically, question and answer pairs. And as you train these models, every question has an answer.

Heath Fletcher:

Right.

Chris Dayton:

Very, very rarely in the training data is the answer to a question just flat out, I don't know. So when, you ask a model a question that it doesn't know the answer to, it just creates an answer because it's not trained to say,

Heath Fletcher:

I don't know. Know. Right.

Chris Dayton:

Yeah. So in a regulated environment, you definitely do not want it to just start creating SOPs and creating regulatory guidelines. And, oh, this will probably be fine because and then it just kinda trails off into something that it's creating. So that's, yeah, that's we we've done a lot of work to tamp down those hallucinations to the point where now it just if if it tells you to do something, it just assumes that you listen to it unless you told it otherwise. That's really the only place we're seeing

Heath Fletcher:

them now. Oh, very interesting. I mean, we've all experienced that. You used ChatGPT for something, and it it you're right. It will never tell you it it doesn't know.

Heath Fletcher:

It'll give you anything, and it's not always necessarily accurate. Yeah. And that's kind of the scary thing, especially when you're talking about a world of where accuracy is extremely important in your world.

Chris Dayton:

Yeah. A a thousand percent. And those those hallucinations when we first started, I mean, even to today, when we talk to, like, the QA and the regulatory folks, because that's usually who we well, them in IT. Right. They always ask, well, I've heard about hallucinations.

Chris Dayton:

What's happening? Why is that a problem? And then you kinda gotta walk them through, a little bit of the the the the technical like, how these things work. That's actually why I started a a webinar series. It's, you know, really short fifteen minute webinars.

Chris Dayton:

The first one was like, hey. Here's some key terms. Let's make sure we're all speaking the same language. Perfect example. Right?

Chris Dayton:

The software folks, you say API, they instantly go to API call. That's how we're transferring information back and forth. I go to the, you know, a QA director, and I go, oh, yeah. API. And the first thing they think of is active pharmaceutical ingredient, and those two could not be more different.

Chris Dayton:

Yeah. Those are two completely different things. So it's like, well, here's here's some new acronyms. Pharmaceutical industry, we love acronyms. We love a good acronym.

Chris Dayton:

But you kinda have to do that education of like, hey. This is the these are some new acronyms. This is how these things work. They don't work on words. They work on tokens, which are little pieces of words that it sticks together.

Chris Dayton:

You know, context windows, all of that. And it's there's a good learning curve to it. But once everybody is speaking the same language, it it gets a lot easier. Yeah.

Heath Fletcher:

I bet. And is that so is the hallucination thing, does that is that something you're always having to work with with with the technology? Like, are you always sort of reviewing it, making sure I mean, how where does the sort of the learning aspect of AI fit in here? Because it you know, you talk about, well, AI is constantly learning. It's always learning new things.

Heath Fletcher:

And are you always having to sort of test it out and make sure that it's, you know

Chris Dayton:

That is a great question. Okay. So in the normal world of AI, right, you have, like, your stateful models, which are like your chat GPTs that you make an account. It kinda remembers the way that you like to talk. Mhmm.

Chris Dayton:

In the world of, like, regulated pharmaceutical manufacturing, that's that's a pretty big no no. Right? Mhmm. You don't want models that are constantly learning from interactions people are having for a couple reasons. Right?

Chris Dayton:

Mhmm. If I'm doing multiple drug products at my facility, I don't want data from drug product a to filter into the results of drug product b. Mhmm. I don't want client information a going in to answer a question for client b and vice versa. Right.

Chris Dayton:

So that's one of the things that we've actually done where we, we do stateless. So each chat is, independent of all the other chats.

Heath Fletcher:

Oh, they're siloed. Yeah.

Chris Dayton:

Yeah. That's kind of the way you have to do it. So you're not getting any of that data bleed from one into another.

Heath Fletcher:

Spilling over. Interesting. Wow.

Chris Dayton:

Yeah. Which also, on the regulatory aspect of it, that allows us to do like, some people call them PCCPs, predetermined change control procedures

Heath Fletcher:

Mhmm.

Chris Dayton:

Where we can actually leave it to QA and or regulatory to go, hey. We actually have an update coming out. You know, there's some new regulations that came out. We have to update the model training. This is what we're going to train it on.

Chris Dayton:

When you guys are ready, let us know. We'll, you know, launch your change control, make sure that you're all good. We're gonna release a new version. We can come in come out, plug in the USB, update all the model training, make sure that it works well on the sandbox, that all the prompting still is the way that QA likes it, and then we can just, you know, roll over into the actual deployment for the the new version. So we we we don't do continuous, like, updated training.

Chris Dayton:

Ours is very version controlled. Mhmm. We don't learn from chats that people are having with the model. We do use the Rag pipeline, you know, retrieval augmented generation to pull in the current and effective SOP. So you're always getting the, like, like, what is effective today while I'm asking the question.

Chris Dayton:

Yeah. But because of that, we're not we don't function the system itself doesn't function like a normal chat GPT where it's knows everything. Like, it's very domain specific versus, like, your public, which is, you know, a whole thing of its own. But the domain specificity, like, I describe it. Chat GPT, really good if you wanna, like, draft an email, you know, draft some HR policy.

Chris Dayton:

But if you need to dig into, like, why is my pH a little different or what's the best way to answer this observation? Like, you want something that's not necessarily gonna be giving you a really good, like, chocolate chip cookie recipe. Mhmm. You want something that's really focused on, like, how are the regulators gonna look at this and what's the right way to word this so I don't open myself up to anything else.

Heath Fletcher:

Oh, wow. That's very interesting. And then how customized does everything have to be for each client for you? Like, you there's a certain probably a certain system of ways that you deliver it, but do you have to customize it slightly for or or or quite a bit?

Chris Dayton:

Yeah. So everything so that's everything that we do is completely custom to the clients. Maybe not. There's a couple little things that don't necessarily have to be. So, like, rag pipelines, those are completely customized.

Heath Fletcher:

Right.

Chris Dayton:

The the system itself, we have the base architecture of how it works with our datasets that we train it on, as well as some of the prompting stuff that we do, some of the parameter tweaking to get it good. But as far as a client deployment goes, it's gonna be completely custom. The only exception is if you're dealing with something before it becomes, company confidential, which really the only time we see those is if you have those, like, start up biotechs. They're like, hey. We're starting up.

Chris Dayton:

We're gonna be outsourcing everything, and everybody's working remote. But we do have those, like, 20 or 30, like, base SOPs, like your dot controls, your quality management, all of those. Mhmm. We actually have some stuff that will draft those in a chatbot that will sit alongside of it to really bolster them up, you know, check it against your regulatory stuff, make sure it's good. And then once we hand those off, now it's client confidential, and then that goes into the next realm of if you want the system, we can deploy it and all that.

Heath Fletcher:

So cool. How did you when you first sort of threw this out there so that, you know, the world could see it and potential clients could start looking at it as a as an option. What was your initial feedback from people? Just

Chris Dayton:

The first the the very first initial feedback that we get even to this day time and time again is what makes yours different from ChatGPT? Like, why can't we just go there?

Heath Fletcher:

Kidding. Really?

Chris Dayton:

Yeah. I mean, it's a great question. Right? Yeah. GPT five came out, what, two, three weeks ago?

Chris Dayton:

You know, it's not everybody likes it as much as GPT four. It doesn't have that, like, sicko fancy where it kind of supports you on everything. Yeah. But the question we always get is why are you guys different? The reception from users has been far and away some of the best feedback we get.

Chris Dayton:

They're like, we had one user that goes they re kinda, you know, doing one of those pilot projects to see us versus Chad GPT.

Heath Fletcher:

Right.

Chris Dayton:

And then go one of the conversations he had with it, he goes, oh, you know, it's it's nice that you're back up. You know, we pushed a new version. Oh, yeah. It's nice you're back up. I've been using the the Chad GPT bot.

Chris Dayton:

He's cool. He's like my little cousin. Like, he's fun to hang out with, but he's not really that helpful. Yeah. The people that use it, I mean, we're shaving 30% we're seeing 30% time reductions from when never.

Chris Dayton:

Like, a deviation is assigned to a writer until it's to QA. So that's you're drafting your investigation, everything else, and we're shaving, what, four days off for somebody that's never written one before compared to somebody that is a seasoned professional because these things do take, you know, eight, ten, twelve hours of just sitting there really focused typing.

Heath Fletcher:

Right.

Chris Dayton:

And our system could write a deviation in in ten minutes. You just give it the information. It'll just section it off to where it needs to be sectioned

Heath Fletcher:

That's incredible.

Chris Dayton:

Putting the rest of your filler words.

Heath Fletcher:

That's incredible. Wow. So the what is the what what is the kinda is one of the you know, is that closed system? Is that that's a really must be a very important aspect for people once they know that they kinda feel their comfort level seems to go up in that process?

Chris Dayton:

Yeah. So they do like the closed system. They like to know I like to say, you know, we use the best, IT security in the world because we use your IT security. So it's whatever it is that the the client is comfortable with on their site, it legitimately plugs in. We do a DNS forwarding, so we just leave one, you know, IP port exposed and all the information travels through there.

Chris Dayton:

It it helps with the the legal you know, we talk to legal departments, the IT folks that, you know, we're not like, oh, you know, API calls going out. And then even with some of the, like, your GDPR, your good data basically, your data handling for, like, Europe and The UK. Right. You can't really be tagging it to a person and all that other stuff. And it's well, we're not sending it to offshore servers.

Chris Dayton:

It's it's you know, I can

Heath Fletcher:

Staying here.

Chris Dayton:

The room that it's in. Yeah.

Heath Fletcher:

There's

Chris Dayton:

there's we're another good thing is we're not collecting data. I I don't want client data. I'm not trying to, you know, sneak around and start my own pharmaceutical company. Like, your data is your data. It should stay with you.

Chris Dayton:

It should live with you. And with how much, you know, these pharmaceutical companies can dump into language models, you know, process specifications, the actual process as a whole, all your critical process parameters. I mean, if it was my company, I wouldn't feel too great about those just kinda getting sent over to a server somewhere. And, you know, what if it has to get undeleted? What if, you know, data gets caught halfway between?

Chris Dayton:

That's we that's kind of where we started. And, you know, like like we said earlier, we don't need massive data farms. It's plugs in your a regular wall wall outlet.

Heath Fletcher:

Yeah. Like you said at the very beginning, you provide it's hardware, so you're actually bringing in a piece of equipment, and that's where the data stays. Right? It's not it's not an it's not on a cloud. It's not on a satellite.

Heath Fletcher:

It's not floating around virtual. So it's actually on-site in that company's space. Is there backups? Does the backup go somewhere else?

Chris Dayton:

Nope. So the we actually completely air gap when we deploy. So there there's no Internet connection at all. Wow. The data the data is stored both within the system, and then I guess you could say we have a backup in that same set of hardware.

Chris Dayton:

But that's for your, like, 21 CFR compliance for you know, make sure that your data integrity is there. It's gotta be enduring. I mean, one of Alcoa Plus. I'm not sure if you folks are familiar. It's like attributable, legible, contemporaneous.

Chris Dayton:

It's how you have to document your data Okay. For the FDA. One of them is it has to be enduring. So once you generate data, you can't do anything with it. So our data, as it's generated, it's locked.

Chris Dayton:

It's actually really easy for QA to go back through and audit it. They can just hop in with their admin privilege, click on a person, click on the chat, and you can see everything that goes back and forth with it.

Heath Fletcher:

Oh, wow. So great. Like, it's, yeah. It's amazing. Yeah.

Heath Fletcher:

We I I have no doubt you're getting a lot of good feedback.

Chris Dayton:

Yeah. We so the way that we designed it, was when we sat down with, Nick and myself sat down originally, I don't remember how long ago now, we asked ourselves if, like, today, I had an AI system, what would I want it to do, and how would I build it coming from, like, the pharmaceutical side of things? So it's not like we're software developers that are trying to, like, shoehorn a Chad GPT account into something for, pick any large pharmaceutical company. We're we were on the other side of the fence. We knew where the headaches were.

Heath Fletcher:

Right.

Chris Dayton:

Where where you effectively can use it to bolster things up. And then, you know, the number one question is always where's the data? So we don't build models unless we know that data is gonna exist to really support the boots on the ground folks like your deviation writers that

Heath Fletcher:

are

Chris Dayton:

I mean, I've done it. I I know how stressful it get. You can you can burn out pretty quick. You know? Because at the end of the day, you have to do it right for the patients.

Chris Dayton:

You wanna make sure you investigate everything. You chase down all your different avenues so that you're sure that you're right before you release. So, you know, we developed a system that tells you what to investigate, and then you just give it your notes, and then boom. There's eight hours of writing you don't have to do.

Heath Fletcher:

Yeah. And you just identified really something very important, burnout. I mean, so time, burnout, human resources, I mean, that's that's really what you're you're you're leveraging all your business on is is humans. Right? And so if anywhere you can kind of provide that kind of support or that kind of relief on on burnout or or or just exhaustion, you know, that's that's an amazing ability.

Chris Dayton:

Yeah. That's, so I when I was when I was running the technical operations group, there was a lot of, like, deviation writers that come in. Oh, you know, they're from manufacturing. They kinda wanna step out of that being on the floor role.

Heath Fletcher:

Mhmm.

Chris Dayton:

And it's it's a good two years before somebody's really solid. You know, they start off you know, you learn your root cause analysis tools, your your six m's, your fishbone diagrams, your five y's, which are just, again, probably more acronyms for how to investigate everything. But it takes some time to learn how to investigate, to learn how to write it, to learn how to draft it so QA is okay. And then when you get audited, you know, your the FDA or the EMA pulls your recent deviations. They wanna talk to the people that investigate them.

Chris Dayton:

So there's really an art form to writing it.

Heath Fletcher:

Mhmm.

Chris Dayton:

And it takes two years until you get people trained. And then when they get really good, you know, the the curse of competency, they get more and more work. And then you got about a year and a half usually before they start to burn out and find another job, move over into QA, or just, you know, leave the organization, and now you're stuck, and you gotta do it all over again. So this

Heath Fletcher:

And during and during the whole process, someone else has to review it Yep. Their work. Right? As they're learning, someone else has to review it and proof it and and everything else. And so you got more than just one person involved in that whole process of getting somebody to a point where they are, you know, writing these things independently.

Chris Dayton:

Yeah. And there's a there's this other thing that happens. This is more of, like, the inside baseball stuff. When you write these things, sometimes there'll be a couple different people, like your your MSAT group, your manufacturing sciences technology, your your process development. You'll have a couple questions.

Chris Dayton:

They'll write a few sentences here and there, maybe a paragraph, and you'll have this mosaic of a report that I see what I wrote, and then this is, like, where M sat, and then here's PD. And now I have to sit in front of an auditor, and I'm like, oh, man. Those two sentences don't sound like me. So one thing that we do is when we sit down with QA, we workshop it with them to make sure that all, like, that background, like, you know, prompting actually works to the tone that they want. Mhmm.

Chris Dayton:

So when everybody uses it, it's a very smooth reading report. Mhmm. And not that kind of choppy where you can feel the tone changes from person to person to person. Right. And that's that's actually a pretty big thing because it it does help a lot to just have a smooth report.

Chris Dayton:

It it flows a lot better. It also if it will help kind of chain that logic together that as they say, you can, like, build the narrative to say, this is what happened. This is what we investigated, and this is why we're sure it's okay.

Heath Fletcher:

Mhmm. Wow. So how did you find this transition into your your role as CEO? I mean, you're the co you're a cofounder and Nick Nick's your your other cofounder. Right?

Heath Fletcher:

Yep. How did you transition into the CEO role? Did was that smooth for you, or did you have some yeah. A bit of a learning curve yourself?

Chris Dayton:

Yeah. It definitely it's definitely got a bit of a learning curve. The good news is I've had some good people to work with that have, you know, like, started up businesses before.

Heath Fletcher:

Yep.

Chris Dayton:

I will say everybody when they warn you, like, oh, you know, there's gonna be some ups and downs too. And you're like, oh, I'm ready for some ups and downs. And, that roller coaster of emotions is something that, no matter how many conversations I had, I I definitely was not prepared for it. Like, there's some days where you're just like, oh, man. This thing is working perfectly.

Heath Fletcher:

Yeah.

Chris Dayton:

And then you try an update or you try, like, a new model for something, and you're like, oh, man. We are starting out right at the beginning again.

Heath Fletcher:

Yeah. There is some hallucination going on there too. Different kind.

Chris Dayton:

Yeah. It I'll put it this way. There's not always an answer to every question that that

Heath Fletcher:

you have. No. So you you mentioned role models. You had some good role models in the past that you worked with, and so that was part of what helped you through that process. What else did you leverage?

Heath Fletcher:

What else did you kinda lean on in in this, you know, in this you know, it's a lot of personal growth, you know, becoming the CEO of a company, especially one that you founded?

Chris Dayton:

Yeah. So, like I said, we do so some some of the good news is, you know, through my career, I've always had, the mindset of whenever somebody needs help, do whatever you can to help them because you never know when it'll come back to help you. Right. That's turned out in spades for us. There's some folks that I've worked with, you know, those 20 consultants that are really sharp as attack.

Chris Dayton:

To this day, I can still call them, work through some of the questions I have. Hey. We're, you know, we're rolling out a new model for, like, quality risk management. Like, you got a couple hours to kinda kick the tires on it, test it out? And what we found is that, you know, the folks that are consultants or that have spent so long in pharmaceuticals, they kinda wanna see what's out there in the AI world, and they're very willing to just, you know, step in and help from those late nights when, you know, I was helping bail them out of problems.

Chris Dayton:

They'll, you know, hop on a meeting for a little bit to try to bail me out of one. That's been really helpful. The other thing is I'm a firm believer that you can't give from a cup that's empty, so you have to make sure that you're taking care of yourself. Mhmm. You know, you have to hold yourself to those schedules.

Chris Dayton:

Late nights are gonna happen. You just roll with them the best you can. But just the overall health of the organization, I think, really comes down to the health of the people that are, you know, working in the organization. We're we're not a huge company, but I do I do try to make sure that everybody's in in a good space before before the really big work starts to hit.

Heath Fletcher:

When did you actually go to market?

Chris Dayton:

We went to market in October.

Heath Fletcher:

K.

Chris Dayton:

We we spent a good time in development. In not running a business before, I didn't realize that, especially in these tech circles, people will have an idea and just try to lock down some funding and, you know, kinda go to market with hopes and dreams, if you will.

Heath Fletcher:

Yeah.

Chris Dayton:

We built it. We thought we hit what was, you know, minimal viable product, that MVP. And then we deployed, and we're like, oh, man. This thing works better than we thought it was going to. We probably could have done this, you know, hopefully six months ago and have been fine.

Chris Dayton:

But, yeah, it's that we we launched in October, and since then, we've just kind of been talking to, you know, folks in the quality field. If you know, ask them the question, where's your hang ups right now? What do you guys what are your real big time sinks? And then from there, we try to workshop, could AI help with this, or is this just the data doesn't exist that this is just not something we'll be able to

Heath Fletcher:

And when you when you find that that client, you know, what's the onboard process? Is it, you know, is it once you engage? Is it a a two month window? Is it six months to kinda get them up and rolling? Like, what's that look like?

Chris Dayton:

So we've usually, the longest part is getting all of, like, IT, legal, and QA on the same page at the same time. Right. From there, rolling out, it's usually about a month or so of, like, data normalization, building the data layers, all the DNS forwarding, the IT stuff. And then we'd like to do usually another month or two of, like, the consulting side, like, the AI maturity assessments, making sure that quality is ready for it. We have a pretty robust training program that comes along with it.

Chris Dayton:

SOPs work instructions of, like, hey. This is where you can use it. This is where you shouldn't use it. There's a big thing where, you know, people have to understand that it can be influential. Like, even the FDA put out some draft guidance in January of model influence versus, like, where the impact of the decision, and you wanna, like, tamp down your risk profiles.

Chris Dayton:

So there's a lot of training that goes along with that of, like, you know, you have to make sure that you keep that human in the loop to review everything. You don't wanna just copy and paste it. You don't want it to create a document and then route it for approval without that human set of eyes.

Heath Fletcher:

Yeah. Right.

Chris Dayton:

And that's kind of hard to do when you're talking to folks that have been burnt out for a while. They're like, man, I just wanna hit seven buttons, and this thing would be done with it and out of my hair. But you really wanna, like, hey. Like, you know, take some time. Read it.

Chris Dayton:

It generates a lot of words very quickly. Make sure that you're you're actually reading the thing before you submit it. That's why we have ours function as a chatbot so that you you are sure that you have that human in the loop review because somebody has to take it out of the chat window and then put it into your, like, Trackwise or your Viva, whatever your management system is

Heath Fletcher:

Right.

Chris Dayton:

To make sure that, like, oh, this is the person that put this in. They're the ones that signed for it. This is why we know

Heath Fletcher:

it's Yeah. That makes sense. Yeah. And then from a scalability point of view for you, for the company, how quickly can you grow this? How how you know, because it it it is a certain amount of hands on, and you actually have to go on-site and install hardware.

Heath Fletcher:

And so what does the scalability look like for you? Can you how quickly are you would you think you can grow?

Chris Dayton:

So the good news is we've kinda shifted a bit on how we do our deployments. It used to just be, like, myself and the folks who, you know, quality assured. We call ourselves QA AI for short. Instead of just doing the QA AI deployments, we've actually started working with other consulting groups. So, even independent consultants, we work with them to kind of work work them through, like, this is the AI maturity assessment.

Chris Dayton:

This is how you wanna look at it. You wanna make sure that everybody's trained. Like, if you talk to somebody in the c suite versus, like, management versus more boots on the ground, do they all know the same kind of terminology which

Heath Fletcher:

For sure. Yeah.

Chris Dayton:

Comes in.

Heath Fletcher:

Probably not. Yeah.

Chris Dayton:

Yeah. And the other thing is, I like to say we do words, not wisdom. So we found that partnering with consulting groups really helps because our models will get you 90% of the way there, but you need that wisdom that comes over top of it. You know, it'll generate an SOP, but we, you know, we have the checks in place to check against the regulatory bodies. But you're not gonna replace twenty years of wisdom of seeing so much stuff go wrong and being through so many audits that you really we found you really do need that extra layer of just consulting wisdom of the folks that have been there.

Chris Dayton:

Yeah. That's why we pivoted to working with more consulting groups, which I guess that's the long way to answer your question. Scalability. Yeah. We use a lot of NVIDIA.

Chris Dayton:

The the CUDA architecture is there. So if there's more data that we need to train on, we can drop in more GPUs. And then because we work with consulting groups, it's kind of like a force force multiplier for me, the you know, my team so that once we know that they're trained up, they can go out and help with the deployment as well, which we found they really like because now they're part of the AI world, and they're not gonna be left in the dust. They're gonna be the folks, that really know what's going on and know the regulations and everything.

Heath Fletcher:

Well, you brought up a good point. It's, I mean, people are being asked to find ways to incorporate AI into organizations to streamline things. And so there are people that their job is to actually go and find a way to get our company in the AI game Because there are some peep some companies who haven't haven't found a way yet. So this is one way, anyways, for for some organizations in in pharma to get out there and and get some AI into their into their network. That's cool.

Heath Fletcher:

So when we first talk talked, you were talking about a webinar, and you and you were you're doing webinars as part of them as part of your marketing initiative. Is that right? It's a way to educate the public? Mhmm.

Chris Dayton:

Yeah. So marketing, we're kind of approaching it from a couple different ways avenues, if you will. We do the webinars to try to educate folks. We think that AI literacy is something that people really should know about. Mhmm.

Chris Dayton:

Whether it's from what is a token to what are hallucinations, some of the stuff we've talked about all the way up to the real big one is does the data exist to try to solve this problem that we have, and then kinda how to scope that out. We also do a lot with, oh, this is how you properly scope out a pilot project. This is how you know, where you should look for a quick ROI. There's you know, it's always helpful if there's an inspection coming and somebody needs to get through a lot of paperwork. You see that ROI a bit quicker.

Chris Dayton:

But the other side to the marketing, besides just, like, the webinars and the the outreach is we do, like, events for PDA. Like, I believe it's the Perinatal Drug Association. I probably pronounced that wrong. Like ISPE for your, like, International Society of Pharmaceutical Engineering. We try to do these to just educate the folks that are there.

Chris Dayton:

I like to say, like, you know, always set up a meeting, reach out. If we can help with it, we'll absolutely try to. But if AI won't work, we'll still part as friends, and at least we got to have a cool conversation about what AI can do.

Heath Fletcher:

Mhmm.

Chris Dayton:

But that education piece of even if you don't go with our system, I would like to know that at least you were a bit more educated and you were making a proper choice just because we were able to talk about it.

Heath Fletcher:

Yeah. I like that. And then, you know, growth growth for you, looking looking ahead, you know, where do you wanna be where do you wanna be in the next two to five years? Is this growing the venture, or is it something you may be looking at acquisition or something like that?

Chris Dayton:

So right now, we we I mean, we have some big plans. Right?

Heath Fletcher:

Okay. Good.

Chris Dayton:

We we're with what we have. We're building the models that we can, the architecture, architecture, but we really want to be able to take, like, somebody that has an idea for a drug product, incorporate, like, our language and reasoning models and be able to build all that paperwork, scale into, your INDs and your BLAs, so, like, your investigational new drug findings, your biological license applications, the ones to really get you out there. That's a lot more paperwork. So, know, but you need bigger and better models to do this. Another thing that we'd really like to move into is be able to just build a quality system from scratch.

Heath Fletcher:

Mhmm.

Chris Dayton:

So, like, I was saying, like, hey. You have a drug product idea. I'm not gonna say it's just hitting a couple buttons, but a lot quicker than it used to take. We can now have your quality system in place. And then moonshot would be to automate some of the the simpler stuff, maybe do some more in the way of machine vision, tap into those security cameras.

Chris Dayton:

Like, you know, you need to, like, clean a sealant like, clean a BSC, right, biological safety cabinet. Let your disinfectant sit for fifteen minutes. You know, you're in there. You clean it. It kinda starts a timer internally.

Chris Dayton:

It pings when you're at fifteen minutes, so you know you've done it right. And, if not, maybe that happens on a Sunday, and you show up Monday and your deviation's halfway written because it's got a lot of historic data to pull from.

Heath Fletcher:

Very cool. Well, yeah, this sounds like the there's no limit to where you can take this, and, and the technology is just gonna improve as time goes on and and the better the better it'll get, I'm sure. So awesome. Hey, this has been really interesting and and thank you so much for you really picked up on the the acronym explanation quick. You said them and then you immediately defined what they were.

Heath Fletcher:

So thank you. I know listeners will appreciate that too. So

Chris Dayton:

Oh, yeah. I always always forget about those.

Heath Fletcher:

I know. Yeah. Every industry has got them. So I I remind everybody. So thanks for doing that, and thanks for your time today.

Heath Fletcher:

Is there any parting words of wisdoms for, you know, someone maybe in your shoes that's working their way into the industry that you would like to share?

Chris Dayton:

Yeah. Don't don't be afraid to try it out. There's some small if you really wanna start, there's some smaller open source models out there that you can kinda start, you know, kicking the tires and playing around with. It's a really great place to get your feet wet. If if you really believe that AI is coming for, for everybody's jobs, which I I do this for a living, it definitely isn't, you know, try to see see if you can teach it what you know how to do, and, maybe you'll be the next person, that that's got the great idea that's really filling that void or that niche that, nobody really identified as being a problem.

Heath Fletcher:

Right. That's good advice. And how about for listeners who are actually thinking, oh, I need this. Who do I talk to? Where do I go?

Heath Fletcher:

They can go to the website, which is, what's your address?

Chris Dayton:

So we are qualityassured.ai. You can check us out on our website. It's got some of the stuff up there. You can reach out on LinkedIn. We have, like, a contact us page on the website.

Chris Dayton:

Reach out. We can set up a demo. If you guys have any great ideas or any of your listeners that, you know, this is a headache, this is something that I would love to not be a headache anymore, You know, shoot me a line. Shoot me a message. I'd love to talk about it and see if it's something we can help with.

Chris Dayton:

That's where we get some of the coolest ideas of, oh, wow. This is a problem they have. I think we might be able to solve this in a in a unique and interesting way.

Heath Fletcher:

That's great, Chris. Alright. And I'll put all those links in the notes below so free people take they can they can just click on them and and get in contact with you. Thank you again for your time today. I really appreciate our conversation, and I wish you all the best.

Heath Fletcher:

Okay. Well, that brings us to the end of that episode. Lots to learn, from Chris Dayton, CEO of Quality Assured AI. Man, he shared his personal journey, into the role of CEO, and now he has been able to harness his passion for tech and problem solving to tackle challenges in an industry he knows deeply. Chris highlighted the value of that closed system or air gapping as he called it, and the customization of a highly regulated environment in pharmaceutical.

Heath Fletcher:

And the real world impact of reducing the time and the quality assurance that it takes up is 30%, addressing burnout and also creating more space for innovation beyond the technology. Thank you for joining me for this episode of the Healthy Enterprise. Please subscribe and share it to with someone you know that would appreciate it. And I hope you have a great day. Thanks for listening, and we'll see you next time.