<img src="https://secure.office-information-24.com/785669.png" style="display:none;">
QT9 Q-Cast

Episode 7: Software Validation: A Practical Guide

Watch Episode 7 Below

Episode 7: What happens when an FDA investigator asks for your validation evidence—are you ready, or scrambling? In this episode, QT9 Vice President Max Austin breaks down FDA and ISO 13485 software validation requirements in plain language. From risk-based validation to common misconceptions and real-world inspection scenarios, this conversation demystifies what “fit for intended use” truly means.

Learn how QT9’s developer-level validation supports life sciences and medical device companies, why risk assessments are essential, and how cloud-based QMS software can significantly reduce validation burden while keeping you compliant and audit-ready.

What this episode explores:

  • What FDA and ISO 13485 really expect from software validation

  • Validation vs. verification explained in practical terms

  • Why risk-based validation matters—and how to right-size your effort

  • How QT9 provides developer-level validation to support inspections

  • Common pitfalls that cause audit stress and how to avoid them

Watch now to understand validation the right way—and see how QT9 QMS helps you stay compliant, confident, and inspection-ready. Learn more at QT9 Software.

Tags & hashtags:
QT9 Software, QMS validation, FDA validation, ISO 13485, software validation, life sciences compliance, medical device QMS, risk-based validation, FDA inspections, cloud QMS, regulated manufacturing
#QT9 #QMS #FDACompliance #ISO13485 #SoftwareValidation #LifeSciences #MedicalDevices #QualityManagement #RegulatoryCompliance

 

Episode Transcript

Christian (00:00)
Picture this. It's a Tuesday. The coffee is perfect. And an FDA investigator walks in and asks for your validation evidence. Could you produce it? Or would you have a panic attack on the spot? today we're unpacking what the FDA and ISO 1345 expect from you.

What happens if you skip validation and how to do it right the first time? Our guest, the validation whisperer himself, Max Austin, vice president of QT9 is here to join me and talk you through validation. Max, thank you for joining us today. Now, as a starter, what's the number one most ridiculous misconception that you hear when it comes to validation?

Max (00:35)
There are so many. I don't know if I could really pull one. I could say with any regulatory standard is that they're always open for interpretation. And when I go through a train or talk to people about any kind of regulatory standard or whatever, I kind of treat them like they're a sandwich. know, meat in the middle and two slices of bread.

Christian (00:37)
This is tough to pull one.

Max (00:57)
want to get a little bit more in detail like a delicious deli sandwich from Katz Deli in New York. Which if you don't know what that is, the infamous scene from When Harry Met Sally, I've been there, great sandwiches. But you put like five pounds of meat in between two little slices of bread. And everyone approaches it like they just want the delicious meat and the bread is just kind of a thing that holds it all together. And you look at a regulatory standard, they will always have an introduction, the body work, and a conclusion at the end. There's so much good information that's contained within

the bread, if you will, that a lot of people just skip over. You can go to a simple 9001. So many people get confused between what is a shell, what is a should, what is a can, what is a may. They read it in the standard and it says you should do this. Is that a requirement? Do I have to do that? Or is it a shall where that is a requirement? So a lot of the stuff of just going through the bread of it.

could clear up so much, but so many people just skip over it and they get right to the middle and they just focus, hyper-focus on one part of it and they just don't understand it because they haven't taken the context of the entire document and you get a lot of confusion out of that. But within that, yeah, there's a million other examples of people getting things wrong and not understanding what they're exactly trying to achieve.

Christian (02:03)
Yes, and honestly, having dealt with validation for over a decade myself, I would say there's still potentially some questions that are out there. So it's great that we're doing this today. I'm going to lay it out for even the most simple layman in terms of validation. But let's roll back just a second and just start by defining what validation means.

Max (02:24)
Sure, so and again, you know, if you put it all into context, you get a much clearer picture. And it's maybe, it's not the best answer, but it provides some clarity as to why there's so much confusion. So the number one document that everyone always likes to refer to and go to is the FDA's guidance on validation. And they put a lot of weight into it or meaning behind it or strength behind it. And there's nothing wrong with that, but you need to put it in context. That guidance document was originally written in 1997.

Which means they didn't put it together in 97. They were working on it throughout the 90s. In the 97s, when they finally published it. It had a slight revision in 2002 or 2003, I think. It was not a substantial change. It was they're kind of cleaning up a few things. So you have to put yourself in that mindset. This document that we're all trying to be obedient to and honor and give justice to was written in 97. What did computers look like? What was the internet?

like in 1997, what were computer software programs like in 97? How do we get them? And it's very much written in a mindset of, you know, maybe I go to a store and I buy some software and I bring it back to my company and we install it on a local server or a network. And then we have to install client versions, maybe on each individual machine or PC. And laptop's really weren't thing back

Christian (03:33)
Yeah, 97 is taking it back.

Max (03:35)
It'd

really rich to have any kind of portable computing device in it that was about the size of a suitcase. In 2025, this is the document we're using to validate software systems. So the language in there, a lot of it goes back to that. You know, they talk about printable, readable format. A lot of software systems printing or printing a report wasn't something they factored into their dev cycle. They just made software. You know, today we make sure that every form has a print so you can

Christian (03:39)
Yeah, portable being.

Max (04:00)
print it out, make a readable format. That's why a lot of old keyboards have that button on there that says print screen. know, back in the day, in the 90s, if an auditor was looking at a company and looking at a software system that they were using in the production cycle, he might say, hey, I would like to see evidence of that. There might not have been a printout, you'd have to print the screen. So a lot of it comes from that kind of philosophy of what computing was like in the 90s. So unfortunately, we are today trying to validate software systems based on this very old and dated document.

I think that's where we start getting into a lot of the confusion and misconceptions when people are talking about validation.

Christian (04:31)
Absolutely, In terms of validation versus verification, I know some people get caught up on that. Would you say it's fair to say that verification is more asking did we build it right? validation is more of the system as we use it is fit for its intended use.

Max (04:50)
And I can even take it a step further and make it worse for you because there is the life sciences world in the FDA world of what verification and validation is. And in the software development world, verification and validation mean two different things. It's true. Yeah. And we sit right in the middle of it because we are a software developer selling a product that's used by medical device and life sciences. So we kind of get the get it from both ends. We get it from our devs and then we get it from our customers. know, what is verification and validation?

Christian (05:12)
Yeah, we have to.

Max (05:17)
the summary you gave was pretty accurate.

Christian (05:18)
Cool, cool. So what does fit for intended use actually look like for a QMS, let's say?

Max (05:24)
Sure, and that should be the number one question you're asking whenever you're looking at a validated system. Now, today we're mainly talking about QT9 quality management software, which is a quality assurance software. But validation applies to any software system that is going to somehow impact a product that will eventually, potentially affect the health and safety of a human being. So the scope and intended use, that should be the first question that you ask because that is going to set the path forward

to how much validation effort is required. It's going to impact your risk assessment to evaluating that system. If it is a system that is eventually a software system that is eventually going to wind up implanted in the human being, that statement, that scope statement is going to have a significant impact on how you validate it and how much validation efforts you put into it, as opposed to a more.

light software system, maybe it's a spreadsheet or something like that, that you're just using to keep track of records. And that failure ultimately, sure, could you come up with a scenario where a bad thing would potentially happen? Yeah, you'd have to string a lot of things together. But that ultimate failure would not necessarily lead to an immediate risk to safety or health of a person. So that statement is very important and critical in just setting the stage for what you could do next.

Christian (06:38)
Yeah, regulators in their infinite wisdom expect that software impacting product quality, patient quality, data integrity, any of that, it all needs to be validated. they also say then that the depth of that validation should be based on risk-based analysis, which to me kind of sounds like code for don't overdo it if you don't have to.

Max (06:58)
Yeah, and the FDA to their credit, actually call out many times in that document from 1997 is use the least burdensome approach. They're basically tipping their hat. My interpretation when I read that is saying like we we want you to make sure that your software has been tested and is working and then it's doing its function, its intended use function. And in multiple points throughout that standard, again, if you go into the bread and you look into it.

and you read the story, they talk about statistics, you know, prior to requiring validation, you know, how many problems did we have with patient care? And of all those problems, how many of them actually eventually had a root cause of a software failure or a code failure or a bug? And there was a significant number. And all they're basically saying is we're just trying to reduce that number. They understand you can never get it to zero, but in the absence of doing nothing, we should at least do something, which is validation and verification of your software.

to at least try to catch as much as we can and reduce it, and again, ultimately improve health and safety of end users.

Christian (07:56)
So ultimately, does sound like there are different validation methods based on the risk presented by any particular scenario or the software that you're using, piece of equipment, what have you. Now, a bit of a side question. Does equipment validation does that mirror what we see in software validation?

Max (08:14)
It does. So if you have a machine like say it's something as simple as a CNC machine and that manufactures parts that eventually go into a finished good. That's probably controlled by a computer machine. So you're going to want to run a test to make sure that when that piece is done on a CNC machine, it's working properly. If I told it to be two inches tall, is it two inches tall? I want to verify and validate that the outputs match what my inputs were.

So equipment, software, data processing, doesn't matter what it's doing. We just want to always want to make sure that the outputs are matching the inputs that we put in so that we have expected results. And that's basically the process. whether it's pieces of equipment, machinery, anything, again, it all starts, we want to define what the scope of this particular piece of equipment is. How does it fit into the big cog of all the things that we're putting in place to manufacture good?

And what's the risk if it fails or gets it wrong? Are there other safety nets downstream that could potentially catch it? Does that mess with our production? Do we need to make an improvement? A lot of things can lead to that, and that's when you get into the thick of

Christian (09:23)
At least they kind of mirror each other in terms of.

Max (09:26)
Yeah, and again, it's going to be a risk assessment. It's going to be a how much testing we need to do and what steps do we need to compile so that when that FDA outrider does come in, if they were to pick out that piece of machine or that piece of software, we're prepared with a packet that says we've measured the risk. We've taken these steps. We've confirmed all these other things. And ultimately, we've determined that this process is effective and we're confident in what we're looking for.

Christian (09:49)
So it is important to keep that risk register, if you will, with the validation, correct? Absolutely. What types of software do fall under this glorious scope of validation? mean, obviously QMS systems, ERP systems, those are wide ranging, but you mentioned spreadsheets, like spreadsheets with macros. Is anything safe from validation?

Max (10:11)
And again, that's why that first thing that you talked about a few minutes ago is so important. You know, what is the scope? Because the scope is going to determine ultimately where we go from there. So yeah, if you're looking at any software system or any automated system and the scope of that is going to have an impact potentially on health and safety of your end users or patients or whatever, then yeah, you should definitely be putting up with and validate that. And if you don't,

and something bad were to happen, then that would be bad.

Christian (10:38)
Problems. Problems. And that transitions us nicely into the next segment where we get a little dramatic, oops, I forgot, inspection scenario. walk us through a nightmare inspection scenario. An investigator asks for your validation evidence and you have nothing. What do they typically want to see in the first place?

Max (10:57)
Sure. The good news is with QT9, us specifically, for all of our life science customers, we provide developer level validation. So what that is, is we basically take the software application and we simulate what any one of our end users, our customers might use it for. It's fairly generic, it has to be. We have a wide base of customers, but we do our best to simulate it.

And the nice thing about our software is that it's not too overly specific, so we can use generalizations and when we're doing our testing in such a way that we can get consistent good results that most of our customer base can use. So all of our customers at any point will have access to the validation that's in our customer portal. They can log in, grab a copy of it. It is large. It's about 1,400 pages.

about six weeks to complete an entire validation. That's our entire team working on, you know, as we improve our software, make new versions. So it's a robust document. at least you would have that to provide for an FDA auditor. What I would suggest going beyond that is that, you know, if you're a brand new customer of QT9, first thing you would want to do is a risk assessment, you know, and you have software that can manage a risk.

Christian (12:07)
No,

I know, but software, there's a module for that.

Max (12:10)
So you just go to that module and you're going to initiate a risk assessment. That's where we're to start measuring the risk and we're going to start looking at our scope statement and doing planning as to how deep we're going to go with this. And this is where I kind of have to stop because every company is going to be different. I can't tell you how to create your risk. I can't tell you how many risk assessments you need to create. can't tell you, you you need to kind of receive that on your own. One of the misconceptions that a lot of people have is that we're going to do it all for you. And it would be nice if we could do that. We would love to be able to provide that as service to our customers. The problem is

And then we kind of get back to where we started in the beginning is that the documentation that we're all going off of is written in 1997. SaaS didn't exist. So there are certain components of validation that that document is looking for, as an example, your installation requirement. Our customers don't install the software. We do all that. They just go on a website and access it. So we do that portion for you. So we have an entire section dedicated to the installation qualification.

For the OQ and the PQ, we simulate that. We try to come up with test case scenarios and examples to demonstrate that to software. Inputs are matching the outputs, and we have screenshots, so many screenshots that go into it. And we produce this very large document that everything's been tested, all the buttons, all the fields. We come up with as many test case scenarios as we are to try to trip it up, and we are confident that this software will work for its intended use. Now, that's our description of intended use. Every customer will have their own.

So that's when they kind of go on their own. But that would be a good start to get going on your validation.

Christian (13:35)
Definitely, definitely. So if you've got QT9, you're covered. QT9 has your back.

it's definitely a lot easier if you've got a system like Qt9 where they provide a level of validation, developer validation. Now if you're using an EQMS that does not provide that at all, are we talking about twice the amount of work, three times the amount of work? How does that?

Max (13:54)
Definitely

harder and I can't put a number on it because you just again it comes down to the scope statement and the risk. You know if it's a big scope, big risk, big work. know little scope, little risk, little work. You know that's the easiest way you can kind of set it up. Yeah if you're looking at an entire QMS system and you have or you're starting from ground level and the developer has given you nothing to go off of that's going to be a pretty substantial task because you just have to start from somewhere. You know every section, every module that you're going to use

you should test and validate some level of testing and be prepared to show that to the FDA when they come knocking.

There's kind of a famous case.

it was a big ticket item. was NASA. Basically, they were making these filter devices for the space shuttle system. And the supplier had sent a screen material that NASA had spec'd out. But they sent two different materials all under the same part number. Now, when NASA was

doing risk assessments on all the systems that go into the space shuttle and all the components that go into it. When they got to this filter thing, they said, this is pretty straightforward. It's pretty simple. It's a metal mesh. It gets wrapped around in a circle. It gets installed in space shuttle. Hard to screw up. So what are we going to do when we measure receiving inspection for this particular good? Just look at it. Make sure the part numbers match. Make sure we got the right good and the quantity. And if everything checks out, put it in the inventory. It's that easy.

Well, they found out later on that they got these wrong materials and the consequences of that would have been complete loss of the shuttle at launch. And that's a bad thing. When they caught that mistake, they immediately went back and revamped the risk assessment process for that. Now, here's the interesting thing of that whole story is that at some point you have to draw the line. I understand why a long time ago maybe NASA said this is not a significant risk. We're not going to do it because

every component on that thing, probably 90 % of it, you come up with a case scenario where if it fails, a really bad thing is gonna happen and someone's gonna get hurt. But if we do so much testing, so much rigorous investigation, then we never get to launch because we're always tied up in testing.

So it's always a balance, but measuring risk and figuring out where you draw those lines, every company needs to do that on their own and do their best and try to document as much as they can. And as long as you're showing a good concerted effort, you should be fine. But again, hindsight, 2020, it's always hard to look back and say, how do we miss this? You know, how did that get through?

Christian (16:05)
Hindsight's 2020. Now, in terms of the support that your team gives to Qt9 customers and companies that are utilizing Qt9 QMS, when the FDA comes in and does the inspection, have you ever had one of the Qt9 customers call you and either have you speak to the...

Max (16:06)
Thanks.

Sure,

customers are always looking for some clarification. ⁓ We've definitely got those panic phone calls. I have an auditor here. I wasn't ready for this. Where are these documents? And we point them in the right direction. So we're always happy to jump in and help and provide some information. It's a 1400 page document. So maybe they're spending an hour trying to find it. They can call us up. We know exactly where that code, that spec, that thing that they're looking for was tested. And we can call it up really quickly. And we're happy to do that.

You know, one of the things that our customers need to realize and understand from that perspective is again going back to this 1997 standard that's really written about all of your software being contained within your facility and there is no SaaS. You know, even though we do validation and we test under as many scenarios as we can using multiple browsers, know Firefox, IE, Edge, Safari, you know, we try to test all these different environments is that there are still

A lot of things that we don't control and potentially the customer doesn't even control that could affect the performance of the software. How's it going over the wire? How is it getting received into your building? Do you have firewalls in place? Do you have email filtering? And we're not white listed, so email notifications are going to go out. That's a performance of the software issue because we've proven that it works on our end. But by the time that data has gotten all the way through the wire onto your PC and your building,

Is there something interfering with it that's going to make the software have some kind of anomaly that we just can't anticipate? And that's why you have to do end user testing. So for all of our customers, know, they have to do end user testing. They have to get on that. They will call us up and ask like, well, you know, not specifically, you know, an FDA or auditors there or they're just having a trouble with the validation. They will reach out to us and ask for clarification. A big ticket item we're getting right now for a lot of people looking for guidance and support is like SSO. There's nothing.

In 1997 guidance that talks about how you You kind of have to interpret it and they're like, well, where's your validation and testing on SSO? Like we did it. We tested it, but what are you using for SSO? Are you using the same platform we're using or using a different one? Because if you're using a different one, you're going to test for that. So the end user, our customers always going to have to do some work and they're going have to make those determinations as to what they're going to do on the run. And we're always here to have and happy to help and answer questions.

Christian (18:10)
A sign-on didn't exist in the light.

Do you guys charge for that? they have to pay if a customer, QT9 customer?

Max (18:36)
We're

happy to help. We're your friendly neighborhood quality management software manufacturer. So give us a call. We like building relationships with our customers and being involved and putting them in the right direction. It would be nice if they listened a little bit better. We're giving you good advice. We're pointing in the right direction. Usually we get that pushback when it comes like, sorry, you have to test that. But we don't want to. Get it? I'm not bringing you the best of news. We can't do this for you. You have to actually test that on your own.

Christian (19:00)
sure that's at least mitigated by the fact that 80 to 90 percent of additional work is already done for them. you ever had an FDA agent of the FDA or ISO 1345 challenge the QT9 validation report?

Max (19:14)
You'll get some FDA auditors that aren't sure about a certain thing or a test or they just can't find something and that's usually a simple phone call of where they get to us. But we've never really been challenged per se where they say this is garbage, this doesn't work. Again, we're a software developer, we're doing our best. The onus is on the end user to prove this stuff. But at the same time, that doesn't mean we just wipe our hands and we're done.

The feedback I've gotten from a vast majority of our customers that I've gone through an audit and I asked them like how did the validation of Q9 QMS go and they go we you know No one prints this thing out. It's just an electronic file but when they open up that file and they see how it is and they just keep scrolling and scrolling and scrolling and see all the test cases that we go through in the screenshot and the evidence they're just like That's clearly tested. Yeah, know and then they look at your risk assessment and your own supporting documents everything else. They're just like done

They have other things they want to look at. This is one of the things that they have to check for. They're not spending a lot of time on it. They're going to go through all of your systems. You probably have multiple other systems that they have to go through the validation, and they just need to check the boxes. They might not even look at all your systems, but they might just do a sample. Show me your QMS. Show me that CNC machine over there, and show me that. Show me that they're validated. And if you've got three out of three, you're fine. If you've got one out of three, they're probably going to do some more digging.

As long as you're covered and you're doing due diligence and you're validating the systems that are appropriately needed, you should be fine.

Christian (20:31)
All right, well, kind of wrapping up here, would you be able to list out the five essentials for our listeners that would ensure that they're inspection ready when it comes to validation?

Max (20:41)
I don't want to say there's x many essentials. It's going to be different for everyone. We deploy primarily in two different formats. You are either SaaS, so you go on a website, you just log into your site, you manage your data, or you're on promise, which means you've purchased the software and you're going to install it locally at your organization.

Depending on which way you deploy will completely determine how you validate. Unfortunately, if you're on-prem, you've got to do a lot of validation because you can no longer write off our installation qualification. You have to come up with your own installation qualification. So there's literally nothing you can do there because it's your servers. We don't touch them. We're hands-on. We don't even know what you're using. So you would have to do that completely on your own. So like are there five steps, seven steps, 12 steps, two steps? It all depends on how you're deploying it and how your organization intends to use the application.

So essentials, would just say scoping statement. There should always be a risk assessment. There should always be some kind of sample inspection. If your software developer or machine developer has provided you with some kind of footprint testing or things that they've done on their own, you want to include that information. But you want to do some kind of sample testing. Trust but verify. I trust the software vendor. I trust this machine manufacturer. But I want to verify that what they're selling me

going to meet in our expectations and our use, know, as long as we're operating within the specifications that the developer provided to us with it. So as long as you've got that information and your auditor comes in, you should be fine. Again, auditors are humans. They have all different kinds of personalities. So it all depends on what kind of auditor you get. And, you know, if the coffee shop had his favorite flavor of coffee that morning or if they were all out, you know, a lot of things can impact how an audit goes. Be nice, be supportive, answer questions and

auditors tend to be very good.

Christian (22:17)
Awesome, awesome. Well Max, we really appreciate you taking the time jumping on here today happy to help the QCast. Where can listeners or QT9 customers contact you? Contact your team if they want to reach out.

Max (22:31)
So obviously our website, there's tons of areas where you can reach out to us for contact. If you're a QT9 user, you can do support requests and go through that channel. If you're a QT9 user, there's a link in your help section where you can go to community. It's a community built of basically forums for all of our users to go in and collaborate, work together, share case examples of things that have worked with your organization, ask for help from other users for ⁓ unique data management issues you have and how others might have done them.

We have an entire section for 13485 and med device you can go in there for additional information on guidance And you know how you want to validate?

Christian (23:07)
absolutely. Well, that's all the time that we have for today. But thanks to you again. Thanks to our listeners for tuning in today. If you enjoyed or learned something from Max, please like, comment, and subscribe so you don't miss our next episode. until then, stay compliant.

Max (23:10)
I gotta get back to work.