During the early development of many therapeutic products, analytical and data management processes are often created quickly and pragmatically. Excel spreadsheets, macros, and bespoke code are widely used because they are flexible and allow rapid iteration while assays are evolving. And, since many novel products do not make it past the R&D stage, it rarely makes sense to invest time and resources into a GxP-ready solution any earlier than is required.
For those products which do make it to the later stages of the development process, however, pragmatic solutions no longer cut the mustard. Processes that worked well during development may not have been designed with the controls, documentation, and traceability expected in a regulated environment. A higher standard of rigour is required to ensure that the analysis not only meets the stringent regulatory requirements of GxP but, ultimately, that the risk of harm to patients is minimised. In short, validated software solutions are a key requirement in any GxP environment.
Key Takeaways
-
Bespoke tools are common, but risky in GxP contexts. Spreadsheets, scripts, and custom-built analytical tools are invaluable during R&D, but their flexibility, lack of controls, and susceptibility to error make them unsuitable for regulated use without a robust validation strategy.
-
Traditional IQ/OQ/PQ validation can be burdensome for bespoke software. While full validation is possible, it can lock analyses to specific software versions, create ongoing revalidation overhead, and introduce productivity and security challenges, particularly for tools such as Excel.
-
Independent cross-checking offers a proportionate validation alternative. Approaches such as Diverse Self-Checking Pairs Programming provide strong assurance by independently reproducing analyses in a different platform, enabling efficient validation and revalidation while retaining bespoke analytical methods.
The question, therefore, is this: what to do next? One approach could be to employ a commercially available GxP-ready software package, of which there are several options. This is a good solution in many cases, but can fall short if bespoke analysis choices are not included as part of the software package. Even if it is possible to fully recreate the analysis, you are still effectively reinventing the wheel.
Would it not be better to have the option to continue to use the existing bespoke software by validating it for GxP? Here, we’ll discuss approaches to validating bespoke software, including examining common reasons why bespoke software might not meet the regulatory standard for GxP and how these can be overcome through a strategic approach to validation.
Why Validate Software?
Since it is a key consideration at almost every stage of the pharmaceutical manufacturing process, quality is a concept which will be familiar to many involved in that field. A high-quality product is one whose key properties – including identity, purity, and potency – are established to be within well-defined specification limits. There are two main components of ensuring the quality of a product: Quality Assurance (QA) and Quality Control (QC). QA is the overall strategic design and optimisation of a process to enable a high-quality product to be manufactured, and to minimise errors. QC, by contrast, consists of the checks on the product implemented throughout the process to catch any errors which do occur and prevent them from propagating into a risk to patients.
Thanks to the complexity of the calculations and modelling required for testing pharmaceutical products, software is inevitably employed as part of most modern manufacturing processes. A key element of QA for these processes, therefore, is ensuring that the results are free of errors introduced by the software. If we’re making high-leverage batch release decisions based on the output of relative potency software, we had better be sure that the software doesn’t think that 2+2=5.
That’s where validation comes in. Software validation attempts to ensure that the chance of errors being introduced to the manufacturing process by software is minimised. This is a requirement for software used in a commercial pharmaceutical manufacturing process: 21 CFR 211.68 (a) states that “Automatic, mechanical, or electronic equipment [including software]…shall be routinely calibrated, inspected, or checked according to a written program designed to assure proper performance”.
Traditionally, there are 3 main components to validating software for a GxP environment:
- Installation Qualification (IQ): tests that the software is correctly implemented and installed.
- Operational Qualification (OQ): tests that the software functions correctly across its defined operating range, including edge cases and failure modes.
- Performance Qualification (PQ): tests that the software functions as required under real-world conditions.
The Challenges of Bespoke Software in a GxP Setting
For many products in R&D, it can make sense to develop software solutions which are useful for a small number or even just one bioassay. The software might use an unconventional model, for example, or suitability criteria might be hard coded. It is in this sense that such solutions are bespoke: they are designed and used for a very specific purpose.
As a result, bespoke software solutions are often less robust than commercially developed packages. Many are simple R or Python scripts, and still others are built as spreadsheets in Microsoft Excel or other similar packages. And these are perfect for developing an analysis method. Easy to change and quick to run, they provide exactly the flexibility which is required when innovating and optimising an analysis.
This flexibility, however, becomes a problem in a GxP setting. When we need to be sure of how the software is going to perform every analysis, the ability to make changes – even unintentional changes – has the potential to increase the risk of harm to patients. Consider, for example, a spreadsheet built to analyse the results of a relative potency assay. It would be all too easy to unintentionally overwrite a cell containing a key suitability criterion. In development, this isn’t a problem: you simply move onto the next assay. But for a routine assay, this could lead to an incorrect decision being made about the suitability of a batch of product with all the consequences that entails.
A further consequence of the flexibility provided by bespoke software is the inevitability of bugs creeping into the code. As anyone who has attempted to code an Excel spreadsheet can attest, it is all too easy for a stray bracket or a missing dollar sign to cause an otherwise-functional analysis to come crashing down around your ears. Even more consequential are the bugs which don’t get spotted and have an undetected influence on the results. Again, in development, these bugs are usually little more than an inconvenience, but in a GxP environment they can prove a catastrophe.
Validating Bespoke Software
All of this means that, when it is time for an assay to move from R&D into regulated use, the use of bespoke software can prove a challenge. Very rarely is it appropriate to keep using a homemade solution straight out of development and into a GxP environment.
As we mentioned previously, one option is to migrate the analysis into a commercial GxP-validated system. This increases confidence in the integrity of the analysis – it is reasonable to expect that any bugs which might have affected the analysis results will have been detected and corrected before release to commercial users. Most GxP software packages also implement rigorous change control processes, meaning changes to the analysis – whether intentional or otherwise – are difficult without detailed documentation.
While this can be appropriate in many cases, it may not always be the most practical route. Rebuilding analyses can require significant time and effort, particularly where methods are non-standard or highly assay-specific. Reimplementation introduces delays and requires teams to adapt to new tools late in the development lifecycle. Not to mention that most (but not all) GxP-approved software still requires a full IQ, OQ, PQ validation before it can be used in anger.
An alternative approach, therefore, is to perform a validation of the bespoke software solution. This can save time and resources: if you’re going to have to validate your GxP software solution anyway, why not have it be the one which does exactly what you want it to do already?
It is certainly possible to go through the full IQ, OQ, PQ validation process for bespoke software. This, however, can lead to problems down the line. For example, the spreadsheet for an analysis might only be validated for a specific version of Excel. Any updates to Excel might introduce changes to key formulae, so it is crucial that the GxP analysis is only ever run in that specific version of the software. That means that, for the life of the assay, any update to Microsoft Excel – a multipurpose software used ubiquitously – is now a high-risk operation which requires an expensive re-validation process. In reality, the software is just never updated in many cases, which can lead to reduced productivity elsewhere and even security concerns for unsupported software versions.
Such concerns are replicated when using traditional validation processes no matter the platform used to design the bespoke software. And that’s not to say anything of the time and resources which are required for a full validation using this approach. Thankfully, there exist alternatives which fulfil regulatory requirements while overcoming the challenges presented by traditional validation.
One such approach is known as Diverse Self-Checking Pairs Programming (DSCPP). This employs the principle that if two different systems give the same answer, then this is strong evidence that said answer is correct. It’s like checking multiple weather apps to figure out whether it’s going to rain: if two trusted apps say it’s going to be dry you’ll be more likely to leave your coat at home compared to if you only looked at a single source of information.
To validate a bespoke software solution using DSCPP, the analysis is replicated in a second piece of software. This forms a validation process with two branches: one branch being the original software solution, and the other being the replica built for the validation. Both branches are then tested using a series of identical inputs which represent normal usage as well as edge cases and failure states. If the branches provide identical results for identical inputs, then the original analysis can be considered validated for GxP use.
Importantly, the second branch should be written using a different platform or language: if you were validating an Excel spreadsheet, you might write the second branch using R code, for example. This ensures that any errors which result from the implementation of key functions or tools in each platform are not replicated across the two branches. Where possible, the branches should also differ in the methodology used to implement the analysis method.
Combined, these principles ensure that the probability of the two branches making an error and still producing the same result is vanishingly small. To produce such an outcome, there would need to be two different errors – one in each branch – which occurred simultaneously and coincidentally produced the same incorrect result. As a result, cross checking the results from the two branches provides a strong test for the validity of the analysis.
A key benefit of a DSCPP validation approach is that it is extremely efficient to initially implement compared to traditional IQ, OQ, PQ validation. Rather than demonstrating the validity of the software in 2-3 stages, the DSCPP approach proves the software is fit for GxP in a single run of a validation data set by testing the three validation components simultaneously. Further, this means that any revalidation exercises can be even swifter once the validation branch is in place. If an Excel sheet is validated using DSCPP, an update to the Excel package need only be met with another comparison to the validation branch. If everything matches, then routine use can continue as normal.
Carving a Path
Bespoke analytical tools such as spreadsheets and statistical scripts play a crucial role in the development and ongoing support of bioassays. Their flexibility and transparency make them invaluable during assay design and optimisation, and for many organisations they represent a deep repository of scientific and operational knowledge. The challenge arises when those same tools are carried forward into a GxP environment, where flexibility must give way to control, traceability, and assurance.
Where it is not feasible or desirable to replicate the bespoke analysis in GxP-ready software platforms, this often means validating bespoke software solutions directly. Approaches such as Diverse Self-Checking Pairs Programming demonstrate that it is possible to meet regulatory expectations without unnecessary burden. By independently verifying analytical results using a second, deliberately different implementation, organisations can achieve a high level of assurance while retaining the benefits of bespoke analysis. Crucially, such approaches also offer a sustainable path forward, enabling efficient revalidation as environments evolve, rather than locking critical analyses to frozen software versions.
Ultimately, the question is not whether a spreadsheet or script can be validated in principle, but whether its use is supported by a validation strategy that is proportionate, well-reasoned, and defensible. When validation is approached as a tool for risk reduction rather than a box-ticking exercise, bespoke software can continue to support GxP decision-making with confidence, preserving both scientific intent and regulatory integrity.
Quantics has supported many organisations facing the transition from development to routine use, helping teams apply regulator-aligned validation strategies to bespoke analytical tools developed during early-stage work. If you are considering how to bring homegrown analysis into a GMP environment, we can help you assess your current processes and determine an appropriate path forward that meets regulatory expectations without unnecessary rework.



Comments are closed.