Involving appropriate statistical support from the early stages of ELISA development can help to ensure a linear development pathway by optimising the experimental designs and thus minimising repeated work. Mathematical simulation can replace the need for some laboratory work altogether. This combination can save considerable time and expense throughout your product development life cycle.
BIOANALYTICAL STATISTICS IS COMPLEX
Quantics can help you understand the statistical principles of ELISA analysis so you can achieve a well-characterised, reliable, precise assay with low failure rates
End to end statistical support.

ELISA Development Studies
Interpolation Analysis
In drug development, interpolation analysis (IA) is used to find the concentration of a target analyte in a sample. It is a common technique in ELISA analysis and is based on the assumption that the dose response curves of the reference standard and test samples are the same.
Quantics are highly experienced in this type of analysis and have recently developed a new interpolation analysis method which both simplifies the sample suitability criteria and makes use of all of the data points. This includes data beyond the upper and lower limits of quantification which would normally be discarded in traditional interpolation analysis. This is particularly useful when dealing with very low levels of detection (e.g. host cell proteins), and can save you substantial time and money, especially with high throughput assays.
Contact us to find out more about how you can reduce data wastage in ELISA assays with our Interpolation Analysis method.



Relative Potency
A relative potency measure is often used in cell-based ELISA formats for lot release and stability testing, to determine if two substances are similar in biological activity. It is also used in cell-based ELISA development, for example measuring antibody-drug conjugate binding to target antigens on a cell surface. Because of the inherent variability of a biological drug measured in a biological system, relative rather than absolute potency is used as a measure of biological activity of the drug.
Quantics are leading experts in determining relative potency for complex biological assays and have been helping clients to develop their cell-based ELISAs for 15 years. We can help you design and optimise your assay at any stage of development.
In 2017 we launched QuBAS: a feature-rich, biostatistical data analysis software package, with all of the required statistical methods to help you develop and validate your bioassay or ELISA, along with routine GMP use.
Find Out MoreParallelism Testing
Parallelism testing is important in both Interpolation Analysis and RP analysis as it is the primary way of confirming that the unknown is behaving biologically like the reference (and thus the reference is appropriate). In the standard Interpolation Analysis method, the %CV around the dose adjusted results from 2 or more dilutions is a surrogate measure of parallelism. If the dose adjusted results are very different, this suggests non-parallelism. In the RP analysis parallelism can be tested specifically in a number of ways.
See Quantics paper: Parallelism in Practice: Approaches to Parallelism in Bioassays, and associated flowchart to guide users to the best option for parallelism testing. Download HERE
PROJECT INSIGHT
A major pharmaceutical manufacturer had been running an assay for GMP batch release for some years without formal parallelism testing. The FDA had warned them that the method would need updating. Quantics was instrumental in:
• The development of a new analysis method that included formal parallelism testing
• Optimising the experimental design to significantly reduce the laboratory costs
• Establishing new system and sample suitability criteria for the new process
• Formally validating the new process
• Documenting all the analysis changes for FDA submission
The subsequent FDA submission was successful, allowing the company to continue production of the biologic.

Choice of Statistical Model
Quantics have years of experience to help you choose the optimal statistical model. We can advise on data transformations, outlier management and management of LOQ values to ensure the model fits the data well and is stable to data variability.
Why does model choice matter?
The statistical model has a major impact on assay accuracy, sensitivity, reliability, and cost. Some models are more likely to fail a test item if the RP is low, or the data are highly variable.
For a traditional ELISA, a quadratic model is usually the model of choice for analysis; however, the curve fit should always be checked before generating results.
Common models for continuous response (e.g. optical density) are:
- 4 Parameter Logistic (4PL)
- 5 Parameter Logistic (5PL)
- Linear*
These models are suitable for cell-based ELISAs with a relative potency measure.
*The linear model is mathematically simpler, but may result in (unnecessary) assay failures with low RP due to failure of parallelism tests.
CUSTOMER INSIGHT
“Overall Quantics helped us minimise the number of assays required but also helped us understand the importance of the equivalence test and how it should be applied. The end result was that the FDA accepted the submission which was the goal. I will make sure to recommend Quantics if any further work comes up.”
Mark Kuy – Ipsen
Variability and ELISA DoE
Any assay that measures either a biologic drug or a drug in a biological system can be highly variable. This variability impacts the response and hence the precision of the reportable value. Quantics can help you understand the contributions of the various factors to the overall variability, in order to design an assay that will meet all regulatory requirements.
The variability inherent in ELISA assays (both traditional and cell-based) arises from a range of sources. Discovering and controlling the important ones can be made much more efficient using a formal Design of Experiments (DoE) approach which provides statistically efficient analysis of the data, optimising the speed of development and minimising resource and costs. In some situations simulation can be used to reduce the laboratory work required.
Instead of studying one factor at a time, these techniques provide an efficient multi-factor approach that generally results in fewer experiments to achieve optimisation. It also provides insight into interactions of factors that affect ELISA performance which can help in further development or validation.
Dilution Group Optimisation
Quantics can save you time and experimental costs by using simulation studies to determine the design that will optimise assay performance and throughput. Simulation can explore the impact of dilution group spread, distribution, and numbers of replicates required for optimal assay response.


Validation of ELISAs
The variability inherent in ELISA assays (both traditional and cell-based) arises from a range of sources. Discovering and controlling the important factors can speed development and minimise resource use and costs in routine use. In some situations, simulation can be used to reduce the laboratory work required.
Ongoing Monitoring and statistical process control

It is important to monitor the performance of an assay over time. Quantics can help you to implement a monitoring protocol for your ELISA.
Simple monitoring by plotting data over time may be adequate in development situations, but in GMP manufacturing regulators are starting to expect a more formal approach known as statistical process control (SPC). This methodology typically charts suitable parameters of the Reference Standard response curve and QC samples or test samples, and has a number of statistically derived rules that trigger warring or action alarms if the assay is showing signs of shifting or drifting. SPC control limits are set based on an analysis of historical data, and Quantics can help you to implement a suitable SPC monitoring protocol for your ELISA.