Bioassay statistics is complex

Quantics can help you understand the statistical principles of bioassay analysis so you can achieve a well-characterised, reliable, precise bioassay with low failure rates

Quantics can support you through the complete
bioassay development pathway.

Bioassay Development Studies

Involving appropriate statistical support from the early stages of bioassay development can help to ensure a linear development pathway by optimising the experimental designs and thus  minimising  repeated work. Mathematical simulation can replace the need for some laboratory work altogether. The combination can save considerable time and expense over the course of a bioassay development process from bench to product.

Bioassay-Development-Pathway

Relative Potency Analysis

More

Parallelism

More

Statistical Model Choice

Choice

Bioassay Variability & DOE

Manage

Dose Group Optimisation

Discover

Validation of Bioassays

Discover

Bridging Studies

More

Trend Analysis

More

Relative-Potency-Analysis

Relative Potency

With small molecule chemical drugs the potency is fairly well related to how much drug is in the preparation, and that can be measured with good accuracy. With a biologic, the potency is related not so much to the amount of stuff in the preparation, but to the biological activity of the preparation, and that has to be measured in a biological system (a bioassay) that is itself variable.

Quantics are  leading experts in this field and have been helping  clients with relative potency assays for 15 years.

Read our blog that introduces the concept of Relative Potency.

What is Relative Potency?

In 2017 we launched QuBAS a feature rich statistical analysis software package, designed with Continuous real time Validation (CrtV) for regulatory use, with all of the commonly required  statistical methods for relative potency assays.

Find Out More

Parallelism Testing

Parallelism testing is fast becoming a regulatory requirement. Quantics can advise on the best method to use.

If the test material is simply acting as a dilution of the reference material (i.e. it is a similar biological entity) the dose response curves will be parallel. One curve is just a horizontal shift of the other.

In practice the curves never are exactly parallel, and each curve is just a best fit to the data points and each has associated confidence limits, so how parallel is parallel enough?

A number of methods are available, and choosing the best one is not simple. Account must be taken of the data available, likely future data variability, the statistical model used and regulatory requirements.

Common testing methods are based on significance testing (e.g. the “F” test) or equivalence testing.

Choosing the wrong test or setting the pass fail criteria incorrectly may result in unnecessary assay failures.

See Quantics paper: Parallelism in Practice: Approaches to Parallelism in Bioassays, and associated flowchart to guide users to the best option for parallelism testing. Download HERE

PROJECT INSIGHT

A major pharmaceutical manufacturer had been running a bioassay for GMP batch release for some years without formal parallelism testing. The FDA had warned them that the method would need updating. Quantics was instrumental in:

• The development of a new analysis method that included formal parallelism testing
• Optimising the experimental design to reduce significantly the laboratory costs
• Establishing new system and sample suitability criteria for the new process
• Formally validating the new process
• Documenting all the analysis changes for FDA submission

The subsequent FDA submission was successful, allowing the company to continue production of the biologic.

4PL or 5 PL Model choice
Choosing a statistical model: Continuous response data

Read our blog

Choice of Statistical Model

Quantics have years of experience to help you choose the optimal statistical model. We can advise on data transforms, outlier management and management of LOQ values to ensure the model  fits the data well and is stable to data variability.

Why does model choice matter?

The statistical model has a major impact on assay accuracy, sensitivity, reliability, and cost. Some models are more likely to fail a test item if the RP is low, or the data are highly variable.

Common models for continuous response (e.g. optical density) are:

  • 4 Parameter Logistic (4PL)
  • 5 Parameter Logistic (5PL)
  • Linear*

 

*The linear model is mathematically simpler, but may result in (unnecessary) assay failures with low RP due to failure of parallelism tests.

Common models for quantal (“yes/no”) response bioassays are:

  • Probit
  • Logit

Assays involving dose-time-response measurement (e.g. challenge assays) can also use a survival analysis model. Survival analysis can result in a significant reduction in animal use per assay over conventional logit or probit models.

probit_scale_plot of Bioassay data
Read our paper that describes how the use of survival analysis techniques can significant increase precision, or reduce test subjects (usually animals) using this mathematically efficient analysis.

Publications

CUSTOMER INSIGHT

“Overall Quantics helped us minimise the number of assays required but also helped us understand the importance of the equivalence test and how it should be applied. The end result was that the FDA accepted the submission which was the goal. I will make sure to recommend Quantics if any further work comes up.”

Mark Kuy – Ipsen

Variability and DoE in Bioassays

Bioassays, both in vitro and, more particularly, in vivo, can be very variable. This variability impacts the response and hence the precision of the reportable value. Quantics can help you understand the contributions of the various factors to the overall variability, in order to design an assay with the required performance.

The variability inherent in bioassays arises from a range of sources. Discovering and controlling the important ones can be made much more efficient using  a formal Design of Experiments (DoE) approach which provides statistically efficient analysis of the data, optimising the speed of development and minimising resource and costs. In some situations, simulation can be used to reduce the laboratory work required.

Instead of studying one factor at a time, these techniques  provide an efficient multi-factor approach that generally requires fewer experiments to achieve optimisation,  and also provide insight into interactions of factors that affect bioassay performance.

Dose Group Optimisation

In Bioassay dose groups and required replicates are typically determined by experimentation.

Quantics can save you time and experimental costs by using simulation studies to determine the design that will optimise assay performance.

Simulation can explore the impact of dose group spread and distribution, numbers of replicates, and for in vivo studies, this may include unequal numbers of animals in the dose groups.

Dose-Group-Optimisation Chart
USP-Bioassay-regulator

Validation of Bioassays

Quantics can help ensure that your assay validation study follows the appropriate regulatory guidance in all its statistical aspects, both design and analysis.

What is bioassay validation?

Assay validation is the process of demonstrating and documenting that the performance characteristics of the procedure and its underlying method meet the requirements for the intended application and that the assay is thereby suitable for its intended use.

Reference Bridging and Method Transfer

Bridging and transfer studies aim to demonstrate that an old and new Method (or laboratory location) have equivalent performance. A number of factors may have to be considered including defining the important characteristics to equate for accuracy and precision.

Quantics can help design bridging studies to include
– types and numbers of samples
– lots / runs required
– establishing acceptance criteria

Several statistical approaches are available and Quantics will advise on the best approach for the particular circumstances taking into account known regulator preferences.

All work can be carried out to GMP.

Bridging-Studies

Ongoing Monitoring and statistical process control

Bioasssay Statistical-process-control

It is important to monitor the performance of an assay over time. Quantics can help you to implement a monitoring protocol for your bioassay.

Simple monitoring by plotting data over time may be adequate in development situations, but in GMP manufacturing regulators are starting to expect a more formal approach  known as statistical process control (SPC). This methodology  typically charts  suitable parameters of the Reference Standard response curve and QC samples or test samples, and has a number of statistically derived rules that trigger warring or action alarms if the assay is showing signs of shifting or drifting. SPC control limits are set based on an analysis of historical data, and Quantics can help you to implement a suitable SPC monitoring protocol for your bioassay. 

Bioequivalence

Quantics is working at the leading edge of the emerging field of statistical evaluation of potency assay data for bioequivalence of biosimilars.

Quantics can help present biosimilar data to regulators, and advise on the different approaches to common biosimilar statistical problems, including ways of managing non-parallel dose response curves.

Biosimilar Bioassay

Recent Bioassay Discussions from Quantics