In life science biopharma manufacturing, demonstrating consistent, repeatable processes is essential both for regulatory compliance and product quality. Being able to create data-driven, performance-based objectives, and aligning the process control strategies with compliance and business performance objectives, allows companies to take their data analysis to the next level: the level at which it becomes meaningful for the company’s bottom line.
For biopharma manufacturers, ensuring that products used to treat animals and humans have a consistent manufacturing process is essential to stay compliant. Multivariate data analysis (MVDA) can help identify patterns of deviations in batch processes so corrections can be made.
At a recent PI World conference in San Francisco, Will A. Penland, MSPA, Principal Data Scientist at Boehringer Ingelheim Animal Health in St. Joseph, MO explained exactly how his company used multivariate data analysis (MVDA) to develop prescriptive process controls and batch evolution trajectories. Ultimately, this led to reductions in production variances and the ability to identify what a “golden batch approach” looks like.
Boehringer Ingelheim was able to develop criticality classifications based on statistical properties, identify and even predict when a batch process wasn’t conforming to norms, and then reduce the rate of production variances. He explained how the company used middleware databases, the OSI Asset Framework, event frames, and a series of analytics engines (statistics software and visualizations from Umetrics Suite) to enable it.
“It’s about demonstrating time and time again that your batch and process is under control and that you’re very capable of meeting your specification limits,” Penland said. In life science and pharma, especially, “to be compliant, you need to demonstrate consistent, repeatable processes.”
Focus on batch evolution with parameter limits
At the end of the day, the overall manufacturing objective is about maintaining yield with consistent quality that meets the predefined critical quality attributes (CQAs). Using batch evolution techniques and multivariate statistics with advanced statistical modeling techniques like principal component analysis (PCA) and PLS/OPLS, you can create a model of what the acceptable limits for a "golden batch" process are. For example, with a bio fermentation process, you can put the same ingredients in, but if you have any variances in the process, temperature or contamination, you end up with something quite different day by day.
So the goal is to reduce process variances by honing in on critical process parameters and detect anything that has a direct affect on the product’s quality. Doing so can reduce deviations in the process and improve yields.
With batch evolution modeling, the objective is to define the limits under which the process can produce a normal result. The challenge is to develop a good data set.
In Boehringer Ingelheim’s case, they were able to detect some major differences in certain batches by looking at the score plots of the principal components. Identifying this sort of variation, and using the data to create models that help predict and then correct batches that are going wrong, saved the company huge amounts of money.
In the plot above you can see the difference spread out over several years (color coded) between a successful series of batches (far right, red) and those with low potency (left, blue).
Watch the video
Watch a video of the presentation with Will Penland from PI World to learn more.
Download the presentation here.
What to know more?
Sign up for our upcoming complimentary webinar on Batch Process Data analysis. Advance registration gives you access to the recorded video for on-demand viewing later, as well.