Continuous manufacturing is one of the key trends within the pharmaceutical industry, both for the production of ‘classical’ drugs as well as large molecules. Companies are looking for ways to shift from traditional batch processing to a continuous method of operation. The main advantages associated with these processes are more room for modularity, automation and flexibility due to a smaller footprint, as well as more consistent quality of the drug product.
For pharmaceutical companies facing challenges such as rising costs, sterner regulations and declining profit margins, innovative new technologies like artificial intelligence (AI) and digital twins have become part of an essential strategy to future-proof their businesses. A digital twin is the next evolution of machine learning combining advanced data analytics and equipment simulation with comprehensive system models that blend historical information with real-time data to predict the future of a process. According to Gartner, the digital twin concept was one of the top 10 strategic technology trends in 2019.
Biosimilars are an exciting route to increasing access to the highly effective therapy made possible by biologics, but ensuring a biosimilar meets the critical quality attributes (CQA) of the original biologic is a major challenge. Optimizing production at full scale is impractical, which makes a quality by design (QbD) approach using a reliable scale down model of the process an attractive alternative. A process development team at Zhejiang Hisun Pharmaceuticals, Taizhou, China, therefore developed a scaled down model of the cell culture process used to produce the biosimilar adalimumab. They qualified the model using multivariate data analysis (SIMCA), and explored the design space for key process attributes (KPA) and CQAs using MODDE.
Over the last several years, the use of artificial intelligence (AI) in the pharma and biomedical industry has gone from science fiction to science fact. Increasingly, pharma and biotech companies are adopting more efficient, automated processes that incorporate data-driven decisions and use predictive analytics tools. The next evolution of this approach to advanced data analytics incorporates artificial intelligence and machine learning.
Most biopharma manufacturing companies are keen to adopt new methods that would streamline production, reduce errors and ensure product quality. That was the goal of Bristol-Myers Squibb when they implemented a complex real-time process monitoring system that involved integrating data from a number of different technologies, systems and vendors to gain greater control over complex batch processes.
In many manufacturing industries, variability in raw materials can lead to unexpected and undesirable changes in the final products. In regulated industries such as pharmaceuticals, this is especially problematic due to the need to maintain carefully controlled processes that stay within approved regulatory parameters for drug development and production. Embracing a total company-wide digital transformation enabled Amgen to align data across multiple systems to not only control, but also predict unacceptable deviations in time to make necessary adjustments. Read on to find out how they used data analytics to implement real-time process control.
The pharmaceutical industry, including R&D, manufacturing and also product sales and use, creates a lot of data. The question is, what can we do to understand our data better, get more out of it, and unlock its potential in the most rational way possible to get to the knowledge we need? And how can we gain control over our research, or the processes needed to generate a stable, reliable product that consistently meets regulatory requirements? The answer is Multivariate Data Analysis.
Throughout the evolution of manufacturing, many industries have gradually shifted away from batch process to continuous process manufacturing as production technologies matured. This has included industries such as chemical, petroleum, steel, automobile, consumer goods and food manufacturing.
Breast cancer is the most commonly diagnosed cancer amongst women worldwide and a leading cause of cancer related deaths among females. It’s the second most common type of cancer overall. According to the International Agency for Research on Cancer Research, there were more than 2 million new cases in 2018.
The natural variability of botanical material often makes it difficult to ensure a consistent quality process for pharmaceuticals made from plant-based products. In addition, botanical drug products (BDPs) are often produced using a series of separate batch processes, which adds even more variability into the manufacturing process.
Advancements in cell and gene therapy hold promise for the future of personalized medicine, especially for cancer treatments. However, bioprocessing methods for autologous cellular therapies, and CAR-T in particular, often present unique challenges in manufacturing due to the variability of the starting material and unique nature of each batch. Is there a way to create more efficient processes in order to bring down costs and make personalized medicine a viable option for more patients?
In life science biopharma manufacturing, demonstrating consistent, repeatable processes is essential both for regulatory compliance and product quality. Being able to create data-driven, performance-based objectives, and aligning the process control strategies with compliance and business performance objectives, allows companies to take their data analysis to the next level: the level at which it becomes meaningful for the company’s bottom line.