In the last few years, many pharmaceutical companies have started investing in continuous production, and some have already succeeded in filing new pharmaceuticals using a continuous flow manufacturing process. This article summarizes a study at GlaxoSmithKline, GSK, where real-time multivariate monitoring added value to the development of a continuous production process of an active pharmaceutical ingredient (API).
The pharmaceutical industry, including R&D, manufacturing and also product sales and use, creates a lot of data. The question is, what can we do to understand our data better, get more out of it, and unlock its potential in the most rational way possible to get to the knowledge we need? And how can we gain control over our research, or the processes needed to generate a stable, reliable product that consistently meets regulatory requirements? The answer is Multivariate Data Analysis.
In agrochemical, pharmaceutical and other industries that manufacture complex chemicals, finding ways to reduce waste and improve inefficiencies often hinges on selecting the right chemical compounds. Data analytics can help manufacturers find alternative compounds that meet complex requirements, decrease raw material usage or enable more cost-effective, sustainable processes.
A challenge for the regenerative medicine industry is to develop cell culture processes that can be scaled up for high volume production. Finding a better way to scale up commonly used research cells like HEK293T (used for protein expression and the production of recombinant retroviruses or lentiviral vectors) would be beneficial for biologists in many fields of medicine. Dr. Franziska Bollmann, virus scientist at Sartorius Stedim Biotech in Germany, recently conducted two experiments to find out if micro bioreactor systems can help facilitate the transition from the traditional shake flask process to a more improved method optimizing process control.
The 2019 Umetrics User Meeting drew more than 102 engineers, operations managers, process experts, researchers, and data scientists in industries ranging from biopharma to food and beverage to chemicals who gathered to share ideas and insights into new methods for streamlining their processes, reducing waste and cost of goods sold.
The potential for Artificial Intelligence (AI) is enormous and the applications seemingly unlimited. One subset of AI, deep learning, offers the promise of efficiently solving a large range of challenges involving unstructured data by harnessing neural networks to save time and money, and even perform seemingly impossible tasks.
Deep learning has revolutionized the fields of artificial intelligence, computer vision, speech recognition, and more in recent years. Deep learning can draw information from unstructured data such as images or text in a way that was unthinkable a decade ago. In industries like Pharma and Biopharma, deep learning can help all the way from understanding how cells work using live cell imaging to monitoring manufacturing using audio.
Whether it’s fake olive oil, coffee bulked up with husks and twigs, or honey tainted with antibiotics, food fraud is a growing problem worldwide. The Australian research organization CSIRO states that the economic damage alone from food fraud has reached $35 billion (in US dollars) in 2018. The underlying cause is nearly always financial gain and economic pressures to save money by using inferior (or mislabeled) products. Predictive analytics is one tool manufacturers are using to combat food fraud.
SIMCA 16 offers improved ribbons, tours, wizards, data merging, multiblock analysis and more.
SIMCA is a multivariate data analytics tool that helps users make sense of complex data by transforming numbers and statistics into visual information for easy interpretation and understanding. Across many industries ranging from pharmaceuticals and chemicals to food and beverage manufacturers to academia, SIMCA helps production managers and researchers a like make better decisions in order to take action quickly and with confidence.
One key to reducing R&D costs in the biopharmaceutical market is streamlining and speeding up process data flow for Design of Experiments (DOE). Now, a direct integration of Genedata Bioprocess® platform and Umetrics Suite MODDE® software enables seamless data flow and facilitates the design, execution and evaluation of experiments in large-molecule process development.
Recently the FDA issued a new draft guideline for continuous manufacturing of small molecule drugs. With these draft guidelines the FDA wants to engage more pharmaceutical manufacturers to shift from traditional batch/start-stop processing to continuous manufacturing. The main advantages associated with these processes are more room for modularity, automation and flexibility due to a smaller footprint, but also more consistent quality of the drug product. Of course the main incentive for the FDA to promote this way of processing is that it believes that this will have a positive impact on drug prices and prevent drug shortages.
Injection molding is the most important production method for manufacturing plastic components used in products ranging from cars to medical devices. Although the plastic components themselves are often inexpensive to produce, any defect can lead to expensive errors that can affect the performance or safety of the finished product. Creating a system of early fault detection and continuous process improvement can mean big payoffs for manufacturers.
Throughout the evolution of manufacturing, many industries have gradually shifted away from batch process to continuous process manufacturing as production technologies matured. This has included industries such as chemical, petroleum, steel, automobile, consumer goods and food manufacturing.
The key to being able to innovate, improve and streamline your processes often lies in gaining as many insights as you can from a variety data sources scattered throughout your operations. Making sense of all that data can be difficult. But it's not an impossible dream.
Breast cancer is the most commonly diagnosed cancer amongst women worldwide and a leading cause of cancer related deaths among females. It’s the second most common type of cancer overall. According to the International Agency for Research on Cancer Research, there were more than 2 million new cases in 2018.
In industries that depend on bioprocessing, achieving the highest possible yields in the shortest time frame, while keeping costs down and product quality high is often challenging. Meeting these goals requires having a well-designed, well-defined and well-controlled process. And at the core of any effective process control is a set of effective process modeling tools.
In industries ranging from biopharmaceuticals to chemicals, executives in today’s manufacturing marketplace face ever-increasing pressures to grow profit margins, reduce time to market and optimize processes across all aspects of their business. Everything from constraints in the supply of raw materials to multiple steps in a manufacturing process can affect productivity—making process optimization an amorphous target.
Speed and changing market conditions are significant challenges for businesses across many industries and markets. The need for improved efficiency, the ability to adapt to changing market conditions and digitalization are often key drivers of change. By focusing on data, a company can make significant improvements in processes or systems, and gain insights into the driving forces behind business operations. Data analytics is the key to business optimization.
Consumers expect a certain consistency in quality and taste from the food and beverage brands they love. But many factors can influence the way a product tastes when it reaches the consumer – ranging from the manufacturing process to seasonality of ingredients to storage temperatures. Similarly, a number of other factors may influence the overall quality attributes that matter, such as alcohol content of beer or stability of the whiskey aging process.
The natural variability of botanical material often makes it difficult to ensure a consistent quality process for pharmaceuticals made from plant-based products. In addition, botanical drug products (BDPs) are often produced using a series of separate batch processes, which adds even more variability into the manufacturing process.
Many elderly people are afraid of falling – and for good reasons. Falls can have serious consequences for the individual but also the fear of falling could have serious effects on health and independence. A new research project at Luleå University of Technology in Sweden has taken a closer look at fall-related concerns among elderly people, using multivariate data analysis, MVDA, with the ultimate goal of finding diagnostic and training methods that could help reduce falls. Results from the first studies have given some interesting answers.
What’s the secret formula for creating long-lasting bubbles? Is expert knowledge of liquid dynamics needed to optimize the mixture design and develop the best bubble solution? Or can we use design of experiments (DOE) and data analytics to draw conclusions? Let’s a take a look at a fun example of how DOE can be used to optimize a mixture design in order to achieve our goal: create long-lasting bubbles.
Advancements in cell and gene therapy hold promise for the future of personalized medicine, especially for cancer treatments. However, bioprocessing methods for autologous cellular therapies, and CAR-T in particular, often present unique challenges in manufacturing due to the variability of the starting material and unique nature of each batch. Is there a way to create more efficient processes in order to bring down costs and make personalized medicine a viable option for more patients?
Could data analytics aid in the diagnosis of severe neurological diseases? In a recent study, a research group at Umeå University has conducted statistical data analysis of biomarkers from patients suffering from Amyotrophic lateral sclerosis (ALS, also known as Lou Gehrig’s disease) and Parkinson’s disease to investigate whether data analytics could help in the diagnosis of – and help distinguish between – the two diseases.
Pressure to cut development costs and lower regulatory barriers while assuring product quality has stimulated the pharmaceutical industry to apply Quality by Design (QbD) to manage risk and gain process and product understanding. As a result, QbD is being widely promoted by regulatory authorities such as the Food and Drug Administration, and the International Conference on Harmonization.
Mining information in unstructured text can be a real challenge. Patent documents, for example, provide a rich source of technological and scientific knowledge that can reveal technological trends as well as information on the legal landscape of the market. This makes analysis of the vast and ever-growing number of patents an important part of corporate business strategies.
Making the perfect bar of chocolate is not just about mixing the right amount of sugar and cocoa, or adjusting the process for product quality. Another factor that must be taken into account to optimize both taste and profits is the grinding time of the cocoa beans. Let’s take a look at how data analytics can be used to elevate both, and to find the right combination of ingredients and process to support the business goals — an important factor in the food and beverage industry.
Making sure your data and processes from research and development through to production are compliant is essential in today's highly regulated life science, biopharma, pharmaceutical and food industries. But it's no easy task. Following all of the required steps and ensuring the integrity of your data at every stage is easier and more successful when you use a product designed to keep your data compliant.
On the west coast of southern Sweden, facing the expanse of the ocean, is the beautiful city of Gothenburg. Surrounded by a string of islands, this city has been the home for sailors and merchants, seafaring and shipping, since ancient times. One of the islands to the north of Gothenburg is the picturesque island of Tjörn. Once every year, Tjörn is the location for one of the most famous sailing races in Sweden – “Tjörn Runt” or “Around Tjörn”.
In a manufacturing setting where consistent quality matters, variability in how individual technicians and operators perform their jobs can be frustrating for managers. Companies need a way to achieve consistent quality, without reducing the capacity for innovation and improvement.
In production, your media will pass several different refinement steps. To really understand and be assured about a good progression and state of the production, all of these processing steps need to be monitored continuously. With SIMCA® and SIMCA®-online, both part of the Umetrics® Suite of Data Analytics Solutions, you can confidently monitor and control every step of your process. The web clients allow you to access manufacturing data anytime, anywhere.
In life science biopharma manufacturing, demonstrating consistent, repeatable processes is essential both for regulatory compliance and product quality. Being able to create data-driven, performance-based objectives, and aligning the process control strategies with compliance and business performance objectives, allows companies to take their data analysis to the next level: the level at which it becomes meaningful for the company’s bottom line.
When it comes to continuous quality improvement and removing defects from a process, Six Sigma continues to be the gold standard in manufacturing and process management. This structured, data-driven methodology for discovering problems relies on rigorous analysis of production and process data. For many companies, engaging in a Six Sigma process can be time consuming or even a bit daunting.
You may have heard the term Six Sigma used in conjunction with lean manufacturing, a Kaizen approach or continuous quality improvement. Perhaps you thought Six Sigma only applied to large-scale business operations, or that newer philosophies had overtaken Six Sigma as the most updated approach to quality management? But if you're looking for a way to improve your production processes or solve a problem you’re having with quality, Six Sigma might be the answer. Are you and your team familiar with these concepts? Here's an overview.
[This blog was a favorite last year, so we thought you'd like to see it again. Send us your comments!].
Whether you work in engineering, R&D, or a science lab, understanding the basics of experimental design can help you achieve more statistically optimal results from your experiments or improve your output quality.
Product development and innovation are important elements for the survival of many companies. Whether introducing a new food flavor or adding new product features, understanding consumer preferences can help guide both design and production decisions. The right decisions can make a product launch more successful, and ultimately more profitable.
Formative assessment has come into focus in recent years. In Sweden, the use of formative assessment is typically emphasized in the curriculum of upper secondary schools. However, scientific studies show both positive as well as no effects at all of formative assessment on student performance.
Furthermore, formative assessment has proved to be time consuming, which obviously is a problem if it has no effects on learning. A new thesis by Daniel Larsson at the Linnæus University, Sweden, shows that multivariate data analysis, MVDA, can be used to give some answers about the effectiveness of such teaching practices.
Multivariate data analysis (MVDA) is a statistical technique that can be used to analyze data with more than one variable in order to look for deviations and understand the relationships between the different data points. In practice, this can mean taking data from a number of different sources and turning it into meaningful information from which you can draw some conclusions.
In bioprocessing today, a shift is happening that takes the ability to monitor, optimize and control processes to the next level. Whereas in the past manufacturers aspired to measure data in order to find out why a bioprocess action happened (using descriptive and diagnostic analytics), today we are able to use predictive analytics to determine what will happen in a bioprocess based on specific process data measured in real-time. This migration “up the food chain” to a higher level of data analytics requires automation, ongoing process monitoring and the ability to make adjustments in real-time.
At the heart of any process used to manufacture biological products is a bioreactor setup that supports a stable and reproducible biologically active environment. The bioreactor provides a controlled environment to achieve optimal growth for the particular cell cultures being used.
Biopharmaceutical companies today are challenged to develop high producing cell lines as quickly as possible. Commercially available media may fall short of performance expectations required to meet targets. The alternative —fully customized media and feed development — requires significant funding, time and in-house expertise in media development.
In life science, biopharma and other areas of research, development and production, design of experiments (DOE) provides a systematic method to determine cause and effect relationships between factors and responses affecting a process, product or analytical system. But the key to understanding your results is effective analysis of your experimental data.
Principal component analysis, or PCA, is a statistical procedure that allows you to summarize the information content in large data tables by means of a smaller set of “summary indices” that can be more easily visualized and analyzed. The underlying data can be measurements describing properties of production samples, chemical compounds or reactions, process time points of a continuous process, batches from a batch process, biological individuals or trials of a DOE-protocol, for example.
In pharmaceutical and other industries that rely on spectroscopy and multivariate calibration for quality control of manufacturing processes, optimizing the analysis of spectral data is imperative. Using a tool that is specifically designed with spectral analytics in mind can make the job faster, easier and more reliable.
What do we mean by pre-processing of data, and why is it needed? Let's take a look at some data pre-processing methods and how they help create better models when using Principle Component Analysis (PCA) and other methods of data analytics.
For manufacturing companies, process control is essential— even for those producing low-cost items such as small plastic parts. That’s because even when units are small and inexpensive, the cost of defects becomes exponentially higher when they reach the next manufacturing step at another plant.
In this blog post, we’ll take a closer look at a feature of the SIMCA data analytics software called the Omics skin. So what exactly is an “omics” skin?
In chemical manufacturing, the process involved in creating a breakthrough product often takes several years — with ongoing tests that may be based on trial and error as much as specifically applied knowledge. One area of development in the specialty chemicals market involves the creation of new new additives called plasticizers that can help resins or polymers retain a more supple or flexible nature.
In this blog post we will take a closer look at OPLS*, or Orthogonal PLS, a method to model process data. The advantage of OPLS compared to PLS is that you can uncover hidden details and get a more precise understanding of your data – all of which will help you build better predictive models of your processes.
Worldwide demand for energy escalates every year, and the consumption of fossil fuels continues to increase despite the growing supply of alternative energy options. Globally, about 81 percent of energy comes from a finite supply of fossil fuels like oil, coal and natural gas. Fossil fuels are used to heat homes, run vehicles, power industry and manufacturing, and provide electricity.
All manufacturing industries need good control and good overview of their production processes. As already discussed in a previous blog post, SIMCA-online enables you to apply advanced multivariate data analytics in real time to monitor your production processes, for example to make sure that your production process is behaving as it should or that the quality is what it should be.
In manufacturing and other industries that have complex processes, knowing which variables have the most impact on quality and at what point, or knowing which combination of variables to change in order to improve your process, can have a huge impact on the overall quality or profitability of your manufacturing process. But without making expensive and time-consuming changes in the physical processes in order to test all possible scenarios, how can you identify and predict the variables that have the most significant impact on your outputs?
Using real-time data analytics monitoring has become the accepted way to monitor processes in several industries. The goal is to detect and diagnose issues as they happen, which is a great leap forward compared to traditional analysis conducted in retrospect. This has been highlighted in a previous blog post.
In a previous blog post we discussed how SIMCA-online can help you make complex data simple and ensure that you get maximum value from your data.
In this blog post we will introduce a number of benefits of the newly released versions of SIMCA 15 and SIMCA-online 15. To mention just a few things, you get a much improved ability to model and control complete processes, including processes with a very high complexity. You also get a much better connection between SIMCA and SIMCA-online so that information can flow in both ways.
An important environmental issue that has come into focus is the increasing number of chemicals that we are exposed to in our everyday life. Chemicals are found in products ranging from cars and furniture to clothing and skincare, and are also by-products from combustion. The CAS REGISTRYSM, an international standard for chemical information, currently contains more than 134 million unique organic and inorganic chemical substances and more than 67 million sequences.
Using a Quality by Design (QbD) approach for DOE supports ICH Q8 compliance
In pharmaceutical development, manufacturers must be able to demonstrate product robustness and deliver the intended quality of the product within allowable ranges for the claimed shelf-life period. Both international and country specific regulatory agencies, such as the FDA, pay close attention to these claims.
Using advanced data analytics models in real time opens up a whole new world of possibilities for improving your production processes. Not only does real-time process monitoring provide a level of confidence in your process performance, it can also help improve the overall quality of your production output.
If data analytics were easy, everyone would do it, right? Well, what if were easy enough for anyone to do it? Can you image what sort insights you might glean from the vast pools of data your company collects about your manufacturing processes, sales or production outputs?
If all of your data stays hidden in the depths of some process control computer or in Excel spreadsheets on the manufacturing floor manager’s desk, are they doing anyone any good?
Analyzing batch process data is a lot like juggling. You have multiple sets of data from different sources and in order to turn them into a meaningful presentation, you need a method of handling them to make sure they are all in the right place at the right time.
When to apply OPLS-DA vs PCA for metabolomics and other omics data analysis
Do you know when to use OPLS-DA and when to use PCA/SIMCA data analysis techniques? Find out how to uncover the differences in your data with these classification and discriminant analysis methods.
How Multivariate Data Analysis Can Separate the Players from the Gorillas (MVDA for beginners)
We have more data than ever before coming at us from many sources – both in our personal lives as well as business. Data is everywhere: from the production flow of a manufacturing floor to the sales results in a grocery store to the number of shares a page gets on Facebook. How do you sort it all out in a way that makes sense? Which data should you worry about and which should you ignore?
“The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.” — Bill Gates
Producing and distributing raw materials and foodstuffs with a low profit margin is a challenging business. One major supplier has made significant gains through applying multivariate data analysis (MVDA) to their manufacturing processes and logistics.
The key to process manufacturing success is a mixture of knowledge and experience supported by mastery of data. A presentation I recently attended put this into sharp focus. A major paper manufacturer was faced with the challenge of maintaining paper smoothness during production. They approached this problem in a way that gave them enormous insight into their process, the ability to control it in real time, and ultimately lead to cost savings and maintained quality. There were also a few added benefits, including the ability to spot, diagnose and solve problems in real time.
Attending IFPAC 2017? Come to booth #407 to see how augmented reality and data analytics work hand-in-hand. Using Microsoft Hololens technology, Sartorius Stedim Data Analytics is making groundbreaking innovation which we will showcase and demo at IFPAC 2017. Get a personal demo, and see, among other interesting use cases, how multidimensional design spaces can be visualized as a hologram using augmented reality.