Principal component analysis, or PCA, is a statistical procedure that allows you to summarize the information content in large data tables by means of a smaller set of “summary indices” that can be more easily visualized and analyzed. The underlying data can be measurements describing properties of production samples, chemical compounds or reactions, process time points of a continuous process, batches from a batch process, biological individuals or trials of a DOE-protocol, for example.
[This blog was a favorite last year, so we thought you'd like to see it again. Send us your comments or questions!]
Whether you work in engineering, R&D, or a science lab, understanding the basics of experimental design can help you achieve more statistically optimal results from your experiments or improve your output quality.
Consumers expect a certain consistency in quality and taste from the food and beverage brands they love. But many factors can influence the way a product tastes when it reaches the consumer – ranging from the manufacturing process to seasonality of ingredients to storage temperatures. Similarly, a number of other factors may influence the overall quality attributes that matter, such as alcohol content of beer or stability of the whiskey aging process.
In the midst of a global COVID-19 pandemic, a top priority for many pharma and biopharma companies is to get a vaccine developed, produced and delivered to the public as quickly as possible. Ushering a vaccine through rigorous testing protocols and regulatory approvals is not an easy (or quick) effort, but incorporating advanced data analytics could help accelerate the process. Data analytics has proven effective in speeding vaccine development both by enabling more efficient Design of Experiments (DOE) and by creating rapid-scale production rollout processes.
You’ve probably heard the terms artificial intelligence (AI), machine learning (ML) and deep learning (DL) being used in conjunction with digital transformation and data science. You may be wondering what the relationship is between these subjects. How are businesses in industries ranging from biopharma to chemicals to food & beverage incorporating AI, machine learning and data science to improve their processes? Let’s take a look at what these terms mean and how businesses are using them to make more strategic decisions and improve production processes.
Digital transformation in biopharma promises to deliver exponential results and make new discoveries and solutions to complex problems a reality, but it requires companies to make big changes to get there—changes in processes as well as adoption of new technologies. For some companies and facilities, this is a bigger leap than for others. Depending on the level of digitalization and integration that currently exists within a company, the process can take from months to years.
Digital transformation in biopharma, like other industries, is accelerating as Pharma 4.0 and Industry 4.0 begin to take shape in companies of all sizes. But knowing that digital transformation is inevitable is one thing, successfully managing the transition process is another. What are the steps biopharma companies should be taking to ensure a smooth digital transformation?
Looking for ways to improve the efficiency of its power plant operations while reducing costs and environmental emissions, the Department of Power and Water at Michigan State University (MSU) began a study using multivariate data analytics that led to some surprising findings. The results have implications that could help other operators of large-scale power facilities reduce their carbon footprint and improve power plant operations.
Finding the right balance between efficient power output from boilers and other energy producing equipment while also reducing environmental emissions is an important objective for power plant operators. Governments and environmental agencies around the world establish emission standards as part of air pollution regulations, but finding the right way to meet the standards can vary greatly depending on location, equipment and other operating factors.
From employing artificial intelligence (AI) to identify drug candidates to using big data to support continuous process manufacturing, the prospects for digital transformation in the biopharma industry are huge. Yet, biopharma and life sciences lag behind many other industries when it comes to digital transformation.
In the midst of a global crisis, many industrial manufacturing operations— including those in the chemical industry— are faced with shortages of supplies and equipment, or staff reductions, and finding it difficult to keep operations working as normal. Are there process improvements or tools that can be used to manage production more efficiently during this time of COVID-19 (and moving forward)?
While many other industries have implemented multivariate data analysis software for process optimization and control, it is still not very common in the pulp and paper industry. However, multivariate data analysis has a very promising potential for both cost reductions and quality improvements in pulp and paper mills. No capital investments are needed, the implementation can be done remotely, and the software typically requires no permits.
A team at Glaxo Smith Kline, GSK, has carried out several studies using Raman spectroscopy and multivariate data analytics in order to monitor upstream cell culture processes. The studies show that Raman spectroscopy is a valuable analytical tool for enabling real-time monitoring and control of production bioreactors and that improved control can lead to improved product quality. Below is a summary of the studies.
An important step on the road to creating treatments for illnesses like COVID-19, which has caused the recent global pandemic, may start with understanding the similarities and differences between the various strains of coronavirus known to exist today. Making sense of large and complex sets of data, especially those that require novel interpretation, calls for a powerful analytics toolset to speed up the process.
Out of control processes in pharma manufacturing are not something to take lightly. If your production runs are seeing frequent deviations, leading to expensive batch losses or frequent rework, it’s time to take a look at ways to correct any process deviations in a more expedient manner. Uncorrected deviations or processes that vary from approved process parameters can lead to costly and dangerous mistakes.
Keeping your pharmaceutical manufacturing processes under control is important not only to ensure a quality product, but also for regulatory compliance. Process or raw material deviations can affect the downstream quality of a product and could mean tossing out an entire batch or end product if process corrections aren’t made soon enough — or if you can’t document that a correction was made before it affected your critical quality attributes.
For pharmaceutical manufacturers, a process deviation may not only mean a bad batch that affects a downstream process, it can also risk a regulatory violation that leads to fines or expensive market setback, or worse, it could endanger the health of the patient.
A new diagnostic method for detecting a rare kidney stone disease has recently been developed at the University of Iceland. Instead of using urine microscopy, which has certain disadvantages, the diagnostic method is based on mass spectrometry of plasma samples. Preliminary clinical data shows very promising results both in terms of detecting the disease and therapeutic drug monitoring. Design of Experiments (DOE) was used as a chemometric approach to optimize the assay. Below is a summary of the assay development and optimization.
Continuous manufacturing is one of the key trends within the pharmaceutical industry, both for the production of ‘classical’ drugs as well as large molecules. Companies are looking for ways to shift from traditional batch processing to a continuous method of operation. The main advantages associated with these processes are more room for modularity, automation and flexibility due to a smaller footprint, as well as more consistent quality of the drug product.
For pharmaceutical companies facing challenges such as rising costs, sterner regulations and declining profit margins, innovative new technologies like artificial intelligence (AI) and digital twins have become part of an essential strategy to future-proof their businesses. A digital twin is the next evolution of machine learning combining advanced data analytics and equipment simulation with comprehensive system models that blend historical information with real-time data to predict the future of a process. According to Gartner, the digital twin concept was one of the top 10 strategic technology trends in 2019.
Biosimilars are an exciting route to increasing access to the highly effective therapy made possible by biologics, but ensuring a biosimilar meets the critical quality attributes (CQA) of the original biologic is a major challenge. Optimizing production at full scale is impractical, which makes a quality by design (QbD) approach using a reliable scale down model of the process an attractive alternative. A process development team at Zhejiang Hisun Pharmaceuticals, Taizhou, China, therefore developed a scaled down model of the cell culture process used to produce the biosimilar adalimumab. They qualified the model using multivariate data analysis (SIMCA), and explored the design space for key process attributes (KPA) and CQAs using MODDE.
Optimizing the function of boilers, turbines and other capital equipment used to generate power requires a careful balance of fuel, heat, pressure, operator proficiency and many other variables. Managing the process on a day-to-day, or minute-to-minute basis, is like performing a skilled and well-orchestrated dance—partly based on data, but also based on operator expertise. Yet, adding more accurate information to the equation can potentially save millions of dollars, cut emissions significantly and even expand the working life of your equipment.
There is a strong demand for devices such as mobile phones, tablets and large screen TVs all over the globe. The business is competitive, which puts pressure on prices. At the same time, production costs are fairly high due to complex production processes. Consequently, a high yield becomes paramount for good profit margins. Multivariate data analysis (MVDA) is being employed by an increasing number of manufacturing companies to increase yield, and the electronics industry is no exception. This article provides examples of where and how real-time data analytics can be used in the electronics industry.
Over the last several years, the use of artificial intelligence (AI) in the pharma and biomedical industry has gone from science fiction to science fact. Increasingly, pharma and biotech companies are adopting more efficient, automated processes that incorporate data-driven decisions and use predictive analytics tools. The next evolution of this approach to advanced data analytics incorporates artificial intelligence and machine learning.
The key to being able to innovate, improve and streamline your processes often lies in gaining as many insights as you can from a variety data sources scattered throughout your operations. Making sense of all that data can be difficult. But it's not an impossible dream.
Most biopharma manufacturing companies are keen to adopt new methods that would streamline production, reduce errors and ensure product quality. That was the goal of Bristol-Myers Squibb when they implemented a complex real-time process monitoring system that involved integrating data from a number of different technologies, systems and vendors to gain greater control over complex batch processes.
In many manufacturing industries, variability in raw materials can lead to unexpected and undesirable changes in the final products. In regulated industries such as pharmaceuticals, this is especially problematic due to the need to maintain carefully controlled processes that stay within approved regulatory parameters for drug development and production. Embracing a total company-wide digital transformation enabled Amgen to align data across multiple systems to not only control, but also predict unacceptable deviations in time to make necessary adjustments. Read on to find out how they used data analytics to implement real-time process control.
Several trends in the food and beverage industry are leading to challenges for manufacturers that can be best addressed with data analytics. With growing digitalization, more companies have access to the kinds of data that can transform their processes to meet the latest consumer demands as well as to shorten time to market, reduce costs, and shrink health and safety risks.
In the last few years, many pharmaceutical companies have started investing in continuous production, and some have already succeeded in filing new pharmaceuticals using a continuous flow manufacturing process. This article summarizes a study at GlaxoSmithKline, GSK, where real-time multivariate monitoring added value to the development of a continuous production process of an active pharmaceutical ingredient (API).
The pharmaceutical industry, including R&D, manufacturing and also product sales and use, creates a lot of data. The question is, what can we do to understand our data better, get more out of it, and unlock its potential in the most rational way possible to get to the knowledge we need? And how can we gain control over our research, or the processes needed to generate a stable, reliable product that consistently meets regulatory requirements? The answer is Multivariate Data Analysis.
In agrochemical, pharmaceutical and other industries that manufacture complex chemicals, finding ways to reduce waste and improve inefficiencies often hinges on selecting the right chemical compounds. Data analytics can help manufacturers find alternative compounds that meet complex requirements, decrease raw material usage or enable more cost-effective, sustainable processes.
A challenge for the regenerative medicine industry is to develop cell culture processes that can be scaled up for high volume production. Finding a better way to scale up commonly used research cells like HEK293T (used for protein expression and the production of recombinant retroviruses or lentiviral vectors) would be beneficial for biologists in many fields of medicine. Dr. Franziska Bollmann, virus scientist at Sartorius Stedim Biotech in Germany, recently conducted two experiments to find out if micro bioreactor systems can help facilitate the transition from the traditional shake flask process to a more improved method optimizing process control.
The 2019 Umetrics User Meeting drew more than 102 engineers, operations managers, process experts, researchers, and data scientists in industries ranging from biopharma to food and beverage to chemicals who gathered to share ideas and insights into new methods for streamlining their processes, reducing waste and cost of goods sold.
The potential for Artificial Intelligence (AI) is enormous and the applications seemingly unlimited. One subset of AI, deep learning, offers the promise of efficiently solving a large range of challenges involving unstructured data by harnessing neural networks to save time and money, and even perform seemingly impossible tasks.
Deep learning has revolutionized the fields of artificial intelligence, computer vision, speech recognition, and more in recent years. Deep learning can draw information from unstructured data such as images or text in a way that was unthinkable a decade ago. In industries like Pharma and Biopharma, deep learning can help all the way from understanding how cells work using live cell imaging to monitoring manufacturing using audio.
Whether it’s fake olive oil, coffee bulked up with husks and twigs, or honey tainted with antibiotics, food fraud is a growing problem worldwide. The Australian research organization CSIRO states that the economic damage alone from food fraud has reached $35 billion (in US dollars) in 2018. The underlying cause is nearly always financial gain and economic pressures to save money by using inferior (or mislabeled) products. Predictive analytics is one tool manufacturers are using to combat food fraud.
SIMCA 16 offers improved ribbons, tours, wizards, data merging, multiblock analysis and more.
SIMCA is a multivariate data analytics tool that helps users make sense of complex data by transforming numbers and statistics into visual information for easy interpretation and understanding. Across many industries ranging from pharmaceuticals and chemicals to food and beverage manufacturers to academia, SIMCA helps production managers and researchers a like make better decisions in order to take action quickly and with confidence.
One key to reducing R&D costs in the biopharmaceutical market is streamlining and speeding up process data flow for Design of Experiments (DOE). Now, a direct integration of Genedata Bioprocess® platform and Umetrics Suite MODDE® software enables seamless data flow and facilitates the design, execution and evaluation of experiments in large-molecule process development.
Recently the FDA issued a new draft guidance for continuous manufacturing of small molecule drugs. With these draft guidelines the FDA wants to engage more pharmaceutical manufacturers to shift from traditional batch/start-stop processing to continuous manufacturing. The main advantages associated with these processes are more room for modularity, automation and flexibility due to a smaller footprint, but also more consistent quality of the drug product. Of course the main incentive for the FDA to promote this way of processing is that it believes that this will have a positive impact on drug prices and prevent drug shortages.
Injection molding is the most important production method for manufacturing plastic components used in products ranging from cars to medical devices. Although the plastic components themselves are often inexpensive to produce, any defect can lead to expensive errors that can affect the performance or safety of the finished product. Creating a system of early fault detection and continuous process improvement can mean big payoffs for manufacturers.
Throughout the evolution of manufacturing, many industries have gradually shifted away from batch process to continuous process manufacturing as production technologies matured. This has included industries such as chemical, petroleum, steel, automobile, consumer goods and food manufacturing.
Breast cancer is the most commonly diagnosed cancer amongst women worldwide and a leading cause of cancer related deaths among females. It’s the second most common type of cancer overall. According to the International Agency for Research on Cancer Research, there were more than 2 million new cases in 2018.
In industries that depend on bioprocessing, achieving the highest possible yields in the shortest time frame, while keeping costs down and product quality high is often challenging. Meeting these goals requires having a well-designed, well-defined and well-controlled process. And at the core of any effective process control is a set of effective process modeling tools.
In industries ranging from biopharmaceuticals to chemicals, executives in today’s manufacturing marketplace face ever-increasing pressures to grow profit margins, reduce time to market and optimize processes across all aspects of their business. Everything from constraints in the supply of raw materials to multiple steps in a manufacturing process can affect productivity—making process optimization an amorphous target.
Speed and changing market conditions are significant challenges for businesses across many industries and markets. The need for improved efficiency, the ability to adapt to changing market conditions and digitalization are often key drivers of change. By focusing on data, a company can make significant improvements in processes or systems, and gain insights into the driving forces behind business operations. Data analytics is the key to business optimization.
The natural variability of botanical material often makes it difficult to ensure a consistent quality process for pharmaceuticals made from plant-based products. In addition, botanical drug products (BDPs) are often produced using a series of separate batch processes, which adds even more variability into the manufacturing process.
Many elderly people are afraid of falling – and for good reasons. Falls can have serious consequences for the individual but also the fear of falling could have serious effects on health and independence. A new research project at Luleå University of Technology in Sweden has taken a closer look at fall-related concerns among elderly people, using multivariate data analysis, MVDA, with the ultimate goal of finding diagnostic and training methods that could help reduce falls. Results from the first studies have given some interesting answers.
What’s the secret formula for creating long-lasting bubbles? Is expert knowledge of liquid dynamics needed to optimize the mixture design and develop the best bubble solution? Or can we use design of experiments (DOE) and data analytics to draw conclusions? Let’s a take a look at a fun example of how DOE can be used to optimize a mixture design in order to achieve our goal: create long-lasting bubbles.
Advancements in cell and gene therapy hold promise for the future of personalized medicine, especially for cancer treatments. However, bioprocessing methods for autologous cellular therapies, and CAR-T in particular, often present unique challenges in manufacturing due to the variability of the starting material and unique nature of each batch. Is there a way to create more efficient processes in order to bring down costs and make personalized medicine a viable option for more patients?
Could data analytics aid in the diagnosis of severe neurological diseases? In a recent study, a research group at Umeå University has conducted statistical data analysis of biomarkers from patients suffering from Amyotrophic lateral sclerosis (ALS, also known as Lou Gehrig’s disease) and Parkinson’s disease to investigate whether data analytics could help in the diagnosis of – and help distinguish between – the two diseases.
Pressure to cut development costs and lower regulatory barriers while assuring product quality has stimulated the pharmaceutical industry to apply Quality by Design (QbD) to manage risk and gain process and product understanding. As a result, QbD is being widely promoted by regulatory authorities such as the Food and Drug Administration, and the International Conference on Harmonization.
Mining information in unstructured text can be a real challenge. Patent documents, for example, provide a rich source of technological and scientific knowledge that can reveal technological trends as well as information on the legal landscape of the market. This makes analysis of the vast and ever-growing number of patents an important part of corporate business strategies.
Making the perfect bar of chocolate is not just about mixing the right amount of sugar and cocoa, or adjusting the process for product quality. Another factor that must be taken into account to optimize both taste and profits is the grinding time of the cocoa beans. Let’s take a look at how data analytics can be used to elevate both, and to find the right combination of ingredients and process to support the business goals — an important factor in the food and beverage industry.
Making sure your data and processes from research and development through to production are compliant is essential in today's highly regulated life science, biopharma, pharmaceutical and food industries. But it's no easy task. Following all of the required steps and ensuring the integrity of your data at every stage is easier and more successful when you use a product designed to keep your data compliant.
On the west coast of southern Sweden, facing the expanse of the ocean, is the beautiful city of Gothenburg. Surrounded by a string of islands, this city has been the home for sailors and merchants, seafaring and shipping, since ancient times. One of the islands to the north of Gothenburg is the picturesque island of Tjörn. Once every year, Tjörn is the location for one of the most famous sailing races in Sweden – “Tjörn Runt” or “Around Tjörn”.
In a manufacturing setting where consistent quality matters, variability in how individual technicians and operators perform their jobs can be frustrating for managers. Companies need a way to achieve consistent quality, without reducing the capacity for innovation and improvement.
In production, your media will pass several different refinement steps. To really understand and be assured about a good progression and state of the production, all of these processing steps need to be monitored continuously. With SIMCA® and SIMCA®-online, both part of the Umetrics® Suite of Data Analytics Solutions, you can confidently monitor and control every step of your process. The web clients allow you to access manufacturing data anytime, anywhere.
In life science biopharma manufacturing, demonstrating consistent, repeatable processes is essential both for regulatory compliance and product quality. Being able to create data-driven, performance-based objectives, and aligning the process control strategies with compliance and business performance objectives, allows companies to take their data analysis to the next level: the level at which it becomes meaningful for the company’s bottom line.
When it comes to continuous quality improvement and removing defects from a process, Six Sigma continues to be the gold standard in manufacturing and process management. This structured, data-driven methodology for discovering problems relies on rigorous analysis of production and process data. For many companies, engaging in a Six Sigma process can be time consuming or even a bit daunting.
You may have heard the term Six Sigma used in conjunction with lean manufacturing, a Kaizen approach or continuous quality improvement. Perhaps you thought Six Sigma only applied to large-scale business operations, or that newer philosophies had overtaken Six Sigma as the most updated approach to quality management? But if you're looking for a way to improve your production processes or solve a problem you’re having with quality, Six Sigma might be the answer. Are you and your team familiar with these concepts? Here's an overview.
Product development and innovation are important elements for the survival of many companies. Whether introducing a new food flavor or adding new product features, understanding consumer preferences can help guide both design and production decisions. The right decisions can make a product launch more successful, and ultimately more profitable.
Formative assessment has come into focus in recent years. In Sweden, the use of formative assessment is typically emphasized in the curriculum of upper secondary schools. However, scientific studies show both positive as well as no effects at all of formative assessment on student performance.
Furthermore, formative assessment has proved to be time consuming, which obviously is a problem if it has no effects on learning. A new thesis by Daniel Larsson at the Linnæus University, Sweden, shows that multivariate data analysis, MVDA, can be used to give some answers about the effectiveness of such teaching practices.
Multivariate data analysis (MVDA) is a statistical technique that can be used to analyze data with more than one variable in order to look for deviations and understand the relationships between the different data points. In practice, this can mean taking data from a number of different sources and turning it into meaningful information from which you can draw some conclusions.
In bioprocessing today, a shift is happening that takes the ability to monitor, optimize and control processes to the next level. Whereas in the past manufacturers aspired to measure data in order to find out why a bioprocess action happened (using descriptive and diagnostic analytics), today we are able to use predictive analytics to determine what will happen in a bioprocess based on specific process data measured in real-time. This migration “up the food chain” to a higher level of data analytics requires automation, ongoing process monitoring and the ability to make adjustments in real-time.
At the heart of any process used to manufacture biological products is a bioreactor setup that supports a stable and reproducible biologically active environment. The bioreactor provides a controlled environment to achieve optimal growth for the particular cell cultures being used.
Biopharmaceutical companies today are challenged to develop high producing cell lines as quickly as possible. Commercially available media may fall short of performance expectations required to meet targets. The alternative —fully customized media and feed development — requires significant funding, time and in-house expertise in media development.
In life science, biopharma and other areas of research, development and production, design of experiments (DOE) provides a systematic method to determine cause and effect relationships between factors and responses affecting a process, product or analytical system. But the key to understanding your results is effective analysis of your experimental data.
In pharmaceutical and other industries that rely on spectroscopy and multivariate calibration for quality control of manufacturing processes, optimizing the analysis of spectral data is imperative. Using a tool that is specifically designed with spectral analytics in mind can make the job faster, easier and more reliable.
What do we mean by pre-processing of data, and why is it needed? Let's take a look at some data pre-processing methods and how they help create better models when using Principle Component Analysis (PCA) and other methods of data analytics.
For manufacturing companies, process control is essential— even for those producing low-cost items such as small plastic parts. That’s because even when units are small and inexpensive, the cost of defects becomes exponentially higher when they reach the next manufacturing step at another plant.
In this blog post, we’ll take a closer look at a feature of the SIMCA data analytics software called the Omics skin. So what exactly is an “omics” skin?
In chemical manufacturing, the process involved in creating a breakthrough product often takes several years — with ongoing tests that may be based on trial and error as much as specifically applied knowledge. One area of development in the specialty chemicals market involves the creation of new new additives called plasticizers that can help resins or polymers retain a more supple or flexible nature.
In this blog post we will take a closer look at OPLS*, or Orthogonal PLS, a method to model process data. The advantage of OPLS compared to PLS is that you can uncover hidden details and get a more precise understanding of your data – all of which will help you build better predictive models of your processes.
Worldwide demand for energy escalates every year, and the consumption of fossil fuels continues to increase despite the growing supply of alternative energy options. Globally, about 81 percent of energy comes from a finite supply of fossil fuels like oil, coal and natural gas. Fossil fuels are used to heat homes, run vehicles, power industry and manufacturing, and provide electricity.
All manufacturing industries need good control and good overview of their production processes. As already discussed in a previous blog post, SIMCA-online enables you to apply advanced multivariate data analytics in real time to monitor your production processes, for example to make sure that your production process is behaving as it should or that the quality is what it should be.
In manufacturing and other industries that have complex processes, knowing which variables have the most impact on quality and at what point, or knowing which combination of variables to change in order to improve your process, can have a huge impact on the overall quality or profitability of your manufacturing process. But without making expensive and time-consuming changes in the physical processes in order to test all possible scenarios, how can you identify and predict the variables that have the most significant impact on your outputs?
Using real-time data analytics monitoring has become the accepted way to monitor processes in several industries. The goal is to detect and diagnose issues as they happen, which is a great leap forward compared to traditional analysis conducted in retrospect. This has been highlighted in a previous blog post.
In a previous blog post we discussed how SIMCA-online can help you make complex data simple and ensure that you get maximum value from your data.
In this blog post we will introduce a number of benefits of the newly released versions of SIMCA 15 and SIMCA-online 15. To mention just a few things, you get a much improved ability to model and control complete processes, including processes with a very high complexity. You also get a much better connection between SIMCA and SIMCA-online so that information can flow in both ways.
An important environmental issue that has come into focus is the increasing number of chemicals that we are exposed to in our everyday life. Chemicals are found in products ranging from cars and furniture to clothing and skincare, and are also by-products from combustion. The CAS REGISTRYSM, an international standard for chemical information, currently contains more than 134 million unique organic and inorganic chemical substances and more than 67 million sequences.
Using a Quality by Design (QbD) approach for DOE supports ICH Q8 compliance
In pharmaceutical development, manufacturers must be able to demonstrate product robustness and deliver the intended quality of the product within allowable ranges for the claimed shelf-life period. Both international and country specific regulatory agencies, such as the FDA, pay close attention to these claims.
Using advanced data analytics models in real time opens up a whole new world of possibilities for improving your production processes. Not only does real-time process monitoring provide a level of confidence in your process performance, it can also help improve the overall quality of your production output.
If data analytics were easy, everyone would do it, right? Well, what if were easy enough for anyone to do it? Can you image what sort insights you might glean from the vast pools of data your company collects about your manufacturing processes, sales or production outputs?
If all of your data stays hidden in the depths of some process control computer or in Excel spreadsheets on the manufacturing floor manager’s desk, are they doing anyone any good?
Analyzing batch process data is a lot like juggling. You have multiple sets of data from different sources and in order to turn them into a meaningful presentation, you need a method of handling them to make sure they are all in the right place at the right time.
When to apply OPLS-DA vs PCA for metabolomics and other omics data analysis
Do you know when to use OPLS-DA and when to use PCA/SIMCA data analysis techniques? Find out how to uncover the differences in your data with these classification and discriminant analysis methods.
How Multivariate Data Analysis Can Separate the Players from the Gorillas (MVDA for beginners)
We have more data than ever before coming at us from many sources – both in our personal lives as well as business. Data is everywhere: from the production flow of a manufacturing floor to the sales results in a grocery store to the number of shares a page gets on Facebook. How do you sort it all out in a way that makes sense? Which data should you worry about and which should you ignore?
“The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.” — Bill Gates
Producing and distributing raw materials and foodstuffs with a low profit margin is a challenging business. One major supplier has made significant gains through applying multivariate data analysis (MVDA) to their manufacturing processes and logistics.
The key to process manufacturing success is a mixture of knowledge and experience supported by mastery of data. A presentation I recently attended put this into sharp focus. A major paper manufacturer was faced with the challenge of maintaining paper smoothness during production. They approached this problem in a way that gave them enormous insight into their process, the ability to control it in real time, and ultimately lead to cost savings and maintained quality. There were also a few added benefits, including the ability to spot, diagnose and solve problems in real time.
Attending IFPAC 2017? Come to booth #407 to see how augmented reality and data analytics work hand-in-hand. Using Microsoft Hololens technology, Sartorius Stedim Data Analytics is making groundbreaking innovation which we will showcase and demo at IFPAC 2017. Get a personal demo, and see, among other interesting use cases, how multidimensional design spaces can be visualized as a hologram using augmented reality.