In a manufacturing setting where consistent quality matters, variability in how individual technicians and operators perform their jobs can be frustrating for managers. Companies need a way to achieve consistent quality, without reducing the capacity for innovation and improvement.
[This blog was a favorite last year, so we thought you'd like to see it again. Send us your comments!].
Whether you work in engineering, R&D, or a science lab, understanding the basics of experimental design can help you achieve more statistically optimal results from your experiments or improve your output quality.
Product development and innovation are important elements for the survival of many companies. Whether introducing a new food flavor or adding new product features, understanding consumer preferences can help guide both design and production decisions. The right decisions can make a product launch more successful, and ultimately more profitable.
Formative assessment has come into focus in recent years. In Sweden, the use of formative assessment is typically emphasized in the curriculum of upper secondary schools. However, scientific studies show both positive as well as no effects at all of formative assessment on student performance.
Furthermore, formative assessment has proved to be time consuming, which obviously is a problem if it has no effects on learning. A new thesis by Daniel Larsson at the Linnæus University, Sweden, shows that multivariate data analysis, MVDA, can be used to give some answers about the effectiveness of such teaching practices.
Multivariate data analysis (MVDA) is a statistical technique that can be used to analyze data with more than one variable in order to look for deviations and understand the relationships between the different data points. In practice, this can mean taking data from a number of different sources and turning it into meaningful information from which you can draw some conclusions.
In bioprocessing today, a shift is happening that takes the ability to monitor, optimize and control processes to the next level. Whereas in the past manufacturers aspired to measure data in order to find out why a bioprocess action happened (using descriptive and diagnostic analytics), today we are able to use predictive analytics to determine what will happen in a bioprocess based on specific process data measured in real-time. This migration “up the food chain” to a higher level of data analytics requires automation, ongoing process monitoring and the ability to make adjustments in real-time.
At the heart of any process used to manufacture biological products is a bioreactor setup that supports a stable and reproducible biologically active environment. The bioreactor provides a controlled environment to achieve optimal growth for the particular cell cultures being used.
Biopharmaceutical companies today are challenged to develop high producing cell lines as quickly as possible. Commercially available media may fall short of performance expectations required to meet targets. The alternative —fully customized media and feed development — requires significant funding, time and in-house expertise in media development.
In life science, biopharma and other areas of research, development and production, design of experiments (DOE) provides a systematic method to determine cause and effect relationships between factors and responses affecting a process, product or analytical system. But the key to understanding your results is effective analysis of your experimental data.
Principal component analysis, or PCA, is a statistical procedure that allows you to summarize the information content in large data tables by means of a smaller set of “summary indices” that can be more easily visualized and analyzed. The underlying data can be measurements describing properties of production samples, chemical compounds or reactions, process time points of a continuous process, batches from a batch process, biological individuals or trials of a DOE-protocol, for example.