## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

The quality of a product is dynamic in nature and develops over time. We present a case study from the food industry in which the concept of measuring the end-product quality is extended to incorporate the shelf-life period of the product with practical examples showing the harm of ignoring the changes that occur in the product quality as a function of time. The article also addresses the use of in-line spectroscopy to relate the variations in the input parameters, such as the raw materials, and the process variables to the final product quality over the entire shelf-life of the product. We also discuss multivariate statistical process control and monitoring issues.

To read the full-text of this research,

you can request a copy directly from the authors.

... Kang and Albin (2000) [3] introduced semiconductor manufacturing applications. Sahni et al. (2005) [4] described a production process of food mayonnaise with profile. For these different applications, we need to use a variety of appropriate control charts to monitor the parameters. ...

... Kang and Albin (2000) [3] introduced semiconductor manufacturing applications. Sahni et al. (2005) [4] described a production process of food mayonnaise with profile. For these different applications, we need to use a variety of appropriate control charts to monitor the parameters. ...

In recent years, linear profile monitoring has become one of the popular research directions in SPC. Although the linear profile model is simple and widely applicable, it is too sensitive to small parameter changes, leading to an increase in false alarm rates. This paper presents a new control chart with the practical importance for linear profile. The control chart can be more tolerant for the small shifts comparing the conventional control chart with considering the practical importance, so as to ensure that the really important changes are detected. The simulation study shows that the new control chart can be used to detect the change of intercept and slope efficiently. Based on control chart provided by Kim (2003), we obtain the threshold for the control chart with tolerance under null hypothesis with nominal in control run length ( ARL 0 ) and also yield run length for the situations of out of control ( ARL 1 ).

... They also gave an example of designing a robust alternator, where the aim is to obtain a desired current profile over a range of speed. Jin and Shi (2001) Sahni et al. (2005) presents an example of a profile response from a mayonnaise production process in the food industry. Some of the examples of the profiles are shown in (Figures 1 and 2). ...

... uately using a polynomial model or using piecewise polynomial models. In this article, we restrict our attention to the types of non linear profiles that can be adequately modeled using lower order polynomials. Few examples of polynomial profile include -acceleration and deceleration profile of an air bag in automotives,Marklund and Nilsson (2003).Sahni et al. (2005) discuss a scenario where monitoring the viscosity of mayonnaise over time is of interest.In this study we investigate the changepoint approach for Phase I analysis of polynomial profiles and conclude the article with our comments on the Phase II aspect of profile monitoring using changepoint approach. The changepoint approach can be def ...

... Wärmefjord (2004) described the problem for the assembly process of the Saab automobile. Sahni et al. (2005) suggest how to monitor the raw materials and different process variables in food industry in order to assure the quality of the final product. ...

The aim of sequential surveillance is on-line detection of an important change in an underlying process as soon as possible after the change has occurred. Statistical methods suitable for surveillance differ from hypothesis testing methods. In addition, the criteria for optimality differ from those used in hypothesis testing. The need for sequential surveillance in industry, economics, medicine and for environmental purposes is described. Even though the methods have been developed under different scientific cultures, inferential similarities can be identified. Applications contain complexities such as autocorrelations, complex distributions, complex types of changes, and spatial as well as other multivariate settings. Approaches to handling these complexities are discussed. Expressing methods for surveillance through likelihood functions makes it possible to link the methods to various optimality criteria. This approach also facilitates the choice of an optimal surveillance method for each specific application and provides some directions for improving earlier suggested methods.

... Marengo et al. (2005) considered an example involving organic pigments applied to cotton surfaces. Sahni et al. (2005), on the other hand, considered an example related to the shelf-life of a food product. Wang and Tsung (2005) used profile monitoring methods to detect changes in a Q-Q plot which refl ected the relationship between the current sample and a baseline sample. ...

Em muitas aplicações a qualidade de um processo ou de um produto é melhor caracterizada e descrita por uma relação funcional entre a variável resposta e uma ou mais variáveis explicativas. Monitoramento de perfis é usado para entender e checar a estabilidade desta relação ao longo do tempo. Cada vez que uma amostra é selecionada, uma coleção de pontos que pode ser representada por uma curva (ou perfil) é observada. Em algumas aplicações de calibração, o perfil pode ser representado adequadamente por um modelo de regressão linear simples, enquanto que em outras aplicações, modelos mais complicados serão necessários. Os objetivos deste artigo são: apresentar e resumir pesquisas recentes do uso de cartas de controle para monitorar perfis de processo e qualidade de produto e encorajar futuras pesquisas nesta área.

... Wärmefjord (2004) described the multivariate problem for the assembly process of the Saab automobile. Sahni et al. (2005) suggest that the raw material and different process variables in food industry should be analysed in order to assure the quality of the final product. Surveillance of several parameters (such as the mean and the variance) of a distribution is multivariate surveillance (see for example Knoth and Schmid (2002)). ...

Multivariate surveillance is of interest in industrial production as it enables the monitoring of several components. Recently there has been an increased interest also in other areas such as detection of bioterrorism, spatial surveillance and transaction strategies in finance.Several types of multivariate counterparts to the univariate Shewhart, EWMA and CUSUM methods have been proposed. Here a review of general approaches to multivariate surveillance is given with respect to how suggested methods relate to general statistical inference principles. Suggestions are made on the special challenges of evaluating multivariate sur-veillance methods.

... The vast majority of SPM methods were developed to handle scalar (0 th order tensor) sensor like data, either univariately [29][30][31] or multivariately [32][33][34][35]. More recently, profile monitoring emerged as a new SPM branch dedicated to functional relationships [36][37][38] or higher-order tensorial data structures such as: near-infrared (NIR) spectra [39][40][41], surface profilometry [37], grey-level images [42], colour and hyperspectral images [43][44][45][46][47], hyphenated instruments [48] among others, expanding the SPM domain to new processes/products. The one aspect shared by all methods proposed in the past and the fundamental premise established since the seminal work of Walter A. Shewhart [31], is that all common cause variation must be present in the reference historical Normal Operating Conditions (NOC) dataset collected to conduct Phase 1 analysis, i.e., in order to assess process stability and establish the control limits for the monitoring statistics. ...

Modern industrial units collect large amounts of process data based on which advanced process monitoring algorithms continuously assess the status of operations. As an integral part of the development of such algorithms, a reference dataset representative of normal operating conditions is required to evaluate the stability of the process and, after confirming that it is stable, to calibrate a monitoring procedure, i.e., estimate the reference model and set the control limits for the monitoring statistics. The basic assumption is that all relevant “common causes” of variation appear well represented in this reference dataset (using the terminology adopted by the founding father of process monitoring, Walter A. Shewhart). Otherwise, false alarms will inevitably occur during the implementation of the monitoring scheme. However, we argue and demonstrate in this article, that this assumption is often not met in modern industrial systems. Therefore, we introduce a new approach based on the rigorous mechanistic modeling of the dominant modes of common cause variation and the use of stochastic computational simulations to enrich the historical dataset with augmented data representing a comprehensive coverage of the actual operational space. We show how to compute the monitoring statistics and set their control limits, as well as to conduct fault diagnosis when an abnormal event is declared. The proposed method, called AGV (Artificial Generation of common cause Variability) is applied to a Surface Mount Technology (SMT) production line of Bosch Car Multimedia, where more than 17 thousand product variables are simultaneously monitored.

... The model structure presented in equation (2) is usually estimated through principal components regression (PCR) [26][27][28][29] or partial least squares (PLS). [30][31][32][33] This model structure finds wide applicability and success in chemometric problems, [34][35][36] soft sensor development, 37,38 large-scale process monitoring and predictive analysis, 20,39,40 and biosystems. 41 However, there is a mismatch between model structure in equation (2) and the structure of data collected in the present situation, as detailed next. ...

The optimized operation of modern analytical instrumentation is a critical but complex task. It involves the simultaneous consideration of a large number of factors, both qualitative and quantitative, where multiple responses should be quantified and several goals need to be adequately pondered, such as global quantification performance, selectivity, and cost. Furthermore, the problem is highly case specific, depending on the type of instrument, target analytes, and media where they are dispersed. Therefore, an optimization procedure should be conducted frequently, which implies that it should be efficient (requiring a low number of experiments), as simple as possible (from experimental design to data analysis) and informative (interpretable and conclusive). The success of this task is fundamental for achieving the scientific goals and to justify, in the long run, the high economic investments made and significant costs of operation. In this article, we present a systematic optimization procedure for the prevalent class of situations where multiple responses are available regarding a family of chemical compounds (instead of a single analyte). This class of problems conducts to responses exhibiting mutual correlations, for which, furthermore, several goals need to be simultaneously considered. Our approach explores the latent variable structure of the responses created by the chemical affinities of the compounds under analysis and the orthogonality of the interpretable extracted components to conduct their simultaneous optimization with respect to different analysis goals. The proposed methodology was applied to a real case study involving the quantification of a family of analytes with impact on wine aroma.

... Multivariate problems for the assembly process of the Saab automobile were described in [40]. In food industry different raw materials and several process steps are used, and in [32] it is suggested that these be analyzed in order to assure the quality of the final product. During the last years there have been an increased need for and interest in continuous monitoring in many areas apart from industrial production. ...

Multivariate surveillance is of interest in many areas such as industrial production, bioterrorism detection, spatial surveillance, and financial transaction strategies. Some of the suggested approaches to multivariate surveillance have been multivariate counterparts to the univariate Shewhart, EWMA, and CUSUM methods. Our emphasis is on the special challenges of evaluating multivariate surveillance methods. Some new measures are suggested and the properties of several measures are demonstrated by applications to various situations. It is demonstrated that zero-state and steady-state ARL, which are widely used in univariate surveillance, should be used with care in multivariate surveillance.

... Profiles are applicable in many other situations, such as performance testing, in which the response is a performance curve over a range of an independent variable like frequency or speed (Bisgaard & Steinberg 1997). Sahni et al. (2005) presented a non-linear profile to monitor a mayonnaise production process. Jin & Shi (2001) developed profiles as waveform signals and gave examples of force and torque signals collected from online sensors. ...

In this paper, a non-parametric approach is first proposed to monitor simple linear profiles with non-normal error terms in Phase I and Phase II. In this approach, two control charts based on a transformation technique and decision on beliefs are designed in order to monitor the intercept and the slope, simultaneously. Then, some simulation experiments are performed in order to evaluate the performance of the proposed control charts in Phase II under both step and drift shifts in terms of out-of-control average run length (ARL1). Besides, the performance of the proposed control charts is compared to the ones of seven other existing schemes in the literature. Simulation results show that the proposed control charts outperform the other control charts in detecting both the small step and small drift shifts of intercept. However, they have a weaker performance compared to other control charts in detecting both small step and small drift shifts of the slope. At the end, a real example from an electronic industry is used to illustrate the implementation of the proposed method.

... The mixtures were the whole plots and the process variables the sub-plots. The design is presented and discussed more thoroughly in Sahni et al. 22 . Note that this is a crossed split-plot situation leading to OLS = GLS. ...

Split-plot designs are frequently needed in practice because of practical limitations and issues related to cost. This imposes extra challenges on the experimenter, both when designing the experiment and when analysing the data, in particular for non-replicated cases. This paper is an overview and discussion of some of the most important methods for analysing split-plot data. The focus is on estimation, testing and model validation. Two examples from an industrial context are given to illustrate the most important techniques. Copyright © 2006 John Wiley & Sons, Ltd.

... See Bisgaard and Steinberg (1997) for an example. Sahni et al. (2005) discuss an example from a mayonnaise production process where the quality characteristic being monitored is a non-linear profile. Nair et al. (2002) present an example from injection moulding where the response of interest is the compression strength of foam measured over different levels of compression. ...

We compare the performance of two phase II monitoring schemes for linear profiles, one based on the classical calibration method monitoring the deviations from the regression line (referred to as the NIST method) and the second based on individually monitoring the parameters of the linear profile (referred to as the KMW method). The comparison criterion is average run length performance under different sustained shifts in the intercept, slope and error standard deviation of the linear calibration line. A simulation study shows that the NIST method performs poorly compared to the combined control charting scheme of the KMW method.

... Gained more information from the data. Alsaleh (2007); Beardsell and Dale (1999); Bidder (1990); Hersleth and Bjerke (2001); Hung and Sung (2011);Jha, Michela, and Noori (1999); Matsuno (1995); Rohitratana and Boon-itt (2001); Sanigar (1990) (2000); Narinder, Aastveit, and Naes (2005); Negiz et al. (1998);Orr (1999); Srikaeo and Hourigan (2002); Van Der Spiegel, Luning, Boer, Ziggers, and Jongen (2005) Business ...

In this paper we demonstrate both a design strategy and a set of analysis techniques for a designed experiment from an industrial process (cheese making) with multivariate responses (sensory data). The design strategy uses two-level factorial design for the factors that can be controlled, and blocking on the raw material to cover other non-designed variation in the raw material. We measure both the raw materials and on several points during the process with FT-IR spectroscopy. The methods of analysis complement each other to give more understanding and better modelling. The 50–50 MANOVA method provides multivariate analysis of variance to test for significance of effects for the design variables. Ordinary PLS2 analysis gives an overview of the data and generates hypotheses about relations. Finally, the orthogonal LS–PLS method is extended to multivariate responses and used to identify the source of the observed block effect and to build models that can be used for statistical process control at several points in the process. In these models, the information at one point is corrected for information that has already been described elsewhere.

A new methodology is proposed to describe the evolution of the main chemical compounds of grape must during wine alcoholic fermentation using Attenuated Total Reflectance Mid Infrared (ATR-MIR) spectra in combination with Multivariate Curve Resolution Alternating Least-Squares (MCR-ALS). In addition, we have developed a process control strategy to detect differences between fermentations running under Normal Operation Conditions (NOC) and fermentations intentionally spoiled with lactic acid bacteria at the beginning of alcoholic fermentation (MLF) to promote deviations from NOC.
MCR-ALS models on these data showed a good data fit (R² = 99.95% and lack of fit = 2.31%). It was possible to resolve the spectral profiles of relevant molecules involved in the alcoholic fermentation process, including the one related to bacterial spoilage (lactic acid). MSPC charts were built based on the concentration profiles obtained from the MCR-ALS models and using T² and Q statistics. Spoiled wines showed off-limit values for T² after 96 hours, making it possible to detect lactic acid bacteria spoilage in early stages of alcoholic fermentation.
Thus, the use of ATR-MIR spectra and MCR-ALS analysis shows a great potential for a rapid control of the state of the alcoholic fermentation process, making it possible to early detect the appearance of undesired molecules during the process, which allows the winemaker to apply corrective measures and obtain a good final product.

Industrial production requires multivariate control charts to enable monitoring of several components. Recently there has been an increased interest also in other areas such as detection of bioterrorism, spatial surveillance and transaction strategies in finance. In the literature, several types of multivariate counterparts to the univariate Shewhart, EWMA and CUSUM methods have been proposed. We review general approaches to multivariate control chart. Suggestions are made on the special challenges of evaluating multivariate surveillance methods.

The characterization of products and processes by space- or time-related sets of measurements (profiles) is becoming an increasingly common situation in today’s highly instrumentalized manufacturing systems. Even though many applications and methodologies in which profile measurements are employed as inputs to analysis tasks have already been proposed and described, problems in which they naturally appear as outputs are rare. Therefore, in this work, we present real-world applications in which profiles are the desired prediction targets and describe the methodologies followed to address the underlying analysis goals. In this context, we show how to properly establish the sample-specific prediction intervals for profiles, in a simple and flexible way, through nonparametric resampling or noise addition procedures. The added value of the various analyses carried out during the description of the case studies is also highlighted from the standpoint of the associated benefits for process improvement.

The number and diversity of Process Analytics applications is growing fast, impacting areas ranging from process operations to strategic planning or supply chain management. However, this field has not reached yet a maturity level characterized by a stable, organized and consolidated body of knowledge for handling the main classes of problems that need to be faced. Data-Driven Process Systems Engineering and Process Analytics only recently received wider recognition, becoming a regular presence in journals and conferences. As a tribute to the groundbreaking Process Analytics contributions of George Stephanopoulos, namely through his academic tree, to which we are proud to belong, this article aims to contribute to the systematization and consolidation of this field in the broad PSE scope, starting from a fundamental understanding of the key challenges facing it, and constructing from them a workflow that can flexibly be adapted to handle different problems, aimed at supporting value creation through good decision-making. In this path, we base our foresight and conceptual framework on the authors’ experience, as well as on contributions from other researchers that, across the world, have been collectively pushing forward Data-Driven Process Systems Engineering.

This paper is a systematic review of the literature on statistical process control (SPC) implementation in the food industry. Using systematic searches across three decades of publications, 41 journal articles were selected for the review. Key findings of the review include motivations: to reduce product defects and to follow the food law and regulations (benefits); barriers: high resistance to change and lack of sufficient statistical knowledge; and (limitations) an absence of statistical thinking and a dearth of SPC implementation guidelines. Further findings highlight the predominance of publications from the USA and the UK within this topic. Future research directions concerning SPC implementation issues as well as a ready reference of the SPC literature in the food manufacturing industry are also discussed.

This paper studies an important type of multi-factor experiments designs called (Split-Blocks experiment Design). And we’re going to use the orthogonal matrix to convert the observation matrix. Also, we study the analysis of variance for the Multivariate split-Blocks design. And we discuss the test statistics for Lawley-Hotelling test, Roy’s Union-Intersection test and Walks, Lambda test.

A systematization of complex data structures, originated in current processes and product analysis activities, is presented as a starting point for developing generalized and flexible frameworks for handling such challenging information sources. In this article, we present an abstract and unifying definition for a class of multidimensional data arrays (here called profiles), built upon which a taxonomy is proposed for their classification, according to aspects relevant for the development of software platforms, namely, the underlying data structure characteristics and the nature of information they convey. Such taxonomy is based on an extensive bibliographic review involving the analysis of complex datasets. Then, we identify the classes of profiles with higher demand (dominant classes) and, for these, conduct a comparison study involving those methodologies that could be applied in the context of two tasks: calibration/regression and process monitoring. The comparison study showed that the Tucker3 and N-way Partial Least Squares methods are good candidates to incorporate a computational framework addressing the dominant classes of profiles that were identified.The purpose of this ongoing work is to provide fundamental information for developing new software tools able to handle a broad scope of problems, involving the type of complex datasets found in practice, as well as on how to modularly add features regarding the treatment of other, less frequent, classes, to add value to users with more specific interests. Copyright © 2012 John Wiley & Sons, Ltd.

In most statistical process control (SPC) applications, it is assumed that the quality of a process or product can be adequately represented by the distribution of a univariate quality characteristic or by the general multivariate distribution of a vector consisting of several correlated quality characteristics. In many practical situations, however, the quality of a process or product is better characterized and summarized by a relationship between a response variable and one or more explanatory variables. Thus, at each sampling stage, one observes a collection of data points that can be represented by a curve (or profile). In some calibration applications, the profile can be represented adequately by a simple straight-line model, while in other applications, more complicated models are needed. In this expository paper, we discuss some of the general issues involved in using control charts to monitor such process-and product-quality profiles and review the SPC literature on the topic. We relate this application to functional data analysis and review applications involving linear profiles, nonlinear profiles, and the use of splines and wavelets. We strongly encourage research in profile monitoring and provide some research ideas.

"A Scientific Approach to the Design of Experiments One Factor Designs Factorial Designs Nested Designs Restrictions on Randomization Play it Again, Sam Two Level Fractional Designs Other Fractional Designs Response Surface Designs Appendices Key Word Index "

Robust design studies with functional responses are becoming increasingly common. The goal in these studies is to analyze location and dispersion effects and optimize performance over a range of input-output values. Taguchi and others have proposed the so-called signal-to-noise ratio analysis for robust design with dynamic characteristics. We consider more general and flexible methods for analyzing location and dispersion effects from such studies and use three real applications to illustrate the methods. Two applications demonstrate the usefulness of functional regression techniques for location and dispersion analysis while the third illustrates a parametric analysis with two-stage modeling. Both a mean-variance analysis for random selection of noise settings as well as a control-by-noise interaction analysis for explicitly controlled noise factors are considered.

Many industrial experiments involve the blending of ingredients and the changing of process conditions to produce end products of highest quality. Such experiments are known as mixture experiments with process variables.
This paper discusses the analysis of data generated from mixture experiments with process variables where the design is of the split-plot type. Two examples are given of experiments consisting of three mixture components and two process variables to illustrate the testing of hypotheses concerning the coefficients in the combined mixture components-process variables model.

Statistical process control methods for monitoring processes with multivariate measurements in both the product quality variable space and the process variable space are considered. Traditional multivariate control charts based on X2 and T2 statistics are shown to be very effective for detecting events when the multivariate space is not too large or ill-conditioned. Methods for detecting the variable(s) contributing to the out-of-control signal of the multivariate chart are suggested. Newer approaches based on principal component analysis and partial least squares are able to handle large ill-conditioned measurement space; they also provide diagnostics which can point to possible assignable causes for the event. The me hods are illustrated on a simulated process of a high pressure low density polyethylene reactor, and examples of their application to a variety of industrial processes are referenced.

The article describes a case study based on implementation of in-line near infrared (NIR) spectroscopy using fibre-optic transmittance probes. This study is based on an experimental design incorporating raw materials and process variables, a split-plot design. The results show that in-line NIR spectroscopy can be used for monitoring and predicting parameters related to both the input parameters (raw materials and process variables) and the final product quality (viscosity of the product) for emulsion-based products in the food industry. However, the results are dependent on the proper choice of validation of the calibration models. The article also proposes a way to update calibration models in situations when new changes not accounted for in the calibration model are experienced.

Motivated by specific problems involving radar-range profiles, we suggest techniques for real-time discrimination in the context of signal analysis. The key to our approach is to regard the signals as curves in the continuum and employ a functional data-analytic (FDA) method for dimension reduction, based on the FDA technique for principal coordinates analysis. This has the advantage, relative to competing methods such as canonical variates analysis, of providing a signal approximation that is best possible, in an L2 sense, for a given dimension. As a result, it produces particularly good discrimination. We explore the use of both nonparametric and Gaussian-based discriminators applied to the dimensionreduced data.

In this article the two-way array interaction model with one observation per cell is discussed. The model is given byLikelihood ratio tests are presented for two hypotheses: (1) no interaction (λ = 0) and (2) equality of treatments (τ1 = τ2 = … = τt). Also maximum likelihood estimators are given for all parameters including σ when λ ≠ 0.

A batch process is finite in duration and can be separated into two stages: startup and production. We develop a methodology to monitor a batch process during the startup stage to reduce the length of the startup stage. We focus on processes that are characterized by multiple process parameters and product characteristics. Because of the complex interdependencies characterizing the process parameters and product characteristics, it is more effective to evaluate them simultaneously. To address the multivariate nature of the process we use a multivariate statistical model: PLS (Projection to Latent Structures). PLS has been applied to several applications in statistical process monitoring. We present a new application of PLS to the startup stage of a batch process. Iterative adjustments made during startup in search of an acceptable production zone consume considerable amounts of material, labor and equipment time. We develop a monitoring procedure to reduce the time as well as the number of iterations and adjustments needed for startup. A PLS model is constructed, using baseline data, to characterize the relationship among process parameters during good production. The startup stage is monitored using the PLS characterization to determine if the process is consistent with good production. We illustrate the improved startup operations with an example from a batch process in filament extrusion, the application that motivates this work. Copyright © 2001 John Wiley & Sons, Ltd.

With the goal of understanding global chemical processes, environmental chemists have some of the most complex sample analysis problems. Multivariate calibration is a tool that can be applied successfully in many situations where traditional univariate analyses cannot. The purpose of this paper is to review multivariate calibration, with an emphasis being placed on the developments in recent years. The inverse and classical models are discussed briefly, with the main emphasis on the biased calibration methods. Principal component regression (PCR) and partial least squares (PLS) are discussed, along with methods for quantitative and qualitative validation of the calibration models. Non-linear PCR, non-linear PLS and locally weighted regression are presented as calibration methods for non-linear data. Finally, calibration techniques using a matrix of data per sample (second-order calibration) are discussed briefly.

The present paper is a study of the use of robust design methodology in a three-component mixture experiment containing two process variables. The mixture components are three wheat flours and the process variables are mixing time and proofing time of the dough. The main focus is on comparing three different techniques for analyzing the loaf volume, which is one of the key parameters of bread quality. The first technique considers the mean square error (MSE) computed across the levels of the process variables in the noise array, for each three component blend in the control array. The second method is based on analysis of variance (ANOVA) of the three factors, flour blend, mixing time and proofing time. The third method is a regression approach, where all factors (the three mixture components and two process variables) are modeled simultaneously.

With process computers routinely collecting measurements on large numbers of process variables, multivariate statistical methods for the analysis, monitoring and diagnosis of process operating performance have received increasing attention. Extensions of traditional univariate Shewhart, CUSUM and EWMA control charts to multivariate quality control situations are based on Hotelling's T2 statistic. Recent approaches to multivariate statistical process control which utilize not only product quality data (Y), but also all of the available process variable data (X) are based on multivariate statistical projection methods (Principal Component Analysis (PCA) and Partial Least Squares (PLS)). This paper gives an overview of these methods, and their use for the statistical process control of both continuous and batch multivariate processes. Examples are provided of their use for analysing the operations of a mineral processing plant, for on-line monitoring and fault diagnosis of a continuous polymerization process and for the on-line monitoring of an industrial batch polymerization reactor.

The goal of the presented study is two-fold. First, we want to emphasize the power of Near Infrared Reflectance (NIR) spectroscopy for discrimination between mayonnaise samples containing different vegetable oils. Secondly, we want to use our data to compare the performances of different classification procedures. The NIR spectra with 351 variables correspond to equally spaced wavelengths in the 1100–2500 nm area. Feature extraction both by automatic wavelength-selection and by projection onto principal components (PCs) is discussed. The discriminant methods considered are linear discriminant analysis (LDA), quadratic discriminant analysis (QDA) and regression with categorical {0,1}-responses. A dataset containing 162 spectra of mayonnaise samples based on six different vegetable oils is analyzed. By LDA with authentic cross-validation (PC-models re-estimated for each cross-validation segment), only one sample was misclassified. Classification by allocating a sample according to the largest fitted value of a linear regression (Discriminant-Partial least squares (DPLS) or Discriminant-Principal components regression (DPCR)) is demonstrated sub-optimal compared to LDA of the corresponding PLS- or PCR-scores. QDA significantly outperforms LDA for projections of the data onto subspaces of moderate size (scores of 7–9 PCs). Two automatic variable-selection procedures choose 16 and 26 wavelengths (variables), respectively from the spectra. Based on the selected wavelengths, LDA gives considerably better classification than the regression approach. By reporting the performances of several feature extraction techniques in tandem with three of the most common classification methods, we hope that the reader will notice two relevant aspects: (1) By using the DPLS and DPCR (classification by `dummy' regressions) one is exposed to a significant risk of obtaining sub-optimal classification results; (2) The automatic wavelength selections may give valuable information about what is actually causing a successful discrimination. Such knowledge can, for instance, be used to select the most suited filters for online applications of NIR. Besides, from demonstrating different classification strategies, our study clearly shows that classification methods with NIR spectra can be used to discriminate between mayonnaise samples of different oil types and fatty acid composition.

The main goal of this article is to present a general framework for looking at an industrial experimental problem—starting from the problem definition stage, utilizing an appropriate experimental design, taking proper response measurements using techniques that describe the desired phenomenon that is to be studied and finally analyzing the data using multivariate techniques.We have focused on three main elements: (1) detect the origin of the different response characteristics in a process, critical control points; (2) relate the response measurements to Near-Infrared spectroscopy (NIRS); and (3) establish relationship between the response measurements taken on the final product and NIR spectra collected for each sample at different sampling points along the process line. The results show the suitability of NIRS not only as a rapid measurement technique for detecting changes in the final product quality at an early stage, but also for process control at the critical control point.

A tutorial on the partial least-squares (PLS) regression method is provided. Weak points in some other regression methods are outlined and PLS is developed as a remedy for those weaknesses. An algorithm for a predictive PLS and some practical hints for its use are given.

- Milliken G. A.