Article

Batch process monitoring and its application to polymerization systems

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Slight changes in raw material properties or operating conditions during critical periods of operation of batch and semi-batch polymerization reactors may have a strong influence on reaction mechanism and impact final product quality. Online process monitoring, fault detection, fault diagnosis, and product quality prediction in real-time ensure safe reactor operation and warn operators about excursions from normal operation that may lead to deterioration in product properties. Multivariate statistical process monitoring and quality prediction using multiway principal components analysis and multiway partial least squares have been successful in detecting abnormalities in process operation and product quality. When abnormal process operation is detected, fault diagnosis tools are used to determine the source cause of the deviation. Illustrative case studies are presented via simulated polyvinyl acetate polymerization.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Chapter
Chemical engineering principles apply widely in development and production of biologics. This chapter describes in greater detail various unit operations that are typically involved in production of biologics and the specific fundamental principles associated with these unit operations. The complexity of proteins drugs and their biological production systems pose extraordinary challenges to the chemical engineers with responsibility for industrial‐scale manufacturing of these products. Design, development, and operation of these processes require complete understanding and appreciation for the intricacies of proteins and living cells. The chapter examines how concepts taught in traditional chemical engineering are applied to design equipment and develop processes that allow modern‐day mass manufacture of biotechnology products. Large‐scale protein manufacturing utilizes a series of unit operations to grow cells, produce product, and isolate and purify product from the cell culture. The chapter examines these steps in more detail and discusses chemical engineering challenges associated with these steps.
Chapter
This overview of process fault diagnosis concentrates on steady-state processes, continuous dynamic processes and batch processes. In steady-state processes, the classic linear model for process fault diagnosis based on the use of principal component analysis is discussed in some detail, followed by extensions of this model to nonlinear steady-state (non)Gaussian processes. These extensions include higher-order statistical models, such as based on the use of independent components, the use of principal curves and surfaces as well as neural networks as nonlinear extensions of principal component analysis. Likewise, innovations and applications of kernel methods are among other considered, including kernel principal component analysis, kernel partial least squares, kernel independent component analysis as well as multiple kernel learning variants of some of these approaches. Continuous dynamic processes are considered in terms of manifold models, adaptive methods and phase space methods, where the application of process diagnostics, such as correlation dimension and recurrence quantitative analyses, has been proposed. The multitude of recent developments in batch processing are similarly reviewed in terms of the multiway principal component model, extended to multiphase and multiblock models. These developments are considered in the broad framework outlined in Chap. 1.
Book
This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data projections. Topics and features: discusses machine learning frameworks based on artificial neural networks, statistical learning theory and kernel-based methods, and tree-based methods; examines the application of machine learning to steady state and dynamic operations, with a focus on unsupervised learning; describes the use of spectral methods in process fault diagnosis.
Article
A novel networked process monitoring, fault propagation identification, and root cause diagnosis approach is developed in this study. First, process network structure is determined from prior process knowledge and analysis. The network model parameters including the conditional probability density functions of different nodes are then estimated from process operating data to characterize the causal relationships among the monitored variables. Subsequently, the Bayesian inference‐based abnormality likelihood index is proposed to detect abnormal events in chemical processes. After the process fault is detected, the novel dynamic Bayesian probability and contribution indices are further developed from the transitional probabilities of monitored variables to identify the major faulty effect variables with significant upsets. With the dynamic Bayesian contribution index, the statistical inference rules are, thus, designed to search for the fault propagation pathways from the downstream backwards to the upstream process. In this way, the ending nodes in the identified propagation pathways can be captured as the root cause variables of process faults. Meanwhile, the identified fault propagation sequence provides an in‐depth understanding as to the interactive effects of faults throughout the processes. The proposed approach is demonstrated using the illustrative continuous stirred tank reactor system and the Tennessee Eastman chemical process with the fault propagation identification results compared against those of the transfer entropy‐based monitoring method. The results show that the novel networked process monitoring and diagnosis approach can accurately detect abnormal events, identify the fault propagation pathways, and diagnose the root cause variables. © 2013 American Institute of Chemical Engineers AIChE J, 59: 2348–2365, 2013
Chapter
Why are Biologics Unique from a Mass, Heat, and Momentum Transfer Standpoint? Scale-Up Approaches and Associated Challenges in Biologics Manufacturing Challenges in Large-Scale Protein Manufacturing Example 3.1 Example 3.2 Specialized Applications of Chemical Engineering Concepts in Biologics Manufacturing Conclusions Acknowledgments References
Article
This work presents results on the problem of robust on‐line diagnosis of abnormal situations in the case of the free‐radical solution polymerization of styrene in a continuous well‐mixed reactor operating under feedback control. The control system is constituted of a model predictive control strategy in conjunction with a ratio control law. The proposed fault diagnosis system is based on an open‐loop approach, which uses a linearized model of the polymerization process for the implementation of a bank of reduced‐order unknown input observers. Each observer is designed to detect changes in a particular process parameter, external disturbance or malfunctioning of sensors and actuators. Fault isolation is obtained through a structured residual approach; meanwhile, fault estimation is performed using the degree of freedom remaining in the observer design. The effectiveness of the proposed technique is verified through numerical simulations carried out on a well‐defined industrial styrene polymerization benchmark. magnified image
Book
Statistical Inference Different Types of Statistical Intervals: An Overview The Assumption of Sample Data The Central Role of Practical Assumptions Concerning “Representative Data” Enumerative versus Analytic Studies Basic Assumptions for Enumerative Studies Additional Aspects of Analytic Studies Convenience and Judgment Samples Sampling People Infinite Population Assumptions Practical Assumptions: Overview Practical Assumptions: Further Example Planning the Study The Role of Statistical Distributions The Interpretation of a Statistical Interval Comments Concerning Subsequent Discussion
Article
Due to sophisticated experimental designs and to modern instrumental constellations the investigation of N-dimensional (or N-way or N-mode) data arrays is attracting more and more attention. Three-dimensional arrays may be generated by collecting data tables with a fixed set of objects and variables under different experimental conditions, at different sampling times, etc. Stacking all the tables along varying conditions provides a cubic arrangement of data. Accordingly the three index sets or modes spanning a three-way array are called objects, variables and conditions. In many situations of practical relevance even higher-dimensional arrays have to be considered. Among numerous extensions of multivariate methods to the three-way case the generalization of principal component analysis (PCA) has central importance. There are several simplified approaches of three-way PCA by reduction to conventional PCA. One of them is unfolding of the data array by combining two modes to a single one. Such a procedure seems reasonable in some specific situations like multivariate image analysis, but in general combined modes do not meet the aim of data reduction. A more advanced way of unfolding which yields separate component matrices for each mode is the Tucker 1 method. Some theoretically based models of reduction to two-way PCA impose some specific structure on the array. A proper model of three-way PCA was first formulated by Tucker (so-called Tucker 3 model among other proposals). Unfortunately the Tucker 1 method is not optimal in the least squares sense of this model. Kroonenberg and De Leeuw demonstrated that the optimal solution of Tucker's model obeys an interdependent system of eigenvector problems and they proposed an iterative scheme (alternating least squares algorithm) for solving it. With appropriate notation Tucker's model as well as the solution algorithm are easily generalized to the N-way case (N > 3). There are some specific aspects of three-way PCA, such as complicated ways of data scaling or interpretation and simple-structure-transformation of a so-called core matrix, which make it more difficult to understand than classical PCA. An example from water chemistry serves as an illustration. Additionally, there is an application section demonstrating several rules of interpretation of loading plots with examples taken from environmental chemistry, analysis of complex round robin tests and contamination analysis in tungsten wire production.
Article
Producing value-added products of high-quality is the common objective in industries. This objective is more difficult to achieve in batch processes whose key quality measurements are not available on-line. In order to reduce the variations of the product quality, an on-line batch monitoring scheme is developed based on the multivariate statistical process control. It suggests using the past measured process variables without real-time quality measurement at the end of the batch run. The method, referred to as BDPCA and BDPLS, integrates the time-lagged windows of process dynamic behavior with the principal component analysis and partial least square respectively for on-line batch monitoring. Like traditional MPCA and MPLS approaches, the only information needed to set up the control chart is the historical data collected from the past successful batches. This leads to simple monitoring charts, easy tracking of the progress in each batch run and monitoring the occurrence of the observable upsets. BDPCA and BDPLS models only collect the previous data during the batch run without expensive computations to anticipate the future measurements. Three examples are used to investigate the potential application of the proposed method and make a comparison with some traditional on-line MPCA and MPLS algorithms.
Article
A semibatch flow scheduling strategy proposed by Teymour and Ray (1989, 1996) is evaluated for a polymerization reaction conducted in a pilot-plant reactor. The reaction used is the free radical terpolymerization of styrene, α-methyl styrene, and acrylic acid monomers initiated by an organic peroxide initiator and carried out in the presence of a reactive glycol ether solvent. This strategy was tested in both single batch and sequential semibatch modes. The process was shown to produce polymer of constant molecular weight properties and composition as inferred from acid number and monomer conversion measurements. This process could be used for obtaining polymer products from a semibatch reactor that are of comparable quality to CSTR products. Results indicate success of this process at meeting this objective; however, practical considerations relating to agitation and temperature control need to be properly addressed to ensure this success.
Article
The industrial application of a new monitoring scheme for batch and semi-batch processes is presented. Multi-way Principal Component Analysis is used to analyze the information from the on-line process measurements. The basic idea is to build a statistical model based on process measurements from past successful batches, which describes the normal operation of the process. Subsequently future batches are compared against this model and characterized as normal or abnormal. The algorithms and all the design equations are presented for setting up Statistical Process Control charts which monitor the performance of a batch process. Contribution plots for detected abnormal operations are developed to identify the measurement variables and time periods of abnormal operation.
Article
The power of multivariate statistical methodologies, namely, MPCA (multiway principal component analysis) and MPLS (multiway projection to latent structures or multiway partial least squares), for batch process analysis, monitoring, fault diagnosis, product quality prediction, and improved process insight is illustrated. These techniques were successfully applied to an industrial emulsion polymerization batch process. One key feature of this work is that reaction extent was used as the common reference scale to align batches with varying time durations. MPCA/MPLS technology (1) detected potential process abnormalities, (2) determined the time an abnormal event occurred, and (3) indicated the likely variable or variables which caused the abnormality. The results also indicated that variations in an ingredient trajectory and heat removal variables were primarily associated with viscosity variability. The resultant PLS model predicted the product viscosities within measurement error, thereby improving our workflow process. Process knowledge played a key role in variable selection and interpretation of the results.
Article
The dynamics of semibatch reactors is often neglected in the literature despite their industrial importance. This article analyzes the stability and dynamic behavior of semibatch polymerization reactors operated according to a flow scheduling strategy designed to impart a steady-state nature to the dynamics of these essentially transient reactors. It can also improve the operation of these reactors, especially the quality of the polymer produced (such as molecular weight distribution and polymer composition). In the proposed strategy of flow rate scheduling, the intensive states of the reactor can be made to reach steady-state values. A comparison of the dynamics of the proposed and classical operating strategies illustrates this possibility. Further dynamic analysis reveals the emergence of phenomena characteristic of continuous operation in a CSTR. Examples are multiplicity of the trajectories, limit cycle oscillations, as well as nonhomogeneous oscillations belonging to a period doubling cascade. Operation in a sequential semibatch mode is discussed, as well as the importance of selecting parameters for the fillup and discharge operation. In this mode of operation either periodic or chaotic behavior is obtained. The effect of both on polymer properties is studied in detail.
Article
The application of dynamic time warping (DTW) to the analysis and monitoring of batch processes is presented. This dynamic-programming-based technique has been used in the area of speech recognition for the recognition of isolated and connected words. DTW has the ability to synchronize two trajectories by appropriately translating, expanding, and contracting localized segments within both trajectories to achieve a minimum distance between the trajectories. Batch processes often are characterized by unsynchronized trajectories, due to the presence of batch-to-batch disturbances and the existence of physical constraints. To compare these batch histories and apply statistical analysis one needs to reconcile the timing differences among these trajectories. This can be achieved using DTW with only a minimal amount of process knowledge. The combination of DTW and a monitoring method based on Multiway PCA/PLS is used for both off-line and on-line implementation. Data fiom an industrial polymerization reactor are used to illustrate the implementation and the performance of this method.
Article
Multivariate statistical procedures for monitoring the progress of batch processes are developed. The only information needed to exploit the procedures is a historical database of past successful batches. Multiway principal component analysis is used to extract the information in the multivariate trajectory data by projecting them onto low-dimensional spaces defined by the latent variables or principal components. This leads to simple monitoring charts, consistent with the philosophy of statistical process control, which are capable of tracking the progress of new batch runs and detecting the occurrence of observable upsets. The approach is contrasted with other approaches which use theoretical or knowledge-based models, and its potential is illustrated using a detailed simulation study of a semibatch reactor for the production of styrene-butadiene latex.
Article
Measurements collected from batch processes naturally produce a third-order or three-dimensional data form. The same structure also results when multiple samples are measured using hyphenated analysis techniques such as liquid chromatography with diode array detection. Analysis of third-order data by principal components analysis (PCA) is achieved by a nonunique rearrangement that produces a two-dimensional array. This preferentially models only one of the three orders present. In contrast, methods such as parallel factor analysis (PARAFAC) apply a particular decomposition that accounts for all three orders explicitly. The results from either approach should be related if data are to be interpreted reliably for applications to batch processes such as on-line monitoring and control. This work compares these two approaches from an applied point of view. To accomplish this objective, exemplary methods are selected from each type of analysis, parallel factor analysis (PARAFAC) and multiway principal components analysis (MPCA). These are employed to analyze data obtained during the manufacture of a condensation polymer in an industrial batch reactor.
Article
Chemometrics, the application of mathematical and statistical methods to the analysis of chemical data, is finding ever widening applications in the chemical process environment. This article reviews the chemometrics approach to chemical process monitoring and fault detection. These approaches rely on the formation of a mathematical/statistical model that is based on historical process data. New process data can then be compared with models of normal operation in order to detect a change in the system. Typical modelling approaches rely on principal components analysis, partial least squares and a variety of other chemometric methods. Applications where the ordered nature of the data is taken into account explicitly are also beginning to see use. This article reviews the state-of-the-art of process chemometrics and current trends in research and applications.
Article
A new approach to monitoring batch processes using the process variable trajectories is presented. It was developed to overcome the need in the approach of Nomikos and MacGregor [P. Nomikos, J.F. MacGregor, Monitoring of batch processes using multi-way principal components analysis, Am. Inst. Chem. Eng. J. 40 (1994) 1361–1375; P. Nomikos, J.F. MacGregor, Multivariate SPC charts for batch processes, Technometrics 37 (1995) 41–59; P. Nomikos, J.F. MacGregor, Multi-way partial least squares in monitoring batch processes, Chemometrics Intell. Lab. Syst. 30 (1995) 97–108] for estimating or filling in the unknown part of the process variable trajectory deviations from the current time until the end of the batch. The approach is based on a recursive multi-block (hierarchical) PCA/PLS method which processes the data in a sequential and adaptive manner. The rate of adaptation is easily controlled with a parameter which controls the weighting of past data in an exponential manner. The algorithm is evaluated on industrial batch polymerization process data and is compared to the multi-way PCA/PLS approaches of Nomikos and MacGregor. The approach may have significant benefits when monitoring multi-stage batch processes where the latent variable structure can change at several points during the batch.
Article
This paper discusses contribution plots for both the D-statistic and the Q-statistic in multivariate statistical process control of batch processes. Contributions of process variables to the D-statistic are generalized to any type of latent variable model with or without orthogonality constraints. The calculation of contributions to the Q-statistic is discussed. Control limits for both types of contributions are introduced to show the relative importance of a contribution compared to the contributions of the corresponding process variables in the batches obtained under normal operating conditions. The contributions are introduced for off-line monitoring of batch processes, but can easily be extended to on-line monitoring and to continuous processes, as is shown in this paper.
Article
Multivariate statistical procedures for monitoring the progress of batch processes are developed. Multi-way partial least squares (MPLS) is used to extract the information from the process measurement variable trajectories that is more relevant to the final quality variables of the product. The only information needed is a historical database of past successful batches. New batches can be monitored through simple monitoring charts which are consistent with the philosophy of statistical process control. These charts monitor the batch operation and provide on-line predictions of the final product qualities. Approximate confidence intervals for the predictions from PLS models are developed. The approach is illustrated using a simulation study of a styrene-butadiene batch reactor.
Article
This is the first of two papers describing a study of the effect of departures from assumptions, other than normality, on the null-distribution of the F-statistic in the analysis of variance. In this paper, certain theorems required in the study and concerning the distribution of quadratic forms in multi-normally distributed variables are first enunciated and simple approximations tested numerically. The results are then applied to determine the effect of group-to-group inequality of variance in the one-way classification. It appears that if the groups are equal, moderate inequality of variance does not seriously affect the test. However, with unequal groups, much larger discrepancies appear. In a second paper, similar methods are used to determine the effect of inequality of variance and serial correlation between errors in the two-way classification.
Article
The technique of dynamic time warping for time registration of a reference and test utterance has found widespread use in the areas of speaker verification and discrete word recognition. As originally proposed, the algorithm placed strong constraints on the possible set of dynamic paths-namely it was assumed that the initial and final frames of both the test and reference utterances were in exact time synchrony. Because of inherent practical difficulties with satisfying the assumptions under which the above constraints are valid, we have considered some modifications to the dynamic time warping algorithm. In particular, an algorithm in which an uncertainty exists in the registration both for initial and final frames was studied. Another modification constrains the dynamic path to follow (within a given range) the path which is locally optimum at each frame. This modification tends to work well when the location of the final frame of the test utterance is significantly in error due to breath noise, etc. To test the different time warping algorithms a set of ten isolated words spoken by 100 speakers was used. Probability density functions of the distances from each of the 100 versions of a word to a reference version of the word were estimated for each of three dynamic warping algorithms. From these data, it is shown that, based on a set of assumptions about the distributions of the distances, the warping algorithm that minimizes the overall probability of making a word error is the modified time warping algorithm with unconstrained endpoints. A discussion of this key result along with some ideas on where the other modifications would be most useful is included.
Article
This paper reports on an optimum dynamic progxamming (DP) based time-normalization algorithm for spoken word recognition. First, a general principle of time-normalization is given using time-warping function. Then, two time-normalized distance definitions, called symmetric and asymmetric forms, are derived from the principle. These two forms are compared with each other through theoretical discussions and experimental studies. The symmetric form algorithm superiority is established. A new technique, called slope constraint, is successfully introduced, in which the warping function slope is restricted so as to improve discrimination between words in different categories. The effective slope constraint characteristic is qualitatively analyzed, and the optimum slope constraint condition is determined through experiments. The optimized algorithm is then extensively subjected to experimental comparison with various DP-algorithms, previously applied to spoken word recognition by different research groups. The experiment shows that the present algorithm gives no more than about two-thirds errors, even compared to the best conventional algorithm.
  • R Boque
  • A Smilde
R. Boque, A. Smilde, AIChE J. 1999, 45, 1504.
  • T Kourti
  • J Lee
  • J F Macgregor
T. Kourti, J. Lee, J. F. MacGregor, Comput. Chem. Eng. 1996, 20, 745.
  • H Sakoe
  • S Chiba
H. Sakoe, S. Chiba, IEEE Trans. Acoustics, Speech and Signal Process. 1978, 2, 43.
  • S Wold
  • N Kettaneh
  • H Friden
  • A Holmberg
S. Wold, N. Kettaneh, H. Friden, A. Holmberg, Chemo. Intell. Lab. Sys. 1998, 44, 331.
  • P Nomikos
  • J F Macgregor
P. Nomikos, J. F. MacGregor, Technometrics 1995, 37, 41.
  • N D Tracy
  • J C Young
  • R L Mason
N. D. Tracy, J. C. Young, R. L. Mason, J. Quality Control 1992, 24, 88.
  • S Wold
  • P Geladi
  • K Esbensen
  • J Ohman
S. Wold, P. Geladi, K. Esbensen, J. Ohman, J. Chemometrics 1987, 1, 41.
  • K Dahl
  • M Piovoso
  • K Kosanovich
K. Dahl, M. Piovoso, K. Kosanovich, Chemo. Intell. Lab. Sys. 1999, 46, 161-180.
  • D J Louwerse
  • A K Smilde
D. J. Louwerse, A. K. Smilde, Chem. Eng. Sci. 2000, 55, 1225.
  • P Nomikos
  • J F Macgregor
P. Nomikos, J. F. MacGregor, AIChE J. 1994, 40, 1361.
  • T Kourti
  • J F Macgregor
T. Kourti, J. F. MacGregor, Chemo. Intell. Lab. Sys. 1995, 28, 3.
  • H Cho
  • K Kim
H. Cho, K. Kim, J. Qual. Technol. 2003, 59, 35.
  • L R Rabiner
  • A E Rosenberg
  • S Levinson
L. R. Rabiner, A. E. Rosenberg, S. Levinson, IEEE Trans. Acoustics, Speech and Signal Process. 1978, 6, 575.
  • R Henrion
R. Henrion, Chemo. Intell. Lab. Sys. 1994, 25, 1.
  • P Miller
  • R E Swanson
  • C F Heckler
P. Miller, R. E. Swanson, C. F. Heckler, Int. J. Appl. Math. Comput. Sci. 1998, 8, 775.
  • D Neogi
  • C Schlags
D. Neogi, C. Schlags, Ind. Eng. Chem. Res. 1998, 37, 3971.
  • J Chen
  • K Liu
J. Chen, K. Liu, Chem. Eng. Sci. 2002, 57, 63.
  • S Rannar
  • J F Macgregor
  • S Wold
S. Rannar, J. F. MacGregor, S. Wold, Chemo. Intell. Lab. Sys. 1998, 41, 73.
  • B M Wise
  • N B Gallagher
B. M. Wise, N. B. Gallagher, J. Process Control 1996, 6, 329.
  • P Nomikos
  • J F Macgregor
P. Nomikos, J. F. MacGregor, Chemo. Intell. Lab. Sys. 1995, 30, 97.
  • A Kassidas
  • J F Macgregor
  • P A Taylor
A. Kassidas, J. F. MacGregor, P. A. Taylor, AIChE J. 1998, 44, 864.
  • G E P Box
G. E. P. Box, The Annals of Mathematical Statistics 1954, 25, 290.
  • J A Westerhuis
  • S P Gurden
  • A K Smilde
J. A. Westerhuis, S. P. Gurden, A. K. Smilde, Chemo. Intell. Lab. Sys. 2000, 51, 95.
  • F Teymour
F. Teymour, AIChE J. 1997, 43, 145.
  • R W Chylla
  • J D Campbell
  • F Teymour
R. W. Chylla, J. D. Campbell, F. Teymour, AIChE J. 1997, 43, 157.