Víctor Gustavo Tercero GómezTecnológico de Monterrey | ITESM · Escuela de Ingeniería y Ciencias
Víctor Gustavo Tercero Gómez
Ph.D. in Engineering Science & Ph.D. in Systems and Engineering Management
Statistical process monitoring, engaged with multivariate, nonnormal, and nonparametric approaches. Open to collaborate.
About
69
Publications
12,738
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
429
Citations
Introduction
Víctor G. Tercero-Gómez currently works at the School of Engineering and Sciences, Tecnológico de Monterrey. Víctor does research in Statistical Process Control and Nonparametric Statistics.
Publications
Publications (69)
Data-driven approaches in machine learning are increasingly applied in economic analysis, particularly for identifying business cycle (BC) turning points. However, temporal dependence in BCs is often overlooked, leading to what we term single path analysis (SPA). SPA neglects the diverse potential routes of a temporal data structure. It hinders the...
Following the 2008 financial crisis, Hyman Minsky’s Financial Instability Hypothesis (FIH) emerged as a prominent financial theory to explain the occurrence of business cycles in the U.S. economy. There have been many theoretical, but few empirical studies dedicated to FIH. The current literature also lacks the statistical support to confirm the ne...
Even though most control chart developments have revolved around the normal distribution, productive operations, in fact, often use non‐normal data, and some even use skewed distributions. This occurs with cycle times and many processes with a single specification limit. Under these conditions, gamma distributions often offer a better characterizat...
The monitoring of condition variables for maintenance purposes is a growing trend amongst researchers and practitioners where decisions are based on degradation levels. The two approaches in Condition-Based Maintenance (CBM) are diagnosing the level of degradation (diagnostics) or predicting when a certain level of degradation will be reached (prog...
We propose a distribution-free Shewhart-type control chart for jointly monitoring location and scale in a finite horizon production, commonly presented in manufacturing environments characterized by high flexibility, production variety, and a limited number of scheduled inspections. The Lepage-type statistic used in this research combines the Wilco...
Thermoluminescence (TL) is a property of some materials utilized to measure the radiation dose in a material exposed to a radiation source for a period of time. Dosimeter is based in the measure of the light that a material emit when it is heated after radiation. The TL response in a material can be derived by the band gap theory of solids. In this...
In recent years, the adoption of statistical process monitoring (SPM) techniques in healthcare has been successful; for instance, biosurveillance and biosignal monitoring have found direct benefits. As the latest reviews of the literature show, parametric SPM techniques have been implemented to evaluate the quality-of-service hospitals provide, tra...
When comparing control chart performance, one should use steady-state properties and strong competitors. Moving average-based methods should not give greater weight to past data than to current data. This defies common sense and leads to poor steady-state performance.
At present, the accuracy of diffusion rate forecasting, at a macro-level, in the research literature, is nonexistent. This research reveals underlying macro-level trends of diffusion rate assessment using historical technological innovation diffusion data to explore the statistical characteristics of diffusion rate percent-error of the Bass and log...
Many extensions and modifications have been made to standard process monitoring methods such as the exponentially weighted moving average (EWMA) chart and the cumulative sum (CUSUM) chart. In addition, new schemes have been proposed based on alternative weighting of past data, usually to put greater emphasis on past data and less weight on current...
Multivariate statistical process control (MSPC) addresses the concurrent monitoring of several measurements. Since multivariate normality is rare in practice, nonparametric schemes become useful alternatives. Inspired by the development of distribution free Shewhart-type control charts to monitor multivariate data streams as a mean to address high-...
We argue against the use of generally weighted moving average (GWMA) control charts. Our primary reasons are the following: (1) There is no recursive formula for the GWMA control chart statistic, so all previous data must be stored and used in the calculation of each chart statistic. (2) The Markovian property does not apply to the GWMA statistics,...
Many extensions and modifications have been made to standard process monitoring methods such as the exponentially weighted moving average (EWMA) chart and the cumulative sum (CUSUM) chart. In addition, new schemes have been proposed based on alternative weighting of past data, usually to put greater emphasis on past data and less weight on current...
The Business Cycle paradigm of Mitchell and Burns has evolved from their original goal of understanding the entire economic process to the binary identification of growth and recessionary Turning Points. We propose a new paradigm for modelling the Business Cycle based on the Statistical Process Monitoring technique of Self-Starting Cumulative Sum (...
We argue against the use of generally weighted moving average (GWMA) control charts. Our primary reasons are the following: 1) There is no recursive formula for the GWMA control chart statistic, so all previous data must be stored and used in the calculation of each chart statistic. 2) The Markovian property does not apply to the GWMA statistics, s...
We discuss issues related to the use of the normality assumption in statistical process monitoring with continuous data. Our illustrations involve the Shewhart X-chart. We illustrate some of the dangers and pitfalls in using nonlinear transformations in order to obtain the approximate normality of the process data. We argue that such transformation...
There is growing literature on new versions of "memory-type" control charts, where deceptively good zero-state average run-length (ARL) performance is misleading. Using steady-state run-length analysis in combination with the conditional expected delay (CED) metric, we show that the increasingly discussed progressive mean (PM) and homogeneously wei...
Risk perception can be quantified in measurable terms of risk aversion and sensitivity. While conducting research on the quantization of programmatic risk, a bridge between positive and normative decision theories was discovered through the application of a novel a priori relationship between objective and subjective probabilities and the applicati...
The uncertainty, or entropy, of an atom of an ideal gas being in a certain energy state mirrors the way people perceive uncertainty in the making of decisions, uncertainty that is related to unmeasurable subjective probability. It is well established that subjects evaluate risk decisions involving uncertain choices using subjective probability rath...
We propose a distribution‐free cumulative sum (CUSUM) chart for joint monitoring of location and scale based on a Lepage‐type statistic that combines the Wilcoxon rank sum and the Mood statistics. Monte Carlo simulations were used to obtain control limits and examine the in‐control and out‐of‐control performance of the new chart. A direct compariso...
For an economic system such as a nation, assessing the efforts of its constituent economic activities that are directed toward greater efficiency, which in aggregate determines the overall efficiency at the national level is important. Such an exercise provides information on which constituent economic activities are underperforming and require att...
System Dynamics (SD)-based simulation is gaining ground as simulation itself becomes increasingly embedded in the decision-making process. Given the wide availability of simulation software like Vensim and Simulink, complex systems can now be constructed, deconstructed, and replicated with ease unlike in the past. The concomitant increase in the ne...
Six Sigma (SS) is now a well-entrenched methodology for performance improvement and comprises highly sophisticated tools targeted at quality improvement or cost reduction initiatives. However, the application of such tools is often not predicated on a foreknowledge or an informed supposition of the potential effects but is instead autotelic. One ap...
Over the past decade, leasing, as opposed to purchasing, has gained prominence regarding acquiring capital-intensive industrial robots. Investments in plant automation have seen significant growth over this time given the increasing need for higher productivity and quality. To offset the concomitant rise in automation equipment costs, organizations...
System Dynamics (SD)-based simulation has gained traction in recent times as a technique to study a system’s behavior. It has been employed to model diverse complexes, from reactive chemical assemblages to socio-economic systems. However, SD-based simulation models, like models derived from competing simulation techniques, typically suffer the prob...
Over time, all industries have adapted to the ever-increasing demand for higher product quality and throughput by embracing automation and industrial robots. The impact of the purchasing decision of automation equipment such as robots on an organization’s profitability needs to be better understood, given the paucity of published empirical data on...
Despite R&D having long been identified by firms as a critical investment because of its relationship with innovation, there is still much to be learned about this activity and its effect. For instance, most firms report R&D expenditures as an aggregated value, as a percentage of either revenues or profit. However, the granular level allocation of...
While the comprehension of an economic system has improved over the years, there is still much to be learned about hidden phenomena that shape a country's economic performance. A plethora of leading, lagging, and coincident type economic indexes are now available in the econometric toolbox that fulfill several traditional economic requirements like...
To maintain the desired quality of a product or service it is necessary to monitor the process that results in the product or service. This monitoring method is called Statistical Process Management, or Statistical Process Control. It is in widespread usage in industry. Extensive statistical methodology has been developed to make it possible to det...
Several studies have been performed to review the state the art in the cost of quality (COQ) literature. These studies have focused on different aspects such as models, adoption, limitations, etc. Few works have been found addressing the COQ simulation procedures. Simulation is a crucial tool to attain preliminary inexpensive results prior to poten...
Control charts are powerful Statistical Process Monitoring tools to detect departures from in-control situations. However, their power detection relies on the fact that all assumptions underlying their design are met, such as independence of data and knowledge of the process model parameters. When parameters are estimated, the average and the stand...
Mexico and US have a strong relationship that goes beyond border sharing; their economies seem to be tight together in several ways like social, cultural and market composition. To have a better understanding of the interrelationship between these two countries, at least in the economic dimension, is needed to analyze how connected their stock mark...
The current work aims to shed light on the effect of technology to reduce defects in manufacturing, via a systematic literature review in industrial robots, six-sigma, radio-frequency identification, design for six-sigma, and advanced manufacturing technologies. The systematic literature review is a technique to gather and analyze information from...
Tests for equality of variances using independent samples are widely used in data analysis. Conover et al. [A comparative study of tests for homogeneity of variance, with applications to the outer continental shelf bidding data. Technometrics. 1981;23:351–361], won the Youden Prize by comparing 56 variations of popular tests for variance on the bas...
A nonparametric control chart for variance is proposed. The chart is constructed following the change-point approach through the recursive use of the squared ranks test for variance. It is capable of detecting changes in the behaviour of individual observations with performance similar to a self-starting CUSUM chart for scale when normality is assu...
Quantile estimation is a problem presented in fields such as quality control, hydrology, and economics. There are different techniques to estimate such quantiles. Nevertheless, these techniques use an overall fit of the sample when the quantiles of interest are usually located in the tails of the distribution. Regression Approach for Quantile Estim...
The sequential analysis of series often requires nonparametric procedures, where the most powerful ones frequently use rank transformations. Reranking the data sequence after each new observation can become too intensive computationally. This led to the idea of sequential ranks, where only the most recent observation is ranked. However, difficultie...
Capability analysis corresponds to a set of methods used to estimate and test the ability of an in-control process to provide a specific output. When there is only one quality characteristic that behaves as a continuous random variable, indices like C
p and C
pk can be used to measure how well requirements are met. Under normality, variation is ind...
A wide variety of Condition-Based Maintenance (CBM) techniques deal with the problem of predicting the time for an asset fault. Most statistical approaches rely on historical failure data that might not be available in several practical situations. To address this issue, practitioners might require the use of self-starting approaches that consider...
The main objective of Condition-Based Maintenance (CBM) is to assess the level of degradation on an equipment for maintenance decision-making. CBM literature shows a wide variety of methodologies to fulfill this objective, with positive results on diagnostics and prognostics of failure. However, the degradation variable could suffer an unexpected c...
The total value of domestic market capitalization of the Mexican Stock Exchange was calculated at 520 billion of dollars by the end of November 2013. To manage this system and make optimum capital investments, its dynamics needs to be predicted. However, randomness within the stock indexes makes forecasting a difficult task. To address this issue,...
Performance of maximum likelihood estimators (MLE) of the change-point in normal series is evaluated considering three scenarios where process parameters are assumed to be unknown. Different shifts, sample sizes, and locations of a change-point were tested. A comparison is made with estimators based on cumulative sums and Bartlett's test. Performan...
Professional certificate training is a challenge for trainees and trainers. A certification program's financial success depends not only on its quality, but also on certification rates. In this study, the authors theorize that trainee performance in the field project component of a training certificate program is the key variable to success; theref...
An important topic in the study of the time series behavior and, in particular, meteorological time series is the long-range dependence. This paper explores the behavior of rainfall variations in different periods, using long-range correlations analysis. Semivariograms and Hurst exponent were applied to historical data in different pluviometric sta...
The total value of domestic market capitalization of the Mexican Stock
Exchange was calculated at 520 billion of dollars by the end of November 2013.
To manage this system and make optimum capital investments, its dynamics needs
to be predicted. However, randomness within the stock indexes makes forecasting
a difficult task. To address this issue,...
Detection of a special cause of variation and the identification of the time it occurs are two important activities in any quality improvement strategy. Detection of changes in a process can be done using control charts. One of these charts, the self-starting CUSUM chart, was created to detect small sustained changes and be implemented without a Ph...
In Statistical Process Control (SPC), to implement a control chart, process parameters have to be estimated from a sample that is assumed to be in control. This estimation is prone to be contaminated with special causes of variation. When using rationale subgroups, traditional approaches that uses within sample variation offer protection against sh...
When sustained changes due to special causes of variation are present, tests and control charts can be used to detect them, and change-point estimators can be applied to approximate their location. When dealing with normal observations, classical maximum likelihood estimation of a change-point does not consider prior knowledge about the change-poin...
To improve a system, from a statistical process control approach, tools from the field of design and analysis of experiments might be used. Within this field, experimental and observational studies address the issue of finding causal relationships between potential factors and response variables. To collect data, experimental design techniques are...
In statistical process control, detection of special causes of variation and the estimation of the time when they occur are two important tasks for process improvement. When dealing with normal independent observations, maximum likelihood estimators for a change-point have been derived, and their structure happens to correspond with a least squared...
Living systems tend to have non‐normal behaviors, are autocorrelated, exhibit patterns of growth or decrement, and achieve states of dynamic equilibrium, making them hard to manage. One way to manage and improve these complex systems is by identifying assignable causes of variation whenever they occur, and control charts are one of the most known t...
Performing retrospective analysis, control charts are capable of detecting whether the process is in statistical control. Nevertheless, traditional approaches like Shewhart, CUSUM and EWMA charts does not use all available information provided by the data that might lead to detect sustained changes in a process. To solve this situation, the likelih...
Most living systems undergo periods of growth and decay. Engineering managers involved in monitoring living systems are in a predicament because most statistical process control tools for monitoring are for systems with zero slopes. This paper analyzes control charts available for monitoring non-zero slope living systems. The paper identifies stren...
Engineering Managers do not have an adequate tool for monitoring the progression of a living system. Most living systems undergo Non-zero slope behavior, which makes it difficult to discern special cause from common causes of variation. The features of the Self-Starting CUSUM chart provide a fertile base for the development of a tool that addresses...
To manage transactional and manufacturing processes, it is often necessary to monitor time between events. This is the case of failure times or arrival times, which might be modeled using a Gamma distribution. Control charts are known to be used to detect process changes that may lead to the identification of assignable causes of variation; which c...
Implementation of a control chart requires knowledge of the probability distribution of the measured characteristic under concern. If this knowledge does not exist, large amount of data have to be collected, their distribution have to be derived, and parameters have to be estimated. This is called Phase I of statistical process control. However, wh...
Phase I of control analysis requires large amount of data to fit a distribution and estimate the corresponding parameters of the process under study. However, when only individual observations are available, and no a priori knowledge exists, the presence of outliers can bias the analysis. A relatively recent and successful approach to address this...
A low certification achievement rate (32%) was measured between 2007 and 2010 on the Lean Six Sigma Open Enrollment Certification Program at Tecnológico de Monterrey. The main cause of this problem was the failure to implement Lean Six Sigma theory into real life projects. 78% of the students that didńt complete a project were unable to fulfill the...
To manage a system, analysts need to control the system, and control charts are known tools used to achieve this objective. However, when dealing with sustained changes, traditional charts are not capable to estimate the initial moment of a change. The detection and estimation of moments of sustained changes belongs to change-point analysis techniq...
A low certification achievement rate was measured during 2007 and 2008 in the Lean Six Sigma open enrollment program at Tecnológico de Monterrey. The use and application of tools covered during this training to complete a project is what most of participants failed to achieve. The objective of this research is to develop a computerized roadmap to g...
Most traditional statistical tools have been developed under the assumption of zero slope behavior over time and normally distributed data. However, living systems are systems that tend to have non-normal behavior, exhibit a pattern of constant growth or a steady decrement, and achieve states of dynamic equilibrium, making them hard to manage. This...