Hon Keung Tony Ng

Hon Keung Tony Ng
Bentley University · Department of Mathematical Sciences

Ph.D.

About

213
Publications
25,419
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
3,533
Citations
Additional affiliations
August 2002 - June 2022
Southern Methodist University
Position
  • Professor (Full)

Publications

Publications (213)
Article
Full-text available
There has been a considerable amount of literature on binomial regression models that utilize well-known link functions, such as logistic, probit, and complementary log-log functions. The conventional binomial model is focused only on a single parameter representing one probability of success. However, we often encounter data for which two differen...
Article
Analysis of means (ANOM) is a graphical alternative for the analysis of variance (ANOVA) that was primarily developed for multiple mean comparisons. The ANOM is a simple graphical display that provides a visualization of statistically significant results and it allows validating their practical significance without deep statistics knowledge. The cl...
Chapter
Load‐sharing models originated in the study of the breaking strength of bundles of threads in 1945 and fibrous composites 20 years later. Its first use in physics was apparently in 1989 after which it was popularized in the physical science community as the fiber bundle model for modeling the breakdown of physical systems. The history and backgroun...
Chapter
Because of the capacitor laws, a series circuit of capacitors can be considered as a load‐sharing bundle, while a parallel/series circuit can be treated as a chain of bundles. Such a chain is of interest in reliability since, if the parallel/series circuit fails when one of the series circuits fails, it is a weakest link chain of bundles (a series/...
Chapter
The fiber bundle model for modeling the breakdown of physical systems is discussed and illustrated for circuits of capacitors and dielectrics. Concepts such as the weakest link, chain of bundles, parallel/series capacitor circuits and series/parallel reliability systems, and size effects for chains are discussed in terms of fibrous composites and c...
Article
Background Suicide with complex etiology and increased global prevalence is a challenge in genome-based biomarker discovery. The present study explores whole-genome association in a small, isolated, historical tribal population of Idu Mishmi from Dibang Valley of Arunachal Pradesh, India with a high rate of suicide. Methods Microarray genotyping (...
Chapter
Statistical analysis of progressively censored data has received much attention during the past decades. This article features different statistical analysis procedures for data obtained with a progressive censoring scheme (PCS). In Section 1, valid and efficient statistical analysis procedures for PCS with progressive staggering and nonstaggering...
Article
Traditionally, in using an auxiliary variable for quality control applications, the response and auxiliary variables are often assumed to follow a bivariate normal distribution. Given that the response variable and an auxiliary variable are linearly related, a novel process monitoring method is proposed to scrutinize the shifts of the model paramet...
Article
Full-text available
In a repairable consecutive C(k,n:F) system, after the system operates for a certain time, some components may fail, some failed components may be repaired and the state of the system may change. The models developed in the existing literature usually assume that the state of the system varies over time depending on the values of n and k and the st...
Article
Full-text available
Streakiness is an important measure in many sports data for individual players or teams in which the success rate is not a constant over time. That is, there are many successes/failures during some periods and few or no successes/failures during other periods. In this paper we propose a Bayesian binary segmentation procedure using a bivariate binom...
Article
Full-text available
In this paper, we aim to identify the significant variables that contribute to the injury severity level of the person in the car when an accident happens and build a statistical model for predicting the maximum injury severity level as well as estimating the potential economic cost in a car accident based on those variables. The General Estimates...
Preprint
Full-text available
enWe study a statistical model for panel data with unobservable grouped factor structures which are correlated with the regressors and whose group membership can be unknown. We assume the factor loadings belong to different subspaces and consider the subspace clustering for factor loadings. We propose a method called least-squares subspace clusteri...
Article
Full-text available
A tolerance interval is a statistical interval that covers at least 100 ρ % of the population of interest with a 100(1− α ) % confidence, where ρ and α are pre-specified values in (0, 1). In many scientific fields, such as pharmaceutical sciences, manufacturing processes, clinical sciences, and environmental sciences, tolerance intervals are used f...
Article
The four-parameter generalized gamma (GΓ) distribution, also known as the Amoroso family of distributions, is a flexible and versatile statistical distribution that encapsulates many well-known lifetime distributions, including the exponential, Weibull, lognormal, and gamma distributions as special instances. The four-parameter GΓ distribution is s...
Article
Full-text available
In geometry and topology, a family of probability distributions can be analyzed as the points on a manifold, known as statistical manifold, with intrinsic coordinates corresponding to the parameters of the distribution. Consider the exponential family of distributions with progressive Type-II censoring as the manifold of a statistical model, we use...
Article
In this article, a class of sequential precedence tests based on ranked set samples (RSS-SPT) with different sequential rank schemes and different α-spending is proposed. The exact null distributions of the proposed RSS-SPT test statistics are derived and the corresponding critical values are tabulated. The performance of the class of RSS-SPT is ev...
Preprint
In this paper, we present a critical overview of statistical fiber bundles models. We discuss relevant aspects, like assumptions and consequences stemming from models in the literature and propose new ones. This is accomplished by concentrating on both the physical and statistical aspects of a specific load-sharing example, the breakdown (BD) for c...
Preprint
Full-text available
In this paper, we propose two simple yet efficient computational algorithms to obtain approximate optimal designs for multi-dimensional linear regression on a large variety of design spaces. We focus on the two commonly used optimal criteria, $D$- and $A$-optimal criteria. For $D$-optimality, we provide an alternative proof for the monotonic conver...
Article
The characteristic of dependence widely exists among different failure modes of systems, which brings extra difficulty for the reliability analysis. In this paper, a nonparametric Bayesian analysis method is proposed for dependent masked data under accelerated lifetime test with censoring. Using the copula function, the dependence structure is cons...
Article
In reliability engineering, obtaining lifetime information for highly reliable products is a challenging problem. When a product quality characteristic whose degradation over time can be related to lifetime, then the degradation data can be used to estimate the first-passage (failure) time distribution and the mean-time-to-failure (MTTF) for a give...
Article
In system engineering, the reliability of a system depends on the reliability of each subsystem. Those subsystems have their own performance characteristics (PCs) which can be dependent. The degradation of those dependent PCs of the subsystems is used to access the system reliability. Parametric frameworks have been developed to model bivariate deg...
Book
This edited collection brings together internationally recognized experts in a range of areas of statistical science to honor the contributions of the distinguished statistician, Barry C. Arnold. A pioneering scholar and professor of statistics at the University of California, Riverside, Dr. Arnold has made exceptional advancements in different are...
Chapter
In the bivariate case, we illustrate the situations where the normalizing constant is in a closed form and situations where the normalizing constant is not in a closed form. Distributional properties of such models are investigated. Discussions on some conjectures related to hidden truncation paradigm for non-normal models are also provided.
Chapter
Evaluating the first-passage time (FPT) distribution of a stochastic process is a prominent statistical problem, which has long been studied. This problem has important applications in reliability and degradation data analysis since the time that a degradation process of a product passes a critical level is considered as the failure time of the pro...
Article
Proportional reversed hazard model and exponentiated distributions have received considerable attention in the statistical literature due to its flexibility. In this paper, we develop the tools for statistical inference of the lifetime distribution of components in a n-component coherent system while the system lifetimes are observed, the system st...
Chapter
In this paper, we discuss the optimal allocation problem in a multi-level accelerated life testing experiment under time (Type-I) censoring when an extreme-value regression model is used for statistical analysis. We derive the expected Fisher information and the asymptotic variance-covariance matrix of the maximum likelihood estimators. Three optim...
Article
In this paper, we consider maximum likelihood estimation of the proportional parameter in a proportional hazard rate (PHR) model based on single and multiply censored order statistics, and progressively Type-II censored order statistics from lifetimes that follow the PHR model. The expectation-maximization (EM) algorithm is proposed for computing t...
Article
Full-text available
Following Arnold and Beaver (2000, Sankhy¯a A, 62, 23–35), we re-visit the hidden truncation paradigm for non-normal models with a feature that the resulting hidden truncated distribution arises from two different families of distributions with the same support set as well as from the same family. In the bivariate case, we illustrate the situations...
Article
This paper considers a step-stress accelerated dependent competing risks model under progressively Type-I censoring schemes. The dependence structure between competing risks is modeled by a general bivariate function, the cumulative exposure model is assumed and the accelerated model is described by the power rule model. The point and interval esti...
Article
Full-text available
In this paper, E-Bayesian estimation of the scale parameter, reliability and hazard rate functions of Chen distribution are considered when a sample is obtained from a type-I censoring scheme. The E-Bayesian estimators are obtained based on the balanced squared error loss function and using the gamma distribution as a conjugate prior for the unknow...
Article
Information geometry has been attracting considerable attention in different scientific fields including information theory, neural networks, machine learning, and statistical physics. In reliability and survival analysis, methods of information geometry are employed to discuss the geometry on a reliability model. Most of the existing work of infor...
Article
Analysis of means (ANOM), similar to Shewhart control chart that exhibits individual mean effects on a graphical display, is an attractive alternative mean testing procedure for the analysis of variance (ANOVA). The procedure is primarily used to analyze experimental data from designs with only fixed effects. Recently introduced, the ANOM procedure...
Article
In this paper, we discuss the parameter estimation for the generalized gamma distribution based on left‐truncated and right‐censored data. A stochastic version of the expectation‐maximization (EM) algorithm is proposed as an alternative method to compute approximate maximum likelihood estimates. Two different methods to obtain reliable initial esti...
Article
Motivated by an accelerated degradation test (ADT) on the power gain of microwave power amplifiers, in this article we propose a model-ranking approach for the estimation of some important reliability characteristics. Different degradation models and statistical lifetime distributions are applied to model the data obtained from the ADT. We study th...
Article
For degradation data in reliability analysis, estimation of the first‐passage time (FPT) distribution to a threshold provides valuable information on reliability characteristics. Recently, Balakrishnan and Qin (2019; Applied Stochastic Models in Business and Industry, 35:571–590) studied a nonparametric method to approximate the FPT distribution of...
Article
Copula models have become one of the most popular tools, especially in finance and insurance, for modeling multivariate distributions in the past few decades, and they have recently received increasing attention for data analysis in reliability engineering and survival analysis. This paper considers two Archimedean copula models — the Gumbel-Hougaa...
Article
Full-text available
Degradation models have been investigated extensively for the evaluation of the quality and reliability of highly reliable products. In practical applications, the proper model of a degradation dataset is often unknown and misspecified for one thing; the dataset may be contaminated or contains outliers for another. Here, contamination means the deg...
Article
In this paper, motivated by a real data example about borderline overian tumors, we study a two-dimension dynamic panel models with confounding individual effect for modeling binary panel data. We propose using the maximum likelihood estimation method to estimate the model parameters. The properties of the maximum likelihood estimators are studied....
Chapter
Designing a proper life test plan to evaluate the quality of a lot of products in order to decide on accepting or rejecting the lot between the manufacturer and customers is an important objective in quality control studies. Most existing life test plans are developed based on the mean time to failure (MTTF) of the products in which a lot is accept...
Chapter
In quality control, the quality of process or product can be characterized by a profile that defines as a functional relationship between a quality response variable and one or more explanatory variables. Many research works have been accomplished on statistical process control for simple linear profile with independent or autocorrelated observatio...
Article
Information geometry has been productively used in different research fields involving machine learning, neural networks, optimization and statistics. It also applied in reliability analysis where time-to-failure data are available for study. For highly reliable products, however, it is difficult to obtain failure data in a reasonable time period....
Article
In this paper, we discuss the statistical inference of constant-stress accelerated dependent competing risks model under Type-II hybrid censoring schemes. The dependency structure is modeled by a Marshall-Olkin bivariate Weibull distribution. Both the shape and the scale parameters in the model are assumed to be dependent on the stress levels throu...
Article
Full-text available
We model the interaction between a qubit and optical radiation field (ORF) using an excited binomial distribution (EBD) and excited negative binomial distribution (ENBD). The explicit form of the density matrix of the qubit–ORF system is given in terms of the photon number distribution. The Mandel parameter is used to quantify the statistical prope...
Article
Type‐I interval‐censoring scheme only documents the number of failed units within two prespecified consecutive exam times at the larger time point after putting all units on test at the initial time schedule. It is challenging to use the collected information from type‐I interval‐censoring scheme to evaluate the reliability of unit when not all adm...
Article
Full-text available
In this article, two different types of precedence tests, each with two different test statistics, based on ranked set samples for testing the equality of two distributions are discussed. The exact null distributions of proposed test statistics are derived, critical values are tabulated for both set size and number of cycles up to 8, and the exact...
Article
In this paper, we propose several tests for monotonic trend based on the Brillinger's test statistic (1989, Biometrika, 76, 23–30). When there are highly correlated residuals or short record lengths, Brillinger's test procedure tends to have significance level much higher than the nominal level. It is found that this could be related to the discrep...
Book
This book explores different statistical quality technologies including recent advances and applications. Statistical process control, acceptance sample plans and reliability assessment are some of the essential statistical techniques in quality technologies to ensure high quality products and to reduce consumer and producer risks. Numerous statist...
Article
Full-text available
In survival or reliability data analysis, it is often useful to estimate the quantiles of the lifetime distribution, such as the median time to failure. Different nonparametric methods can construct confidence intervals for the quantiles of the lifetime distributions, some of which are implemented in commonly used statistical software packages. We...
Article
In this paper, a new censoring scheme named by adaptive progressively interval censoring scheme is introduced. The competing risks data come from Marshall–Olkin extended Chen distribution under the new censoring scheme with random removals. We obtain the maximum likelihood estimators of the unknown parameters and the reliability function by using t...
Article
Information geometry and statistical manifold have attracted wide attention in the past few decades. The Amari–Chentsov structure plays a central role both in optimization and statistical inference because of the invariance under sufficient statistics. This paper discusses the Amari–Chentsov structure on statistical manifolds induced by the curved...
Article
The case-control and case-only designs are commonly used to detect the gene–environment (G–E) interaction. In principle, the tests based on these two designs require a pre-specified genetic model to achieve an expected power of detecting the G–E interaction. Unfortunately, for most complex diseases the underlying genetic models are unknown. It is w...
Article
This paper presents two model selection approaches, namely the random data-driven approach and the weighted modeling approach, to construct robust bootstrap control charts for process monitoring of percentiles of the shape-scale class of distributions under model uncertainty. The generalized exponential, lognormal and Weibull distributions are cons...
Article
The invariant geometric structures on the statistical manifold under sufficient statistics have played an important role in both statistical inference and information theory. In this paper, we focus on one of the commonly used invariant geometric structures, the Amari–Chentsov structure, on a statistical manifold. The manifold is derived from stati...
Article
In this paper, robust control charts for percentiles based on location‐scale family of distributions are proposed. In the construction of control charts for percentiles, when the underlying distribution of the quality measurement is unknown, we study the problem of discriminating different possible candidate distributions in the location‐scale fami...
Article
Full-text available
A simple yet efficient computational algorithm for computing the continuous optimal experimental design for linear models is proposed. An alternative proof the monotonic convergence for $D$-optimal criterion on continuous design spaces are provided. We further show that the proposed algorithm converges to the $D$-optimal design. We also provide an...
Article
Full-text available
In environmental studies, many data are typically skewed and it is desired to have a flexible statistical model for this kind of data. In this paper, we study a class of skewed distributions by invoking arguments as described by Ferreira and Steel (2006, Journal of the American Statistical Association, 101: 823--829). In particular, we consider usi...
Article
The joint cumulative residual entropy (CRE), a measure of information, based on progressively Type-II censored order statistics is discussed. Some useful representations, recurrence relations, a characterization result and two nonparametric estimation methods for the CRE are developed.
Article
Information geometry has been attracted wide attentions in the past few decades. This paper focuses on the Bayesian duality on a statistical manifold derived from the exponential family with data from life tests. Based on life testing data, the statistical manifold is constructed with a new cumulant generating function. The Bregman divergence betwe...
Article
In science and engineering, we are often interested in learning about the lifetime characteristics of the system as well as those of the components that made up the system. However, in many cases, the system lifetimes can be observed but not the component lifetimes, and so we may not also have any knowledge on the structure of the system. Statistic...
Article
Remaining useful life prediction has been one of the important research topics in reliability engineering. For modern products, due to physical and chemical changes that take place with usage and with age, a significant degradation rate change usually exists. Degradation models that do not incorporate a change point may not accurately predict the r...
Article
Full-text available
In this paper, the methods of information geometry are employed to investigate a generalized Bayes rule for prediction. Taking α-divergences as the loss functions, optimality and asymptotic properties of the generalized Bayesian predictive densities are considered. We show that the Bayesian predictive densities minimize a generalized Bayes risk. We...
Article
In this paper, we propose a stochastic gamma process model for assessing the similarity of two dissolution profiles. Based on the proposed stochastic model, we utilize the difference factor and similarity factor to test the similarity of two dissolution profiles based on bootstrap confidence intervals. The performances of the proposed methods are c...
Article
Full-text available
Analysis of means (ANOM) is a powerful tool for comparing means and variances in fixed-effects models. The graphical exhibit of ANOM is considered as a great advantage because of its interpretability and its ability to evaluate the practical significance of the mean effects. However, the presence of random factors may be problematic for the ANOM me...
Chapter
The lifetime information of highly reliable products is usually very difficult to be obtained within an affordable amount experimental time through using traditional life testing methods. Because the benefit of lower manufacturing cost for many highly reliable products, manufacturers can offer more highly reliable products for implementing an accel...
Chapter
Prognostics and system health management becomes an important topic in modern reliability study. In prognostics and system health management, remaining useful life is one of the vital indexes to yield an advance warning of impending failure in a system, thereby helping in executing preventive actions prior to failure occurrence and helping in makin...
Article
Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-est...
Article
Full-text available
In this paper, the moment-based, maximum likelihood and Bayes estimators for the unknown parameter of the Lindley model based on Type II censored data are discussed. The expectation maximization (EM) algorithm and direct maximization methods are used to obtained the maximum likelihood estimator (MLE). Existence and uniqueness of the moment-based an...
Chapter
In the process of designing life-testing experiments , experimenters always establish the optimal experiment scheme based on a particular parametric lifetime model. In most applications, the true lifetime model is unknown and need to be specified for the determination of optimal experiment schemes. Misspecification of the lifetime model may lead to...
Book
This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation dat...
Article
Full-text available
Lindley distribution has received considerable attention in the statistical literature due to its simplicity. In this paper, we consider the problem of predicting the failure times of experimental units that are censored in a right-censored experiment when the underlying lifetime is Lindley distributed. The maximum likelihood predictor, the best un...