## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

Optimal censoring schemes in the model of progressive type II censoring are obtained for a location-scale family of distributions which includes exponential, uniform and Pareto distributions. In the one-parameter set-up the variance of the respective BLUE is used as an optimality criterion. In the two-parameter situation two criteria are applied which are based on the covariance matrix of the corresponding BLUEs, namely the trace and the determinant.

To read the full-text of this research,

you can request a copy directly from the authors.

... The idea of looking for optimal schemes with respect to particular criteria received some interest in recent articles (cf. Burkschat et al., 2006;Hofmann et al., 2005). Balakrishnan and Aggarwala (2000, Ch. 10) address optimization of best linear unbiased estimators (BLUEs) in location-scale families. ...

... Optimal schemes are obtained for several distributions (e.g., normal, extreme value, log-normal distribution) via computational comparisons. Burkschat et al. (2006) take up their approach and explicitly derive optimal schemes for the location-scale family (where sgn q denotes the sign of q = 0) ...

... In the present article, the two-parameter set-up considered in Burkschat et al. (2006) is extended. In Sec. 2 it will be shown that known results can be generalized to a larger class of functionals of the covariance matrix, e.g., the p -criteria (see Pukelsheim, 1993;Shah and Sinha, 1989). ...

Best linear unbiased estimation for parameters of a particular location-scale family based on progressively Type-II censored order statistics is considered and optimal censoring schemes are determined. As optimality criteria serve the φ p - criteria from experimental designs which are applied to the covariance matrix of the BLUEs. The results are supplemented by monotonicity properties of the trace and the determinant with respect to the sample size and the initial number of items in the experiment.

... where [2], one of the basic properties of order statistics is that the fewer operating items are withdrawn, the shorter total test time is expected. Thus, once the experiment lasts beyond expected time, the procedure will be accelerated. ...

... Using general likelihood functions (1) and (2), and functions of inverse Weibull distribution, the likelihood functions of the censored sample are given by ...

... With transformations of expression in balanced loss function, all the estimators mentioned above, including maximum likelihood estimation, symmetric, and asymmetric Bayes estimation, can be presented as special cases. When ρ denotes different functions, such as (φ − σ) 2 (33) describes the squared error estimation, Linex loss function, and general entropy loss function. ...

This paper discusses entropy estimations for two-parameter inverse Weibull distributions under adaptive type-II progressive hybrid censoring schemes. Estimations of entropy derived by maximum likelihood estimation method and Bayes estimation method are both considered. Different Bayes estimators using squared loss function, Linex loss function, general entropy loss function, and balanced loss function are derived. Numerical results are derived by Lindley’s approximation method. Especially, the interval estimation of entropy is derived through maximum likelihood estimation method. To test the effectiveness of the estimations, simulation studies are conducted. These entropy estimation methods are illustrated and applied to analyze a real data set.

... , R * m ) such that ∼ n m−1 /(m − 1)!. According to Burkschat et al. [15], it is equivalent to consider optimality w.r.t. the set m,n = {(γ 1 , . . . , γ m ) ∈ N m |n = γ 1 > γ 2 > · · · > γ m ≥ 1}, ...

... For some optimality criteria, Balakrishnan et al. [3] and Burkschat [12] noticed that one-step plans (OSPs) turn out to be optimal in many situations (see also [9,15,16] In some cases, the following partial ordering on CS(m, n) due to Cramer [17] is useful to establish optimality of O (1) and O(m). For R = (R 1 , . . . ...

Fisher information about multiple parameters in a progressively Type-II censored sample is discussed. A representation of the Fisher information matrix in terms of the hazard rate of the baseline distribution is established which can be used for efficient computation of the Fisher information. This expression generalizes a result of Zheng and Park [On the Fisher information in multiply censored and progressively censored data, Comm. Statist. Theory Methods 33 (2004), pp. 1821–1835] for Fisher information about a single parameter. The result is applied to identify A- and D-optimal censoring plans in a progressively Type-II censored experiment. For illustration, extreme value, normal, and Lomax distributions are considered.

... The interested readers may refer to the book by Balakrishnan and Aggarwala [1]. See also, Balakrishnan et al. [3] and Burkschat et al. [4]. The rest of the paper is as follows: In Section 2, some preliminaries are presented. ...

... where Z 1 and Z 2 are as defined in (4). So, by some algebraic calculations, we get π = 1− 1 − n(τ −μ0) mσ0 1−m . ...

The interval estimation of the survival function of the two-parameter exponential distribution on the basis of the progressively Type-II censored samples is investigated. Toward this end, the concept of the generalized confidence intervals (GCIs) is used and the lower and upper generalized confidence limits (GCLs) are obtained. It will be shown that the coverage probabilities of the GCLs are satisfactory using a simulation study. Finally, some concluding remarks are presented.

... It allows the experimenter to remove units from a life test at various stages during the experiment which may lead to a saving of costs and of time. (see Cohen, 1963 andSen, 1986). In such a random experiment, a group of n independent and identical experimental units is put on a life test at time zero with continuous, identically distributed failure times X 1 , X 2 , ..., X n . ...

... Here, the drop-outs of patients may be caused by migration, lack of interest or by personal or ethical decisions, and they are regarded as random withdrawals. For a detailed discussion of progressive censoring and the relevant developments in this area, one may refer to Sen (1986), Balakrishnan and Aggarwala (2000) and Aggarwala (2001). ...

In this paper we derive some general recurrence relations between moments of progressively Type-II right censored order statistics from a general class of doubly truncated distributions, thus unifying the earlier results in this direction due to several authors.

... The purpose of experimental design is to pick a censoring plan R from C m,n (or from the set of restricted censoring schemes) which is best according to some optimality criterion. Initiated in Balakrishnan and Aggarwala (2000), plenty of such criteria have been discussed in the literature including minimum expected test duration, minimum variance of estimators (of parameters and quantiles), maximum Fisher information, and minimum entropy (see, e.g., Ng et al. (2004), Burkschat et al. (2006Burkschat et al. ( , 2007, Abo-Eleneen (2007), Burkschat (2008), Kundu and Pradhan (2009), Kundu (2009, 2013)). A recent review of these approaches as well as a survey of available results is provided in Balakrishnan and Cramer (2014, Chapter 26). ...

... Further results can be obtained for the variances of the above random variables. Applying the idea of Balakrishnan and Aggarwala (2000), Burkschat et al. (2006Burkschat et al. ( , 2007 considered precision of BLUEs as a criterion for optimality of censoring plans in the family of generalized Pareto distributions ...

Unlike traditional Type I and II censoring, progressive censoring allows for removal of surviving units throughout the life test. This feature has several benefits and in the literature, much work has been done on inference based on progressively censored samples and identifying optimal progressive censoring schemes. In this paper, we introduce restricted progressive censoring and within this class, the problem of identifying optimal schemes under different minimum variance criteria. We explore these plans geometrically as well as provide some useful properties. In particular, we look at the vertices of the set of admissible plans and their role in approximating optimal plans. We also provide computational results for illustrative purposes.

... It allows the experimenter to remove units from a life test at various stages during the experiment which may lead to a saving of costs and of time. (see Cohen, 1963 andSen, 1986). In such a random experiment, a group of n independent and identical experimental units is put on a life test at time zero with continuous, identically distributed failure times X 1 , X 2 , ..., X n . ...

... Here, the drop-outs of patients may be caused by migration, lack of interest or by personal or ethical decisions, and they are regarded as random withdrawals. For a detailed discussion of progressive censoring and the relevant developments in this area, one may refer to Sen (1986), Balakrishnan and Aggarwala (2000) and Aggarwala (2001). ...

In this paper we derive some general recurrence relations between moments of progressively Type-II right censored order statistics from a general class of doubly truncated distributions, thus unifying the earlier results in this direction due to several authors.

... Most frequently (1) R and () m R are optimal, see e.g. Balakrishnan and Aggarawala (2000), Ng et al. (2002Ng et al. ( , 2004, Burkschat et al. (2006Burkschat et al. ( , 2007, and Balakrishnan et al. (2007). For the same family, Burkschat (2008) considered best linear equivariant estimation and Löwner ordering on mean squared error matrices. ...

... Use the definitions of and entropy measures together with (6) and (2) to get (8) and (10). In (8) and (10), put , and , i.e. to get (9) and (11). ...

The main objective of this paper is to explore suitability of some entropy-information measures for introducing a new optimality censoring criterion and to apply it to some censoring schemes from some underlying lifetime models. In addition, the paper investigates four related issues namely; the effect of the parameter of parent distribution on optimal scheme, equivalence of schemes based on Shannon and Awad sup-entropy measures, the conjecture that the optimal scheme is one stage scheme, and a conjecture by Cramer and Bagh (2011) about Shannon minimum and maximum schemes when parent distribution is reflected power. Guidelines for designing an optimal censoring plane are reported together with theoretical and numerical results and illustrations.

... In practical situations, to conduct a life testing experiment, it is advisable to choose the optimum censoring scheme (OCS) out of all possible censoring schemes, i.e., the censoring scheme which will provide maximum information about the unknown parameters based on some scientific criteria. Burkschat et al. (2006) provided optimum censoring scheme based on minimizing variance of a best linear unbiased estimator. Ng et al. (2004) obtained the optimum censoring scheme in terms of minimizing variance of maximum likelihood estimators from Weibull distribution. ...

Progressive censoring schemes have received considerable attention recently. All of these developments are mainly based on a single population. Recently, Mondal and Kundu (2016 Mondal, S., & Kundu, D. (2016). A new two sample Type-II progressive censoring scheme. Retrieved from arXiv:1609.05805. doi:10.1080/03610926.2018.1472781[Taylor & Francis Online] , [Google Scholar]) introduced the balanced joint progressive censoring scheme (BJPC), and studied the exact inference for two exponential populations. It is well known that the exponential distribution has some limitations. In this article, we implement the BJPC scheme on two Weibull populations with the common shape parameter. The treatment here is purely Bayesian in nature. Under the Bayesian set up we assume a Beta Gamma prior of the scale parameters, and an independent Gamma prior for the common shape parameter. Under these prior assumptions, the Bayes estimators cannot be obtained in closed forms, and we use the importance sampling technique to compute the Bayes estimators and the associated credible intervals. We further consider the order restricted Bayesian inference of the parameters based on the ordered Beta Gamma priors of the scale parameters. We propose one precision criteria based on expected volume of the joint credible set of model parameters to find out the optimum censoring scheme. We perform extensive simulation experiments to study the performance of the estimators, and finally analyze one real data set for illustrative purposes.

... Suppose that n independent units with identical cumulative distribution function (CDF; F(·)) and probability density function (PDF; f (·)) are placed on a lifetime experiment. Moreover, assume that at the first failure time (y 1 ), s 1 surviving experimental units are removed and at the second failure time For more detail about Type-II progressive censoring readers can refer to Balakrishnan and Aggarwala (2000), Balakrishnan (2007), , Balakrishnan et al. (2008, Burkschat (2008), Burkschat et al. (2006), Cohen (1963), Cramer (2014), Cramer and Kamps (2001), Herd (1956). Some valuable results can also be found in Ghitany et al. (2013Ghitany et al. ( , 2014, Kang and Seo (2011), Krishna and Kumar (2013), Pakyari and Balakrishnan (2013), Rezapour et al. (2013aRezapour et al. ( , 2013b, Seo and Kang (2014). ...

... When applying progressive Type-II right censoring in life-testing experiments, the selection of an optimal censoring scheme is an important issue, so there is a need for criteria that lead to optimal designs. Different optimality criteria were discussed in literature, some of these criteria are the variance optimality (for the one-parameter case) and the trace and determinant optimality (for the multi-parameter case); these criteria intend to minimize the variance or the determinant of the variance-covariance matrix of estimator under consideration (Balakrishnan and Aggarwala, 2000; Burkschat et al., 2006 and Hofmann et al., 2005). ) ,..., ( 1 m R R 14 ...

... In the literature there are several researchers who treated the problem of selecting an optimal censoring/ truncation sampling scheme. They used several optimality criteria, for example see the work of Awad and Alawneh (1987), ( ), Burkschat et al. (2006 ( , 2007), Hamed (2007) , Abo-Eleneen (2008), Balakrishnan et al. (2008), Kittaneh (2008, and Haj Ahmad and Awad (2009 Awad ( , 2010). In the rest of this section we will introduce basic notations, terminology, and entropy-information measures that will be used in the paper together with some preliminary results. ...

... Thus, n failures are observed and R 1 + · · · + R n items are progressively censored; hence N = n + R 1 + · · · + R n . For details on the model of progressive type II censoring we refer to Sen (1986) and to the recent publications of Viveros and Balakrishnan (1994), Sandhu (1995, 1996) and Balakrishnan (1996, 1998). Point and interval estimation as well as relations for single and product moments are presented in Balakrishnan et al. (1999). ...

... In recent years, there has been a lot of interest in finding the optimal censoring scheme in the statistical literature; for example, see Refs. [47][48][49][50][51][52][53]. Possible censoring schemes refer to any R ,· · ·, R combinations such that n = m + ∑ R and finding the optimum sampling approach means locating the progressive censoring scheme that offers the most information about the unknown parameters among all conceivable progressive censoring schemes for fixed n and m. ...

It is extremely frequent for systems to fail in their demanding operating environments in many real-world contexts. When systems reach their lowest, highest, or both extreme operating conditions, they usually fail to perform their intended functions, which is something that researchers pay little attention to. The goal of this paper is to develop inference for multi-reliability using unit alpha power exponential distributions for stress-strength variables based on the progressive first failure. As a result, the problem of estimating the stress-strength function R, where X, Y, and Z come from three separate alpha power exponential distributions, is addressed in this paper. The conventional methods, such as maximum likelihood for point estimation, Bayesian and asymptotic confidence, boot-p, and boot-t methods for interval estimation, are also examined. Various confidence intervals have been obtained. Monte Carlo simulations and real-world application examples are used to evaluate and compare the performance of the various proposed estimators.

... It has received considerable attention in the literature. See, for example, [4,5,9,10,17,21,24]. It may be noted that finding optimum censoring scheme depends upon the proper choice of optimality criterion. ...

Hybrid censoring scheme is a combination of Type-I and Type-II censoring schemes. Determination of optimum hybrid censoring scheme is an important practical issue in designing life testing experiments to enhance the information on reliability of the product. In this work, we consider determination of optimum life testing plans under hybrid censoring scheme by minimizing the total cost associated with the experiment. It is shown that the proposed cost function is scale invariant for some selected distributions. Optimum solution cannot be obtained analytically. We propose a method for obtaining the optimum solution and consider Weibull distribution for illustration. We also studied the sensitivity of the optimal solution to the misspecification of parameter values and cost components through a well-designed sensitivity analysis. Copyright © 2013 John Wiley & Sons, Ltd.

... Progressively type II right censoring is a useful scheme in which a specific fraction of individuals at risk may be removed from the experiment at each of several ordered failure times (see [7]). The experimenter can remove units from a life test at various stages during the experiments, possibly resulting in a saving of costs and time (see [22]). A schematic illustration is depicted in Fig. 1, where x 1,n , x 2,n , . . ...

... Due to the computational complexity, they presented the optimal schemes only up to n = 50 and m = 3 (see Balakrishnan and Aggarwala, 2000, pages 197-204). Using the same optimality criterion, Burkschat et al. (2006Burkschat et al. ( , 2007 computed optimal censoring schemes for generalized Pareto distribution. Some more choices of ψ, such as total time on test, expected test duration and variance of the test time were considered by Burkschat (2008). ...

... For fixed values of the sample and failure time sizes, the scheme II in which the censoring occurs after the first observed failure gives more accurate results through the MSEs and RABs than the other schemes and this coincides with Theorem [2.2] by Burkschat et al.[42] (3) Results in the CSs III and IV are close to each other. (4) The MCMC CRIs give more accurate results than the approximate CIs and bootstrap CIs since the lengths of the former are less than the lengths of latter, for different sample sizes, observed failures and schemes. ...

Accelerated life testing is widely used in product life testing experiments since it provides significant reduction in time and cost of testing. In this paper, assuming that the lifetime of items under use condition follow the two-parameter Pareto distribution of the second kind, partially accelerated life tests based on progressively Type-II censored samples are considered. The likelihood equations of the model parameters and the acceleration factor are reduced to a single nonlinear equation to be solved numerically to obtain the maximum-likelihood estimates (MLEs). Based on normal approximation to the asymptotic distribution of MLEs, the approximate confidence intervals (ACIs) for the parameters are derived. Two bootstrap CIs are also proposed. The classical Bayes estimates cannot be obtained in explicit form, so we propose to apply Markov chain Monte Carlo method to tackle this problem, which allows us to construct the credible interval of the involved parameters. Analysis of a simulated data set has also been presented for illustrative purposes. Finally, a Monte Carlo simulation study is carried out to investigate the precision of the Bayes estimates with MLEs and to compare the performance of different corresponding CIs considered.

... In this paper, we consider the case of progressive Type-II censoring. A progressive Type-II censoring is a scheme in which a specific fraction of individuals at risk may be removed from the experiment at each of several ordered failure times (see Cohen [14]- [16], Sen [17], Balakrishnan and Cohen [18], Viveros and Balakrishnan [19], Balakrishnan and Sandhu [20], Balakrishnan and Aggarwala [21], Balakrishnan et al. [22], Balakrishnan and Lin [23], Fernandez [24], Asgharzadeh [25] and Wu et al. [26]). ...

Effective management and the assessment of quality performance of
products is important in modern enterprises. Often, the business
performance is measured using the lifetime performance index
CL to evaluate the potential of a process, where L is
a lower specification limit. In this paper the maximum likelihood
estimator (MLE) of CL is derived based on progressive Type
II sampling and assuming the Lomax distribution. The MLE of
CL is then utilized to develop a new hypothesis testing
procedure for given value of L. Moreover, we develop the Bayes
estimator of CL assuming the conjugate prior distribution
and applying the squared-error loss function. The Bayes estimator of
CL is then utilized to develop a credible interval again
for given L. Finally, we propose a Bayesian test to assess the
lifetime performance of products and give two examples and a Monte
Carlo simulation to assess and compare the two ML-approach with the
Bayes-approach with respect to the lifetime performance index
CL.

... For this reason, we consider the case of the progressive type-II censoring. A progressive type-II censoring is a useful scheme in which a specific fraction of individuals at risk may be removed from the experiment at each of several ordered failure times (see Cohen [1][2][3], Sen [4], Balakrishnan and Cohen [5], Viveros and Balakrishnan [6], Balakrishnan and Sandhu [7], Balakrishnan and Aggarwala [8], Balakrishnan et al. [9], Balakrishnan and Lin [10], Fernandez [11], Asgharzadeh [12], Wu et al. [13], Madi and Raqab [14] and Mahmoud et al. [15]). ...

In this paper, we investigate the problem of point and interval estimations of the parameters, reliability and hazard functions for distributions having power hazard function when sample is available from progressive Type-II censoring scheme. The maximum likelihood, Bayes, and parametric bootstrap methods are used for estimating the unknown parameters as well as some lifetime parameters (reliability and hazard functions). Based on the asymptotic normality of the maximum likelihood estimators the approximate confidence intervals (ACIs) are obtained. Moreover, in order to construct the asymptotic confidence intervals of the reliability and hazard functions, we need to find the variance of reliability and hazard functions, which are approximated by delta and parametric bootstrap methods. The Markov chain Monte Carlo (MCMC) technique is used to compute the Bayes estimates of the parameters. Gibbs within the Metropolis--Hasting algorithm has been applied to generate MCMC samples from the posterior density function. Based on the generated samples, the Bayes estimates and highest posterior density credible intervals of the unknown parameters as well as reliability and hazard functions have been computed. The results of Bayes method are obtained under both the balanced squared error (bSE) loss and balanced linear-exponential (bLINEX) loss. Finally, A numerical example using the real data set is provided to illustrate the proposed estimation methods developed here.

... 197-204). Burkschat et al. (2006Burkschat et al. ( , 2007 used the same utility function and obtained the optimum solution for generalized Pareto distribution. Further choices of ψ as expected test duration, total time on test and variance of the test time are also discussed in Burkschat (2008). ...

In determination of optimum Type-II progressive censoring scheme, the experimenter needs to carry out an exhaustive search within the set of all admissible censoring schemes. The existing recommendations are only applicable for small sample sizes. The implementation of exhaustive search techniques for large sample sizes is not feasible in practice. In this article, a meta-heuristic algorithm based on variable neighborhood search approach is proposed for large sample sizes. It is found that the algorithm gives exactly the same solution for small sample sizes as the solution obtained in an exhaustive search; however, for large sample sizes, it gives near-optimum solution. We have proposed a cost function-based optimum criterion, which is scale invariant for location-scale and log-location-scale families of distribution. A sensitivity analysis is also considered to study the effect of misspecification of parameter values or cost coefficients on the optimum solution.

... • With increase of effective sample size (n, m), the criteria ABs and MSEs of both MLEs and BEs decrease as expected, which implies the consistency of the associated estimates when sample sizes increase; • For fixed CSs, ABs and MSEs of BEs for the parameters α, β and λ are serve better than conventional MLEs in most cases in terms of the criteria ABs and MSEs. • For fixed sample sizes n, m, the results under CSs in the censoring occurs after the first observed failure gives more accurate results in terms of ABs and MSEs than others, and this coincides with the results of Burkschat et al. [8]. • From sample size perspective, it is seen that proposed CSs [1] and [2] have smaller sample sizes than those of other CSs, which also indicate that there are relatively larger disparity between the quantities derived from MLEs in Figure 1. ...

This paper presents methods of estimation of the parameters and acceleration factor for
Nadarajah-Haghighi distribution based on constant-stress partially accelerated life tests. Based on progressive Type-II censoring, Maximum likelihood and Bayes estimates of the model parameters and acceleration factor are established, respectively. In addition, approximate confidence interval
are constructed via asymptotic variance and covariance matrix, and Bayesian credible intervals are obtained based on importance sampling procedure. For comparison purpose, alternative bootstrap confidence intervals for unknown parameters and acceleration factor are also presented. Finally, extensive simulation studies are conducted for investigating the performance of the our results, and two data sets are analyzed to show the applicabilities of the proposed methods.

... In the literature there are several researchers who treated the problem of selecting an optimal censoring/ truncation sampling scheme. They used several optimality criteria, for example see the work of Awad and Alawneh (1987), Azzam and Awad (1997), Balakrishnan and Aggarwala (2000), Ng et al. (2004), Zheng and Park (2004), Hattab (2005), Aich A.B. (2006), Burkschat et al. (2006Burkschat et al. ( , 2007, Hamed (2007) , Abo-Eleneen (2008), Balakrishnan et al. (2008), Kittaneh (2008), and Haj Ahmad and Awad (2009,2010). ...

... In recent years, there has been much interest in finding the optimal censoring scheme in the statistical literature; for example, see [30][31][32][33]. For fixed n and m, possible censoring schemes are all ,· · · , combinations such that ,· · · , , and choosing the best sample technique entails finding the progressive censoring scheme that provides the most information about the unknown parameters among all conceivable progressive censoring schemes. ...

In this study, the estimation of the unknown parameters of an alpha power Weibull (APW) distribution using the concept of an optimal strategy for the step-stress accelerated life testing (SSALT) is investigated from both classical and Bayesian viewpoints. We used progressive type-II censoring and accelerated life testing to reduce testing time and costs, and we used a cumulative exposure model to examine the impact of various stress levels. A log-linear relation between the scale parameter of the APW distribution and the stress model has been proposed. Maximum likelihood estimators for model parameters, as well as approximation and bootstrap confidence intervals (CIs), were calculated. Bayesian estimation of the parameter model was obtained under symmetric and asymmetric loss functions. An optimal test plan was created under typical operating conditions by minimizing the asymptotic variance (AV) of the percentile life. The simulation study is discussed to demonstrate the model's optimality. In addition, real-world data are evaluated to demonstrate the model's versatility.

... In this section, we determine the amount of FI about θ contained in the data set B including the first m progressively Type-II censored order statistics. Toward this end, we assume that n = 8, m = 4 and consider four n-tuples of proportionality rates as λ 1 = (1, 1, 1, 1, 1, 1, 1, 1) which corresponds to the case of independent and identically distribute random variables with W e(θ, 1) distribution, λ 2 = (1, 0.9, 0.8, 0.7, 0.6, 0.5, 0.4, 0.3), λ 3 = (1, 1.5, 2, 2.5, 3, 3.5, 4, 4.5) and λ 4 = (1, 3,5,7,9,11,13,15). Using (29), the numerical values of Table 3, for some choices of progressive censoring plans, R, and it is deduced that: ...

Suppose that the failure times of the units placed on a life-testing experiment are independent but nonidentically distributed random variables. Under progressively type II censoring scheme, distributional properties of the proposed random variables are presented and some inferences are made. Assuming that the random variables come from a proportional hazard rate model, the formulas are simplified and also the amount of Fisher information about the common parameters of this family is calculated. The results are also extended to a fixed covariates model. The performance of the proposed procedure is investigated via a real data set. Some numerical computations are also presented to study the effect of the proportionality rates in view of the Fisher information criterion. Finally, some concluding remarks are stated.

... In this paper, we consider the case of the progressive type-II censoring. A progressive type-II censoring is a useful scheme in which a specific fraction of individuals at risk may be removed from the experiment at each of several ordered failure times, see Cohen [9,10], Sen [29], Balakrishnan and Cohen [5], Viveros and Balakrishnan [30], Balakrishnan and Sandhu [7], Balakrishnan and Aggarwala [4], Balakrishnan et al. [8], Balakrishnan and Lin [6], Fernandez [12], Asgharzadeh [3] and Wu et al. [31]. ...

In this paper, Health-related quality of life has not been adequately measured in bladder cancer. A recently developed reliable and disease-specific quality of life instrument (Bladder Cancer Index, BCI) was used to measure. Progressive type II censoring schemes have potential usefulness in practice where budget constraints in place or there is a necessity for the speedy test. To test the process capability, the lifetime performance index [Formula: see text] is widely recommended for evaluating the performance of the product’s lifetime and evaluating the lifetime performance index [Formula: see text] for the three-parameter Weighted-Lomax distribution (WLx) under progressive type-II censoring sample for a lower specification limit ([Formula: see text]). The statistical inference concerning [Formula: see text] is conducted via obtaining the maximum likelihood of [Formula: see text] on the base of progressive type-II censoring. The asymptotic normal distribution of the MLE of [Formula: see text] and the confidence interval is proposed. Moreover, the hypothesis testing of [Formula: see text] for evaluating the lifetime performance of WLx data is conducted. Also, assuming the conjugate prior distribution and squared error loss function, this study constructs a Bayes estimator of [Formula: see text]. The Bayes estimator of [Formula: see text]is then utilized to develop a credible interval in the condition of known [Formula: see text]. Moreover, we propose a Bayesian test to assess the lifetime performance of products. We also propose a Bayesian test to assess the lifetime performance of products. Finally, two examples are given, one of them is considering a real life data of the remission times of bladder cancer patients in endurance lifetime test and the other is a simulated example to illustrate the usage of the proposed procedure.

... Their computation results show that according to this criterion, optimal censoring scheme are not necessarily one step schemes. Burkschat Cramer and Kamps (2006) studied optimality in case of the generalized Pareto distribution, in terms of determinant and trace of the variance-covariance matrix of BLUEs. The criteria depend on the scheme parameters only. ...

Finding the optimal censoring scheme is a discrete optimization problem in the space of schemes. Under the entropy criterion, we examine optimal censoring schemes by preferring the choice of one-step censoring schemes as suggested by Balakrishnan. Exact one step optimal schemes for distributions with decreasing failure rate under entropy criterion are specified by Cramer and Bagh. We consider the distributions with increasing, right tailed and bath tub failure rates and compare the entropy of the one-step censoring schemes with the optimal ones, and observed that the loss in entropy is very negligible.

... Many optimality criteria have been proposed, and many results on optimal censoring designs have been established. In particular, given some optimality criterion, Burkschat et al. [10,11] obtained optimal censoring designs in terms of minimum variance of best linear unbiased estimators. Ng et al. [18] considered minimum variance criteria for maximum likelihood estimates for Weibull distributions. ...

Selection of optimal progressive censoring schemes for the normal distribution is discussed according to maximum likelihood estimation and best linear unbiased estimation. The selection is based on variances of the estimators of the two parameters of the normal distribution. The extreme left censoring scheme is shown to be an optimal progressive censoring scheme. The usual type-II right censoring case is shown to be the worst progressive censoring scheme for estimating the scale parameter. It can greatly increase the variance of estimators.

... For fixed values of the sample and failure time sizes, the Scheme II in which the censoring occurs after the first observed failure gives more accurate results through the MSEs and RABs than the other schemes and this coincides with Theorem [2.2] by [54]. ...

The main purpose of this paper is to obtain the inference of parameters of heterogeneous population represented by finite mixture of two Pareto (MTP) distributions of the second kind. The constant-partially accelerated life tests are applied based on progressively type-II censored samples. The maximum likelihood estimates (MLEs) for the considered parameters are obtained by solving the likelihood equations of the model parameters numerically. The Bayes estimators are obtained by using Markov chain Monte Carlo algorithm under the balanced squared error loss function. Based on Monte Carlo simulation, Bayes estimators are compared with their corresponding maximum likelihood estimators. The two-sample prediction technique is considered to derive Bayesian prediction bounds for future order statistics based on progressively type-II censored informative samples obtained from constant-partially accelerated life testing models. The informative and future samples are assumed to be obtained from the same population. The coverage probabilities and the average interval lengths of the confidence intervals are computed via a Monte Carlo simulation to investigate the procedure of the prediction intervals. Analysis of a simulated data set has also been presented for illustrative purposes. Finally, comparisons are made between Bayesian and maximum likelihood estimators via a Monte Carlo simulation study.

... Due to the computational complexity, they presented the optimal schemes only up to n = 50 and m = 3 (see Balakrishnan and Aggarwala, 2000, pages 197-204). Using the same optimality criterion, Burkschat et al. (2006Burkschat et al. ( , 2007 computed optimal censoring schemes for generalized Pareto distribution. Some more choices of ψ, such as total time on test, expected test duration and variance of the test time were considered by Burkschat (2008). ...

We present here a simple probabilistic approach for determining an optimal progressive censoring scheme by defining a probability structure on the set of feasible solutions. Given an initial solution, the new updated solution is computed within the probabilistic structure. This approach will be especially useful when the cardinality of the set of feasible solutions is large. The validation of the proposed approach is demonstrated by comparing the optimal scheme with these obtained by exhaustive numerical search.

... The first and the third assume that censoring occurs only at the first and the last observed failure, while the second scheme assumes censoring occurs only at the middle observed failure. It can be notice, from the simulation study, that: [18]. 4.The bootstrap CIs give more accurate results than the approximate CIs since the lengths of the former are less than the lengths of latter, for different sample sizes, observed failures and schemes. ...

In this paper, we consider the estimation problem in the case of constant-partially accelerated life tests using progressively
Type-II censored samples. The lifetime of items under use condition follows the two-parameters linear exponential distribution. The
maximum likelihood estimates of the parameters are obtained numerically. Approximate confidence intervals for the parameters, based on normal approximation to the asymptotic distribution of maximum- likelihood estimators, studentized-t and percentile bootstrap confidence intervals are derived. A Monte Carlo simulation study is carried out to investigate the precision of the maximum likelihood estimators and to compare the performance of the confidence intervals considered. Finally, two examples presented to illustrate our results are followed by conclusion.

... For fixed values of the sample and failure time sizes, the scheme II in which the censoring occurs after the first observed failure gives more accurate results through the MSEs and RABs than the other schemes and this coincides with Theorem [2.2] by Burkschat et al.[42] (3) Results in the CSs III and IV are close to each other. (4) The MCMC CRIs give more accurate results than the approximate CIs and bootstrap CIs since the lengths of the former are less than the lengths of latter, for different sample sizes, observed failures and schemes. ...

... This includes likelihood, linear, and Bayesian inference, various kinds of statistical intervals, prediction of future or censored life times, and goodness-of-fit tests. Optimal experimental design of progressively censored life tests has been addressed in, for example, [Refs 18,4,Chapter 10,and 19] (see also [Refs 5,Chapter 26] and the references cited therein). While progressive type-II censoring leads often to tractable expressions, the probabilistic and statistical analysis of progressively type-I censored data turns out to be complicated in most cases (see [Ref. ...

Progressive censoring has received great attention in the last decades especially in life testing and reliability. This article highlights fundamental applications as well as probabilistic and inferential results on progressive censoring.

In the design of constant-stress life-testing experiments, the optimal allocation in a multi-level stress test with Type-I or Type-II censoring based on the Weibull regression model has been studied in the literature. Conventional Type-I and Type-II censoring schemes restrict our ability to observe extreme failures in the experiment and these extreme failures are important in the estimation of upper quantiles and understanding of the tail behaviors of the lifetime distribution. For this reason, we propose the use of progressive extremal censoring at each stress level, whereas the conventional Type-II censoring is a special case. The proposed experimental scheme allows some extreme failures to be observed. The maximum likelihood estimators of the model parameters, the Fisher information, and asymptotic variance–covariance matrices of the maximum likelihood estimates are derived. We consider the optimal experimental planning problem by looking at four different optimality criteria. To avoid the computational burden in searching for the optimal allocation, a simple search procedure is suggested. Optimal allocation of units for two- and four-stress-level situations is determined numerically. The asymptotic Fisher information matrix and the asymptotic optimal allocation problem are also studied and the results are compared with optimal allocations with specified sample sizes. Finally, conclusions and some practical recommendations are provided.

Selecting the optimal progressive censoring scheme for the exponential distribution according to Pitman closeness criterion is discussed. For small sample sizes the Pitman closeness probabilities are calculated explicitly, and it is shown that the optimal progressive censoring scheme is the usual Type-II right censoring case. It is conjectured that this to be the case for all sample sizes. A general algorithm is also presented for the numerical computation of the Pitman closeness probabilities between any two progressive censoring schemes of the same size.

We study a competing risks model using Gompertz distribution under progressive Type-II censoring when probability distributions of failure causes are identically distributed with common scale and different shape parameters. Maximum likelihood estimates (MLEs) of these parameters are obtained and their uniqueness and existence behavior are also discussed. The asymptotic intervals are derived from the observed Fisher information matrix. We compare the performance of all the estimators numerically using simulations. Analysis of a real data set is presented as well. We further determine optimal censoring scheme using expected Fisher information matrix. The design parameters are selected based on suitable measures like cost-based and variance-based criteria functions. Finally, we discuss single and multi-objective optimization approaches to find optimal censoring schemes.

In this article, optimal progressive censoring schemes are examined for the nonparametric confidence intervals of population quantiles. The results obtained can be universally applied to any continuous probability distribution. By using the interval mass as an optimality criterion, the optimization process is free of the actual observed values from the sample and needs only the initial sample size n and the number of complete failures m. Using several sample sizes combined with various degrees of censoring, the results of the optimization are presented here for the population median at selected levels of confidence (99, 95, and 90%). With the optimality criterion under consideration, the efficiencies of the worst progressive Type-II censoring scheme and ordinary Type-II censoring scheme are also examined in comparison to the best censoring scheme obtained for fixed n and m.

This article considers the determination of optimum settings of design parameters of a life testing experiment under progressive censoring scheme by minimizing the total cost associated with the experiment. It is shown that the proposed cost function is scale invariant for location-scale and log-location-scale families of distribution. This is a discrete optimization problem, in which the optimum solution cannot be obtained analytically. However, a meta-heuristic algorithm based on variable neighborhood search approach is proposed to obtain optimum censoring scheme. Finally, a well-planned sensitivity analysis is done in order to study the effect of mis-specification of parameter values and cost components on the optimum solution.

The aim of this paper is twofold. First we discuss the maximum likelihood estimators of the unknown parameters of a two-parameter Birnbaum–Saunders distribution when the data are progressively Type-II censored. The maximum likelihood estimators are obtained using the EM algorithm by exploiting the property that the Birnbaum–Saunders distribution can be expressed as an equal mixture of an inverse Gaussian distribution and its reciprocal. From the proposed EM algorithm, the observed information matrix can be obtained quite easily, which can be used to construct the asymptotic confidence intervals. We perform the analysis of two real and one simulated data sets for illustrative purposes, and the performances are quite satisfactory. We further propose the use of different criteria to compare two different sampling schemes, and then find the optimal sampling scheme for a given criterion. It is observed that finding the optimal censoring scheme is a discrete optimization problem, and it is quite a computer intensive process. We examine one sub-optimal censoring scheme by restricting the choice of censoring schemes to one-step censoring schemes as suggested by Balakrishnan (2007), which can be obtained quite easily. We compare the performances of the sub-optimal censoring schemes with the optimal ones, and observe that the loss of information is quite insignificant.

Process capability analysis has been widely applied in the field of quality
control to monitor the performance of industrial processes. In practice,
lifetime performance index CL is a popular means to assess the performance and potential of their
processes, where L is the lower specification limit. This study will
apply the large-sample theory to construct a maximum likelihood estimator (MLE)
of CL with the progressive first-failure-censored sampling plan under the
Weibull distribution. The MLE of CL is then utilized to develop a new hypothesis testing procedure in the
condition of known L.

In this paper we consider the Bayesian inference of the unknown parameters of the progressively censored competing risks data, when the lifetime distributions are Weibull. It is assumed that the latent cause of failures have independent Weibull distributions with the common shape parameter, but different scale parameters. In this article, it is assumed that the shape parameter has a log-concave prior density function, and for the given shape parameter, the scale parameters have Beta-Dirichlet priors. When the common shape parameter is known, the Bayes estimates of the scale parameters have closed form expressions, but when the common shape parameter is unknown, the Bayes estimates do not have explicit expressions. In this case we propose to use MCMC samples to compute the Bayes estimates and highest posterior density (HPD) credible intervals. Monte Carlo simulations are performed to investigate the performances of the estimators. Two data sets are analyzed for illustration. Finally we provide a methodology to compare two different censoring schemes and thus find the optimum Bayesian censoring scheme.

An accurate procedure is proposed to calculate approximate moments of progressive order statistics in the context of statistical inference for lifetime models. The study analyses the performance of power series expansion to approximate the moments for location and scale distributions with high precision and smaller deviations with respect to the exact values. A comparative analysis between exact and approximate methods is shown using some tables and figures. The different approximations are applied in two situations. First, we consider the problem of computing the large sample variance–covariance matrix of maximum likelihood estimators. We also use the approximations to obtain progressively censored sampling plans for log-normal distributed data. These problems illustrate that the presented procedure is highly useful to compute the moments with precision for numerous censoring patterns and, in many cases, is the only valid method because the exact calculation may not be applicable.

The progressive censoring scheme has received considerable amount of attention in the last fifteen years. During the last few years joint progressive censoring scheme has gained some popularity. Recently, the authors Mondal and Kundu ("A new two sample Type-II progressive censoring scheme", arXiv:1609.05805) introduced a balanced two sample Type-II progressive censoring scheme and provided the exact inference when the two populations are exponentially distributed. In this article we consider the case when the two populations follow Weibull distributions with the common shape parameter and different scale parameters. We obtain the maximum likelihood estimators of the unknown parameters. It is observed that the maximum likelihood estimators cannot be obtained in explicit forms, hence, we propose approximate maximum likelihood estimators, which can be obtained in explicit forms. We construct the asymptotic and bootstrap confidence intervals of the population parameters. Further we derive an exact joint confidence region of the unknown parameters. We propose an objective function based on the expected volume of this confidence set and using that we obtain the optimum progressive censoring scheme. Extensive simulations have been performed to see the performances of the proposed method, and one real data set has been analyzed for illustrative purposes.

Linear inference for progressively Type-II censored order statistics is discussed for location, scale, and location-scale families of population distributions. After a general introduction, results for exponential, generalized Pareto, extreme value, Weibull, Laplace, and logistic distributions are presented in detail.

We consider a general progressively Type-II censored life test where the life time distribution of each test unit belongs to a scale family. We derive an exact confidence interval for the scale parameter. Using Monte Carlo simulation method, we assess the expected lower and upper limits of the proposed confidence interval for the exponential distribution. Finally, we present a numerical example to illustrate the proposed procedure.

This work considers optimum design of a life testing experiment with progressive type I interval censoring. A cost minimization-based optimality criterion is proposed. The proposed cost function incorporates the cost of conducting the experiment, opportunity cost, and post-sale cost. It is shown that the proposed cost function is scale invariant for any lifetime distribution whose support does not depend on the parameters of the distribution. Weibull distribution is considered for illustration. Optimum solution is obtained by a suitable numerical method. A sensitivity analysis is undertaken to study the effect of small perturbations in lifetime model parameter values or cost coefficients.

The problem of an optimal censoring plan in progressive Type-II censoring is discussed for several criteria including minimum experimental time, maximum Fisher information, minimum variance of estimates, as well as further criteria proposed in the literature.

. In the modelof progressive type II censoring, point and interval estimation as well as relationsf or single and product moments are considered. Based on two-parameter exponential distributions, maximum likelihood estimators (MLEs), unif ormly minimum variance unbiased estimators (UMVUEs) and best linear unbiased estimators (BLUEs) are derivedf or both location and scale parameters. Some properties of these estimators are shown. Moreover, results f# r single and product moments of progressive type II censored order statistics are presented to obtain recurrence relationsf rom exponential and truncated exponential distributions. These relations may then be used to compute all the means, variances and covariancesof progressive type II censored order statistics based on exponential distributions f# r arbitrary censoring schemes. The presented recurrence relations simplif y those given by Aggarwala and Balakrishnan (1996). Key words and phrases: Progressive type II censored order statis...

By assuming a general progressive Type-II censored sample, we derive the best linear unbiased estimators (BLUE’s) for the parameters of one- and two-parameter exponential distributions. For the latter, we also derive the maximum likelihood estimators and show that they are simply the BLUE’s adjusted for their bias. An example is given to illustrate the methods of estimation discussed in this paper.

This article presents progressively censored variable sampling plans for the Weibull distribution. Approximate maximum likelihood estimators are developed for estimating the parameters of interest. In the construction of sampling plans, asymptotic distribution theory is used to determine the sample size and the acceptance constant. Sampling plans are tabulated for selected progressive censoring patterns and specifications, for demonstration and comparison. A Monte Carlo experiment, conducted to investigate the accuracy of the asymptotic normal approximation, has shown that the procedure is sufficiently accurate for practical purposes. An example, based on data reported by Montanari and Cacciari from progressively censored aging tests on XLPE-insulated cable models, is given for illustration.

A conditional method of inference is used to derive exact confidence intervals for several life characteristics such as location, scale, quantiles, and reliability when the data are Type II progressively censored. The method is shown to be feasible and practical, although a computer program may be required for its implementation. The method is applied for the purpose of illustration to the extreme-value and the one- and two-parameter exponential models. Prediction limits for the lifelength of future units are also discussed. An example consisting of data from an accelerated test on insulating fluid reported by Nelson is used for illustration and comparison.

This paper is an extension of previous work by the writer concerning progressively censored sampling in the normal distribution [4] and in the Weibull distribution [6]. Here local maximum likelihood estimators and estimators which utilize the first order statistic are derived for the three-parameter log-normal distribution when samples are progressively censored. An illustrative example involving life test data is incli ded. Various properties of the proposed estimators are investigated.

In life and dosage-response studies, progressively censored samples arise when at various stages of an experiment, some though not all of the surviving sample specimens are eliminated from further observation. The sample specimens remaining after each stage of censoring are continued under observation until failure or until a subsequent stage of censoring. In this paper maximum likelihood estimators of the distribution parameters are derived for the normal, and for the exponential distribution when samples are progressively censored.

Best linear invariant estimators of log reliable life are derived for a model in which failure times have a two-parameter Weibull distribution and removal of some surviving items from life test is allowed at the time of any failure. Weights for obtaining best linear invariant estimates under this model are given in tabular form for all censoring patterns for sample sizes 2 through 6.

Linear estimation and prediction based on several samples of generalized order statistics from generalized Pareto distributions is considered. Representations of best linear unbiased estimators (BLUEs) and best linear equivariant estimators in location-scale families are derived, as well as corresponding optimal linear predictors. Moreover, we study positivity of the linear estimators of the scale parameter. An example illustrates that the BLUE may attain negative values with positive probability in certain situations.

Upper and lower bounds for moments of progressively Type II censored order statistics in terms of moments of (progressively Type II censored) order statistics are derived. In particular, this yields conditions for the existence of moments of progressively Type II censored order statistics based on an absolutely continuous distribution function.

Relations are derived for expectations of functions of generalized order statistics within a class of distributions including a variety of identities for single and product moments of ordinary order statistics and record values as particular cases. Since several models of ordered random variables are contained in the concept of generalized order statistics and since there are no restrictions imposed on their parameters, the identities can be applied to all of these models with their different interpretations.

Optimal design of experiments. - New York u.a. : Wiley, 1993. - XXIII, 454 S. - (Wiley series in probability and mathematical statistics)

The problem of evaluating the time-to-failure percentiles in
progressively-censored tests on solid insulating materials is addressed.
Statistical methods to estimate the parameters of the Weibull
distribution (and their confidence limits) are examined on the basis of
the results of aging with combined thermal-electrical stresses carried
out on XLPE insulated cable models. These tests are performed at the
same stresses on samples more than 1 m long and subjected to progressive
censoring of aging times, or on short specimens about 20 cm long and
subjected to complete, or singly-censored, life tests. This procedure
allows the effectiveness of progressively-censored tests in estimating
life percentiles to be verified, and the accuracy of the methods to be
compared

- N Balakrishnan
- R Aggarwala

Balakrishnan, N., Aggarwala, R., 2000. Progressive Censoring. Birkhaüser, Boston.

- N Balakrishnan
- R Aggarwala

Balakrishnan, N., Aggarwala, R., 2000. Progressive Censoring. Birkha¨user, Boston.

Progressively censored reliability sampling plans for the Weibull distribution

- Balasooriya