Article

# On deviations between empirical and quantile processes for mixing random variables

Indian Statistical Institute, Calcutta, India
(Impact Factor: 0.93). 12/1978; 8(4):532-549. DOI: 10.1016/0047-259X(78)90031-3
Source: RePEc

ABSTRACT

Let {Xn} be a strictly stationary φ-mixing process with . It is shown in the paper that if X1 is uniformly distributed on the unit interval, then, for any t ∈ [0, 1], a.s. and a.s., where Fn and Fn−1(t) denote the sample distribution function and tth sample quantile, respectively. In case {Xn} is strong mixing with exponentially decaying mixing coefficients, it is shown that, for any t ∈ [0, 1], a.s. and sup0≤t≤1 |Fn−1(t) − t + Fn(t) − t| = a.s. The results are further extended to general distributions, including some nonregular cases, when the underlying distribution function is not differentiable. The results for φ-mixing processes give the sharpest possible orders in view of the corresponding results of Kiefer for independent random variables.

1 Follower
·
• Source
• "However, it can easily be extended to plug-in estimators of more general L-functionals L K with dK having compact support strictly within (0, 1). Under the stronger mixing conditions α(n) ≤ Ke −εn , ε > 0, and α(n) ≤ Kn −8 the result of Theorem 3.6 is basically already known from [4] and [36] "
##### Article: Marcinkiewicz-Zygmund and ordinary strong laws for empirical distribution functions and plug-in estimators
[Hide abstract]
ABSTRACT: Both Marcinkiewicz-Zygmund strong laws of large numbers (MZ-SLLNs) and ordinary strong laws of large numbers (SLLNs) for plug-in estimators of general statistical functionals are derived. It is used that if a statistical functional is "sufficiently regular", then a (MZ-) SLLN for the estimator of the unknown distribution function yields a (MZ-) SLLN for the corresponding plug-in estimator. It is in particular shown that many L-, V- and risk functionals are "sufficiently regular", and that known results on the strong convergence of the empirical process of \alpha-mixing random variables can be improved. The presented approach does not only cover some known results but also provides some new strong laws for plug-in estimators of particular statistical functionals.
Statistics: A Journal of Theoretical and Applied Statistics 01/2013; 48(5). DOI:10.1080/02331888.2013.800075 · 0.53 Impact Factor
• Source
• "Using this representation, one can express asymptotically sample quantiles as an average of i.i.d random variables and can obtain limiting properties of the sample quantiles. Among others, Sen (1972), Babu and Singh (1978) and Yoshihara (1995) gave the Bahadur representation of sample quantiles for some dependent sequences, such as φ-mixing random variable sequences and strongly mixing random variable sequences respectively. "
##### Article: The Bahadur representation for sample quantiles under negatively associated sequence
[Hide abstract]
ABSTRACT: In this article, we investigate a Bahadur representation of sample quantiles based on negatively associated (NA) sequence. Our results in this note extend Sun's results [Sun, S.X., 2006. The Bahadur representation of sample quantile under week dependence. Statist. Probab. Lett. 76, 1238-1244] which are obtained under other weak dependence.
Statistics [?] Probability Letters 11/2008; 78(16):2660-2663. DOI:10.1016/j.spl.2008.03.026 · 0.60 Impact Factor
• Source
• "As can be seen in the proof of Lemma 3.5 of Babu and Singh [1], the existence of the term log log log M is due to the second moment bound for the sum of a-mixing random variables. One can prove this lemma easily by following the same arguments used by Babu and Singh [1] and using the fact that F(X) follows a uniform (0, 1) distribution for any continuous random variable X with smooth underlying distribution function F. We omit the proof here. "
##### Article: On the accuracy of bootstrapping sample quantiles of strongly mixing sequences
[Hide abstract]
ABSTRACT: In this paper, we examine the rate of convergence of moving block bootstrap (MBB) approximations to the distributions of normalized sample quantiles based on strongly mixing observations. Under suitable smoothness and regularity conditions on the one-dimensional marginal distribution function, the rate of convergence of the MBB approximations to distributions of centered and scaled sample quantiles is of order O(n−1¼ log logn).
Journal of the Australian Mathematical Society 03/2007; 82(02):263 - 282. DOI:10.1017/S1446788700016074 · 0.14 Impact Factor