Article

# Root n Consistent and Optimal Density Estimators for Moving Average Processes

02/2003;
Source: CiteSeer

ABSTRACT The marginal density of a rst order moving average process can be written as convolution of two innovation densities. Saavedra and Cao (2000) propose to estimate the marginal density by plugging in kernel density estimators for the innovation densities, based on estimated innovations. They obtain that for an appropriate choice of bandwidth the variance of their estimator decreases at the rate 1=n. Their estimator can be interpreted as a speci c U-statistic. We suggest a slightly simpli ed U-statistic as estimator of the marginal density, prove that it is asymptotically normal at the same rate, and describe the asymptotic variance explicitly. We show that the estimator is asymptotically ecient if no structural assumptions are made on the innovation density. For innovation densities known to have mean zero or to be symmetric, we describe improvements of our estimator which are again asymptotically ecient.

0 0
·
0 Bookmarks
·
23 Views
• Source
##### Article:Root-n consistency in weighted L 1 -spaces for density estimators of invertible linear processes
Statistical Inference for Stochastic Processes 02/2008; 11(3):281-310.
• Source
##### Article:Uniformly root-\$N\$ consistent density estimators for weakly dependent invertible linear processes
[show abstract] [hide abstract]
ABSTRACT: Convergence rates of kernel density estimators for stationary time series are well studied. For invertible linear processes, we construct a new density estimator that converges, in the supremum norm, at the better, parametric, rate \$n^{-1/2}\$. Our estimator is a convolution of two different residual-based kernel estimators. We obtain in particular convergence rates for such residual-based kernel estimators; these results are of independent interest. Comment: Published at http://dx.doi.org/10.1214/009053606000001352 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org)
08/2007;
• Source
##### Article:Prediction in moving average processes
[show abstract] [hide abstract]
ABSTRACT: For the stationary invertible moving average process of order one with unknown innovation distribution F, we construct root-n consistent plug-in estimators of conditional expectations E(h(Xn+1)|X1,…,Xn). More specifically, we give weak conditions under which such estimators admit Bahadur-type representations, assuming some smoothness of h or of F. For fixed h it suffices that h is locally of bounded variation and locally Lipschitz in L2(F), and that the convolution of h and F is continuously differentiable. A uniform representation for the plug-in estimator of the conditional distribution function P(Xn+1⩽·|X1,…,Xn) holds if F has a uniformly continuous density. For a smoothed version of our estimator, the Bahadur representation holds uniformly over each class of functions h that have an appropriate envelope and whose shifts are F-Donsker, assuming some smoothness of F. The proofs use empirical process arguments.
Journal of Statistical Planning and Inference.

Available from

### Keywords

appropriate choice

asymptotic variance

average process

bandwidth

Cao

convolution

estimator

estimator decreases

innovation densities

innovation density

innovations

kernel density estimators

marginal density

rst order

Saavedra

simpli ed U-statistic

speci c U-statistic

structural assumptions

symmetric

variance