## No full-text available

To read the full-text of this research,

you can request a copy directly from the authors.

This chapter discusses the nonparametric smoothing methods for a class of nonstandard curve estimation problems. A class of models is introduced where an unknown function is defined as solution of an integral equation. Intercept and kernel of the integral equation are unknown but they can be directly estimated by application of classical smoothing methods. Estimates of unknown function are given as solutions of the empirical integral equation. The key properties are the nature of the operator or family of operators that we define the integral equations. There are some results on the asymptotic properties of the estimated functions, which include point-wise normal distribution and uniform stochastic expansions. Simulations suggest that smooth backfitting works stable under weaker assumptions on the design and for quite larger number of additive components. The development of an asymptotic distribution theory for the estimate of unknown function is elaborated. A series of examples are given that motivate the class of models.

To read the full-text of this research,

you can request a copy directly from the authors.

... In nonparametric regression models, however, it often yields an additive model where classical smoothing methods can not be applied, as we illustrate on several cases in this section. Some of the models of this section were also discussed in the overview papers [31] and [44]. A general discussion of smooth least squares in a general class of nonparametric models can also be found in [39]. ...

We give an overview over smooth back tting type estimators in additive models. Moreover we il- lustrate their wide applicability in models closely related to additive models such as nonparametric regression with dependent error variables where the errors can be transformed to white noise by a linear transformation, nonparametric regression with repeatedly measured data, nonparametric panels with xed eects, simultaneous nonparametric equation models, and non- and semiparamet- ric autoregression and GARCH-models. We also discuss extensions to varying coecient models, additive models with missing observations, and the case of nonstationary covariates.

... Additive regression is an example of a nonparametric model where the nonparametric function is given as a solution of an integral equation. This has been outlined in Linton and Mammen [24] and Carrasco, Florens and Renault [6] where also other examples of statistical integral equations are given. Examples are additive models where the additive components are linked as in Linton and Mammen [25] and regression models with dependent errors where an optimal transformation leads to an additive model, see Linton and Mammen [26]. ...

This paper is dedicated to Piet Groeneboom on the occasion of his 65th birthday Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Convergence of the algorithm is shown. Finite sample properties are also compared through simulation experiments. 1.

In this overview we will show that many statistical estimation problems can be considered as empirical solutions of noisy integral equations of the second kind. We illustrate this by giving a series of examples and we will discuss a general nonparametric approach based on plugging-in nonparametric estimates of the integral kernel and of the intercept of the integral equation.

We introduce a new method for the estimation of discount functions, yield curves and forward curves from government issued coupon bonds. Our approach is nonparametric and does not assume a particular functional form for the discount function although we do show how to impose various restrictions in the estimation. Our method is based on kernel smoothing and is defined as the minimum of some localized population moment condition. The solution to the sample problem is not explicit and our estimation procedure is iterative, rather like the backfitting method of estimating additive nonparametric models. We establish the asymptotic normality of our methods using the asymptotic representation of our estimator as an infinite series with declining coefficients. The rate of convergence is standard for one dimensional nonparametric regression. We investigate the finite sample performance of our method, in comparison with other well-established methods, in a small simulation experiment.

Motivated by a nonparametric GARCH model we consider nonparametric additive autoregression models in the special case that the additive components are linked parametrically. We show that the parameter can be estimated with parametric rate and give the normal limit. Our procedure is based on two steps. In the first step nonparametric smoothers are used for the estimation of each additive component without taking into account the parametric link of the functions. In a second step the parameter is estimated by using the parametric restriction between the additive components. Interestingly, our method needs no undersmoothing in the first step.