[Show abstract][Hide abstract] ABSTRACT: We study asymptotic behavior of one-step weighted $M$-estimators based on
samples from arrays of not necessarily identically distributed random variables
and representing explicit approximations to the corresponding consistent
weighted $M$-estimators. Sufficient conditions are presented for asymptotic
normality of the one-step weighted $M$-estimators under consideration. As a
consequence, we consider some well-known nonlinear regression models where the
procedure mentioned allow us to construct explicit asymptotically optimal
estimators.
[Show abstract][Hide abstract] ABSTRACT: We study asymptotic behavior of one-step $M$-estimators based on samples from
arrays of not necessarily identically distributed random variables and
representing explicit approximations to the corresponding consistent
$M$-estimators. These estimators generalize Fisher's one-step approximations to
consistent maximum likelihood estimators. Sufficient conditions are presented
for asymptotic normality of the one-step $M$-estimators under consideration. As
a consequence, we consider some well-known nonlinear regression models where
the procedure mentioned allow us to construct explicit asymptotically optimal
estimators.
[Show abstract][Hide abstract] ABSTRACT: We study the accuracy of estimation of unknown parameters in the case of two-step statistical estimates admitting special representations. An approach to the study of such problems previously proposed by the authors is extended to the case of the estimation of a multidimensional parameter. As a result, we obtain necessary and sufficient conditions for the weak convergence of the normalized estimation error to a multidimensional normal distribution.
Siberian Advances in Mathematics 04/2014; 24(2):119-139. DOI:10.3103/S1055134414020035
[Show abstract][Hide abstract] ABSTRACT: In this article, we consider the problem of finding a solution to a functional equation which in a special way depends on the distribution of a random variable. Such equations naturally arise in construction of consistent estimates in regression problems in the case when the variances of the main observations depend on underlying unknown parameter and the regression coefficients are determined with random errors. A simple example of a regression problem is demonstrated when the equation under consideration occurs.
Siberian Advances in Mathematics 10/2012; 22(4). DOI:10.3103/S1055134412040037
[Show abstract][Hide abstract] ABSTRACT: We consider the linear regression model in the case when the independent variables are measured with errors, while the variances
of the main observations depend on an unknown parameter. In the case of normally distributed replicated regressors we propose
and study new classes of two-step estimates for the main unknown parameter. We find consistency and asymptotic normality conditions
for first-step estimates and an asymptotic normality condition for second-step estimates. We discuss conditions under which
these estimates have the minimal asymptotic variance.
Keywordslinear regression–errors in independent variables–replicated regressors–dependence of variances on a parameter–two-step estimates–consistent estimate–asymptotically normal estimate
[Show abstract][Hide abstract] ABSTRACT: We study the two-step statistical estimates that admit certain expressions of a sufficiently general form. These constructions
arise in various statistical models, for instance in regression problems. Under rather weak restrictions we find necessary
and sufficient conditions for the normalized difference of a two-step estimate and the unknown parameter to converge weakly
to an arbitrary distribution.
[Show abstract][Hide abstract] ABSTRACT: Under consideration is the problem of estimating the linear regression parameter in the case when the variances of observations
depend on the unknown parameter of the model, while the coefficients (independent variables) are measured with random errors.
We propose a new two-step procedure for constructing estimators which guarantees their consistency, find general necessary
and sufficient conditions for the asymptotic normality of these estimators, and discuss the case in which these estimators
have the minimal asymptotic variance.
Keywordslinear regression–errors in the independent variables–dependence of variance on a parameter–two-step estimation–asymptotically normal estimator
[Show abstract][Hide abstract] ABSTRACT: We consider the problem of estimating the unknown parameters of linear regression in the case when the variances of observations
depend on the unknown parameters of the model. A two-step method is suggested for constructing asymptotically linear estimators.
Some general sufficient conditions for the asymptotic normality of the estimators are found, and an explicit form is established
of the best asymptotically linear estimators. The behavior of the estimators is studied in detail in the case when the parameter
of the regression model is one-dimensional.
[Show abstract][Hide abstract] ABSTRACT: We consider the problem of estimating the unknown parameter of the one-dimensional analog of the Michaelis-Menten equation
when the independent variables are measured with random errors. We study the behavior of the explicit estimates that we have
found earlier in the case of known independent variables and establish almost necessary conditions under which the presence
of the random errors does not affect the asymptotic normality of these explicit estimates.
[Show abstract][Hide abstract] ABSTRACT: We consider the problem of estimating an unknown one-dimensional parameter in the linear regression problem in the case when the independent variables (called coefficients in this article) are measured with errors, and the variances of the principal observations can depend on the main parameter. We study the behavior of two-step estimators, previously introduced by the authors in ibid. 50, No. 2, 302–315 (2009), which are asymptotically optimal in the case when the independent variables are measured without errors. Under sufficiently general assumptions we find necessary and sufficient conditions for the asymptotic normality and asymptotic optimality of these estimators in the new setup.
[Show abstract][Hide abstract] ABSTRACT: Under consideration is the problem of estimating unknown parameters in the Michaelis–Menten equation which is frequent in natural sciences. The authors suggest and study asymptotically normal explicit estimates of unknown parameters which often have a minimal covariance matrix.
[Show abstract][Hide abstract] ABSTRACT: Suppose that in some experiment we observe a sequence of independent random variables X1, X2,..., XN such that the following representation is valid for every i: