
[Show abstract]
[Hide abstract]
ABSTRACT:
We study the accuracy of estimation of unknown parameters in the case of twostep statistical estimates admitting special representations. An approach to the study of such problems previously proposed by the authors is extended to the case of the estimation of a multidimensional parameter. As a result, we obtain necessary and sufficient conditions for the weak convergence of the normalized estimation error to a multidimensional normal distribution.
Siberian Advances in Mathematics 04/2014; 24(2):119139. DOI:10.3103/S1055134414020035

[Show abstract]
[Hide abstract]
ABSTRACT:
In this article, we consider the problem of finding a solution to a functional equation which in a special way depends on the distribution of a random variable. Such equations naturally arise in construction of consistent estimates in regression problems in the case when the variances of the main observations depend on underlying unknown parameter and the regression coefficients are determined with random errors. A simple example of a regression problem is demonstrated when the equation under consideration occurs.
Siberian Advances in Mathematics 10/2012; 22(4). DOI:10.3103/S1055134412040037

[Show abstract]
[Hide abstract]
ABSTRACT:
We consider the linear regression model in the case when the independent variables are measured with errors, while the variances
of the main observations depend on an unknown parameter. In the case of normally distributed replicated regressors we propose
and study new classes of twostep estimates for the main unknown parameter. We find consistency and asymptotic normality conditions
for firststep estimates and an asymptotic normality condition for secondstep estimates. We discuss conditions under which
these estimates have the minimal asymptotic variance.
Keywordslinear regression–errors in independent variables–replicated regressors–dependence of variances on a parameter–twostep estimates–consistent estimate–asymptotically normal estimate
Siberian Mathematical Journal 07/2011; 52(4):711726. DOI:10.1134/S0037446611040148 · 0.30 Impact Factor

Yu. Yu. Linke
[Show abstract]
[Hide abstract]
ABSTRACT:
We study the twostep statistical estimates that admit certain expressions of a sufficiently general form. These constructions
arise in various statistical models, for instance in regression problems. Under rather weak restrictions we find necessary
and sufficient conditions for the normalized difference of a twostep estimate and the unknown parameter to converge weakly
to an arbitrary distribution.
Siberian Mathematical Journal 07/2011; 52(4):665681. DOI:10.1134/S0037446611040112 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
Under consideration is the problem of estimating the linear regression parameter in the case when the variances of observations
depend on the unknown parameter of the model, while the coefficients (independent variables) are measured with random errors.
We propose a new twostep procedure for constructing estimators which guarantees their consistency, find general necessary
and sufficient conditions for the asymptotic normality of these estimators, and discuss the case in which these estimators
have the minimal asymptotic variance.
Keywordslinear regression–errors in the independent variables–dependence of variance on a parameter–twostep estimation–asymptotically normal estimator
Siberian Mathematical Journal 01/2011; 52(1):113126. DOI:10.1134/S0037446606010125 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
We consider the problem of estimating the unknown parameters of linear regression in the case when the variances of observations
depend on the unknown parameters of the model. A twostep method is suggested for constructing asymptotically linear estimators.
Some general sufficient conditions for the asymptotic normality of the estimators are found, and an explicit form is established
of the best asymptotically linear estimators. The behavior of the estimators is studied in detail in the case when the parameter
of the regression model is onedimensional.
Siberian Mathematical Journal 03/2009; 50(2):302315. DOI:10.1007/s1120200900352 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
We consider the problem of estimating the unknown parameter of the onedimensional analog of the MichaelisMenten equation
when the independent variables are measured with random errors. We study the behavior of the explicit estimates that we have
found earlier in the case of known independent variables and establish almost necessary conditions under which the presence
of the random errors does not affect the asymptotic normality of these explicit estimates.
Siberian Mathematical Journal 01/2008; 49(3):474497. DOI:10.1007/s1120200800473 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
Considering the linearfractional regression problem with errors in independent variables, we construct and study asymptotically
optimal estimators for unknown parameters in the case of violation of the classical regression assumptions (the variances
of the observations are different and depend on the unknown parameters).
Siberian Mathematical Journal 01/2006; 47(6):11281153. DOI:10.1007/s1120200601208 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
Under consideration is the problem of estimating unknown parameters in the Michaelis–Menten equation which is frequent in natural sciences. The authors suggest and study asymptotically normal explicit estimates of unknown parameters which often have a minimal covariance matrix.
Siberian Mathematical Journal 04/2001; 42(3):517536. DOI:10.1023/A:1010475226779 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
Let a sequence of random variables Z 1 ,⋯,Z N be defined by: Z i =α i (θ)/β i (θ)+ξ i ,i=1,⋯,N, where α i (θ)=α 0i +∑ j=1 m a ji θ j ,β i (θ)=1+∑ j=1 m b ji θ j , are linear combinations depending on an unknown mdimensional parameter θ, while b ji ≥0, a 0i , and a ji are known numbers. The authors consider the problem of estimating the unknown vector θ from the observations Z 1 ,⋯,Z N . They construct an estimator which is asymptotically normal under rather general assumptions on the constants {c i } and the random errors ξ 1 ,ξ 2 ,⋯ . Conditions for optimality of the estimators are also obtained. The article generalizes some previous results of the authors [ibid. 41, No. 1, 125137 (2000; Zbl 0943.62025)].
Siberian Mathematical Journal 01/2001; 42(2):317331. DOI:10.1023/A:1004893114744 · 0.30 Impact Factor

[Show abstract]
[Hide abstract]
ABSTRACT:
Let ξ 1 ,⋯,ξ N be independent, identically distributed random variables with zero mean and unit variance. Let a sequence of independent random variables X 1 ,X 2 ,⋯ be defined in the following way: X i =a i (1+b i θ) 1 +σ i ξ i · The values a i >0 and b i >0 are assumed known, while the values of the parameter θ and the variance DX i ≡σ i 2 are unknown. The authors consider the problem of estimating the unknown parameter θ from the observations X 1 ,⋯,X N . They propose to use the following simple estimator: θ * =∑c i (a i X i )∑ c i b i X i 1 · It turns out that this estimator is asymptotically normal under rather general assumptions on the constants {c i }. Moreover, in the case when some information on the behavior of the variances {σ i } is available, the authors show how to choose functions {γ i (θ)} so that the “improved” estimator θ ** =∑γ i (θ * )(a i X i )∑ γ i (θ * ) b i X i 1 becomes asymptotically efficient in some sense.
Siberian Mathematical Journal 01/2000; 41(1):125137. DOI:10.1007/BF02674002 · 0.30 Impact Factor