Yu. Yu. Linke

Sobolev Institute of Geology and Mineralogy, Novo-Nikolaevsk, Novosibirsk, Russia

Are you Yu. Yu. Linke?

Claim your profile

Publications (12)2.66 Total impact

  • Source
    Yu. Yu. Linke
    [Show abstract] [Hide abstract]
    ABSTRACT: We study asymptotic behavior of one-step $M$-estimators based on samples from arrays of not necessarily identically distributed random variables and representing explicit approximations to the corresponding consistent $M$-estimators. These estimators generalize Fisher's one-step approximations to consistent maximum likelihood estimators. Sufficient conditions are presented for asymptotic normality of the one-step $M$-estimators under consideration. As a consequence, we consider some well-known nonlinear regression models where the procedure mentioned allow us to construct explicit asymptotically optimal estimators.
  • Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: We study the accuracy of estimation of unknown parameters in the case of two-step statistical estimates admitting special representations. An approach to the study of such problems previously proposed by the authors is extended to the case of the estimation of a multidimensional parameter. As a result, we obtain necessary and sufficient conditions for the weak convergence of the normalized estimation error to a multidimensional normal distribution.
    Siberian Advances in Mathematics 04/2014; 24(2):119-139. DOI:10.3103/S1055134414020035
  • Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: In this article, we consider the problem of finding a solution to a functional equation which in a special way depends on the distribution of a random variable. Such equations naturally arise in construction of consistent estimates in regression problems in the case when the variances of the main observations depend on underlying unknown parameter and the regression coefficients are determined with random errors. A simple example of a regression problem is demonstrated when the equation under consideration occurs.
    Siberian Advances in Mathematics 10/2012; 22(4). DOI:10.3103/S1055134412040037
  • A. I. Sakhanenko, Yu. Yu. Linke
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the linear regression model in the case when the independent variables are measured with errors, while the variances of the main observations depend on an unknown parameter. In the case of normally distributed replicated regressors we propose and study new classes of two-step estimates for the main unknown parameter. We find consistency and asymptotic normality conditions for first-step estimates and an asymptotic normality condition for second-step estimates. We discuss conditions under which these estimates have the minimal asymptotic variance. Keywordslinear regression–errors in independent variables–replicated regressors–dependence of variances on a parameter–two-step estimates–consistent estimate–asymptotically normal estimate
    Siberian Mathematical Journal 07/2011; 52(4):711-726. DOI:10.1134/S0037446611040148 · 0.30 Impact Factor
  • Yu. Yu. Linke
    [Show abstract] [Hide abstract]
    ABSTRACT: We study the two-step statistical estimates that admit certain expressions of a sufficiently general form. These constructions arise in various statistical models, for instance in regression problems. Under rather weak restrictions we find necessary and sufficient conditions for the normalized difference of a two-step estimate and the unknown parameter to converge weakly to an arbitrary distribution.
    Siberian Mathematical Journal 07/2011; 52(4):665-681. DOI:10.1134/S0037446611040112 · 0.30 Impact Factor
  • A. I. Sakhanenko, Yu. Yu. Linke
    [Show abstract] [Hide abstract]
    ABSTRACT: Under consideration is the problem of estimating the linear regression parameter in the case when the variances of observations depend on the unknown parameter of the model, while the coefficients (independent variables) are measured with random errors. We propose a new two-step procedure for constructing estimators which guarantees their consistency, find general necessary and sufficient conditions for the asymptotic normality of these estimators, and discuss the case in which these estimators have the minimal asymptotic variance. Keywordslinear regression–errors in the independent variables–dependence of variance on a parameter–two-step estimation–asymptotically normal estimator
    Siberian Mathematical Journal 01/2011; 52(1):113-126. DOI:10.1134/S0037446606010125 · 0.30 Impact Factor
  • Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the problem of estimating the unknown parameters of linear regression in the case when the variances of observations depend on the unknown parameters of the model. A two-step method is suggested for constructing asymptotically linear estimators. Some general sufficient conditions for the asymptotic normality of the estimators are found, and an explicit form is established of the best asymptotically linear estimators. The behavior of the estimators is studied in detail in the case when the parameter of the regression model is one-dimensional.
    Siberian Mathematical Journal 03/2009; 50(2):302-315. DOI:10.1007/s11202-009-0035-2 · 0.30 Impact Factor
  • Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the problem of estimating the unknown parameter of the one-dimensional analog of the Michaelis-Menten equation when the independent variables are measured with random errors. We study the behavior of the explicit estimates that we have found earlier in the case of known independent variables and establish almost necessary conditions under which the presence of the random errors does not affect the asymptotic normality of these explicit estimates.
    Siberian Mathematical Journal 01/2008; 49(3):474-497. DOI:10.1007/s11202-008-0047-3 · 0.30 Impact Factor
  • A. I. Sakhanenko, Yu. Yu. Linke
    [Show abstract] [Hide abstract]
    ABSTRACT: Considering the linear-fractional regression problem with errors in independent variables, we construct and study asymptotically optimal estimators for unknown parameters in the case of violation of the classical regression assumptions (the variances of the observations are different and depend on the unknown parameters).
    Siberian Mathematical Journal 01/2006; 47(6):1128-1153. DOI:10.1007/s11202-006-0120-8 · 0.30 Impact Factor
  • Source
    Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: Under consideration is the problem of estimating unknown parameters in the Michaelis–Menten equation which is frequent in natural sciences. The authors suggest and study asymptotically normal explicit estimates of unknown parameters which often have a minimal covariance matrix.
    Siberian Mathematical Journal 04/2001; 42(3):517-536. DOI:10.1023/A:1010475226779 · 0.30 Impact Factor
  • Source
    Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: Let a sequence of random variables Z 1 ,⋯,Z N be defined by: Z i =α i (θ)/β i (θ)+ξ i ,i=1,⋯,N, where α i (θ)=α 0i +∑ j=1 m a ji θ j ,β i (θ)=1+∑ j=1 m b ji θ j , are linear combinations depending on an unknown m-dimensional parameter θ, while b ji ≥0, a 0i , and a ji are known numbers. The authors consider the problem of estimating the unknown vector θ from the observations Z 1 ,⋯,Z N . They construct an estimator which is asymptotically normal under rather general assumptions on the constants {c i } and the random errors ξ 1 ,ξ 2 ,⋯ . Conditions for optimality of the estimators are also obtained. The article generalizes some previous results of the authors [ibid. 41, No. 1, 125-137 (2000; Zbl 0943.62025)].
    Siberian Mathematical Journal 01/2001; 42(2):317-331. DOI:10.1023/A:1004893114744 · 0.30 Impact Factor
  • Source
    Yu. Yu. Linke, A. I. Sakhanenko
    [Show abstract] [Hide abstract]
    ABSTRACT: Let ξ 1 ,⋯,ξ N be independent, identically distributed random variables with zero mean and unit variance. Let a sequence of independent random variables X 1 ,X 2 ,⋯ be defined in the following way: X i =a i (1+b i θ) -1 +σ i ξ i · The values a i >0 and b i >0 are assumed known, while the values of the parameter θ and the variance DX i ≡σ i 2 are unknown. The authors consider the problem of estimating the unknown parameter θ from the observations X 1 ,⋯,X N . They propose to use the following simple estimator: θ * =∑c i (a i -X i )∑ c i b i X i -1 · It turns out that this estimator is asymptotically normal under rather general assumptions on the constants {c i }. Moreover, in the case when some information on the behavior of the variances {σ i } is available, the authors show how to choose functions {γ i (θ)} so that the “improved” estimator θ ** =∑γ i (θ * )(a i -X i )∑ γ i (θ * ) b i X i -1 becomes asymptotically efficient in some sense.
    Siberian Mathematical Journal 01/2000; 41(1):125-137. DOI:10.1007/BF02674002 · 0.30 Impact Factor

Publication Stats

37 Citations
2.66 Total Impact Points

Institutions

  • 2001–2012
    • Sobolev Institute of Geology and Mineralogy
      Novo-Nikolaevsk, Novosibirsk, Russia
  • 2011
    • Novosibirsk State University
      Novo-Nikolaevsk, Novosibirsk, Russia