Mahdi Roozbeh

Mahdi Roozbeh
  • Ph.D.
  • Professor (Associate) at Semnan University

About

58
Publications
7,275
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
980
Citations
Current institution
Semnan University
Current position
  • Professor (Associate)
Additional affiliations
August 2011 - November 2015
Semnan University
Position
  • Faculty Member

Publications

Publications (58)
Article
Full-text available
Outliers are a common problem in applied statistics, together with multicollinearity. In this paper, robust Liu estimators are introduced into a partially linear model to combat the presence of multicollinearity and outlier challenges when the error terms are not independent and some linear constraints are assumed to hold in the parameter space. Th...
Article
Full-text available
Determining the predictor variables that have a non-linear effect as well as those that have a linear effect on the response variable is crucial in additive semi-parametric models. This issue has been extensively investigated by many researchers in the area of semi-parametric linear additive models, and various separation methods are proposed by th...
Article
Full-text available
Regression analysis frequently encounters two issues: multicollinearity among the explanatory variables, and the existence of outliers in the data set. Multicollinearity in the semiparametric regression model causes the variance of the ordinary least-squares estimator to become inflated. Furthermore, the existence of multicollinearity may lead to w...
Article
Full-text available
The analysis of the high-dimensional dataset when the number of explanatory variables is greater than the observations using classical regression approaches is not applicable and the results may be misleading. In this research, we proposed to analyze such data by introducing modern and up-to-date techniques such as support vector regression, symmet...
Chapter
By evolving science, knowledge, and technology, new and precise methods for measuring, collecting, and recording information have been innovated, which have been resulted in the appearance and developing of high-dimensional data, in which the number of explanatory variables is much larger than the number of observations. Analysis and modeling the h...
Article
Full-text available
In many applications, indexing of high-dimensional data has become increasingly important. High-dimensional data is characterized by multiple dimensions. There can be thousands, if not millions, of dimensions in applications. Classic methods cannot analyse this kind of data set. So, we need the appropriate alternative methods to analyse them. In hi...
Article
Background and purpose: Machine learning is a class of modern and strong tools that can solve many important problems that nowadays Humans may be faced with. Support Vector Regression (SVR) is a way to build a regression model which is an incredible member of the machine learning family. SVR has been proven to be an effective tool in real-value fun...
Article
Nowadays, high-dimensional data appear in many practical applications such as biosciences. In the regression analysis literature, the well-known ordinary least-squares estimation may be misleading when the full ranking of the design matrix is missed. As a popular issue, outliers may corrupt normal distribution of the residuals. Thus, since not bein...
Article
As known, outliers and multicollinearity in the data set are among the important difficulties in regression models, which badly affect the least-squares estimators. Under multicollinearity and outliers’ existence in the data set, the prediction performance of the least-squares regression method is decreased dramatically. Here, proposing an approxim...
Article
Full-text available
The ridge regression estimator is a commonly used procedure to deal with multicollinear data. This paper proposes an estimation procedure for high-dimensional multicollinear data that can be alternatively used. This usage gives a continuous estimate, including the ridge estimator as a particular case. We study its asymptotic performance for the gro...
Article
Full-text available
The ridge regression estimator is a commonly used procedure to deal with multicollinear data. This paper proposes an estimation procedure for high-dimensional multicollinear data that can be alternatively used. This usage gives a continuous estimate, including the ridge estimator as a particular case. We study its asymptotic performance for the gro...
Article
Modern statistical studies often encounter regression models with high dimensions in which the number of features p is greater than the sample size n. Although the theory of linear models is well–established for the traditional assumption p < n, making valid statistical inference in high dimensional cases is a considerable challenge. With recent ad...
Article
Full-text available
With the advancement of technology, analysis of large-scale data of gene expression is feasible and has become very popular in the era of machine learning. This paper develops an improved ridge approach for the genome regression modeling. When multicollinearity exists in the data set with outliers, we consider a robust ridge estimator, namely the r...
Article
In classical linear regression analysis problems, the ordinary least-squares (OLS) estimation is the popular method to obtain the regression weights, given the essential assumptions are satisfied. However, often, in real-life studies, the response data and its associated explanatory variables do not meet the required conditions, in particular under...
Article
Full-text available
Background and purpose: By evolving science, knowledge, and technology, we deal with high-dimensional data in which the number of predictors may considerably exceed the sample size. The main problems with high-dimensional data are the estimation of the coefficients and interpretation. For high-dimension problems, classical methods are not reliable...
Chapter
Full-text available
In classical regression analysis, the ordinary least-squares estimation is the best estimation if the essential assumptions are satisfied. However, if the data does not satisfy some of these assumptions, then results can be misleading. Especially, outliers violate the assumption of normally distributed residuals in the least-squares regression. Rob...
Article
This paper applies a ridge estimation approach in an existing partial logistic regression model with exact predictors, intuitionistic fuzzy responses, intuitionistic fuzzy coefficients and intuitionistic fuzzy smooth function to improve an existing intuitionistic fuzzy partial logistic regression model in the presence of multicollinearity. For this...
Article
In fitting a regression model to survey data, using additional information or prior knowledge, stochastic uncertainty occurs in specifying linear programming due to economic and financial studies. These stochastic constraints, definitely cause some changes in the classic estimators and their efficiencies. In this paper, stochastic shrinkage estimat...
Article
Some linear stochastic constraints may occur during real data set modeling, based on either additional information or prior knowledge. These stochastic constraints often cause some changes in the behaviors of estimators. In this research, shrinkage ridge estimators as well as their positive parts are proposed in the semi-parametric model when some...
Article
When multicollinearity exists in the context of robust regression, ridge rank regression estimator can be used as an alternative to the rank estimator. Performance of the ridge rank regression estimator is highly dependent on the ridge parameter, here the tuning parameter. On the other hand, suppose we are provided with some non-sample uncertain pr...
Article
Full-text available
Introduction: Estimation of age has an important role in legal medicine, endocrine diseases and clinical dentistry. Correspondingly, evaluation of dental development stages is more valuable than tooth erosion. In this research, the modeling of calendar age has been done using new and rich statistical methods. Considerably, it can be considering as...
Article
Full-text available
In this paper, a generalized difference-based estimator is introduced for the vector parameter \(\beta \) in partially linear model when the errors are correlated. A generalized difference-based almost unbiased ridge estimator is defined for the vector parameter \(\beta \). Under the linear stochastic constraint \(r=R\beta +e\), a new generalized d...
Article
Full-text available
Due to advances in technologies, modern statistical studies often encounter linear models with high-dimension, where the number of explanatory variables is larger than the sample size. Estimation in these high-dimensional problems with deterministic covariates or designs is very different from those in the case of random covariates, due to the iden...
Article
Full-text available
It is shown that the prediction performance of the LASSO method is improved for high dimensional data sets by subtracting structural noises through a sparse additive partially linear model. A mild combination of the partial residual estimation method and the back-fitting algorithm by further implying the LASSO method to the predictors of the linear...
Article
In this paper, a generalized difference-based estimator is introduced for the vector parameter β in the partially linear model when the errors are correlated. A generalized difference-based Liu estimator is defined for the vector parameter β. Under the linear stochastic constraint r=Rβ+e, a new generalized difference-based weighted mixed Liu estima...
Article
Modern statistical analysis often encounters linear models with the number of explanatory variables much larger than the sample size. Estimation in these high-dimensional problems needs some regularization methods to be employed due to rank deficiency of the design matrix. In this paper, the ridge estimators are considered and their restricted regr...
Article
There are some classes of biased estimators for solving the multicollinearity among the predictor variables in statistical literature. In this research, we propose a modified estimator based on the QR decomposition in the semiparametric regression models, to combat the multicollinearity problem of design matrix which makes the data to be less disto...
Article
In order to down-weight or ignore unusual data and multicollinearity effects, some alternative robust estimators are introduced. Firstly, a ridge least trimmed squares approach is discussed. Then, based on a penalization scheme, a nonlinear integer programming problem is suggested. Because of complexity and difficulty, the proposed optimization pro...
Article
Multicollinearity among the predictor variables is a serious problem in regression analysis. There are some classes of biased estimators for solving the problem in statistical literature. In these biased classes, estimation of the shrinkage parameter plays an important role in data analyzing. Using eigenvalue analysis, efforts have been made to dev...
Article
As known, the ordinary least-squares estimator (OLSE) is unbiased and also, has the minimum variance among all the linear unbiased estimators. However, under multicollinearity the estimator is generally unstable and poor in the sense that variance of the regression coefficients may be inflated and absolute values of the estimates may be too large....
Article
In this paper, a generalized difference-based estimator is introduced for the vector parameter β in partially linear model when the errors are correlated. A generalized difference-based almost unbiased two parameter estimator is defined for the vector parameter β. Under the linear stochastic constraint r = Rβ + e, we introduce a new generalized dif...
Article
In classical regression analysis, the ordinary least–squares estimation is the best strategy when the essential assumptions such as normality and independency to the error terms as well as ignorable multicollinearity in the covariates are met. However, if one of these assumptions is violated, then the results may be misleading. Especially, outliers...
Article
In classical regression analysis, the ordinary least-squares estimation is the best method if the essential assumptions are met to obtain regression weights. However, if the data do not satisfy some of these assumptions, then results can be misleading. Especially, outliers violate the assumption of normally distributed residuals in the least-square...
Article
Under some non-stochastic linear restrictions based on either additional information or prior knowledge in a semiparametric regression model, a family of feasible generalized robust estimators for the regression parameter is proposed. The least trimmed squares (LTS) method was proposed by Rousseeuw as a highly robust regression estimator, is a stat...
Article
Multicollinearity among the explanatory variables is a serious problem in regression analysis. There are some classes of biased estimators for solving this problem in statistical literature. In these biased classes, estimation of the shrinkage parameter plays an important role in data analyzing. Using eigenvalue analysis, efforts have been made to...
Article
In this paper, ridge and non-ridge type estimators and their robust forms are defined in the semiparametric regression model when the errors are dependent and some non-stochastic linear restrictions are imposed under a multicollinearity setting. In the context of ridge regression, the estimation of shrinkage parameter plays an important role in ana...
Article
Two common problems in applied statistics are multicollinearity between variables and the presence of outliers in the data. In a partially linear regression model, a family of robust ridge estimators for the regression parameters and the nonlinear part is introduced by adding an penalty to the well-known least trimmed squares estimator. The partial...
Article
Full-text available
In this paper shrinkage ridge estimator and its positive part are defined for the regression coefficient vector in a partial linear model. The differencing approach is used to enjoy the ease of parameter estimation after removing the non-parametric part of the model. The exact risk expressions in addition to biases are derived for the estimators un...
Article
Full-text available
In the context of ridge regression, the estimation of ridge (shrinkage) parameter plays an important role in analyzing data. Many efforts have been put to develop skills and methods of computing shrinkage estimators for different full-parametric ridge regression approaches, using eigenvalues. However, the estimation of shrinkage parameter is neglec...
Article
This article considers the problem of point/set estimation in a specific seemingly unrelated regression model, namely system regression model. Feasible type of shrinkage estimator and its positive part are defined for the effective regression coefficient vector, when the covariance matrix of the error term is assumed to be unknown. Their asymptotic...
Article
Fuzzy least-squares regression can be very sensitive to unusual data (e.g., outliers). In this paper, we describe how to fit an alternative robust-regression estimator in fuzzy environment, which attempts to identify and ignore unusual data. The proposed approach concerns classical robust regression and estimation methods that are insensitive to ou...
Article
Full-text available
In this paper, ridge and non-ridge type shrinkage estimators and their positive parts are defined in the semiparametric regression model when the errors are dependent and some non-stochastic linear restrictions are imposed under a multicollinearity setting. The exact risk expressions in addition to biases are derived for the estimators under study...
Article
Full-text available
In the context of ridge regression, the estimation of shrinkage parameter plays an important role in analyzing data. Many efforts have been put to develop the computation of risk function in different full-parametric ridge regression approaches using eigenvalues and then bringing an efficient estimator of shrinkage parameter based on them. In this...
Article
Full-text available
Under a semiparametric regression model, a family of robust estimates for the regression parameter is proposed. The least trimmed squares (LTS) method is a statistical technique for fitting a regression model to a set of points. Given a set of n observations and the integer trimming parameter , the LTS estimator involves computing the hyperplane th...
Article
This paper is concerned with the ridge estimation of the parameter vector in partial linear regression model , with correlated errors, that is, when , with a positive definite matrix and , under the linear constraint , for a given matrix and a given vector . The partial residual estimation method is used to estimate and the function . Under appropr...
Article
Full-text available
In this paper, a generalized difference-based estimator is introduced for the vector parameter β in the semiparametric regression model when the errors are correlated. A generalized difference-based Liu estimator is defined for the vector parameter β in the semiparametric regression model. Under the linear nonstochastic constraint Rβ=R, the general...
Article
Full-text available
This article considers estimation in the seemingly unrelated semiparametric models, when the explanatory variables are affected by multicollinearity. It is also suspected that some additional linear constraints may hold on the whole parameter space. In sequel we propose difference-based ridge type estimators combining the restricted least squares m...
Article
In a partial linear model, some non-stochastic linear restrictions are imposed under a multicollinearity setting. Semiparametric ridge and non-ridge type estimators, in a restricted manifold are defined. For practical use, it is assumed that the covariance matrix of the error term is unknown and thus feasible estimators are replaced and their asymp...
Article
In this paper, an exact sufficient condition for the dominance of the Stein-type shrinkage estimator over the usual unbiased estimator in a partial linear model is exhibited. Comparison result is then done under the balanced loss function. It is assumed that the vector of disturbances is typically distributed according to the law belonging to the s...
Article
This article is concerned with the problem of multicollinearity in the linear part of a seemingly unrelated semiparametric (SUS) model. It is also suspected that some additional non stochastic linear constraints hold on the whole parameter space. In the sequel, we propose semiparametric ridge and non ridge type estimators combining the restricted l...
Article
Full-text available
In this article, a generalized restricted difference-based ridge estimator is defined for the vector parameter in a partial linear model when the errors are dependent. It is suspected that some additional linear constraints may hold on to the whole parameter space. The estimator is a generalization of the well-known restricted least-squares estimat...
Article
Full-text available
In this article, we introduce a semiparametric ridge regression estimator for the vector-parameter in a partial linear model. It is also assumed that some additional artificial linear restrictions are imposed to the whole parameter space and the errors are dependent. This estimator is a generalization of the well-known restricted least-squares esti...
Article
Full-text available
In this paper, the geometric distribution is considered. The means, variances, and covariances of its order statistics are derived. The Fisher information in any set of order statistics in any distribution can be represented as a sum of Fisher information in at most two order statistics. It is shown that, for the geometric distribution, it can be f...

Network

Cited By