# A negative coefficient for a constant in a linear regression?

I just ran a linear regression, in which I got a negative coefficient for the constant. Does this mean the model is wrong?

I just ran a linear regression, in which I got a negative coefficient for the constant. Does this mean the model is wrong?

## Popular Answers

Zachary Gassoumis· University of Southern CaliforniaEven if your dependent variable is typically/always positive (i.e., has a positive mean value), it wouldn't necessarily be surprising to have a negative constant. For example, consider an independent variable that has a strongly positive relationship to a dependent variable. The values of the dependent variable are positive and have a range from 1-5, and the values of the independent variable are positive and have a range from 100-110. In this case, it would not be surprising if the regression line crossed the x-axis somewhere between x=0 and x=100 (i.e., from the first quadrant to the fourth quadrant), which would result in a negative value for the constant.

The bottom line is that you need to have a good sense of your model and the variables within it, and a negative value on the constant should not generally be a cause for concern. Typically, it is the overall relationships between the variables that will be of the most importance in a linear regression model, not the value of the constant.

## All Answers (16)

Jacob Mack· Keiser UniversityZachary Gassoumis· University of Southern CaliforniaEven if your dependent variable is typically/always positive (i.e., has a positive mean value), it wouldn't necessarily be surprising to have a negative constant. For example, consider an independent variable that has a strongly positive relationship to a dependent variable. The values of the dependent variable are positive and have a range from 1-5, and the values of the independent variable are positive and have a range from 100-110. In this case, it would not be surprising if the regression line crossed the x-axis somewhere between x=0 and x=100 (i.e., from the first quadrant to the fourth quadrant), which would result in a negative value for the constant.

The bottom line is that you need to have a good sense of your model and the variables within it, and a negative value on the constant should not generally be a cause for concern. Typically, it is the overall relationships between the variables that will be of the most importance in a linear regression model, not the value of the constant.

Tatiana AndreevaDoes this make sense?

Harald Lothaller· Kunstuniversität GrazBesnik A. Krasniqi· University of Prishtina and Staffordshire UniversityFerris J Ritchey· University of Alabama at BirminghamKelvyn Jones· University of BristolImagine you are modelling house prices in terms of size of rooms. If you use the raw data, the intercept is a rather meaningless extrapolation – the price of a zero -roomed house; you are extrapolating beyond the observed data! However if you transform the predictor so that it is number of rooms minus the average number of rooms (grand mean centering), the intercept is then an interpolation – the price of an average –sized house (the value of the intercept when the transformed variables is zero). Getting a negative value then would certainly cause me to think hard about both data and the models as the result is very implausible.

I recommend students to routinely re- parameterise their regression model (by centering continuous predictors on sample averages, and by choosing an appropriate reference dummy for categorical variables) so that they can get interpretable and meaningful results for the intercept. Such centering will not affect the estimates of the other predictor variables.

This is particularly important in multilevel models where you may have a great number of differential intercepts in a random intercepts model. See

Kelvyn Jones, SV Subramanian (2012) Developing multilevel models for analysing contextuality, heterogeneity and change Volume 1, 1-269.University of Bristol.

Which is downloadable from http://www.mendeley.com/profiles/kelvyn-jones/

You may also be interested in this free online course we have developed that includes training in ordinary regression (you can follow it in R; Stata and MLWin)

http://www.bristol.ac.uk/cmm/learning/course.html

Robert Thomas Brennan· Harvard UniversityRashad Allahverdiyev ShadlinskyKelvyn Jones· University of BristolMara Inés Espinosa H.· University of La SerenaKelvyn Jones· University of BristolCan I suggest you post this as a separate question as it is very different , and it is not an answer!

Mara Inés Espinosa H.· University of La SerenaSorry, is the first time that I use this web...so I don't know yet how work.

Thanks!

Ale J. Hejase· Lebanese American UniversityJimmy Omony· University of GroningenShaheen Akter· University of OxfordCan you help by adding an answer?