# How to convert coefficients of Log-Transformed variables to Odds-Ratio in Logistic Regression?

odds-ratio = [e^e^(beta hat)]-1

I'm using some log-transformed variables within my model of Logistic Regression. After fitting the model I would like to convert its coefficients to odds-ratio. Does anyone knows how to do this? I'm not sure if it is simply a double exponentiation of the coefficient, like

odds-ratio = [e^e^(beta hat)]-1

odds-ratio = [e^e^(beta hat)]-1

- Logistic regression use the prob. of odds of success as in logit [P(Y=1].

It is not necessary to log-transformed the indept. vars because logistic can handle continuous & categorical data.

Say after fiting the model, will get logit[P(Y=1)] = 0.545 + 0.254X1 + 0.786X2

The estimates for beta1 at X1 is such that 2.254 is the log odds for X1.

So if you want the odds ratio for X1, you need to take exp(0.254) = 1.29 - I don't think the e^e^ approach works. And it seems an exact solution is algebraically messy. However, if you view the odds ratio as a rate of change in the odds then you can get a good approximation to your answer using the chain rule for derivatives. Since the derivative of the log is simply 1/x, I think all you need to do is divide the odds ratio by x.
- > Zamalia Mahmud : since logistic regression assumes a linear relationship of the log-odd with the continuous variable, log-transforming the independant variable can be necessary to fulfill this assumption.

> Ramon Rosa : may be instead of explaining in terms as « exp(\beta) = odds-ratio for an increase of +1 in x », as usual in logistic regression, it would be more convenient to express the results as « exp(\beta) = odds-ratio for a 10-times increase of x » (assuming you are using a decimal logarithm transformation of x), since a +1 increase of the decimal log correspond to multiplying x by 10? Would avoid the inconvenience of complicate algebra, chain rules and so on...

> David Abbott: I wonder if the approximation by the derivative would apply for such variations that a log-transformation is needed... - Zamalia Mahmud is correct. To explicate further, the odds for a particular X1, X2, is Exp(0.545 + 0.254X1 + 0.786X2); if X1 increases by 1, the new odds is Exp(0.545 + 0.254(X1+1) + 0.786X2) = Exp(0.254)*Exp(0.545 + 0.254X1 + 0.786X2); taking the ratio and simplifying, the odds ratio is Exp(0.254). Similarly for any other increment in X1 or X2. This is the simplest method, and is always correct (assuming the correctness of the logit model). Taking derivatives should yield the same result IF the explanatory variables are continuous. If, say, X1 is a binary variable (0 or 1), it is technically incorrect to take a partial derivative with respect to it, though in fact the answer will be approximately correct.
- >David Collins: the interpretation of the odds-ratio as exp( coefficient of the logit regression ) is definitly correct; that was not the point on which I answered to Zamalia Mahmud.

The point I mentionned that, as you write, you assume that log( odd ) = 0.545 + 0.254 X1 + 0.786 X2 that means a LINEAR relationship between log( odd ) and your X1 and X2. But there is no reason for the relation to be always linear with X, you well may have log( odd ) = 0.545 + 0.254 log( X1 ) + 0.786 log( X2 ), which is a different model.

Exactly as sometimes you have Y = a + b*X and sometimes you have Y = a + b*log( X ). So, as you check that assuming a linear relation between Y and X is not so wrong before using linear regression results, you should check that assuming the linear relation between log( odd ) and X is correct.

So the point was "yes, it sometimes makes sense to use log(X) and not X", and it not the same thing, and that was on that first part only of Zamalia Mahmud's post that I was answering.

And if you use log(X), odds-ratio for an increase of 1 in X does not corresponds anymore to exp( coefficient ), which was I think the original question... But odd-ratio for X -> 10 * X, assuming decimal logarithm, will.

- Views 10654
- Followers 10
- Answers 5

## All Answers (5)