Seems to be a lot of answers for interpreting odds ratios < 1 and > 1, but none for odds ratios > 2?
If I have an odds ratio of 2.22, does this mean there is a 122% increase in the odds for a 1 unit increase in the corresponding X?
Thank you.
Seems to be a lot of answers for interpreting odds ratios < 1 and > 1, but none for odds ratios > 2?
If I have an odds ratio of 2.22, does this mean there is a 122% increase in the odds for a 1 unit increase in the corresponding X?
Thank you.
Odds are ratios, but odds ratios are ratios of odds (ratios of ratios). This gets confusing quickly, and the previous answers seem confused by this. That's because while odds behave very nicely in terms of regression, they behave very poorly in typical human number understanding.
So, first, odds are ratios, but they're not the ratio we're usually more familiar with: probability. If you're talking about the odds of disease, for example, the odds are likely expressed as the number of disease cases per healthy case. So, if the odds are 1:10 or 0.1, that means you have 1 disease per 10 healthy. However, we're usually used to using a different ratio: the number of disease per total population, or a probability of 1/11. That's not odds: odds is p/(1-p).
So now when we think about odds ratios we need to think in terms of what odds actually mean. If you think of odds as a ratio with a constant denominator (1 is particularly convenient), then when you think about ratios of odds, you can think of just multiplying the numerator by the OR. Then, it's clear that an OR = 1 means "no difference": you multiply the odds by 1 and get the same odds.
An OR = 2.22 means that for each denominator event, there will be 2.22 times as many numerator events for each unit increase in X. So, if you have 0.1 disease per 1 healthy case at X=0, at X=1 you'd have 0.222 disease per 1 healthy case; if you have 1 disease per 1 healthy case at X=0, at X=1 you'd have 2.22 disease per 1 healthy case.