I have looked at the answers at:
- Difference between logit and probit models
- Interpretation of simple predictions to odds ratios in logistic regression
In a linear regression, you regress $Y$ on $X$. For each subject, $i$, you have a $X_i$ and $Y_i$.
Assuming I want to do the transformation in a logistic regression by hand, how do I obtain $P(Y_i)$ for each subject? I found the following website that goes through the procedure for categorical predictors (http://vassarstats.net/logreg1.html). Which is essentially computing the log odds for each combination of categorical predictors. How then do you deal with continuous predictors?
For illustration, I am using the dataset from: https://stats.idre.ucla.edu/r/dae/logit-regression/
bindata <- read.csv("https://stats.idre.ucla.edu/stat/data/binary.csv")
What I would like to achieve is for the results from:
glm.logistic <- glm(admit ~ gpa, bindata, family = "binomial")
to match with
glm.linear <- glm(admit_transform ~ gpa, bindata, family = "gaussian")
where admit_transform is the log odds of admit.
The whole point of this is really to understand how logistic regression works, not as a practical way to do logistic regression.
Yand a fittedY. ObservedYneeds to be transformed before it can be fitted i.e. for each subject, I will have aP(Y). The observedYwould also be where the residulas come from, more specifically, there is an observed and fitted/predictedlg(P(Y)/(1 - P(Y))). My question really is how do I compute this observedP(Y)manually in the case of continuous predictors? – RJ- Mar 23 '18 at 12:56