I have this mixed effects logistic regression model. All the predictors are categorical (I need to maintain also age categorical, not as a continuous variable). The predictors are codified with orthogonal sum to zero contrasts. However, I have a problem of near to perfect separation of data.
mod<- glmer(score ~ 1 +age + gender + real*speaker+ (1|part) + (1|item),
data = data, family = binomial(link = "logit"), control = glmerControl(optimizer="bobyqa"))
I have read from the link here that for mixed models the only solution is to switch to a Bayesian model since a penalized regression is not available for mixed models. I have decided to use the brms package since it has a syntax very similar to lme4. However, I am not confident with Bayesian models and priors, and I am not sure about the results.
mod_bayesian<- brm(score ~ 1 +age + gender + real*speaker+ (1|part) + (1|item),
data = data, family = bernoulli, iter = 1000, chains = 4, cores = 4)
Are mod and mod_bayesian equivalent? Or, better, what is the Bayesian equivalent to mod? How can I detect how to select priors?