0

I am currently writing my master's thesis and am doing a logistic regression with the following glm formula in R:

brandrecall_logistic <- glm(unaided_recall ~ frequency + dynamic_ad + 
    dummy_branded + prerecorded_ad + host_credibility + 
    frequency:dynamic_ad + frequency:brandedcontent_ad + 
    frequency:prerecorded_ad + frequency:host_credibility + age, 
    data = thesis_data, family = binomial)

Almost all of my variables are yes or no options and thus have values of 0 and 1 in my data set. However, when leaving "frequency:brandedcontent_ad" out of my glm formula, my p-values are usable and ok. Right now, when this part of the formula is added, all my p-values are around 0.9. I do not know how to fix this. I need the interaction variables, because they are moderators in my research. Also, I need to a logistic regression since my DV is binary.

I tried to do a normal glm function and expected ok p-values. Not everything will be significant, which is ok, but right now nothing is significant.

This is the difference between leaving the interaction term and including the term. I also only get the glm.fit warning when I include the interaction term between frequency and branded content ad.

Interaction term included:

enter image description here

Interaction term excluded:

enter image description here

  • https://stats.stackexchange.com/questions/11109/how-to-deal-with-perfect-separation-in-logistic-regression?#68917 – user20650 Aug 10 '23 at 16:49
  • 1
    By misquoting the error message in the title, you make it difficult to research the solution. It is always a good idea to quote software error messages exactly. – whuber Aug 10 '23 at 17:56

0 Answers0