0

We know that in linear regression, when each coefficient is not significant in multiple regression but significant as a simple regression, it is most likely the reason of Multicollinearity. However how about the inverse case:

each coefficient is significant in multiple regression but not significant as simple regression?

I am not sure if it is possible. If possible, do you know the any of the reason?

I may have a misunderstanding that I always think as significant F test in whole and non significant T test in each coefficient is the unique flag of multicollinearity. So actually above phenomenon(significant in combination and non significant in single) is also a flag, right?

user6703592
  • 1,325
  • Please read up on Simpson's Paradox. Search for that in the search box and you will find tons of great info. Here is a short answer I gave to one question: https://stats.stackexchange.com/a/478499/141304 – abalter Jul 30 '21 at 19:59
  • @abalter it seems still the problem of Multicollinearity? Could you open a question specific to this question? – user6703592 Aug 01 '21 at 17:06
  • I don't understand what you mean by "open a question specific to this question." Multicollinearity becomes a problem as you add more variables (multiple regression), when those variables are connected. You are talking about seeing a 0 slope with a single variable but significant slopes as you add covariates. This sort of behavior is to be expected when you add covariates. It is also easily demonstrated in Simpson's Paradox. – abalter Aug 02 '21 at 18:54
  • @abalter I know where is my confusion now. Since I always think as significant F test and non significant T test is the unique flag of multicollinearity. So actually above phenomenon is also a flag, right? – user6703592 Aug 03 '21 at 03:08
  • Nope. You are way off. The T-test is just the F-test for when you only have one covariate---simple regression. You are confused about what colinearity is. The way to test for collinearity is using the VIF, and this has nothing to do with confounding. I strongly recommend you learn what confounding is, and do as I suggested and read about Simpson's Paradox as it is the simplest way to demonstrate confounding and the power of covariates. https://stats.stackexchange.com/questions/538773/simpsons-paradox, https://stats.stackexchange.com/questions/19525/simpsons-paradox-or-confounding – abalter Aug 03 '21 at 04:11
  • @abalter sorry here “above phenomenon” I mean the Simpson’s paradox, it is just the reason of multicollinearity right? – user6703592 Aug 03 '21 at 04:17
  • In you question, "not sure if it is possible," what does "it" refer to?? It is possible for the F test to be significant while all the t tests are not significant (at the same alpha level) even when the explanatory variables are orthogonal. There are subtle (but essentially trivial) reasons for simple regression p-values to differ from the multiple regression p-values: namely, the error degrees of freedom will differ, so a different Student t distribution will be used to compute the p-values. – whuber Aug 03 '21 at 15:28
  • @whuber I mean F test to be significant while all the t tests are not significant is a possible signal of Multicollinearity but not always true. And here each coefficient is significant in multiple regression but not significant as simple regression is most likely the result of Multicollinearity, right? – user6703592 Aug 03 '21 at 16:20

0 Answers0