0

I have created two multiple linear models of the same data. The two models vary only in the dropping of a few variables.

Model_A includes all X independent vars (A,B,C,D): enter image description here

Model_B includes only A,B as C,D were found to be insignificant: enter image description here

As you can see the F-statistic has decreased but the adjusted R^2 has decreased. My understanding is that F and R should increase concurently. Can anyone explain these results?

Nick
  • 858
  • What makes you think that F-statisitc and R should increase concurrently? – Dayne Mar 27 '22 at 10:10
  • In general, the $F$ statistic too often can decrease when the number of predictors increases. That's because $F$ normalizes for the number of predictor degrees of freedom $p$ by dividing sums of squares regression by $p$. $\chi^2$ statistics $\uparrow$ as $p \uparrow$ so the $\chi^2$ statistic is perhaps a better basis for judgment. This is bolstered by the fact that for the linear model the likelihood ratio $\chi^2$ statistic is exactly $-n \times \log(1 - R^{2})$. – Frank Harrell Mar 27 '22 at 12:35
  • @Michael Simply look at the formula relating $F$ to $R^{2}$ and notice that it's something times a monotonic function of $R^{2}$. That 'something' encompasses a complete answer to the question. – Glen_b Mar 27 '22 at 22:04

0 Answers0