1

I would like to determine a partial correlation from fixed effects in my linear mixed models.

If i would run a model for example $Aij = \beta_0 + \beta_1X + \beta_2Y + \beta_2Z + ui$ + ϵij

Can I calculate a partial correlation coefficient for the $\beta_1$ to $A$ using the formula

$r= \beta_1 *(var(\beta_1) / \text{sd}(A))$ ?

Walter
  • 38
  • 5
  • 1
    What information do you have available for calculating? Because the units of your proposed formula for $r$ are incorrect, it cannot possibly be right. (The units of $\beta_1$ are in units of $A$ per unit of $X$, so the units for your $r$ are units of $A$ per squared unit of $X,$ rather than being unitless as required of any correlation coefficient.) – whuber Mar 08 '23 at 18:22
  • This formula is the rescaling formula between the correlation coefficient and regression coeficient (see this post link . This is valid for a simple linear regression but is I am wondering if its valid to use in a more complex models as presented here – Walter Mar 08 '23 at 19:08
  • You haven't quoted the formula correctly. What your readers will understand by "$\operatorname{sd}(\beta_1)$" (which makes no sense when read literally because $\beta_1$ is a number) is that it is the sampling variance of the least squares estimate of $\beta_1.$ – whuber Mar 08 '23 at 19:45
  • Thank, indeed I mean the se of the estimate for the ___β1___. I will change it. – Walter Mar 08 '23 at 20:44
  • The units still are nonsensical. – whuber Mar 09 '23 at 14:06
  • How would you suggest writing it down? Might be my ignorance. – Walter Mar 10 '23 at 21:31
  • I would suggest reviewing the formulas you can find here on CV. – whuber Mar 10 '23 at 22:18

1 Answers1

3

One solution you can consider is the partR2 package in R (Stoffel et al., 2019). This allows you to derive a partial effect size for each fixed effects predictor. It is technically devised of two coefficients. The first is $part R^2$, defined below as:

$$ R^2_{x^*} = \frac{Y_X-Y_\tilde{X}}{Y_X+Y_{RE}+Y_R} = \frac{Y_X-Y_\tilde{X}}{Y_{Total}} $$

where ${X}$ is the variance explained by fixed effects in a full model versus $\tilde{X}$, which is the variance in fixed effects from a reduced model. The denominator $Y_X + Y_{RE} + Y_R$ is essentially the total variance in the model (fixed effects + random effects + residuals). This effectively allows one to tease apart to a degree how much variance changes when the model is reduced down to remove the effect of other predictors. The second coefficient is $inclusive$ $R^2$, which is defined below:

$$ IR^2_{x*} = SC^2 * R^2_{x^*} $$

where $SC^2$ is the squared correlation between a predictor of interest and a linear predictor times its $part R^2$. This coefficient quantifies the total proportion of variance explained in the model, both uniquely and jointly with other predictors.

Citation

Stoffel, M. A., Nakagawa, S., & Schielzeth, H. (2021). partR2: Partitioning R2 in generalized linear mixed models. PeerJ, 9, e11414. https://doi.org/10.7717/peerj.11414