2

Suppose a linear regression model: $$y \sim Normal(\beta X, \sigma)$$

For our purposes, assume $y$ is a univariate outcome and $X$ is a design matrix containing an intercept and one additional predictor, i.e. $\beta X = \beta_0 + \beta_1 x_1$. The Bayesian equivalent adds priors to $\beta$ and $\sigma$.

Sometimes, researchers specify multivariate priors on the entire $\beta$ vector such as (in matrix notation) $\beta \sim MVN(0, \Sigma)$ while in other cases they assign univariate priors on each element of $\beta$, such as $\beta_0 \sim N(0, \sigma_0)$ and $\beta_1 \sim N(0,\sigma_1)$.

While the multivariate approach explicitly places a prior on the covariance between parameters, when might researchers wish to specify multivariate versus univariate priors?

  • Hierarchical models often employ multivariate priors but I don't see any reason the multivariate distribution necessarily introduces a hierarchical structure. For example, we might assume $\Sigma = I$, an identity matrix on the standardized coefficients - implying each coefficient is independent of the other.
  • In cases where groups of parameters must satisfy some constraints, such as unit simplex constraints, it makes sense that the parameters are necessarily correlated with each other and a multivariate prior is necessary. However, couldn't one also just write this out as a set of univariate priors with other parameters in the prior equations?
socialscientist
  • 761
  • 5
  • 15
  • Note: this is similar to #2 here https://stats.stackexchange.com/questions/254254/why-do-we-need-multivariate-regression-as-opposed-to-a-bunch-of-univariate-regr which is both unanswered in a multi-question question that is not quite precise enough. – socialscientist Sep 27 '22 at 19:56

0 Answers0