I am studying normal linear regression and wanted to ask a question about its utility when working with independent RV. Suppose that we have for $k \in [1,\dots,n]$, $$Y_k = \beta_0 + \beta_1x_{k1} + \beta_2x_{k2} + \dots + \beta_qx_{kq} + \epsilon_k \quad \text{where } \epsilon_k \sim_{ind} N(0,\sigma^2)$$
We can write put all $Y_k$ in a vector $Y$, put the explanatory variables in a matrix $X$ and parameters in a vector $\beta$ and write $Y=X\beta + \epsilon $ where the disturbance vector $\epsilon \sim N(0,\sigma^2 Id)$.
However, in real life thinking, I do not see an utility to this matrix form if we are studying things that have no link. If $Y_1$ is the points you receive when asking a question in this website, $Y_2$ the salary of employees in a company, $Y_3$ the numbers of goals scored by a striker then these three only share parameters $\beta_i$ and explanatory variables can be totally different. So why do we choose to study them jointly ?