This has been discussed time and again here and as noted in Cliff AB's answer, there are two aspects about the assumptions of regressors, which can be seen in the case of simple linear regression model:
$\bullet $ When the independent variable is controlled by the experimenter and hence it is nonstochastic:
$$\begin{align}\mathbb E[Y]&= \beta_0+\beta_1X,\\\mathbb V[Y]&=\sigma^2.\end{align}$$
$\bullet$ When both the dependent and independent variables are stochastic that is $X$ and $Y$ are jointly distributed:
If the joint distribution is bivariate normal, then
$$\begin{align}\mathbb E[Y\mid X=x]&=\beta_0+\beta_1x,\\\mathbb V[Y\mid X]&=\sigma^2_{y\cdot x}\\&=\sigma^2_y(1-\rho^2).\end{align}$$
Reference:
$\rm[I]$ Linear Models and Generalizations: Least Squares and Alternatives, C. R. Rao, H. Toutenburg, Shalabh, Christian Heumann, Springer-Verlag, $2008, $ sec. $2.1, 2.15.$