Suppose we model the random variable $Y$ as follows: $$\mathbb{E}[Y]=\beta_0+\beta_1x_1.$$ Now many statistics textbooks treat $\beta_i$ as parameters, which is simply constants (correct me if I am wrong), and $x_1$ as a observable varaibles. My concern is how can you do that in theory?
My rationale is that given the probability space we are running the regression on, the expected value of $Y$ should be a fixed number rather than depend on some other observable variables. Moreover, it seemes more natural to think of $x_1$ as a realization of the random variable $X_1$ and write the model specification as $$\mathbb{E}[Y\mid X_1=x_1]=\beta_0+\beta_1x_1.$$
Help is certainly appreciated! Also, any suggestions concerning texts or articles I should take a look is also appreciated.