I'm reading a textbook. In the chapter about least square regression I red that A simple linear least square model can be described as \begin{equation} Y = \alpha + \beta x + e \end{equation} where Y is the mean response, x is an independent variable, and e is a random variable representing an error. This raises a question to me.
Question 1. If Y is the 'mean' response, why do you need a random error on the right hand side?
Now, the textbook says that when there are $n$ pairs of ($x_{i}$, $Y_{i}$) where $i$ is the data point index. If the estimators of $\alpha$ is A and that of $\beta$ is B, then the estimator of the $Y_{i}$ is $A+Bx$. This is also confusing.
Question 2. Isn't it true that $Y_{i} = A+Bx_{i}+e_{i}$ ?
It seems that I misunderstood the meaning of $Y$ in the first equation. I thought that is the mean of the response, but it might be actually just the response which is a random variable. Could someone comment on this? Then I would delete the Question 1 and Question 2 and then leave only the thirds question and select the answer. That might be reasonable, because Question 1 and 2 are based on a wrong assumption.