I'm studying the cases in which the endogeneity problem arises in OLS regression.
Suppose we have the following population equation:
$y=\beta_0 +\beta_1 x_1 + ... + \beta_k x_k + \gamma q + \epsilon$
and say $E(\epsilon | x,q)=0$, such that: $E(y|x,q)=\beta_0 +\beta_1 x_1 + ... + \beta_k x_k + \gamma q$
Suppose $q$ is unobserved and so it goes into the error term, thus your population equation reads as
$y=\beta_0 +\beta_1 x_1 + ... + \beta_k x_k + \nu$ , where $\nu=\gamma q + \epsilon$
Then, the slides says, nothing is lost assuming that $E(q)=0$, because an intercept is included in the basic equation, so that $E(\nu)=0$.
Why is fine assuming that $E(q)=0$, because an intercept is included in the basic equation?