Imagine for the sake of simplicity that I am regressing $Y$ on $X$ with the model
$Y = \beta_0 + \beta_1X + \epsilon$
Now imagine that observations on my $X$ are constant, e.g. take $Y = \{2, 7, 9\}$ and $X = \{3, 3, 3\}$. In this case, I am regressing $Y$ against a constant predictor. What is the statistical implication of this? Specifically, which of the assumptions of OLS am I breaking and Y does lm function in R produce NAs for the estimate of the coefficient $\beta_1$.