I started to study linear regression this autumn and got stuck at the estimation of regression parameters (in matrix form).
In parameter estimation of the general linear model, the OLS method is used to estimate the parameter $\beta$ and result in least square estimator $\hat{\beta}$ in normal equation $$X^\top X\hat{\beta} = X^\top Y$$
I tried to figure out how to find the unique solution of $\hat{\beta}$ when $X^\top X$ has full rank and is non-singular. However, as I am a bit rusty on linear algebra, I am really confused about the methods to acquire multiple solutions of $\hat{\beta}$ for singular $X^\top X$. Therefore I started to try some simple examples.
For instance, I have a linear regression model likes this(for the $i_{th}$ response): $$y_i = \beta_0 + \beta_1\psi(x_i) +e_i, i = 1, ..., n,$$ where $\psi(x_i)$ is the function of $x_i$ and the $e_i$ are independent and have zero mean.
I am now considering the situation that $x_i$ satisfies the case that $\psi(x_i) = 0 \ \forall \ i$. $$ X^\top X = \begin{pmatrix} n&0 \\ 0&0 \end{pmatrix} $$
The matrix above tends to be singular and I can not get a unique solution. And I am trying to find the possible solutions of $\hat{\beta}$.
I am now consulting methods like Gaussian elimination and Cholesky decomposition. As I am rusty on linear algebra, I still could not get a clue. Could anyone give me some inspiration for the simple case above? Thank you.