In Element of Statistical Learning II, in the context of smoothong splines, $\pmb{S_{\lambda}}$ is defined as $$ \pmb{S_{\lambda}} = \pmb{N}(\pmb{N}^T\pmb{N} + \lambda \pmb{\Omega_N})^{-1}\pmb{N}^T $$ $\pmb{N}$ is a $N$x$N$ square matrix defined as $\pmb{N}_{i,j} = N_j(x_i)$ where the $N_j$ are defined as $$\left\{\begin{array}{ll} N_1(X) = 1\\ N_2(X) = X \\ N_{j+2}(X) = d_j(X)-d_{N-1}(X) \end{array} \right. $$
where the knots $\xi_{j}$ are the $x_j$ and
$$ d_j(X)=\frac{(X-\xi_{j})_+^3-(X-\xi_{N})_+^3}{\xi_{N}-\xi_{j}} $$
Assuming that $\pmb{N}$ is invertible the Reinsch form can be obtained as \begin{align} \begin{aligned} \pmb{S_{\lambda}} & = \pmb{N}(\pmb{N}^T\pmb{N} + \lambda \pmb{\Omega_N})^{-1}\pmb{N}^T \\ & = \pmb{N}(\pmb{N}^T\pmb{N} + \lambda\pmb{N}^T\pmb{N}^{-T}\pmb{\Omega_N} \pmb{N}^{-1}\pmb{N})^{-1}\pmb{N}^T \\ & = \pmb{N}[\pmb{N}^T(\pmb{I} + \lambda \pmb{N}^{-T} \pmb{\Omega_N} \pmb{N}^{-1})\pmb{N}]^{-1}\pmb{N}^T \\ & = \pmb{N}\pmb{N}^{-1}(\pmb{I} + \lambda \pmb{N}^{-T} \pmb{\Omega_N} \pmb{N}^{-1})^{-1}\pmb{N}^{-T}\pmb{N}^T \\ & = (\pmb{I} + \lambda \pmb{K})^{-1} \end{aligned} \end{align}
where $\pmb{K} = \pmb{N}^{-T} \pmb{\Omega_N} \pmb{N}^{-1}$.
My question is: How to show that $\pmb{N}$ is invertible?