I am following Brockwell and Davis (Introduction to Time Series and Forecasting, 3rd Edition). Chapter 2, Proposition 2.2.1 claims the following.
If $\{Y_t\}$ is a stationary time series with mean zero and covariance $\gamma_Y$, and $\{\psi_j\}_{j = -\infty}^{+\infty}$ is an absolutely summable sequence, then $X_t = \sum_{j = -\infty}^{\infty} \psi_j Y_{t - j}$ is a stationary time series with mean 0 and covariance function $\gamma_X$.
I know that $\sum_{j = -\infty}^{\infty} |\psi_j| < \infty$ implies $\sum_{j = -\infty}^{\infty} \psi_j^2 < \infty$. I am trying to show that the latter (square-summability of $\{\psi_j\}$) implies mean-square convergence of the sequence $\sum_{j = -n}^{n} \psi_j Y_{t - j}$.
To that end, I want to prove that the Cauchy criterion holds. Given $\epsilon > 0$, there exists $N > 0$ such that for all $M \geq N$ it holds that: $$ E\left [ \left (\sum_{j = -M}^{M} \psi_j Y_{t - j} - \sum_{j = -N}^{N} \psi_j Y_{t - j} \right )^2 \right ] \leq \epsilon $$
I am able to show that the LHS is at most $$ \left ( \sum_{j = -(M - N)}^{M - N} \psi_{j}^2 \right ) \cdot 2 \cdot (M - N) \cdot \gamma_Y(0) $$
But I'm stuck at this point. Any pointers?