Providing an example:
I believe the best way to verify how the roots of the characteristic equation relates to covariance stationarity of the time-series process, is through an example in the form of an AR(1) process. In a vague sense, using the lag-operator in order to obtain the characteristic equation, offers a transformation of the Autoregressive process that eases the examination of stationarity (simply by checking whether the characteristic roots are outside the unit circle). This makes it easier to check stationarity when dealing with AR(p) processes for large $p$-values.
The definition of covariance stationarity for a process is satisfied when the first and second moment exists and are time-invariant. Let us consider an AR(1) process on the form:
$$y_t = \phi y_{t-1} + \varepsilon_t, \qquad \varepsilon_t \sim WN(\mu,\sigma^2),$$
where $\varepsilon_t$ is a Gaussian distributed white-noise process (Gaussianity is an arbitrary assumption, however it is often used for a white-noise process) with constant mean and variance. Using the lag-operator ($L$), we can re-define the process as:
$$y_t-\phi y_{t-1}=\varepsilon_t \quad \Rightarrow (1-\phi L)y_t = \varepsilon_t.$$
In general, the time-series process is stationary if the roots of the characteristic equation ($1-\phi_1 z - \cdots-\phi_p z^p=0$) have a value greater than 1, $|z|>1$ , and thus lie outside of the unit circle (see this link). Related to our example, we can find the characteristic root by solving for $z$ in the characteristic equation ($1-\phi z=0$):
$$|z|=\bigg|\frac{1}{\phi}\bigg| >1 \qquad \iff \qquad |\phi|<1$$
Thus ensuring stationarity in the AR(1) process is equivalent with keeping $|\phi| < 1$ or $-1<\phi <1$. We can verify whether this is true, by calculating the moments of the AR(1) process and observing the parameter restrictions to ensure weak stationarity.
Calculating the mean:
\begin{align*}
y_t &= \phi y_{t-1} + \varepsilon_t\\
&= \phi (\phi y_{t-2} + \varepsilon_{t-1}) + \varepsilon_t\\
&= \phi^2 y_{t-2} + \phi \varepsilon_{t-1} + \varepsilon_{t}\\
&\: \: \vdots\\
&= \sum_{i=0}^{\infty} \phi^{i} \varepsilon_{t-i}
\end{align*}
$$\mathbb{E}\left[y_t\right]= \sum_{i=0}^{\infty} \phi^{i} \mathbb{E}\left[\varepsilon_{t-i}\right] = \sum_{i=0}^{\infty} \phi^{i} \mu = \frac{\mu}{1 - \phi}, \qquad \phi \neq 1$$
implying that $\phi \neq 1$ in order for the first moment to be defined.
Calculating the variance:
\begin{align}
\mathbb{V}ar(y_t)&= \phi^2 \mathbb{V}ar(y_{t-1}) + \sigma^2 \\
&= \phi^2 \left(\phi^2 \mathbb{V}ar(y_{t-2}) + \sigma^2\right) + \sigma^2\\
&= \phi^4 \mathbb{V}ar(y_{t-2}) + \phi^2 \sigma^2 + \sigma^2\\
& \: \: \vdots \\
&= \sum_{i=0}^{\infty} \phi^{2i} \sigma^2\\
&= \frac{\sigma^2}{1-\phi^2}, \qquad |\phi^2|<1.
\end{align}
This further implies that $|\phi|<1$ and conclusively $|1/z|<1 \iff |z|>1$ in order for the variance to be defined.
The autocovariance:
Calculating the autocovariance for each $k\geq 0$ we can observe a pattern:
\begin{align*}
\gamma(1) &= \mathbb{C}ov\left[y_t, y_{t-1}\right] = \mathbb{Cov}\left[(\mu + \phi y_{t-1} + \varepsilon_t), y_{t-1}\right] = \phi \mathbb{C}ov\left[y_{t-1}, y_{t-1}\right] = \phi\mathbb{V}ar(y_t)\\
\gamma(2) &= \mathbb{Cov}\left[(\mu + \phi y_{t-1} + \varepsilon_t), y_{t-2}\right] = \phi \mathbb{Cov}\left[y_{t-1}, y_{t-2}\right] = \phi \gamma(1) = \phi^2 \mathbb{V}ar(y_t)\\
& \: \: \vdots\\
\gamma(k) &= \phi^k \gamma(0) = \phi^k \mathbb{V}ar(y_t)
\end{align*}
where I've used the fact that the white-noise process is independent of the past lags of $y_t$. Therefore, the autocovariance needs to satisfy the same conditions restricted upon the variance in order exist.
In conclusion, we observe that $|\phi|<1$ and thus $|z|>1$ in order for the moments to exist and conclusively satisfy covariance stationarity for $y_t$.