Let's do the math.
Consider zero-mean variables, and the following two regressions:
$$y = \beta x + u$$
$$y = \gamma x + \delta z + v$$
In the first model, we have
$${\rm Var}(\hat \beta) = \frac{\hat \sigma^2_u}{\sum{x^2}} \tag{1}$$
In the second model we have
\begin{align}{\rm V}(\hat \gamma, \hat \delta) &= \hat \sigma^2_u \left[\begin{matrix} \sum x^2 & \sum xz\\ \sum xz & \sum z^2\end{matrix}\right]^{-1} \\
\\
&= \frac{\hat \sigma^2_u}{\left(\sum x^2\right)\left(\sum z^2\right) - \left(\sum xz\right)^2}\left[\begin{matrix} \sum z^2 & -\sum xz\\ -\sum xz & \sum x^2\end{matrix}\right].
\end{align}
The "strange" behavior is when
\begin{align}
{\rm Var}(\hat \gamma) < {\rm Var}(\hat \beta) &\implies \frac{\hat \sigma^2_v\sum z^2}{\left(\sum x^2\right)\left(\sum z^2\right) - \left(\sum xz\right)^2}< \frac{\hat \sigma^2_u}{\sum{x^2}}\\
\\
&\implies \frac{\hat \sigma^2_v}{\hat \sigma^2_u} < \frac{\left(\sum x^2\right)\left(\sum z^2\right) - \left(\sum xz\right)^2}{\left(\sum x^2\right)\left(\sum z^2\right)}\\
&\implies \frac{\hat \sigma^2_v}{\hat \sigma^2_u} < 1- \hat \rho^2_{xz}.
\end{align}
We know that the estimated error variance in the augmented model, $\hat \sigma_v^2$ will be smaller than the corresponding estimate in the short model $\hat \sigma_u^2$.
So if this reduction is bigger (as a percentage and in absolute terms) than the squared correlation of the two variables, we will observe reduced standard error in the larger model.
If the squared correlation is zero ($x$ and $z$ are uncorrelated), then we will certainly get smaller standard deviation of the coefficient of $x$ in the augmented model.
But wait, didn't we just add a lot of independent variation in the model (but presumably related to the dependent variable)? Indeed we did, and this is exactly why "certainty" increased. It is variation that explains.