In this answer, I derive a closed-form expression of $R_x^2 - \Delta R^2$ to analyze its sign. Since calculations below heavily rely on geometry/linear algebra, let's first settle down some notations:
- $x, y, z, d$ are all viewed as column vectors in $\mathbb{R}^n$. By convention, we use $e$ to denote the column vector consisting of all ones in $\mathbb{R}^n$ (i.e., the intercept).
- For a vector $x_0 \in \mathbb{R}^n$, we use $\|x_0\|$ to denote its Euclidean norm.
- For any two vectors $x_1, x_2 \in \mathbb{R}^n$, we use $x_1'x_2$ to denote their inner product.
- The subspace of $\mathbb{R}^n$ spanned by vectors $x_1, x_2, \ldots, x_p$ is denoted by $[x_1, x_2, \ldots, x_p]$. The orthogonal complement space of a subspace $S$ is denoted by $S^\perp$.
- We use $P_M x_0$ to denote the orthogonal projection of $x_0$ onto the subspace $M$. In particular, if $\{q_1, \ldots, q_m\}$ is an orthogonal basis of $M$, then
\begin{align}
P_Mx_0 = \frac{x_0'q_1}{q_1'q_1}q_1 + \cdots + \frac{x_0'q_m}{q_m'q_m}q_m.
\end{align}
With these notations, various R-squares in the question can be expressed by definition as
\begin{align}
& R_{\text{full}}^2 = \frac{\|P_{[e, x, y, z]}d - P_{[e]}d\|^2}{\|P_{[e]^\perp}d\|^2}, \\
& R_{\text{reduced}}^2 = \frac{\|P_{[e, y, z]}d - P_{[e]}d\|^2}{\|P_{[e]^\perp}d\|^2}, \\
& R_{x}^2 = \frac{\|P_{[e, x]}d - P_{[e]}d\|^2}{\|P_{[e]^\perp}d\|^2}, \\
& \Delta R^2 = R_{\text{full}}^2 - R_{\text{reduced}}^2 = \frac{\|P_{[e, x, y, z]}d\|^2 - \|P_{[e, y, z]}d\|^2}{\|P_{[e]^\perp}d\|^2}.
\end{align}
Therefore, requiring $R_x^2 \geq \Delta R^2$ is equivalent to requiring
\begin{align}
\|P_{[e, x]}d - P_{[e]}d\|^2 \geq \|P_{[e, x, y, z]}d\|^2 - \|P_{[e, y, z]}d\|^2. \tag{1}
\end{align}
Let $\{q_1, q_2, q_3\}$ and $\{q_1, q_2, q_3, q_4\}$ be an orthonormal basis of the space $[e, y, z]$ and an orthonormal basis of the space $[e, y, z, x]$ respectively, which can be obtained by the Gram-Schmidt orthogonalization procedure (see, e.g., Algorithm 3.1 in The Elements of Statistical Learning), or equivalently, performing QR decomposition of corresponding matrices. It can be shown that the right hand side of $(1)$ equals to $(d'q_4)^2$. More specifically, if letting
\begin{align}
& z_1 = e, \; q_1 = z_1/\|z_1\|, \\
& z_2 = y - \frac{y'z_1}{z_1'z_1}z_1, \; q_2 = z_2/\|z_2\|, \\
& z_3 = z - \frac{z'z_1}{z_1'z_1}z_1 - \frac{z'z_2}{z_2'z_2}z_2, \; q_3 = z_3/\|z_3\|, \\
& z_4 = x - \frac{x'z_1}{z_1'z_1}z_1 - \frac{x'z_2}{z_2'z_2}z_2 - \frac{x'z_3}{z_3'z_3}z_3, \; q_4 = z_4/\|z_4\|. \tag{G1}
\end{align}
then
\begin{align}
\|P_{[e, x, y, z]}d\|^2 - \|P_{[e, y, z]}d\|^2 = d'(q_1q_1' + \cdots +
q_4q_4')d - d'(q_1q_1' + \cdots + q_3q_3')d = (d'q_4)^2.
\end{align}
Treating the left hand side of $(1)$ similarly, it can be shown that
\begin{align}
\|P_{[e, x]}d - P_{[e]}d\|^2 = (d'\tilde{q}_2)^2,
\end{align}
where
\begin{align}
& v_1 = e, \; \tilde{q}_1 = v_1/\|v_1\|, \\
& v_2 = x - \frac{x'v_1}{v_1'v_1}v_1, \; \tilde{q}_2 = v_2/\|v_2\|. \tag{G2}
\end{align}
Therefore,
\begin{align}
\|P_{[e, x]}d - P_{[e]}d\|^2 - (\|P_{[e, x, y, z]}d\|^2 - \|P_{[e, y, z]}d\|^2)
= (d'\tilde{q}_2)^2 - (d'q_4)^2. \tag{2}
\end{align}
In view of $(2)$, the sign of $R_x^2 - \Delta R^2$ depends on the lengths of the projections of $d$ onto the vector $\tilde{q}_2$ and the vector $q_4$ respectively, hence is in general indefinite. In particular, if $d$ is perpendicular to $\tilde{q}_2$ but is correlated with $q_4$, then $R_x^2 - \Delta R^2 = 0 - (d'q_4)^2 < 0$. One concrete example can be constructed as follows: let $y, z$ be arbitrary $n$-vectors such that $e, y, z$ are linearly independent. Let $z_1, z_2, z_3$ then be calculated by (G1) and define $x = z_1 + z_2 + z_3 + \epsilon$ and $d = -z_2 - z_3 + \epsilon$, where $\epsilon \perp [z_1, z_2, z_3]$ and $\|\epsilon\|^2 = \|z_2\|^2 + \|z_3\|^2$. By (G1) and (G2), it follows that
\begin{align}
& \tilde{q}_2 = \frac{1}{\|z_2 + z_3 + \epsilon\|}(z_2 + z_3 + \epsilon), \\
& q_4 = \frac{1}{\|\epsilon\|}\epsilon.
\end{align}
Hence
\begin{align}
d'\tilde{q}_2 = 0 < d'q_4 = \|\epsilon\|.
\end{align}