To evaluate $E[(X'Y)^2]$ (for clarity, let me use $X, Y$ to denote the two i.i.d. $N_d(\mu, \Sigma)$ random vectors), use the following result (Probability and Measure, Exercise 21.13):
Suppose that $X$ and $Y$ are independent and that $f(x, y)$ is non-negative. Put $g(x) = E[f(x, Y)]$, then $E[g(X)] = E[f(X, Y)]$.
In this case,
\begin{align}
g(x) = E[(x'Y)^2] = \operatorname{Var}(x'Y) + [E(x'Y)]^2 = x'\Sigma x + (x'\mu)^2.
\end{align}
It then follows by the above result that (where we used the expectation of quadratic forms)
\begin{align}
& E[(X'Y)^2] = E[X'\Sigma X] + E[(X'\mu)^2] = E[X'\Sigma X] + \operatorname{Var}(X'\mu) + [E(X'\mu)]^2 \\
=& \mu'\Sigma\mu + \operatorname{tr}(\Sigma^2) + \mu'\Sigma\mu + \|\mu\|^4 \\
=& 2\mu'\Sigma\mu + \operatorname{tr}(\Sigma^2) + \|\mu\|^4.
\end{align}
Therefore, to show your equality, it suffices to show that
\begin{align}
\operatorname{Var}(X'X) = 4\mu'\Sigma\mu + 2\operatorname{tr}(\Sigma^2),
\tag{1}
\end{align}
which is immediate in view of the variance of Gaussian quadratic forms. For a proof to $(1)$, see this answer.
Since the link above didn't treat the non-central case in detail, here is a complete proof to $(1)$.
We first show when $Z \sim N_d(0, \Sigma)$, it holds that
\begin{align}
\operatorname{Var}(Z'Z) = 2\operatorname{tr}(\Sigma^2). \tag{2}
\end{align}
To this end, by Gaussian assumption,
\begin{align}
\operatorname{Cov}(Z_i^2, Z_j^2) =
\begin{cases}
2\sigma_{ij}^4 & i = j, \\
2\sigma_{ij}^2 & i \neq j.
\end{cases} \tag{3}
\end{align}
In view of $(3)$, if denote the covariance matrix of $\xi := (Z_1^2, \ldots, Z_d^2)'$ by $\tilde{\Sigma} = (\operatorname{Cov}(Z_i^2, Z_j^2))$, then $\tilde{\Sigma} = 2\Sigma \circ \Sigma$, where "$\circ$" stands for the
Hadamard product of matrices. It then follows that (let $e$ be the length-$d$ vector of all ones):
\begin{align}
\operatorname{Var}(Z'Z) = \operatorname{Var}(e'\xi) = e'\tilde{\Sigma}e
= 2e'(\Sigma \circ \Sigma)e \color{red}{=} 2\operatorname{tr}(I_{(d)}\Sigma I_{(d)}\Sigma) = 2\operatorname{tr}(\Sigma^2),
\end{align}
where the red equality uses the third identity of Hadamard product properties. Therefore, $(2)$ holds.
For the general $X \sim N_d(\mu, \Sigma)$, write $X = \mu + Z$, where $Z \sim N_d(0, \Sigma)$. Then by $(2)$:
\begin{align}
& \operatorname{Var}(X'X) = \operatorname{Var}(Z'Z + 2\mu'Z) \\
=& \operatorname{Var}(Z'Z) + 4\operatorname{Var}(\mu'Z) +
4\operatorname{Cov}(Z'Z, \mu'Z) \\
=& 2\operatorname{tr}(\Sigma^2) + 4\mu'\Sigma\mu + 4E[Z'Z\mu'Z]. \tag{4}
\end{align}
Comparing $(1)$ and $(4)$, it is easy to see that $(1)$ holds if we can show $E[Z'Z\mu'Z] = 0$. Expanding $Z'Z\mu'Z$ yields
\begin{align}
& E[Z'Z\mu'Z] = \sum_{i = 1}^d \mu_iE[Z_i^3] + \sum_{i = 1}^d\sum_{j \neq i}
\mu_i E[Z_iZ_j^2].
\end{align}
Since $Z_i \sim N(0, \sigma_{ii}^2)$, $E[Z_i^3] = 0$. In addition, when $j \neq i$, since
\begin{align}
\begin{bmatrix} Z_i \\ Z_j \end{bmatrix} \sim
N_2\left(\begin{bmatrix} 0 \\ 0 \end{bmatrix},
\begin{bmatrix}
\sigma_{ii}^2 & \sigma_{ij} \\
\sigma_{ij} & \sigma_{jj}^2
\end{bmatrix} \right),
\end{align}
we have (using the conditional distribution of MVN)
\begin{align}
E[Z_iZ_j^2] = E[E[Z_iZ_j^2|Z_j]] = E[Z_j^2E[Z_i|Z_j]] = \sigma_{ij}\sigma_{jj}^{-2}E[Z_j^3] = 0.
\end{align}
This shows that every term in $E[Z'Z\mu'Z]$ is $0$, whence $E[Z'Z\mu'Z] = 0$. This completes the proof.