For the reason I stated in my comment, I assume what you are really interested is to find an example such that $E[g(X_1)] > E[g(X_2)]$ for some decreasing function $g$, and random variables $X_1$, $X_2$ with $\mu_1 = E[X_1] > \mu_2 = E[X_2]$.
It is very easy to construct such an example using binary random variables -- the idea is that you can tweak the probability mass at $0$ and the range of the other non-zero value and the "speed" how $g$ decreases.
For instance, let $X_1$ follow the distribution $P[X_1 = 0] = 0.9, P[X_1 = 10000] = 0.1$ and let $X_2$ follow the distribution $P[X_2 = 0] = P[X_2 = 900] = 0.5$. Clearly $\mu_1 = 1000 > \mu_2 = 450$. Now take $g(x) = -\sqrt{x}$, $x \geq 0$, then $g$ is decreasing on its domain, and
\begin{align*}
E[g(X_1)] = -100 \times 0.1 = -10 > E[g(X_2)] = -30 \times 0.5 = -15.
\end{align*}
After some modification, the example you proposed can also serve as a counterexample. Instead of using $g(x) = -\log(x)$ (which is undefined if $X_i$ is negative), try using $g(x) = -e^x$. By definition, if $X_i \sim N(\mu_i, \sigma_i^2)$, then $-g(X_i) = e^{X_i} \sim \text{Lognormal}(\mu_i, \sigma_i^2)$,
whence $E[g(X_i)] = -\exp(\mu_i + \sigma_i^2/2)$. Therefore, even when $\mu_1 > \mu_2$, if $\sigma_2^2$ is significantly greater than $\sigma_1^2$, one clearly would see $E[g(X_1)] > E[g(X_2)]$.
In conclusion, examples that meet your constraint are fairly trivial to construct. As @User1865345 mentioned in the comment, a more interesting problem is how the stochastic ordering between two random variables is preserved by expectation: that is, $P[X_1 \leq x] \leq P[X_2 \leq x]$ for all $x$ implies that for any nondecreasing function $g$, $E[g(X_1)] \leq E[g(X_2)]$. A proof of this result can be found in this answer.