For the sake of simplicity, let's assume that the data really are conditionally Poisson and that $\hat{\theta}_i$ really is the true conditional mean ($\lambda_i$) everywhere. In that case, the variance at every point is $\lambda_i^2 = \hat{\theta}_i^2$. If you divide by the square root of the expected value, you are dividing by the conditional standard deviation and because that value is keyed to each individual conditional mean, you are stabilizing the variance.
So why wouldn't it be "approximately" normal (admitting that the hand-waviness of that carries some of the weight)? Well, the normal distribution goes to infinity in both directions, but the Poisson distribution only goes to positive infinity and is bounded from below by $0$. More specifically, the Poisson distribution is skewed, while the normal is not. However, as $\lambda$ increases, the Poisson distribution becomes less skewed (at the rate of $^1/_{\sqrt{\lambda}}$). So when "$\hat{\theta}_i$ are not too small", that becomes less and less of an issue. In addition, it becomes less and less likely that any data would be in the vicinity of the lower bound, as it is so far away.
The other issue is that the Poisson distribution is discrete, whereas the normal is continuous, so there are 'gaps' in the distribution. First, note that any finite sample will have gaps, no matter how finely measured (and even if the data are drawn from a true normal). Second with higher $\lambda$ there will be many possible values that are meaningfully likely to occur and they are closer together (relatively, albeit not absolutely). Moreover, as there are increasing numbers of conditional distributions, there will be a larger number of possible standardized residual values, which would make the distribution more continuousish. Putting all of the above together, the claim is not too surprising.