Your data themselves will not actually be normal. With enough data you would reject normality; this doesn't tell you whether or not your probability calculations would be useful.
It's possible to collect some information on-line that can help to detect particular kinds of non-normality (skewness and kurtosis can be updated fairly easily and can detect some kinds of non-normality -- though these measures would be blind to other forms of non-normality that could be relevant to the kind of calculation you want to apply). I wouldn't suggest formal hypothesis testing using these (nor indeed using any other measure), especially not repeated every period.
However, it sounds like you expect your process to be at least stationary, and given the calculations you want to use it for, actually independent and identically distributed (and if you don't, you need to consider a different approach to what you have in your question anyway), so I am going to suggest an alternative -- run the process for a while to collect min, max, mean and sd and use that information to set up bins for a histogram estimator of the distribution (with as many bins as you can reasonably manage; depending on your needs it should be feasible to have a small proportion in open ended bins at each end, and the more you can afford to put there the more detail you have have in the middle for a fixed number of bins). You can then get approximate cumulative probabilities (cdf or survivor functions) from the histogram estimate of the pdf.
This totally avoids the need to assume normality (as well as providing more useful information to actually assess how good an approximation it might be).
I'd also suggest updating the first few terms of an ACF to at least have some idea if you actually fail to have iid (in the most obvious of ways); though you may want to exponentially weight it if the process might change over time. [Such a possibility would apply to your means, variances, etc, and even to your histogram estimator, so you may want to consider some way to allow for that with those, too]