Say I have a time-series of a parameter $y(t)$. Each value of $y(t)$ has an uncertainty of $\epsilon_y$ due to how it was measured. What is the uncertainty on the calculated rms or standard deviation?
As an example, say I was measuring a voltage signal $V(t)$. The voltage measurement device has a specified uncertainty ($\epsilon_V$) of 0.5%. If I want to say the AC voltage signal had an RMS of 240 volts $\pm \epsilon_{RMS}$, how would I know the uncertainty on the rms?