2

I am measuring time of a computer operation. The operation should run roughly same time each time I measure it. How many times should I measure it to get good average and standard deviation?

originally posted here: https://physics.stackexchange.com/questions/64917/how-many-measurements-should-be-done , but I think that this forum is more appropriate for my question

  • Please note that reposting across SE fora is discouraged. The correct thing to do if you post in one place but think it belongs in another is to flag it and ask for it to be moved. – Glen_b May 18 '13 at 03:07

2 Answers2

1

The easy rule of thumb is that more is better. If it's cheap and easy, take more measures. It depends on how much variance there is between the measures - that is, how much measurement error there is, and how accurately you want to measure it. If the measures are always exactly the same, then one will do it. If they are widely different, you'll need more.

The problem has existed for a long time in psychometrics - how many questions do I need to ask someone to get an idea of their (say) math ability?

The answer is given by the Spearman Brown prophecy formula: http://en.wikipedia.org/wiki/Spearman%E2%80%93Brown_prediction_formula This tells you how the reliability increases as a function of increasing the number of measures.

Jeremy Miles
  • 17,812
1

That really depends on your definition of 'good'.

What kinds of things do you need to say?

If sufficient moments exist, the variance of the mean will decrease as $\frac{1}{n}$. ... and the variance of the variance and of the standard deviation also decreases as $\frac{1}{n}$ (for large $n$, anyway - e.g. for the variance of the variance you end up with $1/n \times$ <something that eventually goes to a constant> ).

So, for example, if you specify the width of a confidence interval you require, you can compute the approximate $n$ you should need to obtain it, as long as you have some idea of the quantities involved.

Glen_b
  • 282,281