I have a dataset which has temperature measurements for every minute in a certain time period.
I want to focus on 10 minute intervals and determine whether two adjacent 10 minute intervals differ significantly. Now this is fairly simple to do, but first I want to find out how big the difference should be for it to be significant.
I tried finding this out by looking at an average difference between 10 minute intervals. This works well enough but I'm wondering is it better to use an average or a min-max kind of difference?
Example:
10, 11, 10, 10, 11, 12, 12, 12, 13, 13 -> Average: 11.4, MinMaxDiff: 3
Let's say something goes wrong with the measurements in the next 10 minutes:
13, 13, 14, 28, 29, 28, 29, 30, 30, 31 -> Average: 24.5, MinMaxDiff: 18
Now in this example obviously both approaches would work, but is one approach generally better than the other?