0

I want to test the Java Runtime. I am using the "System.nanoTime()" function to determine the time needed for the program to be executed. My simple Idea is to let a variable count up to 1 Million in a loop and test how long that goes. You guys surely know, that these measurement are often slightly different, because the computer does other stuff too. So I wanted to measure the testruns like 1 thousand times, so I can see an average value of time. Now here comes my problem: the third output is significantly lower and everything else after it comes close to zero, which makes no sense for me. I guessed that this could have been to do with java itself, but, even if I am currently writing a text for university about Java and Python, I could not figure it out. It would be great if somebody has a hint of this puzzle!

public class Runtime {
public static void main(String[] args) {
    int count = 0;
    double timesum = 0;
    for (int i=0; i <1000 ; i++) {
        timesum = timesum + testrun();
        count = count + 1;
    }
    System.out.println(timesum/count);
}
public static double testrun() {
    final long start = System.nanoTime();
    int sum = 0;
    for (int i=0; i<1000000 ;i++){
        sum = sum +1;
    }
    final long finish = System.nanoTime();
    long time = finish-start;
    System.out.println(time*0.000000001);
    System.out.println("------------------");
    return (time*0.000000001);
}

}

The output is something like this:

1752000
0.001752
1514300
0.0015143
45200
4.52E-5
42500
4.25E-5
0
0.0
100
1.0000000000000001E-7
0
0.0
0
0.0
0
0.0
0
0.0
0
0.0
100

etc....

0 Answers0