I just wrote a program that finds all the prime numbers with an upper bound.
The algorithm: Sieve of Eratosthenes.
Wrote it both in C and Java. The upper bound is 666014.
For some reason C gives the result in more than 2.5 seconds, while Java does it in like half a second.
Details:
The array in C is of type char
The array in Java is of type boolean
C IDE: CodeBlocks
Java IDE: IntellijIdea Community Edition
C code:
#include <stdio.h>
int main() {
int n = 666013;
int i;
int k;
char a[n];
for (i = 2; i <= n; i++)
a[i] = 0;
for (i = 2; i <= n; i++)
if ( a[i] == 0 )
{
printf("%d\n", i);
for (k = i + i; k <= n; k += i)
a[k] = 1;
}
return 0;
}
Java code:
package primes;
public class Prime {
public static void main(String[] args) {
long starttime = System.nanoTime();
final int MAXN = 666013;
boolean a[] = new boolean[MAXN];
for (int i = 2; i < a.length; i++)
a[i] = true;
for (int i = 2; i < a.length; i++)
if (a[i])
{
System.out.println(i);
System.out.printf("");
for (int j = i + i; j < a.length; j += i) {
a[j] = false;
}
}
System.out.println(System.nanoTime() - starttime);
}
}
Last Edit: used System.nanoTime() Java gives 0.35 seconds
The C algorithm cannot be any faster. What is the reason Java is faster here?