I had read someone's blog that the high precision random number is hard to get, he said if we want to get a chance in a million, using :
Random(1,1000)*Random(1,1000) < 2 is better than Random(1,1000000)<2 ( we suppose Random(1,n) generate integer numbers between 1 to n )
theoretically, the probilities of two are the same, but what's different action of those expression in a real program? for example rand() function in standard C language, and please give more information.