I have been trying to simulate a simple problem of taking about 100-1000 Ar molecules in a NVE (fixed vol, energy) system with equal speed but randomized velocity, and evolving them to obtain a Maxwellian speed distribution. I am choosing a Lennard Jones pair potential with no cutoff (despite being computationally expensive), periodic boundary conditions, and a varying integration time step to keep numerical error on energy to below 2%. I observe that my system reaches Maxwellian-like speed distribution (visually looks like a good Gaussian) from the initial rather uniform speed distribution. But very soon after some (about 10 in 100) particles accelerate to very large speeds even if my total energy is being held constant (to within 2% of initial energy) skewing the distribution to one side.
I have tried to vary the no. of particles, increased temperature to allow re-equilibration of what appears like a growing fluctuation but can't make this effect go away. I am puzzled because I am managing to keep total energy with a 2% bound. Has anyone else has observed such long-time behavior in simple MD simulations, and/or is it well-known behavior? I am using the velocity Verlet scheme. I am using a well-scaled system (i.e., not trying to code actual atomic dimensions) and using a reasonable well distributed initial condition.