I am trying to synchronize music played on different computers. I've been collecting quantitative information to try to identify sources of asynchrony.
I would like to know how far apart in time two sounds need to be in order for people to think that they are playing at the same time. In other words, how imprecise can my synchronization be in order to sound like synchronization?
This obviously depends on the sort of music that is being played, so I think I'm looking for the lowest number, which I think would correspond to especially fast and crisp music.
October 6 update
Let's try asking it a different way. I've been looking for all of the possible sources of variable lag and logging them. Since the music still sounds off, at least one of the following must be true
- There's a bug in my code.
- I'm missing a source of lag.
- A difference of one centisecond in the playing of two sounds is enough for people to hear them as different.
One centisecond is one beat at 6000 bpm, or about one sixty-fourth of a beat at 94 bpm.
October 10 update
Hmm I could also test this by offsetting a few sounds by a few centiseconds in some sound program, encoding them and then playing them.
And people seem to think that I'm streaming music or that I'm asking about sources of latency. I'm not streaming music, but I'll keep in mind everyone's thoughts on where the latency may be coming from.