0

I've tried to count request - response time programmatically, and ended up with this code:

function fakeRequest(wait) {
  return new Promise(resolve => {
    setTimeout(() => resolve(wait), wait);
  });
}

function calculateTime(fn, params) {
  const startTime = new Date().getTime();
  fn(...params)
    .then(response => {
      const endTime = new Date().getTime();
      const requestTime = endTime - startTime;
      console.log(`
        Request should take ${response} ms
        Request took ${requestTime} ms
      `);
    });
}

calculateTime(fakeRequest, [2000]);

In this example, we have hardcoded resolve time (2000 milliseconds), and in my understanding, the final result should be the same - 2 seconds. But when I run this code on my machine, it gives me different results between 2000ms and 2003ms.

I'm trying to figure out, where these 3 milliseconds come from:

  1. The reason is the execution time of new Date().getTime(). (but if so, why do we get different results between 2000 and 2003, why it's not the same on every execution?).

  2. The reason is an asynchronous nature of the request, even though it has hardcoded resolve time.

  3. Something else.

I'd like to hear your thoughts and find a way to get a real time of response (2 seconds in this case).

Commercial Suicide
  • 14,875
  • 14
  • 62
  • 80

2 Answers2

1

While Felix is right in that setTimeout can't guarantee the exact callback time, there are some things to note with your code. You aren't calculating the end time as early as possible (after the resolve). My test get slightly closer to the desired time below. My point is that even if your setTimeout was exact, I don't think your log would be correct anyway.

var closeEnd

function fakeRequest(wait) {
  return new Promise(resolve => {
    setTimeout(() => {
      closeEnd = performance.now()
      resolve(wait)
    }, wait);
  })  
}

function calculateTime(fn, params) {
  const startTime = performance.now()
  console.log(startTime)
  fn(...params)
    .then(response => {      
      const requestTime = closeEnd - startTime;      
      console.log(`
        Request should take ${response} ms
        Request took ${requestTime} ms
      `);
    });
}

calculateTime(fakeRequest, [2000]);
Matt Way
  • 30,739
  • 10
  • 76
  • 83
1

If you replace setTimeout(resolve, wait, wait) with resolve(wait) you will get ~5ms. That is probably due to two things:

1) Date.now() returns a non accurate time.

2) Promises are always resolved asynchronously, so there is a small delay until the next engine tick.

So even if setTimeout would be accurate (or if you are not mocking the request) you still wouldn't get an accurate result. And there is no way to. And actually I see no reason why that millisecond would matter.

Jonas Wilms
  • 120,546
  • 16
  • 121
  • 140