1

So I've been doing this ProductSearchPage using React and it has a bunch of filter values that I need to set to filter my product list and show results.

Up until now, I've been handling my product list as an array (even though I'm fetching it as an object, I'm converting it to an array) and I've been using lots of map, forEach and A LOT of filter loops over those arrays over and over again.

  • I'll get a productList, I'll filter based on category
  • I'll take the new filteredList and filter based on priceFilters
  • I'll take the new filteredList and filter based on ratingFilter
  • And so on for brandFilter, featuresFilters, etc.

I began to think that I might be creating a black hole of iterations and that might hurt my performance at some point. I'm doing client side searching and filtering. We're talking about 2k products maximum.

So I wondered if it would be faster to iterate and filter over an object instead of an array. I would be deleting properties and creating new objects along the way.

So I did this snippet to test:

And for my surprise the results were a lot in favor of the array loops.

Looping object with for...in: 0.31ms
Looping array with forEach: 0.08ms
Looping array with filter: 0.10ms
Looping array with map: 0.09ms

QUESTION

Is this enough evidence that looping through arrays is faster than looping through objects and I should stick to the forEach, map and filter methods?

NOTE: This is really simplified. In my real case, each product is an object with some properties (some of them are nested properties). So my options are to keep the list as an array of object (like I've been doing so far) or I could keep a big object allProducts with each product as a property of that object. Could this change the results?

const myObject = {};
const myArray = []

for (let i=0; i<=2000; i++) {
  myObject['prop'+i] = i;
}

for (let k=0; k<=2000; k++) {
  myArray[k] = k;
}

const t0 = window.performance.now();

for (const key in myObject) {
  if (myObject[key] % 37 === 0) {
    //console.log(myObject[key] + ' is a multiple of 37');
  }
}

const t1 = window.performance.now();
console.log('Looping object with for...in: ' + (t1 - t0).toFixed(2) + 'ms');

const t2 = window.performance.now();

myArray.forEach((item) => {
  if (item % 37 === 0) {
    //console.log(item + ' is a multiple of 37');
  }
});

const t3 = window.performance.now();
console.log('Looping array with forEach: ' + (t3 - t2).toFixed(2) + 'ms');

const t4 = window.performance.now();

const newArray = myArray.filter((item) => item % 37 === 0);

const t5 = window.performance.now();
console.log('Looping array with filter: ' + (t5 - t4).toFixed(2) + 'ms');

const t6 = window.performance.now();

const newArray2 = myArray.map((item) => item*2);

const t7 = window.performance.now();
console.log('Looping array with map: ' + (t7 - t6).toFixed(2) + 'ms');
cbdeveloper
  • 21,907
  • 18
  • 110
  • 225
  • 1
    It absolutely does not matter: a user will not notice a sub-millisecond difference. Even iterating the array multiple times will be received as immediate. – Bergi Jul 04 '19 at 18:01
  • 1
    I agree that at N <= 2000, worrying about this seems like premature optimization. Go with whatever is easiest to maintain and understand for others reading the code. Most likely that's the same thing you're doing currently. Also, consider the use case. Are you iterating or enumerating? Does order matter? `for..in` won't preserve ordering. See https://stackoverflow.com/questions/500504/why-is-using-for-in-with-array-iteration-a-bad-idea for a lot more discussion on this topic. – thmsdnnr Jul 04 '19 at 18:34
  • You might apply all your filters just in one loop. E.g. `productList.filter(e => category && priceFilters && ratingFilter etc...)` – Kosh Jul 05 '19 at 15:14

0 Answers0