Recently I was asked to refactor some code that leverages JavaScript's array.reduce()
method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:
http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a
I added the test cases for "while loop array.pop()
assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.
I have researched this quite a bit and have found a lot of conflicting information from over the years. I want to better understand what specificity is causing the high variance measured in the benchmark test linked above. Which leads to this post: Given the benchmark example (linked above and shown below), why would the While Loop test case, measure over 5000x faster than its the For Loop counterpart?
//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
arr[i] = Math.random();
}
// Test Case #1
// While loop, implicit comparison, inline pop code
var i = array.length;
while ( i-- ) {
sum += array.pop();
}
// Test Case #2
// Reverse loop, implicit comparison, inline code
for ( var i = arr.length; i--; ) {
sum += arr[i];
}
*Edited
In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for. I also removed a previous example that was confusing.
pop
version just looks like a bunch of no-ops.