javascriptfor-loopv8slowdown

javascript loop (for statement) slows down after 2.1 billion iteration?


I was trying to benchmark javascript and .net core in order to choose a server side framework for providing some specific restful services which needed to iterate large arrays(about 2.1 billion). while working on a simple code, I realized node has strange behavior after a particular number iteration. I repeated on multiple platforms and reached same result. tested platforms were:

running video shows surprisingly increase process time about two times from 300ms to 600ms

sample codes:

1. nodejs:

var cnt = 0;
var logPeriod=100000000;
var max=10000000000;
for (let i = 0; i < max; i++) {
  if (i % logPeriod === 0) {
    // var end = Date.now();
    if (i !== 0) {
      console.timeEnd(cnt*logPeriod, i);
      cnt++;
    }
    console.time(cnt*logPeriod);
  }
}

2.browser

<!DOCTYPE html>
<html>
  <head>
    <script>
      function doloop() {
        var cnt = 0;
        var logPeriod = 100000000;
        var max = 10000000000;
        for (let i = 0; i < max; i++) {
          if (i % logPeriod === 0) {
            // var end = Date.now();
            if (i !== 0) {
              console.timeEnd(cnt * logPeriod, i);
              cnt++;
            }
            console.time(cnt * logPeriod);
          }
        }
      }
    </script>
  </head>
  <body>
    <button onclick="doloop()">doloop</button>
  </body>
</html>


Solution

  • V8 developer here.

    V8's optimizing compiler generates code that uses plain 32-bit integers for numbers as long as it can. Once a number exceeds int32 range (or precision requirements, i.e. when it needs to hold fractional values), then such optimized code is thrown away (or never generated in the first place) and 64-bit doubles are used instead, as the JavaScript spec requires. Arithmetic operations (even something as simple as i++) are slower on 64-bit doubles than they are on 32-bit integers, that's just what hardware does.

    In terms of behavior, this internal difference is unobservable: numbers always behave as if they were 64-bit doubles. But that doesn't mean that engines actually always use 64-bit doubles under the hood: as you can see here, there is a significant performance benefit when the engine can get away with using 32-bit integers internally.

    choose [JavaScript or .net for] restful services which needed to iterate large arrays(about 2.1 billion)

    That's an easy decision: use .net. V8 (and hence Node) won't let you create arrays with 2.1 billion elements, because the per-object size limit is far lower than that. Sure, var a = new Array(2_100_000_000) will evaluate just fine, but that's because it doesn't actually allocate all that memory. Start filling in elements and watch it crash after a while :-)

    And if your actual arrays won't be that big after all, then please define a benchmark that's closer to your actual workload, because its results will be more representative and hence more useful for your decision-making.