javascriptnode.jsfsnode-streamsnode.js-stream

JavaScript heap out of memory while creating a huge file using createWriteStream in node.js FATAL ERROR: Reached heap limit Allocation failed


const fs = require('fs');
const file = fs.createWriteStream('./big.file');


for(let i=0; i<1e8; i++){
    file.write(`llorem ipsum ${i}`);
}

file.end();

The above code tries to create a huge file in node.js using fs.createWriteStream and as far as I know streams are collection of data that might not be available at once and don't have to fit in memory. But when i run my script my memory footprint keeps on increasing eventually causing a JavaScript heap out of memory error. My question is am I missing anything about streams and why this happens if streams were not necessary to fit in memory.

Before Script was run Before Script was run

After script was run enter image description here

Error

<--- Last few GCs --->

[386723:0x6297e70]    42815 ms: Mark-sweep (reduce) 2046.9 (2081.1) -> 2046.2 (2081.6) MB, 1806.1 / 0.0 ms  (+ 94.3 ms in 13 steps since start of marking, biggest step 7.6 ms, walltime since start of marking 1911 ms) (average mu = 0.341, current mu = 0.10[386723:0x6297e70]    45153 ms: Mark-sweep (reduce) 2047.7 (2081.8) -> 2046.9 (2082.1) MB, 1919.1 / 0.0 ms  (+ 123.1 ms in 16 steps since start of marking, biggest step 11.3 ms, walltime since start of marking 2057 ms) (average mu = 0.244, current mu = 0.

<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
 1: 0xb17ec0 node::Abort() [node]
 2: 0xa341f4 node::FatalError(char const*, char const*) [node]
 3: 0xcfe71e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
 4: 0xcfea97 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
 5: 0xee8d35  [node]
 6: 0xef7ab1 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
 7: 0xefad0c v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
 8: 0xec72bb v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [node]
 9: 0x123052b v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [node]
10: 0x16147d9  [node]
Aborted (core dumped)

Solution

  • I believe you'll need to check the return value of write(), stop writing when it is false, and make use of the drain event to indicate when to start writing again.

    Currently your loop just writes indefinitely to the internal buffer without flushing to disk which is why your script causes the memory error.

    Something like this should work:

    const fs = require('fs');
    const file = fs.createWriteStream('./big.file');
    
    const max = 1e8;
    let i = 0;
    
    file.on('drain', function writeAsMuchAsPossible() {
      while (i < max && file.write(`llorem ipsum ${i++}`));
      if (i === max) {
        file.end();
      }
    }).emit('drain')