I try to create a 20MB file, but it throws the error out of memory, set the max-old-space-size to 2gb, but still can someone explain to me why writing a 20mb stream consumes so much memory?
I have 2.3 g.b of free memory
let size=20*1024*1024; //20MB
for(let i=0;i<size;i++){
writeStream.write('A')
}
writeStream.end();
As mentioned in node documentation, Writable
stores data in an internal buffer. The amount of data that can be buffered depends on highWaterMark
option passed into the stream's constructor.
As long as size of buffered data is below below highWaterMark
, calls to Writable.write(chunk)
will return true
. Once the buffered data exceeds limit specified by highWaterMark
it returns false
. This is when you should stop writing more data to Writable
and wait for drain
event which indicates that it's now appropriate to resume writing data.
Your program crashes because it keeps writing even when the internal buffer has exceeded highWaterMark
.
Check the docs about Event:'drain'
. It includes an example program.
This looks like a nice use case for Readable.pipe(Writable)
You can create a generator function that returns a character and then create a Readable
from that generator by using Readable.from()
. Then pipe the output of Readable
to a Writable
file.
The reason why it's beneficial to use pipe here is that :
A key goal of the stream API, particularly the stream.pipe() method, is to limit the buffering of data to acceptable levels such that sources and destinations of differing speeds will not overwhelm the available memory. link
and
The flow of data will be automatically managed so that the destination Writable stream is not overwhelmed by a faster Readable stream. link
const { Readable } = require('stream');
const fs = require('fs');
const size = 20 * 1024 * 1024; //20MB
function * generator(numberOfChars) {
while(numberOfChars--) {
yield 'A';
}
}
const writeStream = fs.createWriteStream('./output.txt');
const readable = Readable.from(generator(size));
readable.pipe(writeStream);