Write a very large array to a file using csv-write-stream and fs stream.
I have a small app that writes a huge array of data (thousands of entries) to a CSV file. To achieve this I use the aforementioned library, which is nothing more than a mask (a convenience) for the fs stream. However, the application crashes during runtime and I don't know why.
The file is created and the stream starts writing to it, but then during the execution I always have the same error:
events.js:141
throw er; // Unhandled 'error' event
^
Error: write after end
at writeAfterEnd (_stream_writable.js:166:12)
at WriteStream.Writable.write (_stream_writable.js:211:5)
at ondata (_stream_readable.js:536:20)
at emitOne (events.js:77:13)
at emit (events.js:169:7)
at Readable.read (_stream_readable.js:368:10)
at flow (_stream_readable.js:751:26)
at WriteStream.<anonymous> (_stream_readable.js:609:7)
at emitNone (events.js:67:13)
at WriteStream.emit (events.js:166:7)
I know this error is related with this piece of code:
let writer = csvWriter({
headers: ['country', 'postalCode']
});
let fsStream = fs.createWriteStream('output.csv');
writer.pipe(fsStream);
//very big array !
currCountryCodes = [{country: 'A', postalCode: 0}, {country: 'B', postalCode: 1 }];
for (let j = 0; j < currCountryCodes.length; j++) {
writer.write(currCountryCodes[j]);
}
writer.end(function() {
fsStream.end(function() {
console.log('=== CSV written successfully, stopping application ===');
process.exit();
});
});
Most specifically to the execution of:
fsStream.end(function() {
console.log('=== CSV written successfully, stopping application ===');
process.exit();
});
But I can't understand why it happens. Somehow, it only happens when the array I am writing has thousands of lines, if I only have a dozen or so, it runs perfectly fine.
From reading the error I get the impression that I am still writing to the file after I closed the stream, but that can't be happening because those functions only run when everything is flushed to the file (right?)
The issue is that you (understandably) assume that when the callback for writer.end()
gets called, no data will be written to fsStream
anymore, which is not the case (as the error suggests).
writer
is a duplex stream, which means it's both readable and writable. By calling the .end()
method, you're telling it that you won't be writing to it anymore. However, that doesn't mean that it still won't be readable.
When you subsequently end fsStream
, there can still be unflushed data in some buffer that will get piped to fsStream
, yielding the error.
You can solve this by listening for the end
event on writer
, which is the readable equivalent of signaling that there's no data left to be read anymore:
writer.end(function() {
writer.on('end', function() {
fsStream.end(function() {
console.log('=== CSV written successfully, stopping application ===');
process.exit();
});
});
});
EDIT: I think that when writer.end()
gets called, fsStream
will automatically be ended as well (so fsStream.end()
is superfluous, and its callback may not get called because the finish
event has already been emitted).
You can listen to the finish
event on fsStream
if you want to make sure that all data has been flushed to the output file:
writer.end(function() {
fsStream.on('finish', function() {
console.log('=== CSV written successfully, stopping application ===');
process.exit();
});
});
Or even:
fsStream.on('finish', function() {
console.log('=== CSV written successfully, stopping application ===');
process.exit();
});
writer.end();