node.jscsvdumpfscsv-write-stream

fs dump equivalent in NodeJs?


Objective

Forcing fs (and the libraries using it) to write everything to files before terminating application.

Background

I am writing an object to a CSV file using the npm package csv-write-stream.

Once the library is done writing the CSV file, I want to terminate my application using process.exit().

Code

To achieve the aforementioned objective, I have written the following:

let writer = csvWriter({
  headers: ['country', 'postalCode']
});

writer.pipe(fs.createWriteStream('myOutputFile.csv'));

//Very big array with a lot of postal code info
let currCountryCodes = [{country: Portugal, postalCode: '2950-286'}, {country: Barcelona, postalCode: '08013'}];

for (let j = 0; j < currCountryCodes.length; j++) {
  writer.write(currCountryCodes[j]);
}

writer.end(function() {
  console.log('=== CSV written successfully, stopping application ===');
  process.exit();
});

Problem

The problem here is that if I execute process.exit(), the library wont have time to write to the file, and the file will be empty.

Since the library uses fs, my solution to this problem, is to force a fs.dump() or something similar in NodeJs, but after searching, I found nothing similar.

Questions

  1. How can I force fs to dump (push) all the content to the file before exiting the application?
  2. If the first option is not possible, is there a way to wait for the application to write and then close it ?

Solution

  • I think your guess is right. When you call process.exit(), the piped write stream hasn't finished writing yet.

    If you really want to terminate your server explicitly, this will do.

    let r = fs.createWriteStream('myOutputFile.csv');
    writer.pipe(r);
    
    ...
    
    writer.end(function() {
      r.end(function() {
        console.log('=== CSV written successfully, stopping application ===');
        process.exit();
      });
    });