node.jsamazon-s3zipnodejs-streamarchiverjs

Archiver combine/pipe with PassThrough(), append and finalize not a function - Nodejs Stream


So, I am downloading files with the help of Axios stream, zipping them with archiver then I want to upload the zip to s3 bucket. At first, I saved zip in local directory, everything worked fine like this. I used multipipe lib to combine streams. Combine archiver zip stream with fs.createWriteStream

 function uploadFromStream(name) {
   const output = fs.createWriteStream(__dirname + `/${name}.zip`);

   const zip= combine(archive, output);
    return zip
 }

At the axios end

zip.append(response.data.pipe(new PassThrough()), { name: name });

Everything worked as expected, Zip is getting saved with all files in it.

Then I researched and look for how to upload to s3 with stream. I found this way. Pipe a stream to s3.upload()

inputStream .pipe(uploadFromStream(s3));

function uploadFromStream(s3) {
  var pass = new stream.PassThrough();

  var params = {Bucket: BUCKET, Key: KEY, Body: pass};
  s3.upload(params, function(err, data) {
    console.log(err, data);
  });

  return pass;
}

I implemented this with archiver like this.

function uploadFromStream(bucket, destination) {
  var pass = new stream.PassThrough();

  var params = { Bucket: bucket, Key: destination, Body: pass };
  S3.upload(params, function (err, data) {
    console.log(err, data);
  });

  const s3Stream = combine(archive, pass);

  return s3Stream;
}

Now when I try to append I am getting error append not a function same when finalize is not a function. When I check I found that s3Stream.append and s3Stream.finalize are undefined. I dont think It should happen but I guess PassThrough is removing functions from combined stream.

And piping streams like this also did not work.

  return archive.pipe(pass);

Solution

  • So after logging streams many times. I think pipe does not combine functions of two streams, but only creates a flow between two streams. So I just need to pipe passThrough and archiver then return archiver to append files to it.

    function uploadFromStream(bucket, destination) {
      const pass = new stream.PassThrough();
    
      const params = { Bucket: bucket, Key: destination, Body: archive };
    
      S3.upload(params, function (err, data) {
        console.log(err, data);
      });
      archive.pipe(pass);
    
      return archive;
    }
    

    By doing this I am not facing the above mentioned error. My bad was working with streams first time