The context behind this question is that I am taking an image buffer, compressing it with pngquant, and then piping the compressed image to the response. Something like:
// https://www.npmjs.com/package/pngquant
const PngQuant = require('pngquant');
// start with base64-encoded png image data:
var base64data = '.......';
// then create buffer from this, as per:
// https://stackoverflow.com/a/28440633/4070848
// https://stackoverflow.com/a/52257416/4070848
var imgBuffer = Buffer.from(base64data, 'base64');
// set up pngquant...
const optionsArr = [ ..... ];
const myPngQuanter = new PngQuant(optionsArr);
// convert buffer into stream, as per:
// https://stackoverflow.com/a/16044400/4070848
var bufferStream = new stream.PassThrough();
bufferStream.end(imgBuffer);
// pipe the image buffer (stream) through pngquant (to compress it) and then to res...
bufferStream.pipe(myPngQuanter).pipe(res);
I want to determine the compression ratio achieved by the pngquant operation. I can easily find the starting size with:
const sizeBefore = imgBuffer.length;
I also need the size of the compressed stream. Furthermore, this information must be available before the stream is piped to the res
destination because I need to add a header to res
based on the compression stats.
To get sizeAfter
I've tried the length-stream module, where you can insert a listener into the pipe (between myPngQuanter
and res
) to determine the length as it passes through. Whilst this does seem to work to determine the length of the compressed stream, it doesn't happen in time to add any headers to res
. I've also tried stream-length, but cannot get it to work at all.
Any help appreciated.
Well streams by their nature don't really have length information (a stream can be infinite, e.g. opening /dev/random
), so the easiest option I can see is using another temporary buffer. It is unfortunate that pngquant
doesn't have options for operating on buffers, but there is not much you can do about that, besides using a different package altogether.
2nd edit, since stream-buffer might not work:
There is a package called stream-to-array
, which allows easy implementation of a stream-to-buffer conversion. As per the README, the code should be modified to:
const toArray = require('stream-to-array');
const util = require('util');
toArray(bufferStream.pipe(myPngQuanter))
.then(function (parts) {
const buffers = parts
.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);
});
Or alternatively with await
, if you happen to be in an async
context:
const toArray = require('stream-to-array');
const util = require('util');
const parts = await toArray(bufferStream.pipe(myPngQuanter));
const buffers = parts.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);