node.jsstreaminterprocess

Can you consume a NodeJS read stream from another process?


If I open a read stream in nodejs, can I then serve that up via the file system without actually writing a file to disk? So, that another application could specify "somefile.json" and then read the file as if it were a normal file?


Solution

  • Yes. The data could be bounced via a network socket. However, if both processes run on the same Unix host the data may be written to a Named Pipe from node, and then read from another process.

    Named pipes (also called fifos) is a convenient approach to serving cross-process communication through the OS' default filesystem API.

    A named pipe source can be instantiated in NodeJS using a third party lib, and treated as a writable stream sink

    import fifo from "mkfifo";
    import util from "util";
    import crypto from "crypto";
    import stream from "stream";
    import fs from "fs";
    
    (async ()=>{
      await new Promise(resolve => fifo.mkfifo("fifo-file", 0o600, resolve));
      
      await util.promisify(stream.pipeline)(
        function*(){
          for(let i=0;i<100;i++){
            yield JSON.stringify({ id: crypto.randomBytes(6).toString("hex") }) + "\n";
          }
        },
        fs.createWriteStream("fifo-file")
      );
    })()
      .catch(console.error);
    

    The script will be paused on execution (due to backpressure) until some other process reads from the named pipe.