node.jsasync-awaitstreamasync-iterator

Does the Node.js "request" library support an async-iterable response stream?


I'm somewhat new to Node.js libraries and I'm trying figure how to use async iteration over an HTTP response stream. My overall goal is to read a large response stream and process it as chunks arrive, currently via a generator function. I cannot store the entire response in memory for processing.

I'm using the request library to execute the HTTP request as follows.

const request = require("request");

// contrived chunk-by-chunk stream processing 
async function* getChunks(stream) {
  for await (const chunk of stream) {
    yield chunk[0];
  }
}

async function doWork() {
  var response = request.get("https://pastebin.com/raw/x4Nn0Tby");
  for await (c of getChunks(response)) {
    console.log(c);
  }
}

When I run doWork(), I get an error stating that the stream variable of getChunks() is not async-iterable.

TypeError: stream is not async iterable

This is surprising, as I thought that all readable-streams are generally async-iterable, and that the request library returns a stream when no callback is provided. When I replace request.get(...) with fs.createReadStream(...) to some local file, all works as expected.

Perhaps the request library doesn't support this. If so, what do I need to do to process HTTP response streams via async-iteration?

Using Node.js 11.13 and request 2.88.0.


Solution

  • I did some more experimenting with the request and request-promise-native libraries and don't think this is possible under the current implementation. The resulting stream does not appear to be async-iterable at all. Furthermore, a proper implementation will need to await for the response to return before processing the stream (as suggested by @JBone's answer). But if you call await request.get(...), you retrieve the entire contents of the response, which is undesirable for large responses.

    const r = require("request");
    const rpn = require("request-promise-native");
    
    // contrived chunk-by-chunk stream processing 
    async function* getChunks(stream) {
      for await (const chunk of stream) {
        yield chunk[0];
      }
    }
    
    async function doWork() {
      const url = "https://pastebin.com/raw/x4Nn0Tby";
      const response = r.get(url);         // returns a non-async-iterable object.
      const response2 = await rp.get(url); // returns the contents of url
    
      for await (c of getChunks(response)) {  // yields response not async-iterable error.
        console.log(c);
      }
    }
    

    My solution to this problem was to replace usage of request and request-promise-native with the axios library. The libraries are functionally similar, but axios allows you to specify that a request should resolve to a stream; as expected, the stream is async-iterable.

    const axios = require("axios");
    
    async function doWork() {
      var response = await axios.request({
        method: "GET",
        url: "https://pastebin.com/raw/x4Nn0Tby",
        responseType: "stream",
      });
    
      for await (c of getChunks(response.data)) {  // async-iteration over response works as expected.
        console.log(c);
      }
    }