node.jsconnection-pooling

Generic Pool max and min not woking


I'm trying to access Redis through a Node.js server and send data via an API. To reduce stress on the server since this will be called multiple times, I'm using connection pooling with the generic-pool library. I have configured everything, and it's working, but the connection pooling isn't functioning as expected. When I set the max value to 20 and send 30 requests asynchronously, all the requests remain in a pending state. I'm not sure what's going wrong.

Here's the relevant code:

var PASSWORD = config.REDIS_PASSWORD;
var pool = createPool({
  create: () => {
    console.log("Creating new Redis client...");
    return new Promise((resolve, reject) => {
      const client = PASSWORD
        ? redis.createClient({
            host: config.REDIS_HOST,
            port: config.REDIS_PORT,
            password: PASSWORD,
          })
        : redis.createClient({
            host: config.REDIS_HOST,
            port: config.REDIS_PORT,
          });

      client.on("error", (err) => {
        console.error("Redis error:", err);
        reject(err);
      });

      client.on("ready", () => {
        console.log("Redis client ready");
        resolve(client);
      });
    });
  },
  destroy: (client) => {
    console.log("Destroying Redis client...");
    return new Promise((resolve) => {
      client.quit(() => {
        console.log("Redis client destroyed");
        resolve();
      });
    });
  },
  max: 20,
  min: 3,
});

const logPoolStats = () => {
  console.log(`Pool Size: ${pool.size}`);
  console.log(`Available: ${pool.available}`);
  console.log(`Borrowed: ${pool.borrowed}`);
  console.log(`Pending: ${pool.pending}` + "\n");
};

app.post("/executeCommand", async (req, res) => {
  const { redisKey, command, args = [] } = req.body;

  if (!redisKey || !command) {
    console.log("Invalid parameters received:", req.body);
    return res.status(400).json({ error: "Invalid parameters" });
  }

  let argumentsArray = Array.isArray(args) ? args : JSON.parse(args);
  argumentsArray = [redisKey, ...argumentsArray];

  let client;
  try {
    logPoolStats();
    client = await pool.acquire();
    logPoolStats();
    if (typeof client[command] === "function") {
      client[command](...argumentsArray, (err, result) => {
        pool.release(client);
        console.log("Client released back to pool");
        logPoolStats();
        if (err) {
          console.error(`Error executing command ${command}:`, err);
          return res
            .status(500)
            .json({ error: "Failed to execute Redis command" });
        }
        console.log(`Executed command ${command} with result:`);
        return res.json({
          command: command,
          data: result,
        });
      });
    } else {
      console.log(`Invalid Redis command attempted: ${command}`);
      pool.release(client);
      return res
        .status(400)
        .json({ error: `Invalid Redis command: ${command}` });
    }
  } catch (error) {
    if (client) {
      pool.release(client);
    }
    console.error("Failed to execute Redis command:", error);
    return res.status(500).json({ error: "Failed to execute Redis command" });
  }
});

I attempted to increase the maximum value for the connection pool, but even after doing so, the requests are still not being processed as expected. When I send asynchronous requests beyond the maximum limit, they remain in a pending state rather than being executed. This issue persists regardless of how I adjust the pool settings. I'm unsure about the underlying problem and would appreciate any insights or suggestions.


Solution

  • Your problem is client = await pool.acquire()
    You put await on the promise so the main thread is blocked till you get the pool, which means that there's just one connection at work at a time.
    Make it a promise, and build the logic in then block or something similar, but let the main thread keep running and connection switch control faster, for example finishing whatever the process need to do after getting the value back and releasing the pool, so when its can aquire it again it is ready to keep working and take the next command.
    The way you built it is a sync usage of connection pool.

    Plus - the opts e.g. max min should be a separate object passed to create as a second param.
    From docs:

    const opts = {
      max: 10, // maximum size of the pool
      min: 2 // minimum size of the pool
    };
    
    const myPool = genericPool.createPool(factory, opts);