python-3.xasync-awaittornado

How does tornado deal with concurrency?


The tornado-based web application runs with 4 forking sub-processes as following.

async def post_fork_main():
    app = tornado.web.Application([(r'/', MyHandler)])
    server = tornado.httpserver.HTTPServer(app)
    server.add_sockets(sockets)
    await asyncio.Event().wait()
    
    
if __name__ == '__main__':
    sockets = tornado.netutil.bind_sockets(8080)
    tornado.process.fork_processes(4)
    print("Service is started successfully ...")
    asyncio.run(post_fork_main())

The MyHandler class is

class MyHandler(tornado.web.RequestHandler):
    async def get(self):
        # some other processing code.
        await call_api()   # reply very slowly.
        # some other processing code.
        self.write('OK')

what will happen if there are more than 4 requests to my service when current 4 sub-processes are waiting the response of all_api()? Is the CPU almost idle?


Solution

  • The requests are queued until there is a process that is ready to handle them.

    You can test it for yourself - for example, if the implementation of call_api() was:

    async def call_api():
        print("Dealing with request")
        time.sleep(1)
    

    You could run the following bash code, which will call 5 API requests concurrently (or at least without blocking & waiting for response):

    for i in {1..5}
    do
      curl localhost:8080/ &
    done
    

    Then, you will see the server immediately prints "Dealing with request" four times, and the fifth will be printed only after the one of the first for requests returned a response.