javamultithreadingtomcatthreadpoolexecutorforkjoinpool

Additional thread pools within Tomcat apps


I work with a Java webapp that runs with Apache Tomcat. The max threads for the Tomcat thread pool is 800, and the minSpareThreads is 25. While it runs, it usually sits at around 400 running threads at a given time.

Let's say I have a computationally expensive, non-blocking task that I have to do in my Tomcat app, in which the ForkJoinPool.commonPool is used to solve the task more efficiently.

Because my Apache Tomcat app already has a large thread pool in it, does the Tomcat thread pool reduce the performance gains I would get from using a ForkJoinPool (or any thread pool, for that matter) in my Tomcat app? Could the performance costs of running the Tomcat threadpool along side a ForkJoinPool negate the performance gains of using a ForkJoinPool because now, there are going to be way more threads than there are CPUs?

Is adding any sort of additional thread pool to an Apache Tomcat app bad for the performance of the entire application?


Solution

  • It's hard to give a general answer to this question, because it depends so much on the specific workload. There's no substitute for testing and profiling your own application. Here are some things to think about, however.

    Running a CPU-bound task in a separate thread pool isn't guaranteed to give you any performance benefit at all. There are two main reasons it might be beneficial:

    1. When one thread submits a task to an executor to run on a separate thread, it can then continue on to do other work concurrently.
    2. If the task can be broken down into multiple sub-tasks that are each run on separate threads, you can take advantage of parallel processing on multiple CPU cores to get it done faster.

    The costs of having more threads are:

    1. Memory allocated to each thread.
    2. Latency caused by context switching when the OS reallocates CPU time from one thread to another.

    The defaults for the Tomcat request thread pool size are based on the common situation that threads are spending a lot of time blocking on I/O: reading the request over the network, making database queries and updates, and writing the response back to the client. This means that these threads can't make use of all available CPU time and it is beneficial to have a lot more threads than CPU cores so that blocked threads can be preempted by ones that need CPU time.

    So, a big question is what is invoking these tasks: is it a request thread? If so, what does that request thread do while the task is in progress? Is it making blocking I/O calls? Is it just waiting for the task to complete?

    If most of your requests are invoking one of these CPU-intensive tasks and then blocked waiting for it to complete, and if these tasks are not split up to run in parallel on multiple cores, then you might not get any benefit from running tasks in a separate thread pool. It might be better to avoid the overhead of context switching and run these tasks on the request thread. If most of the requests handled by your service run this type of task, then you might want to reduce the number of request threads in the Tomcat thread pool, because your actual concurrency will be limited by the available CPU time. You could end up with a large number of waiting threads and high response latency. These requests might then time out on the client side, wasting a lot of server resources on requests that ultimately fail.