javamultithreadingspark-framework

Concurrency inside embedded Jetty of Spark java


We have a simple Java REST API that is supported by Spark framework.

We have initialized the threads as is indicated in http://sparkjava.com/documentation#embedded-web-server by the following chunk that is being called from the Java main method of our application:

int maxThreads = 8;
int minThreads = 2;
int timeOutMillis = 30000;
threadPool(maxThreads, minThreads, timeOutMillis);

However, we have made some simulations for simultaneous requests and it results that the threads get created but are queued to make HTTP requests sequentially, although we thought that the requests were going to be concurrent.

Is that normal? Is Spark framework usual behavior to avoid the server to handle a configured number of threads but make them to wait in queue to effectively perform the HTTP requests?


Solution

  • I have been investigating this and what I know now is that this is the usual way to proceed specially for tiny web frameworks like Spark Java.

    The reason behind is to regulate the traffic in the server an avoid throttling.