I am using HttpClient
of Java 11 to post the request to an HTTP2 server. The HttpClient Object is created as a Singleton Spring bean as shown below.
@Bean
public HttpClient getClient() {
return HttpClient.newBuilder().version(Version.HTTP_2).executor(Executors.newFixedThreadPool(20)).followRedirects(Redirect.NORMAL)
.connectTimeout(Duration.ofSeconds(20)).build();
}
I am using the sendAsync method to send the requests asynchronously.
When I try to hit the server continuously, I am receiving the error after certain time "java.io.IOException: too many concurrent streams". I used Fixed threadpool in the Client building to try to overcome this error, but it is still giving the same error.
The Exception stack is..
java.util.concurrent.CompletionException: java.io.IOException: too many concurrent streams
at java.base/java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:367) ~[?:?]
at java.base/java.util.concurrent.CompletableFuture.uniComposeStage(CompletableFuture.java:1108) ~[?:?]
at java.base/java.util.concurrent.CompletableFuture.thenCompose(CompletableFuture.java:2235) ~[?:?]
at java.net.http/jdk.internal.net.http.MultiExchange.responseAsyncImpl(MultiExchange.java:345) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.MultiExchange.lambda$responseAsync0$2(MultiExchange.java:250) ~[java.net.http:?]
at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1072) ~[?:?]
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) ~[?:?]
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705) ~[?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.base/java.lang.Thread.run(Thread.java:834) [?:?]
Caused by: java.io.IOException: too many concurrent streams
at java.net.http/jdk.internal.net.http.Http2Connection.reserveStream(Http2Connection.java:440) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.Http2ClientImpl.getConnectionFor(Http2ClientImpl.java:103) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.ExchangeImpl.get(ExchangeImpl.java:88) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.Exchange.establishExchange(Exchange.java:293) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.Exchange.responseAsyncImpl0(Exchange.java:425) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.Exchange.responseAsyncImpl(Exchange.java:330) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.Exchange.responseAsync(Exchange.java:322) ~[java.net.http:?]
at java.net.http/jdk.internal.net.http.MultiExchange.responseAsyncImpl(MultiExchange.java:304) ~[java.net.http:?]
Can someone help me in fixing this issue?
The server is Tomcat9 and its max concurrent streams are the default.
When I try to hit the server continuously
The server has a setting for max_concurrent_streams
that is communicated to the client during the initial establishment of a HTTP/2 connection.
If you blindly "hit the server continuously" using sendAsync
you are not waiting for previous requests to finish and eventually you exceed the max_concurrent_streams
value and receive the error above.
The solution is to send concurrently a number of requests that is less than max_concurrent_streams
; after that, you only send a new request when a previous one completes.
This can easily implemented on the client using a Semaphore
or something similar.