I have an old Java application working on java 1.6 on Tomcat 6. Due to way how it is set up in the environment, it is hard to do any inner diagnostics - basically I can't touch it so - it is a blackbox.
The app is crashing due to lack of free connections. The limits are set high (max 255 parallel connections) but, even if the number of open connections is like 60, it is still crashing.
Netstat shows that there are a lot of data in recvQ (just an example):
tcp 1464 0 localhost:7076 remote-host1:3120 ESTABLISHED
tcp 2512 0 localhost:7611 remote-host2:3120 ESTABLISHED
tcp 6184 0 localhost:4825 remote-host3:3120 ESTABLISHED
I couldn't find any useful hints about the case (similar issue is here: https://serverfault.com/questions/672730/no-connection-available-in-pool-netstat-recvq-shows-high-number).
The questions:
1) Why the application is not reading all the data received?
2) Because all the data is not read, another connection is opened to the DB. Am I right?
Any ideas will be appreciated.
1) What the application does with the read data? May be it can't write to disk, it's waiting for other conditions, there's a thread lock, etc.
2) New connections are opened because those ones are still in use regardless of the recvQ.
Regarding the number of connections, you should count half closed connections too, these TCP status mean the connection is still active
ESTABLISHED
FIN_WAIT_1
FIN_WAIT_2
TIME_WAIT
On Linux:
netstat -ant | grep -E 'ESTABLISHED|FIN_WAIT_1|FIN_WAIT_2|TIME_WAIT' | sort -k 6,6
To further troubleshoot it's suggested to get thread and/or heap dumps and analyze them.
Another case involving TIME_WAIT.