I have between 1000-2000 webpages to download from one server, and I am using go routines and channels to achieve a high efficiency. The problem is that every time I run my program up to 400 requests fail with the error "connection reset by peer". Rarely (maybe 1 out of 10 times), no requests fail.
What can I do to prevent this?
One thing that is interesting is that when I ran this program on a server in the same country as the server the website is hosted in, 0 requests failed, so I am guessing there is some problem with delay (as it is now running on a server on a different continent).
The code I am using is basically just a simple http.Get(url) request, no extra parameters or a custom client.
The message connection reset by peer
indicates that the remote server sent an RST
to forcefully close the connection, either deliberately as a mechanism to limit connections, or as a result of a lack of resources. Either way you are likely opening too many connections, or reconnecting too fast.
Starting 1000-2000 connections in parallel is rarely the most efficient way to download that many pages, especially if most or all are coming from a single server. If you test the throughput you will find an optimal concurrency level that is far lower.
You will also want to set the Transport.MaxIdleConnsPerHost
to match your level of concurrency. If MaxIdleConnsPerHost
is lower than the expected number of concurrent connections, the server connections will often be closed after a request, only to be immediately opened again -- this will slow your progress significantly and possibly reach connection limits imposed by the server.