I am trying to run the following code,
for parname in parss:
data = {'action': 'listp', 'parish': parname}
data = urllib.urlencode(data)
req = urllib2.Request('http://www.irishancestors.ie/search/townlands/ded_index.php', data)
response = urllib2.urlopen(req)
but i get the error below few minutes after the code gets executed
urllib2.URLError: <urlopen error [Errno 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
This is my proxy settings.
Any help is highly appreciated
As discussed in the comments, executing large numbers of requests in very short time can lead to the server, especially web-servers, to block your connection attempts.
This is a common counter measure to automated attacks on the web. Depending on the server, waiting very short amounts of time between requests should solve your problem.
You could also use a more dynamic approach. First, execute as much requests as possible with no waits in between. If a request takes significantly longer than usual it is most likely a timeout and you have to wait. At this point, you cancel your request, wait and try again. If the subsequent try also results in a timeout you double the waiting time. With this procedure, called adaptive backoff, you should be (hopefully) able to access the data you want with minimal overhead.