The following script works perfectly with a file containing 2 rows but when I tried 2500 row file, I got 429 exceptions. So, I increased the query time to 5 seconds. I also filled the user agent. After unsuccessful attempts, I connected to VPN to change 'fresh' but I got 429 errors again. Is there something I am missing here? Nominatim policy specifies no more connections than 1 per second, I am doing one per 5 seconds...any help would be helpful!
from geopy.geocoders import Nominatim
import pandas
from functools import partial
from geopy.extra.rate_limiter import RateLimiter
nom = Nominatim(user_agent="xxx@gmail.com")
geocode = RateLimiter(nom.geocode, min_delay_seconds=5)
df=pandas.read_csv('Book1.csv', engine='python')
df["ALL"] = df['Address'].apply(partial(nom.geocode, timeout=1000, language='en'))
df["Latitude"] = df["ALL"].apply(lambda x: x.latitude if x != None else None)
df["Longitude"] = df["ALL"].apply(lambda x: x.longitude if x != None else None)
writer = pandas.ExcelWriter('Book1.xlsx')
df.to_excel(writer, 'new_sheet')
writer.save()
Error message:
Traceback (most recent call last):
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\site-packages\geopy\geocoders\base.py", line 355, in _call_geocoder
page = requester(req, timeout=timeout, **kwargs)
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\urllib\request.py", line 531, in open
response = meth(req, response)
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\urllib\request.py", line 641, in http_response
'http', request, response, code, msg, hdrs)
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\urllib\request.py", line 569, in error
return self._call_chain(*args)
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\urllib\request.py", line 503, in _call_chain
result = func(*args)
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\urllib\request.py", line 649, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:/Users/u6022697/Documents/python work/Multiple GPS Nom Pandas.py", line 14, in <module>
df["ALL"] = df['Address'].apply(partial(nom.geocode, timeout=1000, language='en'))
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\site-packages\pandas\core\series.py", line 3849, in apply
mapped = lib.map_infer(values, f, convert=convert_dtype)
File "pandas\_libs\lib.pyx", line 2327, in pandas._libs.lib.map_infer
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\site-packages\geopy\geocoders\osm.py", line 406, in geocode
self._call_geocoder(url, timeout=timeout), exactly_one
File "C:\Users\u6022697\AppData\Local\Programs\Python\Python37\lib\site-packages\geopy\geocoders\base.py", line 373, in _call_geocoder
raise ERROR_CODE_MAP[code](message)
geopy.exc.GeocoderQuotaExceeded: HTTP Error 429: Too Many Requests
I've done reverse geocoding of ~10K different lat-lon combinations in less than a day. Nominatim doesn't like bulk queries, so the idea is to prevent looking like one. Here's what I suggest:
Make sure that you only query unique items. I've found that repeated queries for the same lat-lon combination is blocked by Nominatim. The same can be true for addresses. You can use unq_address = df['address'].unique()
and then make a query using that series. You could even end up with less addresses.
The time between queries should be random. I also set the user_agent to have a random number every time. In my case, I use the following code:
from time import sleep
from random import randint
from geopy.geocoders import Nominatim
from geopy.exc import GeocoderTimedOut, GeocoderServiceError
user_agent = 'user_me_{}'.format(randint(10000,99999))
geolocator = Nominatim(user_agent=user_agent)
def reverse_geocode(geolocator, latlon, sleep_sec):
try:
return geolocator.reverse(latlon)
except GeocoderTimedOut:
logging.info('TIMED OUT: GeocoderTimedOut: Retrying...')
sleep(randint(1*100,sleep_sec*100)/100)
return reverse_geocode(geolocator, latlon, sleep_sec)
except GeocoderServiceError as e:
logging.info('CONNECTION REFUSED: GeocoderServiceError encountered.')
logging.error(e)
return None
except Exception as e:
logging.info('ERROR: Terminating due to exception {}'.format(e))
return None
I find that the line sleep(randint(1*100,sleep_sec*100)/100)
does the trick for me.