In my Django project, I want to use the MaxMind db file to acquire information about IP-addresses. The file takes up to 90 megabytes. I expect about 50 requests per second and also the application will be hosted on the Gunicorn with 3 workers. So, Will I meet the limits if I open a file every request?
with maxminddb.open_database('GeoLite2-City.mmdb') as mmdb:
info = mmdb.get('152.216.7.110')
return Response({'city': info.city, 'other': 'some_other_information_about_ip'})
or should I open the file in settings.py once:
mmdb = maxminddb.open_database('GeoLite2-City.mmdb')
and use this descriptor in my views.py ?
from django.conf import settings
....
info = settings.mmdb.get('152.216.7.110')
return Response({'city': info.city, 'other': 'some_other_information_about_ip'})
I am concerned about several problems:
with
, you'll be alright.The maxminddb
module will not load the file into memory if it can avoid it (see the documentation). You shouldn't open it in settings
, but in your view.
You can also slap a lru_cache
decorator on the mmdb-accessing code, so each worker will cache up to (e.g.) the 128 most recently used results. And, of course, you can replace that with the Django cache to share it between workers.
@functools.lru_cache(maxsize=128)
def get_ip_info(ip):
with maxminddb.open_database("GeoLite2-City.mmdb") as mmdb:
return mmdb.get(ip)
def my_view(request):
info = get_ip_info(...)
return Response(
{
"city": info.city,
"other": "some_other_information_about_ip",
}
)