I have a view in a Django application that take as input a large CSV file. It cycle all the rows and insert datas on DB.
Locally I have no problem but when I deploy my proj online the view give me back a timeout after some time.
My webserver configuration is Nginx + Gunicorn.
I tried to increase timeout on Gunicorn and proxy_connect_timeout
, proxy_read_timeout
on Nginx by writing a large number (120000) and now it's better, I get timeout after about 90 seconds instead of 30 (default for Gunicorn) but still is not what I expected and it's not enough to finish my job.
Also I don't like so much this approach. I don't want infinite timeout for every request.
What's the best approach to deal with long request and not have timeout?
Maybe by answering to user with a message and then run the job in background? Any suggestion?
Thanks.
Using Celery With Django for Background Task Processing that means process the CSV file with celery async task.
OR
As a quick hack, if you don't want to use celery; use multi-threading to process the CSV and save the result of the processing in DB or file and server the result from DB or file.
Note: Never Process big files on main thread; always try to use a different server to process the big files.If different server is not possible then try to process it in background task