I've integrated django-import-export in my admin, and it is working as expected. I have created a resource for the model I want to export, and the export works fine for smaller datasets. However, on my (small) production environment, I have around 8k entries that I want to export, and my small server is not able to handle that. Whenever I try to export those records, it keeps loading for a while and I get a 504 error.
Right now my production environment is running on a t3.micro instance set up using AWS Elastic Beanstalk.
How can I manage exporting these many records?
Is there any way I can configure django-import-export to export them in batches? I can't setup a background worker right now, since a client wants these files ASAP, and I don't have enough experience with setting up background jobs to be able to do this in such short notice.
If you have access to the server itself, you could create a management command to export the data programmatically. You could then write this to s3, or to a file, or even write to console and copy the output. If you are short on time, this is probably your best option.
Using the UI, you can export in batches if you enable export via action:
Here, you select which records you want and then export. Though this is probably not practical if you have 8k records. However, if you change the list per page, this might be feasible. For example:
# Note that ExportActionMixin is declared to enable action export
class YourModelAdmin(ExportActionMixin, ImportExportModelAdmin):
resource_classes = [YourResourceResource]
# add 1000 per page so that only ~8 exports required
list_per_page = 1000
Another option for long running commands is django-import-export-celery.
Finally, if you can login to the db, then perhaps a SQL command is the best way to do this.