djangoamazon-web-servicespython-django-storages

django-storages with multiple S3 Buckets


I am using AWS and I will have different buckets in my application. I am also using Django-Storages. Is there a way to specify which bucket I want to upload the file to (for example, as a parameter in the Save() function or whatever?)

I saw this Django - Error importing storages.backends, but I don't understand how it should be used?!


Solution

  • S3BotoStorage takes the bucket name as a parameter. If not given it will use the AWS_STORAGE_BUCKET_NAME setting. That means if you want to make S3BotoStorage the default storage backend with DEFAULT_FILE_STORAGE then it must use the default bucket.

    However you can also assign a storage on a field level:

    from django.db import models
    from storages.backends.s3boto import S3BotoStorage
    
    class MyModel(models.Model):
        file_1 = models.FileField() # Uses default storage
        file_2 = models.FileField(storage=S3BotoStorage(bucket='other-bucket'))
    

    Edit:

    Comments are getting out of hand so I'll update my answer. Changing the parameters of the storage backend on an instance basis is not something that the Django storage API was designed to do. The storage backend does not have knowledge of the model instance because the storages can be used outside the context of a model such as with static files. Not completely unreasonable but it's not a usage that Django or django-storages was intended to solve. I don't expect you aren to find a drop in storage backend that will handle this for you.

    The docs describe how you can manage files manually: https://docs.djangoproject.com/en/1.9/topics/files/#storage-objects At a minimum you would need store the bucket where you saved the file somewhere so that you can find it later when you query the model.