I'm using django-storages with amazon s3, with a configuration close to this guide: https://simpleisbetterthancomplex.com/tutorial/2017/08/01/how-to-setup-amazon-s3-in-a-django-project.html
Now I have a situation where I want to rename a lot of files when the model is saved, before implemented the s3 storage backend I simply called os.rename:
os.rename(initial_path, new_path)
Since that obviously doesnt work with django-storages, is there a a way of doing that differently using the storage's capabilities?
As you probably should have found out by now s3 has only create, copy and delete methods. So you should implement a function that would rename: copy-with-new-name & delete-old. You can then put this function by redefining save method of a model OR by implementing django signals for that model.
Pseudo code:
def rename_s3(old_key, new_key):
s3.copy(old_key, new_key)
s3.delete(old_key)
def rename_multiple(dict_of_keys):
for old_key, new_key in dict_of_keys:
rename_s3(old_key, new_key)
Can't remember exactly, but the actual code of copying can look like this:
from boto3.session import Session
session = Session(aws_access_key_id="X", aws_secret_access_key="Y")
s3 = session.resource('s3')
source_bucket = s3.Bucket("bucket-A")
target_bucket = s3.Bucket("bucket-B") # this can be "bucket-A" if you want to copy it in the same bucket
old_key = "path/to/old/file/my.csv"
new_key = "path/to/new/file/my_new_name.csv"
target_bucket.copy(Key= new_key, CopySource={"Bucket": "bucket-A", "Key": old_key})
Solution presented above has advantage of copying files between buckets belonging to different accounts (as long as account-B has read access to account-A).
Note that you can also use accelerated copy using S3.Client.copy, however it does not support copying files between different regions (Note in documentation).
Some nice links:
Boto3 Docs 1.13.21 documentation(S3.Bucket.copy)