I'm experiencing an issue where images uploaded through Django (using boto3) to SeaweedFS's S3 API are corrupted, while uploads through S3 Browser desktop app work correctly. The uploaded files are 55 bytes larger than the original and contain a Content-Encoding: aws-chunked
header, making the images unopenable.
When uploading through S3 Browser desktop app:
When uploading through Django/boto3:
Content-Encoding: aws-chunked
100000
).x-amz-checksum-
Original file size: 12345 bytes
Uploaded file size: 12400 bytes (+55 bytes)
First bytes: 100000...
Last bytes: ...x-amz-checksum-crc32:SJJ2UA==
# Method 1: Using ContentFile
storage.save(path, ContentFile(file_content))
# Method 2: Using Django File object
storage.save(path, File(file))
# Method 3: Direct boto3 upload
client.upload_fileobj(f, bucket_name, path)
aws-chunked
encoding in boto3?Any help or insights would be greatly appreciated 🙏!
import boto3
AWS_ACCESS_KEY_ID=''
AWS_SECRET_ACCESS_KEY=''
API_URL=''
bucket_name = 'sample-bucket'
s3 = boto3.client('s3',
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
endpoint_url=API_URL)
testfile = r"image.png"
s3.upload_file(testfile, bucket_name, 'sample.png', ExtraArgs={'ContentType': 'image/png'})
Follow answer in https://github.com/boto/boto3/issues/4435#issuecomment-2648819900
I added this lines on top of settings.py
file and problem solved.
import os
os.environ["AWS_REQUEST_CHECKSUM_CALCULATION"] = "when_required"
os.environ["AWS_RESPONSE_CHECKSUM_VALIDATION"] = "when_required"