I'm working with AWS S3 buckets and using Boto3 in Python. I need to find the size of each bucket to monitor their storage usage. I have found some examples using the boto3 library, but I'm not sure how to modify the code to retrieve the size of each individual bucket.
I currently have the following code snippet that lists the buckets:
import boto3
s3 = boto3.resource('s3')
buckets = list(s3.buckets.all())
for bucket in buckets:
print(bucket.name)
But this only lists the bucket names. How can I extend this code to calculate and print the size of each bucket, or is there an alternative approach to achieve this using Boto3?
Any assistance or code examples would be highly appreciated. Thank you!
Amazon CloudWatch automatically collects metrics on Amazon S3, including BucketSizeBytes
:
The amount of data in bytes stored in a bucket in the STANDARD storage class, INTELLIGENT_TIERING storage class, Standard - Infrequent Access (STANDARD_IA) storage class, OneZone - Infrequent Access (ONEZONE_IA), Reduced Redundancy Storage (RRS) class, or Glacier (GLACIER) storage class. This value is calculated by summing the size of all objects in the bucket (both current and noncurrent objects), including the size of all parts for all incomplete multipart uploads to the bucket.
See: Monitoring Metrics with Amazon CloudWatch - Amazon Simple Storage Service