I need to copy data across Google Cloud Platform - Cloud Storage(GCS) buckets (source is a GCS bucket and destination is a GCS bucket) Since I perform copy along some more operations in small batches I use the gsutil cp command from bash shell script
The exact command I use is as follows
# objpaths_file has object paths as gs://source_bucket/obj1, ...
objlist=objpaths_file
cat $objlist| gsutil -m cp -I gs://target_bucket
The objects to be copied have custom metadata fields. This way of copying objects using "gsutil cp" does copy custom metadata key values, provided the metadata key has an associated non null value In case a custom metadata key has null value then the copied metadata does not have that key in the destination (the key with null value is dropped from copy)
So my questions are
And another less relevant question :-)
Thanks for your response
Yogesh
since gsutil cp command loses metadata
use the gcloud storage cp command instead.
Below command will copy all metadata fields including keys with NULL values
** gcloud storage cp gs://<source bucket>/objectname gs://<target bucket>/
**
and gcloud storage cp allows multiple arguments to behave like the gsutil batch mode
so I could use
** gcloud storage cp gs://<source bucket>/object1 gs://<source bucket>/object2 ...<I tried 1000 objects> gs://<target bucket>/
**