pythongoogle-cloud-storagetransfer

Google Cloud Storage Transfer with a secured tsv url list file


I am trying to transfer the content or a publicly available url on a GCS bucket.

For that, I use the Google Cloud Storage Transfer api, and it requires me to perform two steps:

  1. To create a .tsv file containing the list of my public URLS.
  2. To create the transfer (here using the python api).

To launch the script, I use a service account that has Storage Object Admin permission on both the bucket containing the transfer.tsv file as well as the sink bucket.

I can only seem to make it work when the transfer.tsv file is uploaded to a bucket that is open on the internet.

Do you know if it is possible to put is on a secured bucket, and to give permission to the service account that creates the transfer ?

So far, all my tries yielded the following error.

error

PERMISSION_DENIED   1   
https://storage.googleapis.com/my-private-bucket/transfer.tsv   
Received HTTP error code 403.   

transfer.tsv

TsvHttpData-1.0
https://image.shutterstock.com/image-photo/portrait-surprised-cat-scottish-straight-260nw-499196506.jpg

python script

from google.cloud import storage_transfer
from datetime import datetime

def create_one_time_http_transfer(
    project_id: str,
    description: str,
    list_url: str,
    sink_bucket: str,
):
    """Creates a one-time transfer job from Amazon S3 to Google Cloud
    Storage."""

    client = storage_transfer.StorageTransferServiceClient()

    # the same time creates a one-time transfer
    one_time_schedule = {"day": now.day, "month": now.month, "year": now.year}

    transfer_job_request = storage_transfer.CreateTransferJobRequest(
        {
            "transfer_job": {
                "project_id": project_id,
                "description": description,
                "status": storage_transfer.TransferJob.Status.ENABLED,
                "schedule": {
                    "schedule_start_date": one_time_schedule,
                    "schedule_end_date": one_time_schedule,
                },
                "transfer_spec": {
                    "http_data_source": storage_transfer.HttpData(list_url=list_url),
                    "gcs_data_sink": {
                        "bucket_name": sink_bucket,
                    },
                },
            }
        }
    )

    result = client.create_transfer_job(transfer_job_request)
    print(f"Created transferJob: {result.name}")

And I call the function

create_one_time_http_transfer(
        project_id="my-project-id",
        description="first transfer",
        list_url=tsv_url,
        sink_bucket="my-destination-bucket",
    )

Solution

  • Found a way to make it work.

    When I uploaded the transfer.tsv file to the storage, I return the signed url instead of the public url

    from datetime import datetime
    from google.cloud import storage
    
    def upload_to_storage(
        file_input_path: str, file_output_path: str, bucket_name: str
    ) -> str:
        gcs = storage.Client()
    
        # # Get the bucket that the file will be uploaded to.
        bucket = gcs.get_bucket(bucket_name)
    
        # # Create a new blob and upload the file's content.
        blob = bucket.blob(file_output_path)
    
        blob.upload_from_filename(file_input_path)
    
        return blob.generate_signed_url(datetime.now())
    

    This signed url is then passed on to the create_one_time_http_transfer mentioned above.