I have two GCloud accounts, consider x & y.
I ran the command gcloud config set account x
as only account x has access to that particular gcloud project.
But everytime I run a local job task such as:
python -m trainer.task --bucket=bucket-name --output_dir=outputdir --job-dir=./tmp --train_steps=200
I get the following error:
tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "y does not have storage.objects.create access to bucket-name."
}
],
"code": 403,
"message": "y does not have storage.objects.create access to bucket-name."
}
}
It seems to me that the command line is accessing the y account even though I am logged into the x account. I double checked that I am logged into the right account with access to the right project.
I ran into this issue trying to upload some files to GCS through an API.
It seems that APIs use the default account and not the account you are signed in to.
gcloud auth application-default login
Followed by the auth flow.
That fixed it for me like @joan grau noel mentioned. I had to go through a new auth flow to allow Auth Library access via the browser. After changing the default service account to the one I needed the errors stopped.