I have plumbeR API running in cloud run, and I would like to access files in a google storage bucket within the same project. The API runs fine, but I cannot get through the authentication. I am trying to use the googleAuthR and gargle libraries, but I am doing something wrong.
Here is my api.R
#* @get /
function(){
text <- "Hello, this is from a cloud run function...."
return ( text )
}
#* @get /haveToken
function(){
require(googleAuthR)
require(googleCloudStorageR)
require(gargle)
text <- paste0("Do I have a token: ",gar_has_token())
#gcs_auth(token = token_fetch())
gar_auth( gar_gce_auth_default() )
text <- paste0(text,"<br>Do I have a token: ",gar_has_token())
return ( gcs_list_buckets(projectId = "<MYPROJECT>") )
}
My dockerfile is:
FROM gcr.io/gcer-public/googlecloudrunner:master
COPY api.R .
ENTRYPOINT ["R", "-e", "pr <- plumber::plumb(commandArgs()[4]); pr$run(host='0.0.0.0', port=as.numeric(Sys.getenv('PORT')))"]
CMD ["api.R"]
docker build -t gcs_test1 .
docker image tag gcs_test1:latest gcr.io/<MYPROJECT>/gcscr:latest
gcloud run deploy gcs-test1 --image=gcr.io/<MYPROJECT>/gcscr:latest --platform managed --allow-unauthenticated --service-account=gcs-sa@<MYPROJECT>.iam.gserviceaccount.com
my service account currently has editor role, but I would like to cut it down to Storage Admin + Cloud Run Service Agent.
If you have any advice on how to access GCS, I would greatly appreciate it.
Use googleAuthR::gar_gce_auth()
to get a token from a machine running on the Google Cloud