Locally, I'm able to grant a GAE project access to Drive/Sheets so a Python script accessing Bigquery can access data in Sheets.
I did this by running:
gcloud auth application-default login --scopes=openid,https://www.googleapis.com/auth/userinfo.email,https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/drive,https://www.googleapis.com/auth/bigquery
When my code runs in the cloud, I get
google.api_core.exceptions.Forbidden 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
How do I grant the default credentials in Google Cloud access to Drive/Sheets?
Thanks
You can use the app_engine module in google.auth
. This solution requires you to have enabled bundled API for Python 3 because google.auth.app_engine
makes use of app_identity
which is a bundled API service. A call is also made to memcache bundled API
Sample code to list files in Google Drive is as follows (I tested this in Production and it works). Part of this code is taken from Google's sample found here
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
from google.auth import app_engine
# Create the credentials using the scopes you need
# The call to Credentials optionally accepts a service account. If you
# don't provide one, the default application service account is used
creds = app_engine.Credentials(
scopes=["https://www.googleapis.com/auth/userinfo.email",
"https://www.googleapis.com/auth/drive"]
)
try:
# Create drive api client
drive_client = build('drive', 'v3', credentials=creds)
files = []
# Get all the files in the drive that accessible to the service
# account
response = drive_client.files().list().execute()
for file in response.get('files', []):
print(f'Found file: {file.get("name")}, {file.get("id")}')
files.extend(response.get('files', []))
except HttpError as error:
print(f'An error occurred: {error}')
files = None
return json.dumps(files)