I am trying to create a BigQuery external table by reading a file from Google Drive - it works with inline scheme but failed with autodetect
flag.
Document referenced:
https://cloud.google.com/bigquery/external-data-drive
Schema File:
$ bq mkdef --autodetect --source_format=CSV "https://drive.google.com/open?id=<file-id>" > schema.json
schema.json:
{
"autodetect": true,
"csvOptions": {
"encoding": "UTF-8",
"quote": "\""
},
"sourceFormat": "CSV",
"sourceUris": [
"https://drive.google.com/open?id=<file-id>"
]
}
External Table:
$ bq mk --external_table_definition=schema.json mydataset.mytable
BigQuery error in mk operation: Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found.
It works, with inline schema:
$ bq mk --external_table_definition=col1:INTEGER,col2:STRING@CSV=https://drive.google.com/open?id=<file-id> mydataset.mytable
Table 'myproject:mydataset.mytable' successfully created.
Note: I have enabled Google Drive access by using
gcloud auth login --enable-gdrive-access
The issue actually seems to be related to the authentication. For what it's worth, gcloud
uses different OAuth token than bq
.
I think the best course of action at this point is to:
$HOME/.bigqueryrc
, there is credential_file =
line,credential_file
referenced in the previous step (on Linux/macOS it's probably something like .config/gcloud/...
),gcloud auth --enable-gdrive-access --force
, the OAuth window should ask you for the permission to use GDrive too,If it still doesn't work, you can lookup what scopes are used with your token by previewing the file referenced in credential_file
. It's a simple JSON file, and the scopes is just a list of URIs, there should be one with drive
or drive.read
.