If I run the same query in the gcp biguqery console UI , it works but with airflow it fail with the error
google.api_core.exceptions.BadRequest: 400 EXPORT to AWS S3 is only supported for tables present in BigQuery Omni AWS regions.; reason: invalidQuery, message: EXPORT to AWS S3 is only supported for tables present in BigQuery Omni AWS regions.
from airflow import DAG
from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator
with DAG(
dag_id="sql_dag",
schedule=None
):
query = """
EXPORT DATA WITH CONNECTION `aws-eu-west-1.export_to_s3`
OPTIONS(uri=\"s3://XXXXXXX/export_1/*.json\", format=\"json\")
AS select 10,10
"""
BigQueryInsertJobOperator(
task_id=f"bq_extract",
location="EU",
gcp_conn_id="bq",
configuration={
"query": {
"query": query,
"useLegacySql": False,
}
}
)
you must disable location for EXPORT DATA query using a pre-existing bigquery external connection.
So remove or comment location arg
# location="EU",