scalaapache-sparksnowflake-cloud-data-platform

Kill Snowflake queries from Spark Connector


Is there a way to kill Snowflake queries using the Spark connector ? Alternatively is there a way to grab the last last query id or session id in Spark to kill it outside of Spark.

The use case is user controlled long running Spark jobs with long running Snowflake queries. When a user is killing the Spark jobs , the current Snowflake query keeps on running (for many hours )

Thank you


Solution

  • There isn't a direct way to kill Snowflake queries using the Spark connector.

    you can retrieve the last query ID in Spark to manage it outside Spark, later you can kill that with CALL SYSTEM$CANCEL_QUERY('<query_id>');