pythonapache-sparkpysparkapache-spark-sqldatabricks

How to handle an AnalysisException on Spark SQL?


I am trying to execute a list of queries in Spark, but if the query does not run correctly, Spark throws me the following error: AnalysisException: "ALTER TABLE CHANGE COLUMN is not supported for changing ...

This is part of my code (i'm using python and Spark SQL on Databricks):

for index, row in df_tables.iterrows():
  query = row["query"]
  print ("Executing query: ")
  try:
      spark.sql(query)
      print ("Query executed")
  except (ValueError, RuntimeError, TypeError, NameError):
      print("Unable to process your query dude!!")
  else:
      #do another thing

Is there any way to catch that exception? ValueError, RuntimeError, TypeError, NameError seems not working. There's no so much information about that in the Spark webpage.


Solution

  • You can modify the try except statement as below :

    try:
      spark.sql(query)
      print ("Query executed")
    except Exception as x:
      print("Unable to process your query dude!!" + \
            "\n" + "ERROR : " + str(x))