pythonapache-sparkexceptionpysparktry-catch

How to import AnalysisException in PySpark


I can't find how to import AnalysisException in PySpark so I can catch it. For example:

df = spark.createDataFrame([[1, 2], [1, 2]], ['A', 'A'])

try:
  df.select('A')
except AnalysisException as e:
  print(e)

Error message:

NameError: name 'AnalysisException' is not defined

Solution

  • You can import it here:

    from pyspark.sql.utils import AnalysisException
    

    This is shown in the error traceback like

    Traceback (most recent call last):
      ...
      File "<string>", line 3, in raise_from
    pyspark.sql.utils.AnalysisException: cannot resolve ...