pythonsqlsql-serverpysparkdatabricks

Why does appending data with PySpark raise a "SQLServerException: CREATE TABLE permission denied" exception?


In my Databricks cluster I'm trying to write a DataFrame to my table with the following code:

df.write.jdbc(url=JDBCURL, table=table_name, mode="append")

And this line fails with

Py4JJavaError: An error occurred while calling o53884.jdbc. :

com.microsoft.sqlserver.jdbc.SQLServerException: CREATE TABLE permission denied in database '[REDACTED]'. at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:265) at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1676)

What I don't understand is, I'm not trying to create a table, I'm trying to append data to a table that already exists.

(EDIT: The table did not, in fact, exist.)

In this example, the user I specified in JDBCURL has SELECT, INSERT, UPDATE and DELETE rights on the destination schema.


Solution

  • Databricks (Spark) will often try to do things automatically that you aren't explicitly asking it to do. Without your exact dataframe and database table schema, it is difficult to say, but check for the following issues which can result in Spark trying to create a table:

    1. Confirm the table actually exists. Since you've already done this, try...

    2. Confirm the schemas are a perfect match. This is the issue I see most often that causes the CREATE TABLE permission error.

    You can also try to log the exact commands being issued to the database to do more advanced troubleshooting, but I've only needed to do this a handful of times.