azureapache-spark-sqlazure-synapsespark-notebook

Can i run stored procedure in spark pool synapse?


I wanted to know how we can run stored procedure in spark pool (azure synapse) which i have created in dedicated SQL pool. Also can we run SQL queries to access data in ddsql pool in notebook.


Solution

  • It is possible to do this (eg using an ODBC connection as described here) but you would be better off just using a Synapse Pipeline to do the orchestration:

    1. run a stored Proc activity which places the data you want to work with in a relevant table or storage account
    2. call a notebook activity using the spark.read.synapsesql method as described in detail here.

    The pattern:

    enter image description here

    Is there a particular reason you are copying existing data from the sql pool into Spark? I do a very similar pattern but reserve it for things I can't already do in SQL, such as sophisticated transform, RegEx, hard maths, complex string manipulation etc