I want to execute a Databrikcs Notebook's code via Databricks API and get the output of notebook's code as response.
Is it possible of is there any workaround for the same ?
Is the same possible with Databricks SQL api ?
You can either create a job via UI, and then trigger it with run-now API, or you can use run-submit API to create ephemeral job. The both calls will give back the Run ID. Then you can monitor the status via get-run API, and wait for the notebook executed. And after that you can get output of the notebook with get-output API.
You can also execute the individual commands on the existing cluster, using the older API 1.2.