In databricks there is the following magic command $sh, that allows you run bash commands in a notebook. For example if I wanted to run the following code in Databrick:
pip install great_expectations
I would issue the following code:
%sh
pip install great_expectations
Can someone let me know what the equivalent is with Apache Spark notebook in Azure Synapse? It may well be that it isn't possible with Azure Synapse but I don't know.
Just to add to this question, in Databricks when I run the following command
great_expectations init
The command remains stuck in running, see image
However, what I would expect from a regular Linux OS when I run the same code would be
OK to proceed? [Y/n]:
Is there something I could add to
great_expectations init
To make the code return
OK to proceed? [Y/n]:
To install packages just run a pip install preceded with a % right in the synapse notebook like this.
%pip install great_expectations
I am also able to run shell commands in the notebook using this: %%sh for example
%%sh
lsb_release -a