Is it possible to use the Scala worksheet in IntelliJ as an alternative to a Jupyter notebook. I followed the solution mentioned here , however it runs the code locally and not on a remote cluster. My main challenge is that the IntelliJ IDE is running locally in my laptop and the spark cluster is in the cloud. How do I ask IntelliJ to use the remote SBT ?
If you are just trying out a tutorial in a more interactive/adhoc fashion to learn Spark programming and concepts, the link you mentioned in your post is how you'll do it. In that mode, you're basically simulating a single-node Spark cluster (i.e your local machine) that acts as both the Driver as well as the executor node (all in one).
However, that's not how you'll actually submit and run a Spark application on an ACTUAL Spark cluster in a more real world scenario. If that's what you're trying to do, you'll need to instead use one of the two deployment modes Spark offers: Client Mode and Cluster Mode. You'll use the "spark-submit" command line utility to submit your compiled Spark library (the fat JAR) as a Spark job. More details here.