apache-sparkpysparkhdfsspark-shell

Why I take "spark-shell: Permission denied" error in Spark Setup?


I am new on Apache Spark. I am trying to setup Apache Spark to my Macbook. I download file "spark-2.4.0-bin-hadoop2.7" from Apache Spark official web site.
When I try to run ./bin/spark-shell or ./bin/pyspark I get Permission denied error.
I want to just run spark on my local machine.
I also tried to give permission to all folders but it does not help. Why do I this error?


Solution

  • I solve this issue by adding /libexec folder to spark home path

    set $SPARK_HOME to

    /usr/local/Cellar/apache-spark/<your_spark_version>/libexec