I am trying to execute a Spark application built through Scala IDE through my standalone Spark service running on cloudera quickstart VM 5.3.0.
My cloudera account JAVA_HOME is /usr/java/default
However, I am facing the below error message while executing the start-all.sh command from cloudera
user as below:
[cloudera@localhost sbin]$ pwd
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin
[cloudera@localhost sbin]$ ./start-all.sh
chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs': Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out: Permission denied
failed to launch org.apache.spark.deploy.master.Master:
tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out' for reading: No such file or directory
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
cloudera@localhost's password:
localhost: chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs': Operation not permitted
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out: Permission denied
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out' for reading: No such file or directory
localhost: full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
I had added export CMF_AGENT_JAVA_HOME=/usr/java/default
in /etc/default/cloudera-scm-agent
and run sudo service cloudera-scm-agent restart
. See How to set CMF_AGENT_JAVA_HOME
I had also added export JAVA_HOME=/usr/java/default
in locate_java_home
function definition in file /usr/share/cmf/bin/cmf-server
and restarted the cluster and standalone Spark service
But the below error is repeating while starting spark service from root
user
[root@localhost spark]# sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
failed to launch org.apache.spark.deploy.master.Master:
JAVA_HOME is not set
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
root@localhost's password:
localhost: Connection closed by UNKNOWN
Can anybody suggest how to set JAVA_HOME so as to start Spark standalone service on cloudera manager?
The solution had been quite easy and straightforward. Just added export JAVA_HOME=/usr/java/default
in /root/.bashrc
and it successfully started the spark services from root
user without the JAVA_HOME is not set
error. Hope it helps somebody facing same problem.