I'm trying to run the spark examples from Eclipse
and getting this generic error: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources.
The version I have is spark-1.6.2-bin-hadoop2.6.
I started spark using the ./sbin/start-master.sh
command from a shell, and set my sparkConf
like this:
SparkConf conf = new SparkConf().setAppName("Simple Application");
conf.setMaster("spark://My-Mac-mini.local:7077");
I'm not bringing any other code here because this error pops up with any of the examples I'm running. The machine is a Mac OSX and I'm pretty sure it has enough resources to run the simplest examples.
What am I missing?
The error indicates that you cluster has insufficient resources for current job.Since you have not started the slaves i.e worker . The cluster won't have any resources to allocate to your job. Starting the slaves will work.
`start-slave.sh <spark://master-ip:7077>`