I have a Apache Spark 1.6.1 standalone cluster set on a single machine with the following specifications:
I set nothing so Spark can take the default values, which for cores is "all the available cores", based on that, the question is:
Why is Spark detecting 8 cores, when I only have 4?
I assume that setting all available cores
means that Spark is also using Virtual cores
And since your CPU does support Hyperthreading it has 8 virtual cores available.
If you want to only use physical cores I assume there is a specific setting for that.