amazon-web-servicesapache-sparkamazon-s3spark-submitqubole

How to pass --properties-file to spark-submit in Qubole?


I am using Spark in Qubole by having the clusters created in AWS. In Qubole Workbench, when I execute the below Command Line, it works fine and the command is successful

/usr/lib/spark/bin/spark-submit s3://bucket-name/SparkScripts/test.py

But, when I execute the same command along with --properties-file option

/usr/lib/spark/bin/spark-submit --properties-file s3://bucket-name/SparkScripts/properties.file s3://bucket-name/SparkScripts/test.py

it gives below error message

Qubole > Shell Command failed with exit code: 1

App > Error occurred when getting effective config required to initialize Qubole security provider

App > Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Properties file s3:/bucket-name/SparkScripts/properties.file does not exist

Can someone help me fix this? I need some application properties to be stored on a separate file on Amazon S3 and passed on to --properties-file to my spark program.


Solution

  • @saravanan - Qubole does not have the ability to specify --properties file from S3 path currently. It will be available in release 59.