apache-sparklog4jhadoop-yarn

How to use Spark application name in logs?


I am running Apache spark application in Yarn cluster. I use same application to trigger(spark-submit) multiple time with different parameters ( one of the parameter is --name ) and I have common log4j.xml file for logs.

I would like use some thing like ${spark.app.name} in log4j.xml to different logs from individual spark jobs.

I tried using ${spark.app.name} in log4j.xml but it is not working.

  1. What I am doing wrong ?
  2. Is there a better way to log spark application name in logs ?

Tried : I tried using ${spark.app.name} and ${spark.app.id} in log4j.xml

Expect to happen: Able to see spark job name in every line

What actually resulted: Spark job name didn't appeared in log instead it was empty.

Note : I have gone through this question Output Spark application name in driver log. It is not helping.


Solution

  • I Used separate log4j config file to name for every spark submit script.