An approach for Logging in Spark jobs

emr   livy-logo    spark-logo-trademark   splunk-logo-news



Spark website provides three options for using a custom log4j configuration for logging.

  1. Upload a custom with every job
  2. Add -Dlog4j.configuration={location of log4j config} with every job
  3. Update the $SPARK_CONF_DIR/ file

In one of our product which runs Spark Jobs in YARN mode on EMR cluster through Livy server we selected Option 3. Here I am explaining how we did it and the challenges we faced.

First we have prepared a custom file with all the configurations we want. It include custom loggers, their log file location, log rotation strategy etc. This file content will be appended to the /etc/spark/conf.dist/ during EMR Deployment. Here the log file location is important as the log files will be getting created in both Master and on all Executors in the cluster.

Spark website suggests to use ${} as a reference directory for log file location. Example:


In EMR executors the variable will be resolved to /var/log/hadoop-yarn/containers/{application_id}/{container_dir}/. This will be different for each application (job) and for each container within the application. So multiple application.log file will be created for every execution of the application (job) in each executors.

Another catch is that the variable ${} is not available in Master. However when the application is started log4j reads the configuration and tries to write/create the log files in Master as well. Since the variable is not available, the log file location will be evaluated as /application.log – i.e in root directory and it fails with “Permission Denied” exception. The solution to this was to create an empty /application.log as part of Deployment itself and give permission to the user who runs the job. In our case since we are running the Spark jobs using livy server, which was running as a process by “livy” user, we had to give write permission to livy user for the /application.log file

Once we have done this we have observed the complete logging happening for the Spark jobs. However depending on what gets executed where (master or executors) the logging happening locally on each of the node. To aggregate and analyze the logs from all the nodes we are using Splunk.

7 thoughts on “An approach for Logging in Spark jobs

    1. Thanks Haarsh. The file looks something like this


      log4j.appender.appFile.layout.conversionPattern=%d{ISO8601} : %X{APP} : %X{CID} : %-5p - %m%n

      log4j.appender.appJsonFile.layout.conversionPattern={"timestamp": "%d{ISO8601}", "appName": "%X{APP}","correlationId": "%X{CID}", "logLevel": "%p", "message": "%m"}%n


  1. Hi Anil:
    A huge thank you for the pointers in this post. I’m a little lost with the snippet below:

    The solution to this was to create an empty /application.log as part of Deployment itself and give permission to the user who runs the job.

    I tried to accomplish this by ssh ing onto the master and executing mkdir /application.log.
    I get a message permission denied.

    There is perhaps a gap in my knowledge. How do I create /application.log as part of Deployment on EMR.

    Thank You.


  2. Hello Anil,

    At the outset, thanks a lot for such valuable blog. It would be really great, if you could help me understand the following:

    Can you please provide any documentations/resources to support “the variable ${} is not available in Master” ? Do you mean to say that ${} is not available on Master initially when the application is submitted, however later when Yarn reserves resources and launches executors containers, ${} will be set by Yarn dynamically. And, post Yarn sets ${}, is the variable available on Master then ? If so, then can we configure the log4j to read the configurations at delayed time instant rather than at first on Master ?

    Or ${} is never available on Master since Yarn only sets it dynamically on executors containers only ?

    Positively looking forward for your response.

    Thanking You,
    Amiya Chakraborty.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s