It looks like you’ll need to append to the JVM arguments used when launching your tasks/jobs.
conf/spark-defaults.conf as described here
Alternatively try editing
conf/spark-env.sh as described here to add the same JVM argument, although the entries in conf/spark-defaults.conf should work.
If you are still not getting any joy, you can explicitly pass the location of your log4j.properties file on the command line along with your
spark-submit like this if the file is contained within your JAR file and in the root directory of your classpath
spark-submit --class sparky.MyApp --master spark://my.host.com:7077 --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j-executor.properties" myapp.jar
If the file is not on your classpath use the
file: prefix and full path like this
spark-submit ... --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/apps/spark-1.2.0/conf/log4j-executor.properties" ...