标签云

微信群

扫码加入我们

WeChat QR Code

I am running a spark-streaming application on a standalone setup (version 1.6.1). When I run the application using spark-submit, the logs show up on the terminal. These are useful later on, for instance, to understand what the reason for the failure of the app was, if it failed. From what I read over the documentation, I have spark.eventLog.enabled flag set to true. But, this only saves the event logs to tmp/spark-events folder. These logs are of not much use to me , as I see it. My jobs fail often, due to many exceptions. What is the correct way to store these logs which show up in the terminal (I am guessing, the driver logs?) and analyse my exceptions?


Thanks for the reply! I haven't had a chance to test this out. Will get back asap

2019年04月19日11分00秒

Hey I just tried this and is getting many log4j:WARN statements with at the end : '....log4j:ERROR No appender named [FILE] could be found. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties'

2019年04月18日11分00秒

Aswin Fixed it. It should be "RollingFileAppender".

2019年04月18日11分00秒

Thanks, but I am still getting ERROR logs. Any idea why these might be happening? I've added the logs to the Edit section in the question.

2019年04月19日11分00秒

Aswin For some reason it isn't finding the RoleFileAppender class. Perhaps you want to explicitly pass the log4j jar to spark driver via spark.driver.extraClassPath

2019年04月18日11分00秒