spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yong Zhang <java8...@hotmail.com>
Subject RE: Logging in executors
Date Wed, 13 Apr 2016 17:35:39 GMT
Is the env/dev/log4j-executor.properties file within your jar file? Is the path matching with
what you specified as env/dev/log4j-executor.properties?
If you read the log4j document here: https://logging.apache.org/log4j/1.2/manual.html
When you specify the log4j.configuration=my_custom.properties, you have 2 option:
1) the my_custom.properties has to be in the jar (or in the classpath). In your case, since
you specify the package path, you need to make sure they are matched in your jar file2) use
like log4j.configuration=file:///tmp/my_custom.properties. In this way, you need to make sure
file my_custom.properties exists in /tmp folder on ALL of your worker nodes.
Yong

Date: Wed, 13 Apr 2016 14:18:24 -0300
Subject: Re: Logging in executors
From: cmatas@despegar.com
To: yuzhihong@gmail.com
CC: user@spark.apache.org

Thanks for your response Ted. You're right, there was a typo. I changed it, now I'm executing:
bin/spark-submit --master spark://localhost:7077 --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=env/dev/log4j-driver.properties"
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=env/dev/log4j-executor.properties"
--class....
The content of this file is:
# Set everything to be logged to the consolelog4j.rootCategory=INFO, FILElog4j.appender.console=org.apache.log4j.ConsoleAppenderlog4j.appender.console.target=System.errlog4j.appender.console.layout=org.apache.log4j.PatternLayoutlog4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd
HH:mm:ss} %p %c{1}: %m%n
log4j.appender.FILE=org.apache.log4j.RollingFileAppenderlog4j.appender.FILE.File=/tmp/executor.loglog4j.appender.FILE.ImmediateFlush=truelog4j.appender.FILE.Threshold=debuglog4j.appender.FILE.Append=truelog4j.appender.FILE.MaxFileSize=100MBlog4j.appender.FILE.MaxBackupIndex=5log4j.appender.FILE.layout=org.apache.log4j.PatternLayoutlog4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd
HH:mm:ss} %p %c{1}: %m%n
# Settings to quiet third party logs that are too verboselog4j.logger.org.spark-project.jetty=WARNlog4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERRORlog4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFOlog4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFOlog4j.logger.org.apache.parquet=ERRORlog4j.logger.parquet=ERRORlog4j.logger.com.despegar.p13n=DEBUG
# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL
with Hive supportlog4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATALlog4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

Finally, the code on which I'm using logging in the executor is:
def groupAndCount(keys: DStream[(String, List[String])])(handler: ResultHandler) = {
  
  val result = keys.reduceByKey((prior, current) => {
    (prior ::: current)
  }).flatMap {
    case (date, keys) =>
      val rs = keys.groupBy(x => x).map(
          obs =>{
            val (d,t) = date.split("@") match {
              case Array(d,t) => (d,t)
            }
            import org.apache.log4j.Logger
            import scala.collection.JavaConverters._
            val logger: Logger = Logger.getRootLogger
            logger.info(s"Metric retrieved $d")
            Metric("PV", d, obs._1, t, obs._2.size)
        }
      )
      rs
  }

  result.foreachRDD((rdd: RDD[Metric], time: Time) => {
    handler(rdd, time)
  })

}
Originally the import and logger object was outside the map function. I'm also using the root
logger just to see if it's working, but nothing gets logged. I've checked that the property
is set correctly on the executor side through println(System.getProperty("log4j.configuration"))
and is OK, but still not working.
Thanks again,-carlos. 		 	   		  
Mime
View raw message