hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kumar r <kumarc...@gmail.com>
Subject Hadoop log larger file size
Date Tue, 08 Nov 2016 07:17:06 GMT
I have configured Kerberos enabled Hadoop-2.7.2 cluster in windows.

I have noticed that logs generating more than 5 GB in hadoop.log file.

log4j configured max file size to 256 MB and max backup index 20. But i
don't know why hadoop keep on appending logs in single file.

hadoop.log.maxfilesize=256MB
hadoop.log.maxbackupindex=20
log4j.appender.RFA=org.apache.log4j.RollingFileAppender
log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}
log4j.appender.RFA.MaxFileSize=${hadoop.log.maxfilesize}
log4j.appender.RFA.MaxBackupIndex=${hadoop.log.maxbackupindex}
log4j.appender.RFA.layout=org.apache.log4j.PatternLayout# Pattern
format: Date LogLevel LoggerName LogMessage
log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n

How can i set max file size and backup index?

Mime
View raw message