hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Prakash <ravihad...@gmail.com>
Subject Re: Hadoop log larger file size
Date Tue, 08 Nov 2016 19:37:21 GMT
Hi Kumar!

You have to be careful which log4j.properties file is on the classpath of
the daemn which has the big logs. Often times there are multiple
log4j.properties file, perhaps in the classpath or in one of the jars on
the classpath. Are you sure the log4j.properties file you edited is the
only one loaded by the classloader?

HTH
Ravi

On Mon, Nov 7, 2016 at 11:17 PM, kumar r <kumarccpp@gmail.com> wrote:

> I have configured Kerberos enabled Hadoop-2.7.2 cluster in windows.
>
> I have noticed that logs generating more than 5 GB in hadoop.log file.
>
> log4j configured max file size to 256 MB and max backup index 20. But i
> don't know why hadoop keep on appending logs in single file.
>
> hadoop.log.maxfilesize=256MB
> hadoop.log.maxbackupindex=20
> log4j.appender.RFA=org.apache.log4j.RollingFileAppender
> log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}
> log4j.appender.RFA.MaxFileSize=${hadoop.log.maxfilesize}
> log4j.appender.RFA.MaxBackupIndex=${hadoop.log.maxbackupindex}
> log4j.appender.RFA.layout=org.apache.log4j.PatternLayout# Pattern format: Date LogLevel
LoggerName LogMessage
> log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
>
> How can i set max file size and backup index?
>
>

Mime
View raw message