hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Leo Alekseyev <dnqu...@gmail.com>
Subject Re: How to change logging from DRFA to RFA? Is it a good idea?
Date Tue, 28 Sep 2010 21:13:02 GMT
I have all of the above in my log4j.properties; every  line that
mentions DRFA is commented out.  And yet, I still get the following
errors:

log4j:ERROR Could not find value for key log4j.appender.DRFA
log4j:ERROR Could not instantiate appender named "DRFA".

Is there another config file?..  Is DRFA hard-coded somewhere?..



On Mon, Sep 27, 2010 at 5:28 PM, Boris Shkolnik <borya@yahoo-inc.com> wrote:
> log4j.appender.RFA=org.apache.log4j.RollingFileAppender
> log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}
>
> log4j.appender.RFA.MaxFileSize=1MB
> log4j.appender.RFA.MaxBackupIndex=30
>
> hadoop.root.logger=INFO,RFA
>
>
> On 9/27/10 4:12 PM, "Leo Alekseyev" <dnquark@gmail.com> wrote:
>
> We are looking for ways to prevent Hadoop daemon logs from piling up
> (over time they can reach several tens of GB and become a nuisance).
> Unfortunately, the log4j DRFA class doesn't seem to provide an easy
> way to limit the number of files it creates.  I would like to try
> switching to RFA with set MaxFileSize and MaxBackupIndex, since it
> looks like that's going to solve the log accumulation problem, but I
> can't figure out how to change the default logging class for the
> daemons.  Can anyone give me some hints on how to do it?
>
> Alternatively, please let me know if there's a better solution to
> control log accumulation.

Mime
View raw message