hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Edward Capriolo <edlinuxg...@gmail.com>
Subject Re: syslog-ng and hadoop
Date Thu, 20 Aug 2009 15:16:32 GMT
On Thu, Aug 20, 2009 at 10:49 AM, mike anderson<saidtherobot@gmail.com> wrote:
> Yeah, that is interesting Edward. I don't need syslog-ng for any particular
> reason, other than that I'm familiar with it. If there were another way to
> get all my logs collated into one log file that would be great.
> mike
>
> On Thu, Aug 20, 2009 at 10:44 AM, Edward Capriolo <edlinuxguru@gmail.com>wrote:
>
>> On Wed, Aug 19, 2009 at 11:50 PM, Brian Bockelman<bbockelm@cse.unl.edu>
>> wrote:
>> > Hey Mike,
>> >
>> > Yup.  We find the stock log4j needs two things:
>> >
>> > 1) Set the rootLogger manually.  The way 0.19.x has the root logger set
>> up
>> > breaks when adding new appenders.  I.e., do:
>> >
>> > log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter
>> >
>> > 2) Add the headers; otherwise log4j is not compatible with syslog:
>> >
>> > log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
>> > log4j.appender.SYSLOG.facility=local0
>> > log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
>> > log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n
>> > log4j.appender.SYSLOG.SyslogHost=red
>> > log4j.appender.SYSLOG.threshold=ERROR
>> > log4j.appender.SYSLOG.Header=true
>> > log4j.appender.SYSLOG.FacilityPrinting=true
>> >
>> > Brian
>> >
>> > On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote:
>> >
>> >> Has anybody had any luck setting up the log4j.properties file to send
>> logs
>> >> to a syslog-ng server?
>> >> My log4j.properties excerpt:
>> >> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender
>> >> log4j.appender.SYSLOG.syslogHost=10.0.20.164
>> >> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout
>> >> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
>> >> log4j.appender.SYSLOG.Facility=HADOOP
>> >>
>> >> and my syslog-ng.conf file running on 10.0.20.164
>> >>
>> >> source s_hadoop {
>> >>       # message generated by Syslog-NG
>> >>       internal();
>> >>       # standard Linux log source (this is the default place for the
>> >> syslog()
>> >>       # function to send logs to)
>> >>       unix-stream("/dev/log");
>> >>       udp();
>> >> };
>> >> destination df_hadoop { file("/var/log/hadoop/hadoop.log");};
>> >> filter f_hadoop {facility(hadoop);};
>> >> log {
>> >> source(s_hadoop);
>> >> filter(f_hadoop);
>> >> destination(df_hadoop);
>> >> };
>> >>
>> >>
>> >> Thanks in advance,
>> >> Mike
>> >
>> >
>>
>> Mike slightly off topic but you can also run a Log 4J server which
>> perfectly transports the messages fired off by LOG4j. The
>> log4J->syslog loses/ changes some information. If anyone is interested
>> in this let me know and I will write up something about it.
>>
>

Mike,
I just put this up for you.
http://www.edwardcapriolo.com/wiki/en/Log4j_Server

All of the functionality is in the class
org.apache.log4j.net.SocketServer which ships as part of Log4j.

I pretty much followed this http://timarcher.com/node/10

I started with the syslog appender but it had some quirks. Mostly the
syslog appender can only write a syslog so it loses some information.
The Log4jserver transfers the log.error("whatever" ) as is and can
handle it on the server end though the servers logging properties.
Cool stuff.

Mime
View raw message