Return-Path: X-Original-To: apmail-flume-user-archive@www.apache.org Delivered-To: apmail-flume-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9A216187EA for ; Tue, 1 Dec 2015 10:21:28 +0000 (UTC) Received: (qmail 71597 invoked by uid 500); 1 Dec 2015 10:21:27 -0000 Delivered-To: apmail-flume-user-archive@flume.apache.org Received: (qmail 71515 invoked by uid 500); 1 Dec 2015 10:21:27 -0000 Mailing-List: contact user-help@flume.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@flume.apache.org Delivered-To: mailing list user@flume.apache.org Received: (qmail 71420 invoked by uid 99); 1 Dec 2015 10:21:26 -0000 Received: from Unknown (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 01 Dec 2015 10:21:26 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 7F30FC8697 for ; Tue, 1 Dec 2015 10:13:57 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.129 X-Spam-Level: *** X-Spam-Status: No, score=3.129 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_ENVFROM_END_DIGIT=0.25, HTML_MESSAGE=3, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id m2oBULCtK84a for ; Tue, 1 Dec 2015 10:13:56 +0000 (UTC) Received: from mail-lf0-f54.google.com (mail-lf0-f54.google.com [209.85.215.54]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id CC6C623299 for ; Tue, 1 Dec 2015 10:13:54 +0000 (UTC) Received: by lfs39 with SMTP id 39so1503231lfs.3 for ; Tue, 01 Dec 2015 02:13:41 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=S+gydQnvfvUOBLb4rJE5zLrc5UITSA6ZJ6KnftVaVp0=; b=keMSBCPuRW+7zVHfwggw0jQrxtuOIUHQgJfCntTttXEQPQNxicEIV2+M3Qyrz0mU51 pZVUOFpC5ciDrre8QLL56H5J5Yyn+D3d2rEaaQnANWp2yfPGeILnQYtrstiH4PMTzI8F XxZNdqF4e5O25KWTcUwxx337TGx0buGCzqhYmytbEmAg+1xYh24iHj5864WNG4wef13S hUgHwuCLGkyYgkkwmbnWNaOcvh4Rq3VFxKtwNJFnje6HklUpiYsWK88DX0l7WB+oiJ1A PC8dqtCazgd7xDmQfdnajEQeoBxCFatb9bBUF42i4gxnJ0L26/AOEVRzfwvqtxEOx58r HQIA== MIME-Version: 1.0 X-Received: by 10.112.13.98 with SMTP id g2mr22835430lbc.18.1448964821514; Tue, 01 Dec 2015 02:13:41 -0800 (PST) Received: by 10.25.216.39 with HTTP; Tue, 1 Dec 2015 02:13:41 -0800 (PST) In-Reply-To: References: Date: Tue, 1 Dec 2015 15:43:41 +0530 Message-ID: Subject: Re: Flume log4j Appender issue From: yogendra reddy To: user@flume.apache.org Content-Type: multipart/alternative; boundary=001a11c3b8a67a82350525d36969 --001a11c3b8a67a82350525d36969 Content-Type: text/plain; charset=UTF-8 update I ran the flume agent first and then made changes to hadoop log4j properties file and after the restart this started working fine. but now Hive service is not coming up because of avro-ipc jar that I had to add to hadoop-hdfs lib to get flume log4j appender working I would like to know if anybody here has used flume to copy Hadoop daemon/service logs? Thanks, Yogendra On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy wrote: > Hi All, > > I'm trying to configure flume to write hadoop service logs to a common > sink. > > Here's what I have added to hdfs log4j.properties > > # Define the root logger to the system property "hadoop.root.logger". > log4j.rootLogger=${hadoop.root.logger}, flume > > #Flume Appender > log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender > log4j.appender.flume.Hostname = localhost > log4j.appender.flume.Port = 41414 > > and when I run sample pi job I get this error > > $ hadoop jar hadoop-mapreduce-examples.jar pi 10 10 > log4j:ERROR Could not find value for key log4j.appender.flume.layout > 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default maxIOWorkers > log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host: > localhost, port: 41414 }: RPC connection error > Exception in thread "main" java.lang.ExceptionInInitializerError > at org.apache.hadoop.util.RunJar.run(RunJar.java:200) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: org.apache.commons.logging.LogConfigurationException: > User-specified log class 'org.apache.commons.logging.impl.Log4JLogger' > cannot be found or is not useable. > at > org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804) > at > org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292) > at > org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269) > at > org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657) > at > org.apache.hadoop.util.ShutdownHookManager.(ShutdownHookManager.java:44) > ... 2 more > > I have added these jars to hadoop-hdfs lib - > > avro-ipc-1.7.3.jar > flume-avro-source-1.5.2.2.2.7.1-33.jar > flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar > flume-hdfs-sink-1.5.2.2.2.7.1-33.jar > flume-ng-sdk-1.5.2.2.2.7.1-33.jar > > and I do have the commons-logging( commons-logging-1.1.3.jar) and > log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this > issue? > > Thanks, > Yogendra > > > --001a11c3b8a67a82350525d36969 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
update

I ran the flume agent firs= t and then made changes to hadoop log4j properties file and after the resta= rt this started working fine. but now Hive service is not coming up because= of avro-ipc jar that I had to add to hadoop-hdfs lib to get flume log4j ap= pender working=C2=A0

I would like to know if anybody her= e has used flume to copy Hadoop daemon/service logs?

Thanks,
Yogendra

On Wed, Nov 25, 2015 at 2:03 PM, yogendra reddy <yogendra.60@gmail.com> wrote:
Hi All,

I'm trying to configu= re flume to write hadoop service logs to a common sink.

Here's what I have added to hdfs log4j.properties=C2=A0

# Define the root logger to the system property "= hadoop.root.logger".
log4j.rootLogger=3D${hadoop.root.logger= }, flume

#Flume Appender
log4= j.appender.flume =3D org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname =3D localhost
log4j.append= er.flume.Port =3D 41414

and when I run sampl= e pi job I get this error=C2=A0

$ hadoop jar = hadoop-mapreduce-examples.jar pi 10 10
log4j:ERROR Could not find= value for key log4j.appender.flume.layout
15/11/25 07:23:26 WARN= api.NettyAvroRpcClient: Using default maxIOWorkers
log4j:ERROR R= PC client creation failed! NettyAvroRpcClient { host: localhost, port: 4141= 4 }: RPC connection error
Exception in thread "main" ja= va.lang.ExceptionInInitializerError
=C2=A0 =C2=A0 =C2=A0 =C2=A0 a= t org.apache.hadoop.util.RunJar.run(RunJar.java:200)
=C2=A0 =C2= =A0 =C2=A0 =C2=A0 at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.commons.logging.LogConfigurationException: Us= er-specified log class 'org.apache.commons.logging.impl.Log4JLogger'= ; cannot be found or is not useable.
=C2=A0 =C2=A0 =C2=A0 =C2=A0 = at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation= (LogFactoryImpl.java:804)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apac= he.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)=
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.commons.logging.impl.L= ogFactoryImpl.getInstance(LogFactoryImpl.java:292)
=C2=A0 =C2=A0 = =C2=A0 =C2=A0 at org.apache.commons.logging.impl.LogFactoryImpl.getInstance= (LogFactoryImpl.java:269)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apac= he.commons.logging.LogFactory.getLog(LogFactory.java:657)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.util.ShutdownHookManager.<clin= it>(ShutdownHookManager.java:44)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 .= .. 2 more

I have added these jars to hadoop-= hdfs lib -

avro-ipc-1.7.3.jar =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0=C2=A0
flume-avro-source-1.5.2.2.2.7.1-33.jar= =C2=A0
flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar =C2=A0
flume-hdfs-sink-1.5.2.2.2.7.1-33.jar =C2=A0=C2=A0
flume-ng-sdk= -1.5.2.2.2.7.1-33.jar

and I do have the comm= ons-logging( commons-logging-1.1.3.jar) and log4j(1.2.17) jars present in t= he hdfs lib. Any pointers to debug this issue?

Tha= nks,
Yogendra



--001a11c3b8a67a82350525d36969--