chukwa-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eric Yang <eric...@gmail.com>
Subject Re: Hadoop metric are not collected on amazon EMR
Date Thu, 05 Apr 2012 18:00:44 GMT
Hi Meena,

I can't pin point the exact problem, but there are two possible
scenarios leading to this issues.

1. Is there may be a mismatch of the hadoop jar file between your
Hadoop cluster and the hadoop jar file in
CHUKWA_HOME/share/chukwa/lib.  By default, Chukwa is shipped with
hadoop-core-1.0.0.jar.  You should replace hadoop-*.jar with the
0.20.205 jar files, then restart the collector.

2. Is your cluster running with Kerberos security turn on?  If yes,
make sure you have the proper core-site.xml in Chukwa classpath and
users keytabs setup.  kinit has been initialized for the Chukwa user.

regards,
Eric

On Wed, Apr 4, 2012 at 10:12 PM, meena k.s <imsmiling.meena@gmail.com> wrote:
> Hi Eric,
>
> I had modified the chukwa-collector-conf.xml to stream to hdfs as mentioned
> in the link. but still i get the same error.
>
> I'm using chukwa 0.5 and hadoop version 0.20.205
>
> Thanks,
> Meena
>
>
> On Thu, Apr 5, 2012 at 8:34 AM, Eric Yang <eric818@gmail.com> wrote:
>>
>> Which version of Hadoop are you using?  This looks like a mismatch
>> between how Chukwa
>> calls DFS client, and Hadoop jar file.
>>
>> regards,
>> Eric
>>
>> On Wed, Apr 4, 2012 at 1:43 AM, meena k.s <imsmiling.meena@gmail.com>
>> wrote:
>> > Hi Eric,
>> >
>> > Regarding my earlier query, i see the following error message in
>> > chukwa-collector.log
>> >
>> >  WARN main SeqFileWriter - Got an exception trying to rotate. Will try
>> > again
>> > in 300 seconds.
>> > org.apache.hadoop.ipc.RemoteException: java.io.IOException:
>> > java.lang.NoSuchMethodException:
>> > org.apache.hadoop.hdfs.protocol.ClientProtocol.create(java.lang.String,
>> > org.apache.hadoop.fs.permission.FsPermission, java.lang.String, boolean,
>> > boolean, short, long)
>> >         at java.lang.Class.getMethod(Class.java:1605)
>> >         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
>> >         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>> >         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>> >         at java.security.AccessController.doPrivileged(Native Method)
>> >         at javax.security.auth.Subject.doAs(Subject.java:396)
>> > Thanks,
>> > Meena
>> >
>> > On Wed, Apr 4, 2012 at 1:33 PM, meena k.s <imsmiling.meena@gmail.com>
>> > wrote:
>> >>
>> >> Hi Eric,
>> >>
>> >> Thanks for the quick reply.I have tried to install chukwa 0.5
>> >> accordingly.
>> >> but there seems to be a problem while starting chukwa-collector. In the
>> >> log
>> >> files i could see something related to hbase. I haven't installed any
>> >> hbase
>> >> in my cluster. Therefore I commented out the parameters related to
>> >> hbase in
>> >> my chukwa-collector-conf.xml. Is it necessary to have a hbase
>> >> installation
>> >> for chukwa 0.5 to work.
>> >>
>> >> Thanks,
>> >> Meena
>> >>
>> >> On Tue, Apr 3, 2012 at 9:36 PM, Eric Yang <eric818@gmail.com> wrote:
>> >>>
>> >>> Hi Meena,
>> >>>
>> >>> Hadoop 0.20.205 and Chukwa 0.4 is a mismatch.  Hadoop 0.20.20x has
>> >>> been modified to use MetricsSink instead of MentricsContext.
>> >>> Therefore, you will not see Hadoop metrics by using Chukwa 0.4.  You
>> >>> will need Chukwa 0.5 to monitor Hadoop 0.20.205+.  In addition, you
>> >>> should copy hadoop-metrics2.properties to $HADOOP_CONF_DIR to monitor
>> >>> Hadoop.  Hope this helps.
>> >>>
>> >>> regards,
>> >>> Eric
>> >>>
>> >>> On Tue, Apr 3, 2012 at 3:33 AM, Meena K.S <ksmeena1806@gmail.com>
>> >>> wrote:
>> >>> > Hi,
>> >>> >
>> >>> >
>> >>> >
>> >>> > We want to use chukwa for hadoop log collection. We are running
hive
>> >>> > queries on EMR. So for the purpose of experimenting with chukwa
we
>> >>> > configured chukwa on a 2 node EMR cluster.  The collector and
the
>> >>> > agent are running and also the system logs are generated but the
>> >>> > hadoop mapred and rpc /jvm logs are not being generated. Chukwa
>> >>> > version : 0.4 and the  hadoop version :0.20.205
>> >>> >
>> >>> >
>> >>> >
>> >>> > My hadoop-metrics.properties is as below :
>> >>> >
>> >>> >
>> >>> > # Licensed to the Apache Software Foundation (ASF) under one or
more
>> >>> >
>> >>> > # contributor license agreements.  See the NOTICE file distributed
>> >>> > with
>> >>> >
>> >>> > # this work for additional information regarding copyright
>> >>> > ownership.
>> >>> >
>> >>> > # The ASF licenses this file to You under the Apache License,
>> >>> > Version
>> >>> > 2.0
>> >>> >
>> >>> > # (the "License"); you may not use this file except in compliance
>> >>> > with
>> >>> >
>> >>> > # the License.  You may obtain a copy of the License at
>> >>> >
>> >>> > #
>> >>> >
>> >>> > #     http://www.apache.org/licenses/LICENSE-2.0
>> >>> >
>> >>> > #
>> >>> >
>> >>> > # Unless required by applicable law or agreed to in writing,
>> >>> > software
>> >>> >
>> >>> > # distributed under the License is distributed on an "AS IS" BASIS,
>> >>> >
>> >>> > # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
or
>> >>> > implied.
>> >>> >
>> >>> > # See the License for the specific language governing permissions
>> >>> > and
>> >>> >
>> >>> > # limitations under the License.
>> >>> >
>> >>> >
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.recordType=HadoopMetricsProcessor
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.chukwaClientHostname=localhost
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.chukwaClientPortNum=9093
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.DatePattern=.yyyy-MM-dd
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.layout=org.apache.log4j.PatternLayout
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.layout.ConversionPattern=%d{ISO8601}
%p
>> >>> > %c:
>> >>> > %m%n
>> >>> >
>> >>> > log4j.appender.chukwa.rpc.Dir=/tmp/chukwa/log/metrics
>> >>> >
>> >>> >
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.recordType=HadoopMetricsProcessor
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.chukwaClientHostname=localhost
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.chukwaClientPortNum=9093
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.DatePattern=.yyyy-MM-dd
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.layout=org.apache.log4j.PatternLayout
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.layout.ConversionPattern=%d{ISO8601}
%p
>> >>> > %c:
>> >>> > %m%n
>> >>> >
>> >>> > log4j.appender.chukwa.dfs.Dir=/tmp/chukwa/log/metrics
>> >>> >
>> >>> >
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.recordType=HadoopMetricsProcessor
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.chukwaClientHostname=localhost
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.chukwaClientPortNum=9093
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.DatePattern=.yyyy-MM-dd
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.layout=org.apache.log4j.PatternLayout
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.layout.ConversionPattern=%d{ISO8601}
%p
>> >>> > %c: %m%n
>> >>> >
>> >>> > log4j.appender.chukwa.mapred.Dir=/tmp/chukwa/log/metrics
>> >>> >
>> >>> >
>> >>> >
>> >>> > I have also copied chukwa-hadoop-*-client.jar and json.jar to
>> >>> > HADOOP_HOME/lib, and restarted hadoop, but still the logs were
not
>> >>> > generated.
>> >>> >
>> >>> > Also when I run telnet localhost 9093, the jvm or other hadoop
>> >>> > metrics
>> >>> > are not running, but the other system adapters are running
>> >>> > correctly.
>> >>> >
>> >>> > Can someone help me here..
>> >>> >
>> >>> > Thanks
>> >>> > Meena
>> >>
>> >>
>> >
>
>

Mime
View raw message