Return-Path: X-Original-To: apmail-incubator-chukwa-user-archive@www.apache.org Delivered-To: apmail-incubator-chukwa-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8A6D79901 for ; Mon, 14 Nov 2011 14:29:49 +0000 (UTC) Received: (qmail 81795 invoked by uid 500); 14 Nov 2011 14:29:49 -0000 Delivered-To: apmail-incubator-chukwa-user-archive@incubator.apache.org Received: (qmail 81770 invoked by uid 500); 14 Nov 2011 14:29:49 -0000 Mailing-List: contact chukwa-user-help@incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: chukwa-user@incubator.apache.org Delivered-To: mailing list chukwa-user@incubator.apache.org Received: (qmail 81763 invoked by uid 99); 14 Nov 2011 14:29:49 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Nov 2011 14:29:49 +0000 X-ASF-Spam-Status: No, hits=2.8 required=5.0 tests=FREEMAIL_FROM,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,URI_HEX X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of afathalla@gmail.com designates 209.85.210.175 as permitted sender) Received: from [209.85.210.175] (HELO mail-iy0-f175.google.com) (209.85.210.175) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Nov 2011 14:29:44 +0000 Received: by iahk25 with SMTP id k25so6055695iah.6 for ; Mon, 14 Nov 2011 06:29:24 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=5PqlVlvwrDfscXl1J6zrHUoJ85nGVFK7yBgiBSoD4Q4=; b=U1ZFOEpX9CZe0SLAQ4MiwWmGCQFKH10qol/xpBZgoY/LUnrcpAqKm1NSoeMDoP5/Ig 5DjNK5u9iQZBlHh99FCQt6QeUyRNUxpzY6B8HR7IFr06A4e//514/lFOZXVD5UYsSe5/ SmBaUJM65aoQsorIxK44thSkhKZDMSo1P74rM= MIME-Version: 1.0 Received: by 10.231.5.81 with SMTP id 17mr5290096ibu.31.1321280963893; Mon, 14 Nov 2011 06:29:23 -0800 (PST) Received: by 10.231.37.74 with HTTP; Mon, 14 Nov 2011 06:29:23 -0800 (PST) In-Reply-To: References: Date: Mon, 14 Nov 2011 16:29:23 +0200 Message-ID: Subject: Re: Error while starting the collector From: Ahmed Fathalla To: chukwa-user@incubator.apache.org Content-Type: multipart/alternative; boundary=001517740b00805a7804b1b2b2d2 --001517740b00805a7804b1b2b2d2 Content-Type: text/plain; charset=ISO-8859-1 do u have /usr/bin/sar , installed. I think it should be there by default in Ubuntu.. On Mon, Nov 14, 2011 at 3:26 PM, Mohammad Tariq wrote: > One more strange thing which I have noticed is that if am removing > "initial_adaptors", I am able to start the agent. But if the > "initial_adaptors" file is present inside "conf", I am getting > following errors - > tariq@ubuntu:~/chukwa-0.4.0$ bin/chukwa agent > tariq@ubuntu:~/chukwa-0.4.0$ java.io.IOException: Cannot run program > "/usr/bin/sar": java.io.IOException: error=2, No such file or > directory > at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) > at java.lang.Runtime.exec(Runtime.java:593) > at java.lang.Runtime.exec(Runtime.java:431) > at java.lang.Runtime.exec(Runtime.java:328) > at > org.apache.hadoop.chukwa.inputtools.plugin.ExecPlugin.execute(ExecPlugin.java:66) > at > org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor$RunToolTask.run(ExecAdaptor.java:68) > at java.util.TimerThread.mainLoop(Timer.java:512) > at java.util.TimerThread.run(Timer.java:462) > Caused by: java.io.IOException: java.io.IOException: error=2, No such > file or directory > at java.lang.UNIXProcess.(UNIXProcess.java:148) > at java.lang.ProcessImpl.start(ProcessImpl.java:65) > at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) > ... 7 more > java.io.IOException: Cannot run program "/usr/bin/iostat": > java.io.IOException: error=2, No such file or directory > at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) > at java.lang.Runtime.exec(Runtime.java:593) > at java.lang.Runtime.exec(Runtime.java:431) > at java.lang.Runtime.exec(Runtime.java:328) > at > org.apache.hadoop.chukwa.inputtools.plugin.ExecPlugin.execute(ExecPlugin.java:66) > at > org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor$RunToolTask.run(ExecAdaptor.java:68) > at java.util.TimerThread.mainLoop(Timer.java:512) > at java.util.TimerThread.run(Timer.java:462) > Caused by: java.io.IOException: java.io.IOException: error=2, No such > file or directory > at java.lang.UNIXProcess.(UNIXProcess.java:148) > at java.lang.ProcessImpl.start(ProcessImpl.java:65) > at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) > ... 7 more > > Regards, > Mohammad Tariq > > > > On Mon, Nov 14, 2011 at 6:49 PM, TARIQ wrote: > > Hello Ahmed, > > Thanks for your valuable reply. Actually, earlier it was > > hdfs://localhost:9000...but it was not working so I made it 9999..But > > 9999 is also not working..Here is my core-site.xml file - > > > > > > dfs.replication > > 1 > > > > > > > > dfs.data.dir > > /home/tariq/hdfs/data > > > > > > > > dfs.name.dir > > /home/tariq/hdfs/name > > > > > > > > And hdfs-site.xml - > > > > > > fs.default.name > > hdfs://localhost:9000 > > > > > > hadoop.tmp.dir > > file:///home/tariq/hadoop_tmp > > > > > > > > Regards, > > Mohammad Tariq > > > > > > > > On Mon, Nov 14, 2011 at 5:21 PM, Ahmed Fathalla [via Apache Chukwa] > > <[hidden email]> wrote: > >> I think the problem you have is in this line > >> writer.hdfs.filesystem > >> hdfs://localhost:9999/ > >> HDFS to dump to > >> > >> > >> > >> Are you sure you've got HDFS running on port 9999 on your local machine? > >> On Mon, Nov 14, 2011 at 1:18 PM, Mohammad Tariq <[hidden email]> wrote: > >>> > >>> Whenever I am trying to start the collector using " bin/chukwa > >>> collector " I get the following line on the terminal and the terminal > >>> gets stuck there itself - > >>> > >>> tariq@ubuntu:~/chukwa-0.4.0$ bin/chukwa collector > >>> tariq@ubuntu:~/chukwa-0.4.0$ >>> value="+442011111416">2011-11-14 16:36:28.888::INFO: Logging > >>> to STDERR via org.mortbay.log.StdErrLog > >>> 2011-11-14 > >>> 16:36:28.911::INFO: jetty-6.1.11 > >>> > >>> > >>> And this is the content of my collector.log file - > >>> > >>> 2011-11-14 > >>> 16:36:27,955 INFO main ChukwaConfiguration - chukwaConf is > >>> /home/tariq/chukwa-0.4.0/bin/../conf > >>> 2011-11-14 > >>> 16:36:28,096 INFO main root - initing servletCollector > >>> 2011-11-14 > >>> 16:36:28,098 INFO main PipelineStageWriter - using > >>> pipelined writers, pipe length is 2 > >>> 2011-11-14 > >>> 16:36:28,100 INFO Thread-6 SocketTeeWriter - listen thread started > >>> 2011-11-14 > >>> 16:36:28,102 INFO main SeqFileWriter - rotateInterval is 300000 > >>> 2011-11-14 > >>> 16:36:28,102 INFO main SeqFileWriter - outputDir is /chukwa > >>> 2011-11-14 > >>> 16:36:28,102 INFO main SeqFileWriter - fsname is > >>> hdfs://localhost:9999/ > >>> 2011-11-14 > >>> 16:36:28,102 INFO main SeqFileWriter - filesystem type from > >>> core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem > >>> 2011-11-14 > >>> 16:36:28,196 ERROR main SeqFileWriter - can't connect to > >>> HDFS, trying default file system instead (likely to be local) > >>> java.lang.NoClassDefFoundError: > >>> org/apache/commons/configuration/Configuration > >>> at > >>> > >>> > org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.(DefaultMetricsSystem.java:37) > >>> at > >>> > >>> > org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.(DefaultMetricsSystem.java:34) > >>> at > >>> > >>> > org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216) > >>> at > >>> org.apache.hadoop.security.KerberosName.(KerberosName.java:83) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:189) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409) > >>> at > >>> > >>> > org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395) > >>> at > >>> org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:1418) > >>> at > org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319) > >>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226) > >>> at > >>> > >>> > org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123) > >>> at > >>> > >>> > org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88) > >>> at > >>> > >>> > org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112) > >>> at > >>> > >>> > org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433) > >>> at > >>> org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256) > >>> at > >>> > org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) > >>> at > >>> > >>> > org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616) > >>> at > >>> org.mortbay.jetty.servlet.Context.startContext(Context.java:140) > >>> at > >>> > org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513) > >>> at > >>> > org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) > >>> at > >>> > org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) > >>> at org.mortbay.jetty.Server.doStart(Server.java:222) > >>> at > >>> > org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39) > >>> at > >>> > >>> > org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121) > >>> Caused by: java.lang.ClassNotFoundException: > >>> org.apache.commons.configuration.Configuration > >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > >>> at java.security.AccessController.doPrivileged(Native Method) > >>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > >>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247) > >>> ... 29 more > >>> > >>> Could anyone point out the issue if possible??? Although, I am able to > >>> start the agent using " bin/chukwa agent "..I am using Chukwa(0.4.0) > >>> on a single machine..The chukwa-collector-conf.xml file looks like > >>> this - > >>> > >>> > >>> > >>> > >>> chukwaCollector.writerClass > >>> > >>> > >>> > org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter > >>> > >>> > >>> > >>> chukwaCollector.pipeline > >>> > >>> > >>> > org.apache.hadoop.chukwa.datacollection.writer.SocketTeeWriter,org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter > >>> > >>> > >>> > >>> > >>> > >>> writer.hdfs.filesystem > >>> hdfs://localhost:9999/ > >>> HDFS to dump to > >>> > >>> > >>> > >>> chukwaCollector.outputDir > >>> /chukwa/logs/ > >>> Chukwa data sink directory > >>> > >>> > >>> > >>> chukwaCollector.rotateInterval > >>> 300000 > >>> Chukwa rotate interval (ms) > >>> > >>> > >>> > >>> chukwaCollector.http.port > >>> 8080 > >>> The HTTP port number the collector will listen > >>> on > >>> > >>> > >>> > >>> > >>> And both the "collectors" and "agents" files have only one line i.e > >>> "localhost" > >>> > >>> Many thanks in advance > >>> Regards, > >>> Mohammad Tariq > >> > >> > >> > >> -- > >> Ahmed Fathalla > >> > >> > >> ________________________________ > >> If you reply to this email, your message will be added to the discussion > >> below: > >> > >> > http://apache-chukwa.679492.n3.nabble.com/Error-while-starting-the-collector-tp3506534p3506606.html > >> To unsubscribe from Apache Chukwa, click here. > >> See how NAML generates this email > > > > ________________________________ > > View this message in context: Re: Error while starting the collector > > Sent from the Chukwa - Users mailing list archive at Nabble.com. > > > -- Ahmed Fathalla --001517740b00805a7804b1b2b2d2 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable do u have /usr/bin/sar , installed. I think it should be there by default i= n Ubuntu..

On Mon, Nov 14, 2011 at 3:26 P= M, Mohammad Tariq <dontariq@gmail.com> wrote:
One more strange thing which I have noticed= is that if am removing
"initial_adaptors", I am able to start the agent. But if the
"initial_adaptors" file is present inside "conf", I am = getting
following errors -
tariq@ubuntu:~/chukwa-0.4.0$ bin/chukwa agent
tariq@ubuntu:~/chukwa-0.4.0$ java.io.IOException: Cannot run program
"/usr/bin/sar": java.io.IOException: error=3D2, No such file or directory
=A0 =A0 =A0 =A0at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)<= br> =A0 =A0 =A0 =A0at java.lang.Runtime.exec(Runtime.java:593)
=A0 =A0 =A0 =A0at java.lang.Runtime.exec(Runtime.java:431)
=A0 =A0 =A0 =A0at java.lang.Runtime.exec(Runtime.java:328)
=A0 =A0 =A0 =A0at org.apache.hadoop.chukwa.inputtools.plugin.ExecPlugin.ex= ecute(ExecPlugin.java:66)
=A0 =A0 =A0 =A0at org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdap= tor$RunToolTask.run(ExecAdaptor.java:68)
=A0 =A0 =A0 =A0at java.util.TimerThread.mainLoop(Timer.java:512)
=A0 =A0 =A0 =A0at java.util.TimerThread.run(Timer.java:462)
Caused by: java.io.IOException: java.io.IOException: error=3D2, No such
file or directory
=A0 =A0 =A0 =A0at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)=
=A0 =A0 =A0 =A0at java.lang.ProcessImpl.start(ProcessImpl.java:65)
=A0 =A0 =A0 =A0at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)<= br> =A0 =A0 =A0 =A0... 7 more
java.io.IOException: Cannot run program "/usr/bin/iostat":
java.io.IOException: error=3D2, No such file or directory
=A0 =A0 =A0 =A0at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)<= br> =A0 =A0 =A0 =A0at java.lang.Runtime.exec(Runtime.java:593)
=A0 =A0 =A0 =A0at java.lang.Runtime.exec(Runtime.java:431)
=A0 =A0 =A0 =A0at java.lang.Runtime.exec(Runtime.java:328)
=A0 =A0 =A0 =A0at org.apache.hadoop.chukwa.inputtools.plugin.ExecPlugin.ex= ecute(ExecPlugin.java:66)
=A0 =A0 =A0 =A0at org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdap= tor$RunToolTask.run(ExecAdaptor.java:68)
=A0 =A0 =A0 =A0at java.util.TimerThread.mainLoop(Timer.java:512)
=A0 =A0 =A0 =A0at java.util.TimerThread.run(Timer.java:462)
Caused by: java.io.IOException: java.io.IOException: error=3D2, No such
file or directory
=A0 =A0 =A0 =A0at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)=
=A0 =A0 =A0 =A0at java.lang.ProcessImpl.start(ProcessImpl.java:65)
=A0 =A0 =A0 =A0at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)<= br> =A0 =A0 =A0 =A0... 7 more

Regards,
=A0=A0 =A0Mohammad Tariq



On Mon, Nov 14, 2011 at 6:49 PM, TARIQ <dontariq@gmail.com> wrote:
> Hello Ahmed,
> =A0 =A0Thanks for your valuable reply. Actually, earlier it was
> hdfs://localhost:9000...but it was not working so I made it 9999..But<= br> > 9999 is also not working..Here is my core-site.xml file -
> <configuration>
> =A0 =A0 =A0<property>
> =A0 =A0 =A0 =A0 =A0<name>dfs.replication</name>
> =A0 =A0 =A0 =A0 =A0<value>1</value>
> =A0 =A0 =A0</property>
>
> =A0 =A0 =A0 <property>
> =A0 =A0 =A0 =A0 =A0<name>dfs.data.dir</name>
> =A0 =A0 =A0 =A0 =A0<value>/home/tariq/hdfs/data</value> > =A0 =A0 =A0</property>
>
> =A0 =A0 =A0<property>
> =A0 =A0 =A0 =A0 =A0<name>dfs.name.dir</name>
> =A0 =A0 =A0 =A0 =A0<value>/home/tariq/hdfs/name</value> > =A0 =A0 =A0</property>
> </configuration>
>
> And hdfs-site.xml -
> <configuration>
> =A0 =A0<property>
> =A0 =A0 =A0 =A0 =A0<name>fs.default.name</name>
> =A0 =A0 =A0 =A0 =A0<value>hdfs://localhost:9000</value> > =A0 =A0</property>
> =A0 =A0<property>
> =A0 <name>hadoop.tmp.dir</name>
> =A0 <value>file:///home/tariq/hadoop_tmp</value>
> =A0 =A0</property>
> </configuration>
>
> Regards,
> =A0=A0 =A0Mohammad Tariq
>
>
>
> On Mon, Nov 14, 2011 at 5:21 PM, Ahmed Fathalla [via Apache Chukwa] > <[hidden email]> wrote:
>> I think the problem you have is in this line
>> =A0 =A0<name>writer.hdfs.filesystem</name>
>> =A0 =A0<value>hdfs://localhost:9999/</value>
>> =A0 =A0<description>HDFS to dump to</description>
>> =A0</property>
>>
>>
>> Are you sure you've got HDFS running on port 9999 on your loca= l machine?
>> On Mon, Nov 14, 2011 at 1:18 PM, Mohammad Tariq <[hidden email]= > wrote:
>>>
>>> Whenever I am trying to start the collector using " bin/c= hukwa
>>> collector " I get the following line on the terminal and = the terminal
>>> gets stuck there itself -
>>>
>>> tariq@ubuntu:~/chukwa-0.4.0$ bin/chukwa collector
>>> tariq@ubuntu:~/chukwa-0.4.0$ <a href=3D"tel:2011-11-14= %2016"
>>> value=3D"+442011111416">2011-11-14 16:36:28.888::INFO: =A0Logging
>>> to STDERR via org.mortbay.log.StdErrLog
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28.911::INFO: =A0jetty-6.1.11
>>>
>>>
>>> And this is the content of my collector.log file -
>>>
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:27,955 INFO main ChukwaConfiguration - chukwaConf is
>>> /home/tariq/chukwa-0.4.0/bin/../conf
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,096 INFO main root - initing servletCollector
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,098 INFO main PipelineStageWriter - using
>>> pipelined writers, pipe length is 2
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,100 INFO Thread-6 SocketTeeWriter - listen thread sta= rted
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,102 INFO main SeqFileWriter - rotateInterval is 30000= 0
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,102 INFO main SeqFileWriter - outputDir is /chukwa >>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,102 INFO main SeqFileWriter - fsname is
>>> hdfs://localhost:9999/
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,102 INFO main SeqFileWriter - filesystem type from >>> core-default.xml is org.apache.hadoop.hdfs.DistributedFileSyst= em
>>> <a href=3D"tel:2011-11-14%2016" value=3D"+442011111416"= ;>2011-11-14
>>> 16:36:28,196 ERROR main SeqFileWriter - can't connect to >>> HDFS, trying default file system instead (likely to be local)<= br> >>> java.lang.NoClassDefFoundError:
>>> org/apache/commons/configuration/Configuration
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init&g= t;(DefaultMetricsSystem.java:37)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit= >(DefaultMetricsSystem.java:34)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UgiInstrumentation.create(UgiInstru= mentation.java:51)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.initialize(Use= rGroupInformation.java:196)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.ensureInitiali= zed(UserGroupInformation.java:159)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.isSecurityEnab= led(UserGroupInformation.java:216)
>>> =A0 =A0 =A0 =A0at
>>> org.apache.hadoop.security.KerberosName.<clinit>(Kerbero= sName.java:83)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.initialize(Use= rGroupInformation.java:189)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.ensureInitiali= zed(UserGroupInformation.java:159)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.isSecurityEnab= led(UserGroupInformation.java:216)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(U= serGroupInformation.java:409)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser= (UserGroupInformation.java:395)
>>> =A0 =A0 =A0 =A0at
>>> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSys= tem.java:1418)
>>> =A0 =A0 =A0 =A0at org.apache.hadoop.fs.FileSystem$Cache.get(Fi= leSystem.java:1319)
>>> =A0 =A0 =A0 =A0at org.apache.hadoop.fs.FileSystem.get(FileSyst= em.java:226)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.i= nit(SeqFileWriter.java:123)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWr= iter.init(PipelineStageWriter.java:88)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.chukwa.datacollection.collector.servlet.Serv= letCollector.init(ServletCollector.java:112)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHol= der.java:433)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.= java:256)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycl= e.java:39)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHan= dler.java:616)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.jetty.servlet.Context.startContext(Context.java:14= 0)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandle= r.java:513)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycl= e.java:39)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrappe= r.java:130)
>>> =A0 =A0 =A0 =A0at org.mortbay.jetty.Server.doStart(Server.java= :222)
>>> =A0 =A0 =A0 =A0at
>>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycl= e.java:39)
>>> =A0 =A0 =A0 =A0at
>>>
>>> org.apache.hadoop.chukwa.datacollection.collector.CollectorStu= b.main(CollectorStub.java:121)
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.commons.configuration.Configuration
>>> =A0 =A0 =A0 =A0at java.net.URLClassLoader$1.run(URLClassLoader= .java:202)
>>> =A0 =A0 =A0 =A0at java.security.AccessController.doPrivileged(= Native Method)
>>> =A0 =A0 =A0 =A0at java.net.URLClassLoader.findClass(URLClassLo= ader.java:190)
>>> =A0 =A0 =A0 =A0at java.lang.ClassLoader.loadClass(ClassLoader.= java:306)
>>> =A0 =A0 =A0 =A0at sun.misc.Launcher$AppClassLoader.loadClass(L= auncher.java:301)
>>> =A0 =A0 =A0 =A0at java.lang.ClassLoader.loadClass(ClassLoader.= java:247)
>>> =A0 =A0 =A0 =A0... 29 more
>>>
>>> Could anyone point out the issue if possible??? Although, I am= able to
>>> start the agent using " bin/chukwa agent "..I am usi= ng Chukwa(0.4.0)
>>> on a single machine..The chukwa-collector-conf.xml file looks = like
>>> this -
>>>
>>> <configuration>
>>>
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.writerClass</name> >>>
>>>
>>> =A0<value>org.apache.hadoop.chukwa.datacollection.writer= .PipelineStageWriter</value>
>>> =A0</property>
>>>
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.pipeline</name>
>>>
>>>
>>> =A0<value>org.apache.hadoop.chukwa.datacollection.writer= .SocketTeeWriter,org.apache.hadoop.chukwa.datacollection.writer.SeqFileWrit= er</value>
>>> =A0</property>
>>>
>>> <!-- LocalWriter parameters
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.localOutputDir</name>=
>>> =A0 =A0<value>/tmp/chukwa/dataSink/</value>
>>> =A0 =A0<description>Chukwa local data sink directory, se= e
>>> LocalWriter.java</description>
>>> =A0</property>
>>>
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.writerClass</name> >>>
>>>
>>> =A0<value>org.apache.hadoop.chukwa.datacollection.writer= .localfs.LocalWriter</value>
>>> =A0 =A0<description>Local chukwa writer, see LocalWriter= .java</description>
>>> =A0</property>
>>> -->
>>>
>>> =A0<property>
>>> =A0 =A0<name>writer.hdfs.filesystem</name>
>>> =A0 =A0<value>hdfs://localhost:9999/</value>
>>> =A0 =A0<description>HDFS to dump to</description><= br> >>> =A0</property>
>>>
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.outputDir</name>
>>> =A0 =A0<value>/chukwa/logs/</value>
>>> =A0 =A0<description>Chukwa data sink directory</descr= iption>
>>> =A0</property>
>>>
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.rotateInterval</name>=
>>> =A0 =A0<value>300000</value>
>>> =A0 =A0<description>Chukwa rotate interval (ms)</desc= ription>
>>> =A0</property>
>>>
>>> =A0<property>
>>> =A0 =A0<name>chukwaCollector.http.port</name>
>>> =A0 =A0<value>8080</value>
>>> =A0 =A0<description>The HTTP port number the collector w= ill listen
>>> on</description>
>>> =A0</property>
>>>
>>> </configuration>
>>>
>>> And both the "collectors" and "agents" fil= es have only one line i.e
>>> "localhost"
>>>
>>> Many thanks in advance
>>> Regards,
>>> =A0=A0 =A0Mohammad Tariq
>>
>>
>>
>> --
>> Ahmed Fathalla
>>
>>
>> ________________________________
>> If you reply to this email, your message will be added to the disc= ussion
>> below:
>>
>> http://apac= he-chukwa.679492.n3.nabble.com/Error-while-starting-the-collector-tp3506534= p3506606.html
>> To unsubscribe from Apache Chukwa, click here.
>> See how NAML generates this email
>
> ________________________________
> View this message in context: Re: Error while starting the collector > Sent from the Chukwa - Users mailing list archive at Nabble.com.
>



--
= Ahmed Fathalla
--001517740b00805a7804b1b2b2d2--