incubator-chukwa-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From AD <straightfl...@gmail.com>
Subject Re: How to begin with Chukwa
Date Thu, 10 Nov 2011 22:18:29 GMT
do you have a /chukwa/logs directory?  If not you may want to update in
collector-conf.xml

 <property>
    <name>chukwaCollector.outputDir</name>
    <value>/chukwa/logs/</value>
    <description>Chukwa data sink directory</description>
  </property>

Also try commenting out

 <property>
    <name>writer.hdfs.filesystem</name>
    <value>@TODO-COLLECTORS-NAMENODE@</value>
    <description>HDFS to dump to</description>
  </property>

To avoid the HDFS error.

On Thu, Nov 10, 2011 at 4:51 PM, TARIQ <dontariq@gmail.com> wrote:

> Here is the collector.log file -
>
> 2011-11-11 03:20:37,394 INFO main ChukwaConfiguration - chukwaConf is
> /home/prashant/chukwa-0.4.0/bin/../conf
> 2011-11-11 03:20:38,960 INFO main root - initing servletCollector
> 2011-11-11 03:20:38,967 INFO main PipelineStageWriter - using
> pipelined writers, pipe length is 2
> 2011-11-11 03:20:38,972 INFO Thread-6 SocketTeeWriter - listen thread
> started
> 2011-11-11 03:20:38,979 INFO main SeqFileWriter - rotateInterval is 300000
> 2011-11-11 03:20:38,979 INFO main SeqFileWriter - outputDir is
> /chukwa/logs/
> 2011-11-11 03:20:38,979 INFO main SeqFileWriter - fsname is
> hdfs://localhost:9000/
> 2011-11-11 03:20:38,979 INFO main SeqFileWriter - filesystem type from
> core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem
> 2011-11-11 03:20:39,205 ERROR main SeqFileWriter - can't connect to
> HDFS, trying default file system instead (likely to be local)
> java.lang.NoClassDefFoundError:
> org/apache/commons/configuration/Configuration
>         at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
>
>         at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
>
>         at
> org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>
>         at
> org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:189)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
>
>         at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
>         at
> org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123)
>
>         at
> org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88)
>
>         at
> org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112)
>
>         at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433)
>         at
> org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616)
>
>         at
> org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
>         at
> org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
>         at org.mortbay.jetty.Server.doStart(Server.java:222)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121)
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.commons.configuration.Configuration
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 29 more
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Fri, Nov 11, 2011 at 3:09 AM, AD [via Apache Chukwa]
> <[hidden email] <http://user/SendEmail.jtp?type=node&node=3498068&i=0>>
> wrote:
>
> > can you share the collector.log file?
> > Also are you setting CHUKWA_HOME variable properly ?
> > the iostat error is a bit concerning.
> >
> > On Thu, Nov 10, 2011 at 2:53 PM, Mohammad Tariq <[hidden email]> wrote:
> >>
> >> Can you share 'configured' Chukwa configuration files that are
> >> necessary in order to run Chukwa agent and collector, if you have any
> >> and if possible for you so that I can observe them and figure out what
> >> I am missing???
> >>
> >> Regards,
> >>     Mohammad Tariq
> >>
> >>
> >>
> >> On Fri, Nov 11, 2011 at 1:16 AM, Mohammad Tariq <[hidden email]> wrote:
> >> > This is the output on my terminal when i issue
> >> > "prashant@ubuntu:~/chukwa-0.4.0$ bin/chukwa agent" -
> >> > prashant@ubuntu:~/chukwa-0.4.0$ java.io.IOException: Cannot run
> >> > program "/usr/bin/iostat": java.io.IOException: error=2, No such file
> >> > or directory
> >> >        at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
> >> >        at java.lang.Runtime.exec(Runtime.java:593)
> >> >        at java.lang.Runtime.exec(Runtime.java:431)
> >> >        at java.lang.Runtime.exec(Runtime.java:328)
> >> >        at
> >> >
> org.apache.hadoop.chukwa.inputtools.plugin.ExecPlugin.execute(ExecPlugin.java:66)
>
> >> >        at
> >> >
> org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor$RunToolTask.run(ExecAdaptor.java:68)
>
> >> >        at java.util.TimerThread.mainLoop(Timer.java:512)
> >> >        at java.util.TimerThread.run(Timer.java:462)
> >> > Caused by: java.io.IOException: java.io.IOException: error=2, No such
> >> > file or directory
> >> >        at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
> >> >        at java.lang.ProcessImpl.start(ProcessImpl.java:65)
> >> >        at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
> >> >        ... 7 more
> >> > java.io.IOException: Cannot run program "/usr/bin/sar":
> >> > java.io.IOException: error=2, No such file or directory
> >> >        at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
> >> >        at java.lang.Runtime.exec(Runtime.java:593)
> >> >        at java.lang.Runtime.exec(Runtime.java:431)
> >> >        at java.lang.Runtime.exec(Runtime.java:328)
> >> >        at
> >> >
> org.apache.hadoop.chukwa.inputtools.plugin.ExecPlugin.execute(ExecPlugin.java:66)
>
> >> >        at
> >> >
> org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor$RunToolTask.run(ExecAdaptor.java:68)
>
> >> >        at java.util.TimerThread.mainLoop(Timer.java:512)
> >> >        at java.util.TimerThread.run(Timer.java:462)
> >> > Caused by: java.io.IOException: java.io.IOException: error=2, No such
> >> > file or directory
> >> >        at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
> >> >        at java.lang.ProcessImpl.start(ProcessImpl.java:65)
> >> >        at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
> >> >        ... 7 more
> >> >
> >> > and when i issue "prashant@ubuntu:~/chukwa-0.4.0$ bin/chukwa
> >> > collector" i get this -
> >> > prashant@ubuntu:~/chukwa-0.4.0$ 2011-11-11 01:04:25.851::INFO:
> >> > Logging to STDERR via org.mortbay.log.StdErrLog
> >> > 2011-11-11 01:04:25.894::INFO:  jetty-6.1.11
> >> >
> >> > Regards,
> >> >     Mohammad Tariq
> >> >
> >> >
> >> >
> >> > On Fri, Nov 11, 2011 at 1:12 AM, Mohammad Tariq <[hidden email]>
> wrote:
> >> >> Hello AD,
> >> >>  this my initial_adaptors file -
> >> >>
> >> >> add org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor
> Iostat
> >> >> 60 /usr/bin/iostat -x -k 55 2 0
> >> >> add org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor Df
> 60
> >> >> /bin/df -l 0
> >> >> add org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor Sar
> 60
> >> >> /usr/bin/sar -q -r -n ALL 55 0
> >> >> add org.apache.hadoop.chukwa.datacollection.adaptor.ExecAdaptor Top
> 60
> >> >> /usr/bin/top -b -n 1 -c 0
> >> >>
> >> >> and this is the collectors file -
> >> >> localhost
> >> >>
> >> >> I am trying to run both agent and collector on the same machine.
> >> >>
> >> >> The collector.log file looks like this -
> >> >> 2011-11-11 01:04:25,180 INFO main ChukwaConfiguration - chukwaConf
> is
> >> >> /home/prashant/chukwa-0.4.0/bin/../conf
> >> >> 2011-11-11 01:04:25,434 INFO main root - initing servletCollector
> >> >> 2011-11-11 01:04:25,438 INFO main PipelineStageWriter - using
> >> >> pipelined writers, pipe length is 2
> >> >> 2011-11-11 01:04:25,448 INFO main SeqFileWriter - rotateInterval is
> >> >> 300000
> >> >> 2011-11-11 01:04:25,448 INFO main SeqFileWriter - outputDir is
> >> >> /chukwa/logs/
> >> >> 2011-11-11 01:04:25,448 INFO main SeqFileWriter - fsname is
> >> >> hdfs://localhost:9000/
> >> >> 2011-11-11 01:04:25,448 INFO main SeqFileWriter - filesystem type
> from
> >> >> core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem
> >> >> 2011-11-11 01:04:25,455 INFO Thread-6 SocketTeeWriter - listen
> thread
> >> >> started
> >> >> 2011-11-11 01:04:25,593 ERROR main SeqFileWriter - can't connect to
> >> >> HDFS, trying default file system instead (likely to be local)
> >> >> java.lang.NoClassDefFoundError:
> >> >> org/apache/commons/configuration/Configuration
> >> >>        at
> >> >>
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
>
> >> >>        at
> >> >>
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:189)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
>
> >> >>        at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
>
> >> >>        at
> >> >>
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
> >> >>        at
> >> >> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
> >> >>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
> >> >>        at
> >> >>
> org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123)
>
> >> >>        at
> >> >>
> org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88)
>
> >> >>        at
> >> >>
> org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112)
>
> >> >>        at
> >> >>
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433)
> >> >>        at
> >> >>
> org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256)
> >> >>        at
> >> >>
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
> >> >>        at
> >> >>
> org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616)
>
> >> >>        at
> >> >> org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
> >> >>        at
> >> >>
> org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513)
> >> >>        at
> >> >>
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
> >> >>        at
> >> >>
> org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
> >> >>        at org.mortbay.jetty.Server.doStart(Server.java:222)
> >> >>        at
> >> >>
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
> >> >>        at
> >> >>
> org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121)
>
> >> >> Caused by: java.lang.ClassNotFoundException:
> >> >> org.apache.commons.configuration.Configuration
> >> >>        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >> >>        at java.security.AccessController.doPrivileged(Native Method)
> >> >>        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >> >>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >> >>        at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >> >>        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >> >>        ... 29 more
> >> >>
> >> >> Regards,
> >> >>     Mohammad Tariq
> >> >>
> >> >>
> >> >>
> >> >> On Thu, Nov 10, 2011 at 8:30 PM, AD <[hidden email]> wrote:
> >> >>> what happens when you tail the collector.log file?
> >> >>> can you share your initial_adaptors and collectors file in conf
?
> >> >>>
> >> >>> On Thu, Nov 10, 2011 at 8:13 AM, Mohammad Tariq <[hidden email]>
> >> >>> wrote:
> >> >>>>
> >> >>>> Hello AD,
> >> >>>>   I am sorry for leaving the discussion in between, I was busy
> with
> >> >>>> my exams.Now I am back and need your help.I started with what
you
> >> >>>> have
> >> >>>> suggested and read about the concepts (agents and collectors
etc).
> >> >>>> Now
> >> >>>> I am trying to collect the data from a file on the local file
> system
> >> >>>> and trying to dump it into the HDFS and I am running the agent
and
> >> >>>> collector on the same machine.I was able to start the agent
but
> when
> >> >>>> I
> >> >>>> tried to start the collector I got the following line at the
> terminal
> >> >>>> and the terminal got stuck there itself -
> >> >>>> tariq@ubuntu:~/chukwa-0.4.0$ 2011-11-10 18:42:09.521::INFO:
>  Logging
> >> >>>> to STDERR via org.mortbay.log.StdErrLog
> >> >>>> 2011-11-10 18:42:09.544::INFO:  jetty-6.1.11
> >> >>>>
> >> >>>> When I issued jps, I could see only the agent.
> >> >>>>
> >> >>>> Regards,
> >> >>>>     Mohammad Tariq
> >> >>>>
> >> >>>>
> >> >>>>
> >> >>>> On Sat, Nov 5, 2011 at 5:38 AM, Bill Graham <[hidden email]>
> wrote:
> >> >>>> > One difference between Chukwa and Flume is that in Chukwa
you
> >> >>>> > configure
> >> >>>> > each
> >> >>>> > component individually to know how to handle different
> datatypes.
> >> >>>> > This
> >> >>>> > makes
> >> >>>> > it fairly straight-forward to set up, but can be painful
to
> manage
> >> >>>> > for
> >> >>>> > very
> >> >>>> > large installs (so I've heard). Flume centralized the
configs of
> >> >>>> > "flows"
> >> >>>> > which helps when managing larger installs, at the cost
of
> >> >>>> > additional
> >> >>>> > complexity.
> >> >>>> >
> >> >>>> > On Fri, Nov 4, 2011 at 7:02 AM, AD <[hidden email]>
wrote:
> >> >>>> >>
> >> >>>> >> Are you running the agent and collector on the same
machine?
>  Off
> >> >>>> >> the
> >> >>>> >> cuff
> >> >>>> >> its something like
> >> >>>> >> 1 - set your CHUKWA_HOME env variable
> >> >>>> >> 2 - edit conf/chukwa-env.sh and make sure JAVA_HOME
and all the
> >> >>>> >> vars
> >> >>>> >> look
> >> >>>> >> good
> >> >>>> >> 3 - edit conf/initial_adaptors with the adaptors you
need.
>  note
> >> >>>> >> here
> >> >>>> >> that
> >> >>>> >> the dataType
> >> >>>> >> 4 - edit conf/collectors and add the hostname/ip of
the
> collectors
> >> >>>> >> 5 - edit conf/agents and add ip/hostname of the agents
> >> >>>> >> 6 - edit conf/chukwa-demux.conf and make sure you
have a demux
> >> >>>> >> processor
> >> >>>> >> for your dataType (name from initial_adaptors)
> >> >>>> >> On agent node run bin/chukwa agent , tail
> ${CHUKWA_LOGS}/agent.log
> >> >>>> >> On collector node run bin/chukwa collector  ,  tail
> >> >>>> >> ${CHUKWA_LOGS]/collector.log
> >> >>>> >> On collector node run bin/chukwa Demux  ,  tail
> >> >>>> >> ${CHUKWA_LOGS}/Demux.log
> >> >>>> >>
> >> >>>> >>
> >> >>>> >> On Fri, Nov 4, 2011 at 9:48 AM, Mohammad Tariq <[hidden
email]>
> >> >>>> >> wrote:
> >> >>>> >>>
> >> >>>> >>> Hello AD,
> >> >>>> >>>   Yes, I have read the wiki and right now I am
trying to set
> up a
> >> >>>> >>> Chukwa cluster using the wiki itself. Are there
any other
> links
> >> >>>> >>> you
> >> >>>> >>> are aware of that can help me learn and use Chukwa
in a better
> >> >>>> >>> fashion?? I am trying to use Chukwa for data in
"The Wellsite
> >> >>>> >>> Information Transfer Standard Markup Language
(WITSML)"
> format.
> >> >>>> >>>
> >> >>>> >>> Regards,
> >> >>>> >>>     Mohammad Tariq
> >> >>>> >>>
> >> >>>> >>>
> >> >>>> >>> On Fri, Nov 4, 2011 at 6:49 PM, AD <[hidden
email]> wrote:
> >> >>>> >>> >
> >> >>>> >>> > I think it helps to read he wiki and understand
the basic
> >> >>>> >>> > concepts,
> >> >>>> >>> > like agents and collectors.  What kind of
data do you want
> to
> >> >>>> >>> > parse?
> >> >>>> >>> >  I have
> >> >>>> >>> > used both Flume and Chukwa myself and can
say for sure
> Chukwa
> >> >>>> >>> > is
> >> >>>> >>> > miles
> >> >>>> >>> > easier to work with and configure.  Flume
may have some more
> >> >>>> >>> > advanced
> >> >>>> >>> > features, but so far i found Chukwa both
easy to extend from
> a
> >> >>>> >>> > development
> >> >>>> >>> > standpoint, and much easier to configure.
> >> >>>> >>> > Have you read the wiki pages ?
> >> >>>> >>> >
> >> >>>> >>> > On Fri, Nov 4, 2011 at 8:39 AM, Mohammad
Tariq <[hidden
> email]>
> >> >>>> >>> > wrote:
> >> >>>> >>> >>
> >> >>>> >>> >> Hello list,
> >> >>>> >>> >>   I am totally new to Chukwa and I have
to learn and start
> >> >>>> >>> >> using it
> >> >>>> >>> >> as
> >> >>>> >>> >> soon soon as possible as a part of my
new project. Could
> any
> >> >>>> >>> >> one
> >> >>>> >>> >> let me know
> >> >>>> >>> >> how to begin with?? It would be a great
favor to me. I know
> >> >>>> >>> >> Flume
> >> >>>> >>> >> and have
> >> >>>> >>> >> used it already.Also I would like to
know how different is
> >> >>>> >>> >> Chukwa
> >> >>>> >>> >> from
> >> >>>> >>> >> Flume??
> >> >>>> >>> >>   Many thanks in advance.
> >> >>>> >>> >>
> >> >>>> >>> >>   Regards,
> >> >>>> >>> >>     Mohammad Tariq
> >> >>>> >>> >
> >> >>>> >>
> >> >>>> >
> >> >>>> >
> >> >>>
> >> >>>
> >> >>
> >> >
> >
> >
> >
> > ________________________________
> > If you reply to this email, your message will be added to the discussion
> > below:
> >
> http://apache-chukwa.679492.n3.nabble.com/How-to-begin-with-Chukwa-tp3479982p3498042.html
> > To unsubscribe from Apache Chukwa, click here.
> > See how NAML generates this email
>
> ------------------------------
> View this message in context: Re: How to begin with Chukwa<http://apache-chukwa.679492.n3.nabble.com/How-to-begin-with-Chukwa-tp3479982p3498068.html>
> Sent from the Chukwa - Users mailing list archive<http://apache-chukwa.679492.n3.nabble.com/Chukwa-Users-f679490.html>at
Nabble.com.
>

Mime
View raw message