chukwa-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ariel Rabkin <asrab...@gmail.com>
Subject Re: Chukwa setup question
Date Wed, 21 Oct 2009 03:06:46 GMT
Howdy!

The code to harvest Hadoop logs requires Hadoop 0.20.  The agents and
collectors themselves should be fine with 0.18 or above.  I use 0.20
throughout, and recommend that.  With Hadoop 0.20, you have a choice
of configuration files.  You can *either* use
mapred-site/hdfs-site/core-site, OR you can stick everything in
hadoop-site.  The first approach is preferred, since hadoop-site is
going away in the future.

You do not need to start Chukwa from init.  You have a bunch of other
options, depending on how you want to do your daemon management.

The minimum you need to run Chukwa are agents on each machine you're
monitoring, and a collector to write the collected data to HDFS.   The
basic command to start an agent is bin/agent.sh.  The base command to
start a collector is bin/jettyCollector.sh.

If you want to start a bunch of agents, you can use the
bin/start-agents.sh script. This just uses ssh to start agents on a
list of machines, given in conf/agents. It's exactly parallel to
Hadoop's start-hdfs script.  There's also a bin/start-collectors.sh
that does the same to start collectors, on machines listed in
conf/collectors.  One hostname per line.

There are also stop scripts that do the exact opposite of the start commands.

I'm not sure if that answered your question. Let me know if not.

--Ari

On Tue, Oct 20, 2009 at 6:30 PM, Sagar <snaik@attributor.com> wrote:
> I m trying to install the new chuwa release
> (http://people.apache.org/~asrabkin/chukwa-0.3.0-candidate-3/)
>
> I am bit confused:
> I tried to compile this with hadoop-18 jars. It wont. so I tried with
> hadoop-20 and it did
> So I assumed we need hadoop-20. I did the setup hadoop-20 setup
>
> Now, as I go thru admin.pdf (in tar)
>
> It says , link to hadoop-conf.site.xml (which is in hadoop-18 and not in
> hadoop-20)
> so my question is :
> - wht version of hadoop do we need  ?
> - if we need 20, do we need hdfs-site.xml / mapred-site.xml /core-site.xml ?
>  or as hack, cat all 3 files into conf/hadoop-site.xml ? or need all of them
>
> I m on ubuntu
> I dont have /etc/init.d/fuctions in either path. As a result, I cant start
> anything, Can u suggest a hack or simple start commands
>
>
>
> Thanks,
> -Sagar
>



-- 
Ari Rabkin asrabkin@gmail.com
UC Berkeley Computer Science Department

Mime
View raw message