incubator-chukwa-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ariel Rabkin <>
Subject Re: Chukwa questions
Date Fri, 09 Jul 2010 16:19:33 GMT

This question should probably go to the Chukwa-user list, not the Hadoop lists.

You do not need Hadoop on the machines with Chukwa agents. You don't
even technically need it on the machines hosting collectors -- you can
point a collector at a Hadoop cluster across the network, if you like.
Although best practice is to put collectors on machines that also have
Hadoop DataNodes.

"Hadoop" per se mostly refers to mapreduce + HDFS.

On Fri, Jul 9, 2010 at 8:33 AM, Blargy <> wrote:
> I am looking into to Chukwa to collect/aggregate our search logs from across
> multiple hosts. As I understand it I need to have a agent/adaptor running on
> each host which then in turn forward this to a collector (across the
> network) which will then write out to HDFS. Correct?
> Does Hadoop need to be installed on the host machines that are running the
> agent/adaptors or just Chuckwa? Is Hadoop by itself anything or is Hadoop
> just a collection of tools... HDFS, Hive, Chukwa, Mahout, etc?
> Thanks
> --
> View this message in context:
> Sent from the Hadoop lucene-users mailing list archive at

Ari Rabkin
UC Berkeley Computer Science Department

View raw message