hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Blargy <zman...@hotmail.com>
Subject Chukwa questions
Date Fri, 09 Jul 2010 15:33:52 GMT

I am looking into to Chukwa to collect/aggregate our search logs from across
multiple hosts. As I understand it I need to have a agent/adaptor running on
each host which then in turn forward this to a collector (across the
network) which will then write out to HDFS. Correct?

Does Hadoop need to be installed on the host machines that are running the
agent/adaptors or just Chuckwa? Is Hadoop by itself anything or is Hadoop
just a collection of tools... HDFS, Hive, Chukwa, Mahout, etc?

View this message in context: http://lucene.472066.n3.nabble.com/Chukwa-questions-tp954643p954643.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

View raw message