hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yatong Zhang <bluefl...@gmail.com>
Subject Need some recommendations on hardwares
Date Tue, 23 Dec 2014 12:30:12 GMT
Hi there,
I am gonna build a 30-nodes cluster and the basic idea is:
1. Hadoop for the base distributed file system and spark for the map reduce
2.Hbase for the data stograge.
3.Kafka for the outside data,
4.Using Storm to read messages in Kafka and write them to hbase and solr
5.Solr to index data and provide the search & query services

I am planning to build this with commodity PC hardwares like i5, i7, with
16~32GB memory, and possibly SSDs.
So I have some suggestions/recommendations:
1. Hardware recommendations for each sub systems(Hbase, kafka, solr etc.)
2. The amount of PCs of each sub systems

I have about 50M messages per day and each message is about 400 ~ 600 bytes
with about 10 fields to index.

Thanks and any suggestions are appreciated~

View raw message