hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From erolagnab <trung....@gmail.com>
Subject How much RAMs needed...
Date Sun, 15 Jul 2007 15:04:38 GMT

I have a HDFS with 2 datanodes and 1 namenode in 3 different machines, 2G ram
each.
Datanode A contains around 700,000 blocks and Datanode B contains 1,200,000+
blocks, the namenode fails to start due to out of memory when trying to add
Datanode B into its rack. I have adjusted the java heap memory to 1600MB
which is the maxinum. But it still runs out of memory.

AFAIK, namenode loads all blocks information into the memory. If so, then is
there anyway to estimate how much ram needed for a HDFS with given number of
blocks in each datanode?
-- 
View this message in context: http://www.nabble.com/How-much-RAMs-needed...-tf4082367.html#a11603027
Sent from the Hadoop Users mailing list archive at Nabble.com.


Mime
View raw message