hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jcuencaa <jordi.cuenca.aub...@everis.com>
Subject Server sizing Lucene + Hadoop
Date Thu, 31 May 2012 14:19:26 GMT
Hello! 
I need to do a capacity planning or a server sizing for a Lucene + Hadoop
server, it means, plan how many servers and hardware (CPU, memory, etc.) do
I need to accomplish with the maximum amount of work that my organization
requires in a given period. 
I haven’t found documentation regarding to this in the Lucene or Hadoop site
or, at least, which things should be taken into account for doing the server
sizing. It’s obvious that sizing depends on many factors but, in example, in
Application servers or Web Servers normally sizing is done inferring
hardware needs using some benchmarks as a baseline. 
So I’d be pleased if someone can help me. 
Thanks in advance. 

--
View this message in context: http://lucene.472066.n3.nabble.com/Server-sizing-Lucene-Hadoop-tp3987033.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

Mime
View raw message