hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <tdunn...@veoh.com>
Subject Re: OutOfMemory
Date Sun, 01 Jul 2007 21:10:48 GMT

If you are using machines with only 512MB of memory, it is probably a very
bad idea to set minimum help size so large.

-Xms400M might be more appropriate.

I should say, though that if you have a program that is worth using hadoop
on, you have a problem that is worth having more memory on each processor.
Most of the work I do benefits more from memory than from processor, at
least up to >1-2GB RAM.

On 6/30/07 11:51 AM, "Avinash Lakshman" <alakshman@facebook.com> wrote:

> There is an element in the config for Java params. Set it to -Xms1024M
> and give it a shot. It is definitely seems like a case of you running
> out of heap space.
> 
> A
> -----Original Message-----
> From: Emmanuel JOKE [mailto:jokeout@gmail.com]
>  ...
> My cluster of 2 machines used each 512 M0 of memory. isn't it enough ?
> What is the best practice ?
> 
> Do you any idea if they are a bug ? or is it just my conf which is not
> correct ?
> 
> Thanks for your help


Mime
View raw message