hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Davanum Srinivas <dava...@gmail.com>
Subject Re: Out of Memory Exception while building hadoop
Date Mon, 05 Sep 2011 02:36:19 GMT
John,

Your box is probably using gcj or something else
(http://jeffchannell.com/Flex-3/gc-warning.html), you may want to try
using sun java6 sdk

-- dims

On Sun, Sep 4, 2011 at 10:26 PM, Rottinghuis, Joep
<jrottinghuis@ebay.com> wrote:
> Hi John,
>
> I have not seen this with Cocoon, but I have seen it with FindBugs.
> For those errors I had to pass an addidional param to give FindBugs more memory.
>
> What else is running on the box? How much heap is available?
>
> Cheers,
>
> Joep
> ________________________________________
> From: john smith [js1987.smith@gmail.com]
> Sent: Sunday, September 04, 2011 9:50 AM
> To: common-user@hadoop.apache.org; common-dev@hadoop.apache.org
> Subject: Out of Memory Exception while building hadoop
>
> Hey folks,
>
> Strangely I get a out of memory exception while building hadoop from source.
> I have 2gigs of ram and I've tried building it from both eclipse and
> commandline
>
> http://pastebin.com/9pcHg1P9 is the full stack trace. Can anyone help me out
> on this?
>
> Thanks,
> John Smith
>



-- 
Davanum Srinivas :: http://davanum.wordpress.com

Mime
View raw message