hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Cagdas Gerede" <cagdas.ger...@gmail.com>
Subject Re: Hadoop and VMware
Date Tue, 06 May 2008 16:03:07 GMT
See these:

Heap Size : 16GB
https://issues.apache.org/jira/browse/HADOOP-3248?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel

Heap Size: 3.2GB
https://issues.apache.org/jira/browse/HADOOP-3022?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12594019#action_12594019

Cagdas

-- 
------------
Best Regards, Cagdas Evren Gerede
Home Page: http://cagdasgerede.info


On Tue, May 6, 2008 at 6:33 AM, Ahmad Humayun <ahmad.humyn@gmail.com> wrote:

> I set the value to -Xms512m and now works with 512mb assigned to my VM :)
>
> Is there some recommended heap size I should use with hadoop? Is 512 too
> less?
>
>
> Regards,
>
> On Tue, May 6, 2008 at 6:17 PM, Ahmad Humayun <ahmad.humyn@gmail.com>
> wrote:
>
> > 32 bit JVM
> >
> >
> > On Tue, May 6, 2008 at 3:20 PM, Steve Loughran <stevel@apache.org>
> wrote:
> >
> > > Ahmad Humayun wrote:
> > >
> > > > Just tried with 512 MB and 1 GB....and guess what .... it started
> > > > (finally!!) working at a GB.
> > > >
> > > > Is there a way to lower this requirement?.....I'll also just try to
> > > > hone in
> > > > into the min. amount of RAM needed.
> > > >
> > > > I really can't afford using a GB of RAM for my VM....I will soon run
> > > > out of
> > > > juice that way :(
> > > >
> > > >
> > > > Regards,
> > > >
> > > > On Tue, May 6, 2008 at 12:49 AM, Ahmad Humayun <
> ahmad.humyn@gmail.com>
> > > > wrote:
> > > >
> > > >  Well my VM is allocated 256 MB.....I'll just increase it and report
> > > > > back
> > > > >
> > > > > Plus I have just tried HelloWorld programs....and since they
> hardly
> > > > > have
> > > > > any memory usage, they work.
> > > > >
> > > > >
> > > > > Regards,
> > > > >
> > > > >
> > > > > On Tue, May 6, 2008 at 12:41 AM, Christophe Taton <
> taton@apache.org>
> > > > > wrote:
> > > > >
> > > > >  Hi Ahmad,
> > > > > >
> > > > > > As the error message suggests it, your issue is likely to be
> > > > > > related to
> > > > > > the
> > > > > > amount of memory available:
> > > > > >    Error occurred during initialization of VM
> > > > > >   Could not reserve enough space for object heap
> > > > > >
> > > > > > How much memory did you allocate to your VM? Can you run any
> other
> > > > > > Java
> > > > > > applications in your JVM?
> > > > > >
> > > > > > Christophe
> > > > > >
> > > > > > On Mon, May 5, 2008 at 9:33 PM, Ahmad Humayun <
> > > > > > ahmad.humyn@gmail.com>
> > > > > > wrote:
> > > > > >
> > > > > >  Hi there,
> > > > > > >
> > > > > > > Has anybody tried running Hadoop on VMware (6.0). I have
> > > > > > > installed
> > > > > > >
> > > > > > open
> > > > > >
> > > > > > > SUSE
> > > > > > > 10.2 as a guest OS....and I have been trying to get Hadoop
> > > > > > > started,
> > > > > > >
> > > > > > but
> > > > > >
> > > > > > > whatever I do with bin/hadoop, I keep getting this error:
> > > > > > > *Error occurred during initialization of VM
> > > > > > > Could not reserve enough space for object heap
> > > > > > > Could not create the Java virtual machine.*
> > > > > > >
> > > > > > > Any ideas? Is it a problem with VMware? Or maybe my java
> > > > > > > environment
> > > > > > > setup?
> > > > > > > Or I'm simply doing something wrong in setting up Hadoop?
> > > > > > >
> > > > > >
> > > -Are you using 64 bit or 32 bit JVM?
> > >
> >
> >
> >
> > --
> > Ahmad Humayun
> > Research Assistant
> > Computer Science Dpt., LUMS
> > +92 321 4457315
> >
>
>
>
> --
> Ahmad Humayun
> Research Assistant
> Computer Science Dpt., LUMS
> +92 321 4457315
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message