ambari-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Darpan Patel <darpa...@gmail.com>
Subject Re: Ambari Server going Out Of Memory?
Date Tue, 01 Dec 2015 15:58:39 GMT
Hi Myroslav,

You were correct. Out of 14 GB physical memory around 150 MB were available.
Manually I killed few processes. But When I run the command to check the
memory eaters I see many processes are hanged related to Hive.
I am not sure how come so much of memory got eaten up. Any guidelines on
killing the processes?


On 1 December 2015 at 15:27, Myroslav Papyrkovskyy <
mpapyrkovskyy@hortonworks.com> wrote:

> Hello. This is JVM error, and not Ambari one.
> It basically says that theres not enough memory at host to handle heap of
> allowed size (2GB by default).
> In your case it tries to allocate ~400MB and is unable to do so.
> May be some other application consumed all available memory.
>
> Regards,
> Myroslav
>
> > 1 груд. 2015 р. о 17:19 Darpan R <darpanbe@gmail.com> написав(ла):
> >
> > Hi all,
> >
> > Ambari was working fine in production for last 30 days and today I see
> that is down. While checking the logs and trying to restart the Ambari
> Server I see following error.
> >
> > Java HotSpot(TM) 64-Bit Server VM warning: INFO:
> os::commit_memory(0x00000000a0000000, 402653184, 0) failed; error='Cannot
> allocate memory' (errno=12)
> > #
> > # There is insufficient memory for the Java Runtime Environment to
> continue.
> > # Native memory allocation (mmap) failed to map 402653184 bytes for
> committing reserved memory.
> > # An error report file with more information is saved as:
> > # //hs_err_pid31856.log
> >
> > On opening the hs_err_pid31856.log I see that it says out of memory.
> Attaching the log file here with for your information.
> >
> > Thanks,
> > DP
> >
> >
> > <AmbariRestart.log>
>
>

Mime
View raw message