hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jaideep Dhok" <jaideep.d...@gmail.com>
Subject Re: about the exception in mapreduce program?
Date Fri, 01 Feb 2008 05:25:31 GMT
You can change the max memory used by JVM using the -Xmx option. There is
also a HADOOP_HEAPSIZE option in hadoop-env.sh, which you can increase.

On Feb 1, 2008 10:22 AM, ma qiang <maqiang1984@gmail.com> wrote:

> Hi all:
>      I meet this problem as below:
>      My map function read from a table in HBase, then merge several
> string and finally save these string into another table HBase. The
> number of string and the length of the string  is large. After ten
> minutes, the  hadoop print error "out of memory, java heap is not
> enough" . And the program is tested using small string and there is no
> error.  But when the number and length of string become large, the
> error happened. I installed the hadoop in non-distributed mode, and
> the size of my computer's memory is 2G, this size is enough fit for my
> simple program in theory.
>     Who can tell me why?
>     Thank you very much!
>
>
> Best Wishes!
>



-- 
Jaideep Dhok

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message