hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Amit k. Saha" <amitsaha...@gmail.com>
Subject Re: JNI Crash using hadoop
Date Sat, 25 Oct 2008 16:34:02 GMT

On Sat, Oct 25, 2008 at 6:48 PM, lamfeeling <lamfeeling@126.com> wrote:
>  Dear all:
>    I'm a new guy to hadoop, and I want to immigrate my existing project (by C++ ) to
Hadoop using JNI.
>    All the feature seems OK, except one method.
>    When it is invoked, hadoop gives me a error message, which is : bad_alloc.
>    I googled this message, it tells me, this is a common problem when your memory is
used up, but my memory is not full yet.
>    Are there some limitations of memory in Haoop? Especially when using JNI methods?
>    This program has been tested millions of times, so the problem should not be in my
C++ program.
>    Could anyone give me a answer? Thanks a lot!!

Consider using 'Pipes':


Amit Kumar Saha
Skype: amitkumarsaha

View raw message