hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Evans <ev...@yahoo-inc.com>
Subject Re: division by zero in getLocalPathForWrite()
Date Thu, 25 Oct 2012 14:07:54 GMT
It looks like you are running with an older version of 2.0, even though it
does not really make much of a difference in this case,  The issue shows
up when getLocalPathForWrite thinks there is no space on to write to on
any of the disks it has configured.  This could be because you do not have
any directories configured.  I really don't know for sure exactly what is
happening.  It might be disk fail in place removing disks for you because
of other issues. Either way we should file a JIRA against Hadoop to make
it so we never get the / by zero error and provide a better way to handle
the possible causes.

--Bobby Evans

On 10/24/12 11:54 PM, "Ted Yu" <yuzhihong@gmail.com> wrote:

>HBase has Jenkins build against hadoop 2.0
>I was checking why TestRowCounter sometimes failed:
>I think the following could be the cause:
>2012-10-22 23:46:32,571 WARN  [AsyncDispatcher event handler]
>resourcemanager.RMAuditLogger(255): USER=jenkins	OPERATION=Application
>failed with state: FAILED	PERMISSIONS=Application
>application_1350949562159_0002 failed 1 times due to AM Container for
>appattempt_1350949562159_0002_000001 exited with  exitCode: -1000 due
>to: java.lang.ArithmeticException: / by zero
>	at 
>	at 
>	at 
>	at 
>	at 
>	at 
>However, I don't seem to find where in getLocalPathForWrite() division by
>zero could have arisen.
>Comment / hint is welcome.

View raw message