hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jason hadoop <jason.had...@gmail.com>
Subject Re: IO Exception in Map Tasks
Date Mon, 27 Apr 2009 16:20:55 GMT
You will need to figure out why your task crashed,
Check the task logs, there may be some messages there, that give you a hint
as to what is going on.

you can enable saving failed task logs and then run the task standalone in
the isolation runner.
chapter 7 of my book (alpha available) provides details on this, hoping the
failure repeats in the controlled environment.

You could unlimit the core dump size, via hadoop-env.sh *ulimit -c unlimited
*, but that will require that the failed task logs be available as the core
will be in the task working directory.


On Mon, Apr 27, 2009 at 1:30 AM, Rakhi Khatwani <rakhi.khatwani@gmail.com>wrote:

> Thanks Jason,
>              is there any way we can avoid this exception??
>
> Thanks,
> Raakhi
>
> On Mon, Apr 27, 2009 at 1:20 PM, jason hadoop <jason.hadoop@gmail.com
> >wrote:
>
> > The jvm had a hard failure and crashed
> >
> >
> > On Sun, Apr 26, 2009 at 11:34 PM, Rakhi Khatwani
> > <rakhi.khatwani@gmail.com>wrote:
> >
> > > Hi,
> > >
> > >      In one of the map tasks, i get the following exception:
> > >      java.io.IOException: Task process exit with nonzero status of 255.
> > > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:424)
> > >
> > > java.io.IOException: Task process exit with nonzero status of 255.
> > > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:424)
> > >
> > > what could be the reason?
> > >
> > > Thanks,
> > > Raakhi
> > >
> >
> >
> >
> > --
> > Alpha Chapters of my book on Hadoop are available
> > http://www.apress.com/book/view/9781430219422
> >
>



-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message