hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: location of Java heap dumps
Date Thu, 19 Jul 2012 16:44:31 GMT
You need to ask your job to not discard failed task files. Else they
get cleared away (except for logs) and thats why you do not see it
anymore afterwards.

If you're using 1.x/0.20.x, set "keep.failed.task.files" to true in
your JobConf/Job.getConfiguration objects, before submitting your job.
Afterwards, visit the failed task's node and cd to the right
mapred.local.dir subdir for the attempt and you should see your file
in the task's working dir there. The config prevents the failed task
files to get removed.

Hope this helps!

On Thu, Jul 19, 2012 at 1:37 PM, Marek Miglinski <mmiglinski@seven.com> wrote:
> Thanks Markus,
>
> But as I said, I have only read access on the nodes and I can't make that change. So
the question open.
>
>
> Marek M.
> ________________________________________
> From: Markus Jelsma [markus.jelsma@openindex.io]
> Sent: Wednesday, July 18, 2012 9:06 PM
> To: mapreduce-user@hadoop.apache.org
> Subject: RE: location of Java heap dumps
>
> -XX:HeapDumpPath=/path/to/heap.dump
>
>
> -----Original message-----
>> From:Marek Miglinski <mmiglinski@seven.com>
>> Sent: Wed 18-Jul-2012 19:51
>> To: mapreduce-user@hadoop.apache.org
>> Subject: location of Java heap dumps
>>
>> Hi all,
>>
>> I have a setting of -XX:+HeapDumpOnOutOfMemoryError on all nodes and I don't have
permissions to add location where those dumps will be saved, so I get a message in my mapred
process:
>>
>> java.lang.OutOfMemoryError: Java heap space
>> Dumping heap to java_pid10687.hprof ...
>> Heap dump file created [1385031743 bytes in 30.259 secs]
>>
>> Where do I locate those dumps? Cause I can't find them anywhere.
>>
>>
>> Thanks,
>> Marek M.
>>



-- 
Harsh J

Mime
View raw message