hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arun C Murthy <...@hortonworks.com>
Subject Re: Unexpected end of input stream: how to locate related file(s)/
Date Wed, 19 Jun 2013 06:28:27 GMT
Robin,

On Jun 18, 2013, at 11:12 PM, Robin Verlangen <robin@us2.nl> wrote:

> Hi Arun,
> 
> Thank you for your reply. We run Hadoop 2.0.0 with MapReduce 0.20 packaged by Cloudera.
> 
> Do you know where to find the log files related to a specific task, is that also in the
folder /var/log/hadoop-0.20-mapreduce/userlogs/job_ID/

Unfortunately it's hard to say, personally - try looking in /var/log/hadoop-0.20-mapreduce/userlogs/job_ID/taskAttemptID/syslog.

At least on hadoop-1.2+ and hadoop-2.x you should see the following log msg at the top of
the map-tasks' log files:
LOG.info("Processing split: " + inputSplit);
hth,
Arun

> 
> Best regards, 
> 
> Robin Verlangen
> Data Architect
> 
> W http://www.robinverlangen.nl
> E robin@us2.nl
> 
> 
> What is CloudPelican?
> 
> Disclaimer: The information contained in this message and attachments is intended solely
for the attention and use of the named addressee and may be confidential. If you are not the
intended recipient, you are reminded that the information remains the property of the sender.
You must not use, disclose, distribute, copy, print or rely on this e-mail. If you have received
this message in error, please contact the sender immediately and irrevocably delete this message
and any copies.
> 
> 
> On Wed, Jun 19, 2013 at 7:59 AM, Arun C Murthy <acm@hortonworks.com> wrote:
> What version of MapReduce are you using? At the beginning of the log-file you should
be able to see a log msg with the input-split file name for the map.
> 
> thanks,
> Arun
> 
> On Jun 18, 2013, at 10:54 PM, Robin Verlangen <robin@us2.nl> wrote:
> 
>> Hi there,
>> 
>> How can I locate the files that cause these errors in my Map/Reduce jobs?
>> 
>> java.io.IOException: java.io.EOFException: Unexpected end of input stream
>> 	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
>> 	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
>> 	at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:276)
>> 	at org.apache.hadoop.hive.ql.io.HiveRecordReader.doNext(HiveRecordReader.java:79)
>> 	at org.apache.hadoop.hive.ql.io.HiveRecordReader.doNext(HiveRecordReader.java:33)
>> 	at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:108)
>> 	at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
>> 	at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:200)
>> 	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
>> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
>> 	at org.apache.hadoop
>> 
>> Best regards, 
>> 
>> Robin Verlangen
>> Data Architect
>> 
>> W http://www.robinverlangen.nl
>> E robin@us2.nl
>> 
>> 
>> What is CloudPelican?
>> 
>> Disclaimer: The information contained in this message and attachments is intended
solely for the attention and use of the named addressee and may be confidential. If you are
not the intended recipient, you are reminded that the information remains the property of
the sender. You must not use, disclose, distribute, copy, print or rely on this e-mail. If
you have received this message in error, please contact the sender immediately and irrevocably
delete this message and any copies.
> 
> --
> Arun C. Murthy
> Hortonworks Inc.
> http://hortonworks.com/
> 
> 
> 

--
Arun C. Murthy
Hortonworks Inc.
http://hortonworks.com/



Mime
View raw message