hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From venkata subbarayudu <avsrit2...@gmail.com>
Subject Too many fetch failures
Date Wed, 04 Nov 2009 08:49:54 GMT
Hi All,
          I was setup a single node hadoop (hadoop-0.20.0) and is able to
run that have only Map Tasks, but If a task have Map & Reduce tasks, then
MapTask is thorwing exceptions saying "Too many fetch failures" and the
reduceTask is not starting at all. can some one please guide whats the error
is, and how to resolve this..
         One thing I noted is eventhough I setup hadoop as localhost (in
config files), I am seeing the IP of the machine on tasktracker,namenode
logs saying TaskTracker starting on server : <ip of the machine>, instead of
localhost or 127.0.0.1. and so I am understanding from where hadoop is
picking up the IP of the machine.

                               Thanks for you help in advance,
Thanks,
Rayudu.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message