hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From prashant sethia <prashantsethia...@gmail.com>
Subject Shuffle Error: Exceeded MAX_FAILED_UNIQUE_FETCHES; bailing-out.
Date Thu, 02 Dec 2010 05:24:07 GMT
Hi all,

I have installed Hadoop 0.20.2 on a cluster of 6 computers. Everything runs
fine when I run an application on Hadoop with one system in the cluster. But
as soon as I include more than one system and  then run a job, I get the
following error after completion of the map task:

Shuffle Error: Exceeded MAX_FAILED_UNIQUE_FETCHES; bailing-out.

The value of "ulimit -n" is set to 35000. I am not using any DNS names, so
no DNS resolution problems.

What can be the problem then?

Same happens with Hadoop 0.21.0 version.


Thanks,
Prashant.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message