hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ashish Venugopal" <...@andrew.cmu.edu>
Subject using too many mappers?
Date Fri, 18 Jul 2008 15:59:36 GMT
Is it possible that using too many mappers causes issues in Hadoop 0.17.1? I
have an input data directory with 100 files in it. I am running a job that
takes these files as input. When I set "-jobconf mapred.map.tasks=200" in
the job invocation, its seems like the mappers received "empty" inputs (that
my binary does not cleanly handle). When I unset the mapred.map.tasks
parameter, the jobs runs fine, many mappers do get used because the input
files are manually split. Can anyone offer an explanation / have there been
changes in the use of this parameter between 0.16.4 and 0.17.1?
Ashish

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message