hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steven Wong <sw...@netflix.com>
Subject RE: Dose block size determine the number of map task
Date Thu, 02 Jun 2011 01:44:20 GMT
When using CombineHiveInputFormat, parameters such as mapred.max.split.size (and others) help
determine how the input is split across mappers. Other factors include whether your input
files' format is a splittable format or not.

Hope this helps.


From: Junxian Yan [mailto:junxian.yan@gmail.com]
Sent: Wednesday, June 01, 2011 12:45 AM
To: user@hive.apache.org
Subject: Dose block size determine the number of map task

I saw this in hadoop wiki: http://wiki.apache.org/hadoop/HowManyMapsAndReduces

But in my experiment,I see the different result. When I set the CombineHiveInputFormat in
hive and by the doc, the default block should be 64M, but my input files are more than 64M,
hadoop still created one map task to handle all data.

Can you help to figure out where is wrong?

R

Mime
View raw message