hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From - <commodor...@ymail.com>
Subject Terasort's performance anomaly
Date Thu, 24 Oct 2013 18:46:43 GMT
Hi,
I've been running Terasort on Hadoop 2.1.0-beta. I have a 6 node cluster 5 of which runs
a Node Manager and all have a Data Node. I don't understand why I have a bad performance in
most cases and why in some cases the performance is good  (10GB Terasort with 2 reducers).

* When I run 10, 20 and 30 GB with 1 reducer, I get the following results:

Total time: 1100, 3300 and 5700 sec

Avg map time: 29, 50 and 72 sec
Avg reduce time: 870, 2700 and 4800 sec

Killed map tasks: 2, 5 and 5

* When I run 10, 20 and 30 GB with 2 reducers, I get the following results:
Total time: 385, 4575 and 7379 sec
Avg map time: 35, 52 and 70 sec
Avg reduce time: 225, 3879 and 6116 sec
Killed map tasks: 1, 4, 5


* These results don't make sense since there is no correlation between them. Somehow, 10 GB
Terasort with 2 reducers works much better than 1 reducer. In other cases increasing the amount
of reducers actually decreases the performance. When I check the logs of application master,
I see a lot of "Container killed on request. Exit code is 143" error which are generally followed
by "[AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
Diagnostics report from attempt_1382628871858_0002_m_000021_0: Container killed by the ApplicationMaster."
(e.g. there are 219 of them in 30GB Terasort with 2 reducers) - which doesn't give much information.

* I mostly use the default settings, the only changes which may have an impact (I also set
dfs replication factor to 1) are the following:
  <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce.shuffle</value>
  </property>
  <property>
    <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
    <value>org.apache.hadoop.mapred.ShuffleHandler</value>
  </property>

* I observed that in all cases "map output materialized bytes" is slightly more than "map
output bytes" (which would be ok since I don't use any compression).

Here is one of the terminal outputs (from 20 GB Terasort with 2 reducers):

13/10/23 22:31:32 INFO mapreduce.Job:  map 0% reduce 0%

13/10/23 22:31:45 INFO mapreduce.Job:  map 1% reduce 0%
13/10/23 22:31:47 INFO mapreduce.Job:  map 3% reduce 0%
13/10/23 22:31:48 INFO mapreduce.Job:  map 6% reduce 0%
13/10/23 22:31:49 INFO mapreduce.Job:  map 7% reduce 0%
13/10/23 22:31:50 INFO mapreduce.Job:  map 8% reduce 0%
13/10/23 22:31:51 INFO mapreduce.Job:  map 10% reduce 0%
13/10/23 22:31:53 INFO mapreduce.Job:  map 11% reduce 0%
13/10/23 22:31:55 INFO mapreduce.Job:  map 12% reduce 0%
13/10/23 22:31:57 INFO mapreduce.Job:  map 14% reduce 0%
13/10/23 22:31:59 INFO mapreduce.Job:  map 15% reduce 0%
13/10/23 22:32:00 INFO mapreduce.Job:  map 16% reduce 0%
13/10/23 22:32:02 INFO mapreduce.Job:  map 17% reduce 0%
13/10/23 22:32:03 INFO mapreduce.Job:  map 18% reduce 0%
13/10/23 22:32:08 INFO mapreduce.Job:  map 20% reduce 0%
13/10/23 22:32:09 INFO mapreduce.Job:  map 21% reduce 0%
13/10/23 22:32:12 INFO mapreduce.Job:  map 22% reduce 0%
13/10/23 22:32:15 INFO mapreduce.Job:  map 23% reduce 0%
13/10/23 22:32:22 INFO mapreduce.Job:  map 25% reduce 0%
13/10/23 22:32:28 INFO mapreduce.Job:  map 30% reduce 1%
13/10/23 22:32:32 INFO mapreduce.Job:  map 36% reduce 1%
13/10/23 22:32:35 INFO mapreduce.Job:  map 38% reduce 1%
13/10/23 22:32:41 INFO mapreduce.Job:  map 39% reduce 1%
13/10/23 22:32:43 INFO mapreduce.Job:  map 40% reduce 1%
13/10/23 22:32:44 INFO mapreduce.Job:  map 41% reduce 2%
13/10/23 22:32:47 INFO mapreduce.Job:  map 42% reduce 2%
13/10/23 22:32:49 INFO mapreduce.Job:  map 43% reduce 2%
13/10/23 22:32:58 INFO mapreduce.Job:  map 44% reduce 2%
13/10/23 22:33:01 INFO mapreduce.Job:  map 47% reduce 2%
13/10/23 22:33:04 INFO mapreduce.Job:  map 48% reduce 2%
13/10/23 22:33:05 INFO mapreduce.Job:  map 49% reduce 2%
13/10/23 22:33:07 INFO mapreduce.Job:  map 49% reduce 3%
13/10/23 22:33:08 INFO mapreduce.Job:  map 50% reduce 3%
13/10/23 22:33:12 INFO mapreduce.Job:  map 51% reduce 3%
13/10/23 22:33:13 INFO mapreduce.Job:  map 52% reduce 3%
13/10/23 22:33:24 INFO mapreduce.Job:  map 55% reduce 4%
13/10/23 22:33:36 INFO mapreduce.Job:  map 60% reduce 5%
13/10/23 22:33:39 INFO mapreduce.Job:  map 61% reduce 5%
13/10/23 22:33:44 INFO mapreduce.Job:  map 62% reduce 5%
13/10/23 22:33:49 INFO mapreduce.Job:  map 64% reduce 5%
13/10/23 22:33:54 INFO mapreduce.Job:  map 67% reduce 6%
13/10/23 22:33:57 INFO mapreduce.Job:  map 69% reduce 6%
13/10/23 22:34:00 INFO mapreduce.Job:  map 70% reduce 6%
13/10/23 22:34:02 INFO mapreduce.Job:  map 71% reduce 6%
13/10/23 22:34:03 INFO mapreduce.Job:  map 72% reduce 6%
13/10/23 22:34:05 INFO mapreduce.Job:  map 73% reduce 6%
13/10/23 22:34:07 INFO mapreduce.Job:  map 74% reduce 6%
13/10/23 22:34:16 INFO mapreduce.Job:  map 76% reduce 6%
13/10/23 22:34:19 INFO mapreduce.Job:  map 77% reduce 7%
13/10/23 22:34:22 INFO mapreduce.Job:  map 78% reduce 7%
13/10/23 22:34:24 INFO mapreduce.Job:  map 79% reduce 7%
13/10/23 22:34:27 INFO mapreduce.Job:  map 80% reduce 7%
13/10/23 22:34:30 INFO mapreduce.Job:  map 81% reduce 7%
13/10/23 22:34:36 INFO mapreduce.Job:  map 82% reduce 7%
13/10/23 22:34:40 INFO mapreduce.Job:  map 83% reduce 7%
13/10/23 22:34:41 INFO mapreduce.Job:  map 83% reduce 8%
13/10/23 22:34:45 INFO mapreduce.Job:  map 84% reduce 8%
13/10/23 22:34:51 INFO mapreduce.Job:  map 85% reduce 8%
13/10/23 22:34:54 INFO mapreduce.Job:  map 86% reduce 8%
13/10/23 22:35:01 INFO mapreduce.Job:  map 87% reduce 8%
13/10/23 22:35:05 INFO mapreduce.Job:  map 88% reduce 8%
13/10/23 22:35:08 INFO mapreduce.Job:  map 89% reduce 8%
13/10/23 22:35:10 INFO mapreduce.Job:  map 90% reduce 8%
13/10/23 22:35:11 INFO mapreduce.Job:  map 90% reduce 9%
13/10/23 22:35:14 INFO mapreduce.Job:  map 91% reduce 9%
13/10/23 22:35:16 INFO mapreduce.Job:  map 92% reduce 9%
13/10/23 22:35:19 INFO mapreduce.Job:  map 93% reduce 9%
13/10/23 22:35:22 INFO mapreduce.Job:  map 94% reduce 9%
13/10/23 22:35:32 INFO mapreduce.Job:  map 94% reduce 10%
13/10/23 22:35:44 INFO mapreduce.Job:  map 95% reduce 11%
13/10/23 22:35:47 INFO mapreduce.Job:  map 96% reduce 11%
13/10/23 22:35:53 INFO mapreduce.Job:  map 97% reduce 11%
13/10/23 22:35:55 INFO mapreduce.Job:  map 98% reduce 11%
13/10/23 22:35:59 INFO mapreduce.Job:  map 99% reduce 11%
13/10/23 22:36:02 INFO mapreduce.Job:  map 100% reduce 11%
13/10/23 22:36:14 INFO mapreduce.Job:  map 100% reduce 12%
13/10/23 22:36:29 INFO mapreduce.Job:  map 100% reduce 13%
13/10/23 22:36:35 INFO mapreduce.Job:  map 100% reduce 14%
13/10/23 22:37:12 INFO mapreduce.Job:  map 100% reduce 15%
13/10/23 22:37:36 INFO mapreduce.Job:  map 100% reduce 16%
13/10/23 22:37:52 INFO mapreduce.Job:  map 100% reduce 17%
13/10/23 22:38:13 INFO mapreduce.Job:  map 100% reduce 18%
13/10/23 22:38:28 INFO mapreduce.Job:  map 100% reduce 19%
13/10/23 22:38:42 INFO mapreduce.Job:  map 100% reduce 20%
13/10/23 22:39:10 INFO mapreduce.Job:  map 100% reduce 21%
13/10/23 22:39:24 INFO mapreduce.Job:  map 100% reduce 22%
13/10/23 22:39:35 INFO mapreduce.Job:  map 100% reduce 23%
13/10/23 22:40:15 INFO mapreduce.Job:  map 100% reduce 24%
13/10/23 22:40:26 INFO mapreduce.Job:  map 100% reduce 25%
13/10/23 22:40:56 INFO mapreduce.Job:  map 100% reduce 26%
13/10/23 22:41:09 INFO mapreduce.Job:  map 100% reduce 27%
13/10/23 22:41:31 INFO mapreduce.Job:  map 100% reduce 28%
13/10/23 22:41:37 INFO mapreduce.Job:  map 100% reduce 29%
13/10/23 22:42:06 INFO mapreduce.Job:  map 100% reduce 30%
13/10/23 22:42:45 INFO mapreduce.Job:  map 100% reduce 31%
13/10/23 22:42:53 INFO mapreduce.Job:  map 100% reduce 32%
13/10/23 22:42:59 INFO mapreduce.Job:  map 100% reduce 33%
13/10/23 23:04:24 INFO mapreduce.Job:  map 100% reduce 34%
13/10/23 23:04:48 INFO mapreduce.Job:  map 100% reduce 35%
13/10/23 23:05:10 INFO mapreduce.Job:  map 100% reduce 36%
13/10/23 23:05:43 INFO mapreduce.Job:  map 100% reduce 37%
13/10/23 23:06:14 INFO mapreduce.Job:  map 100% reduce 38%
13/10/23 23:06:28 INFO mapreduce.Job:  map 100% reduce 39%
13/10/23 23:06:54 INFO mapreduce.Job:  map 100% reduce 40%
13/10/23 23:07:20 INFO mapreduce.Job:  map 100% reduce 41%
13/10/23 23:07:38 INFO mapreduce.Job:  map 100% reduce 42%
13/10/23 23:08:02 INFO mapreduce.Job:  map 100% reduce 43%
13/10/23 23:08:23 INFO mapreduce.Job:  map 100% reduce 44%
13/10/23 23:08:47 INFO mapreduce.Job:  map 100% reduce 45%
13/10/23 23:09:05 INFO mapreduce.Job:  map 100% reduce 46%
13/10/23 23:09:24 INFO mapreduce.Job:  map 100% reduce 47%
13/10/23 23:09:49 INFO mapreduce.Job:  map 100% reduce 48%
13/10/23 23:10:12 INFO mapreduce.Job:  map 100% reduce 49%
13/10/23 23:10:34 INFO mapreduce.Job:  map 100% reduce 50%
13/10/23 23:11:01 INFO mapreduce.Job:  map 100% reduce 51%
13/10/23 23:11:26 INFO mapreduce.Job:  map 100% reduce 52%
13/10/23 23:11:51 INFO mapreduce.Job:  map 100% reduce 53%
13/10/23 23:12:16 INFO mapreduce.Job:  map 100% reduce 54%
13/10/23 23:12:43 INFO mapreduce.Job:  map 100% reduce 55%
13/10/23 23:13:06 INFO mapreduce.Job:  map 100% reduce 56%
13/10/23 23:13:33 INFO mapreduce.Job:  map 100% reduce 57%
13/10/23 23:13:57 INFO mapreduce.Job:  map 100% reduce 58%
13/10/23 23:14:18 INFO mapreduce.Job:  map 100% reduce 59%
13/10/23 23:14:47 INFO mapreduce.Job:  map 100% reduce 60%
13/10/23 23:15:13 INFO mapreduce.Job:  map 100% reduce 61%
13/10/23 23:15:38 INFO mapreduce.Job:  map 100% reduce 62%
13/10/23 23:16:03 INFO mapreduce.Job:  map 100% reduce 63%
13/10/23 23:16:25 INFO mapreduce.Job:  map 100% reduce 64%
13/10/23 23:16:48 INFO mapreduce.Job:  map 100% reduce 65%
13/10/23 23:17:16 INFO mapreduce.Job:  map 100% reduce 66%
13/10/23 23:17:50 INFO mapreduce.Job:  map 100% reduce 67%
13/10/23 23:18:29 INFO mapreduce.Job:  map 100% reduce 68%
13/10/23 23:19:23 INFO mapreduce.Job:  map 100% reduce 69%
13/10/23 23:20:16 INFO mapreduce.Job:  map 100% reduce 70%
13/10/23 23:21:05 INFO mapreduce.Job:  map 100% reduce 71%
13/10/23 23:21:55 INFO mapreduce.Job:  map 100% reduce 72%
13/10/23 23:22:47 INFO mapreduce.Job:  map 100% reduce 73%
13/10/23 23:23:39 INFO mapreduce.Job:  map 100% reduce 74%
13/10/23 23:24:32 INFO mapreduce.Job:  map 100% reduce 75%
13/10/23 23:25:24 INFO mapreduce.Job:  map 100% reduce 76%
13/10/23 23:26:15 INFO mapreduce.Job:  map 100% reduce 77%
13/10/23 23:27:10 INFO mapreduce.Job:  map 100% reduce 78%
13/10/23 23:28:01 INFO mapreduce.Job:  map 100% reduce 79%
13/10/23 23:28:54 INFO mapreduce.Job:  map 100% reduce 80%
13/10/23 23:29:51 INFO mapreduce.Job:  map 100% reduce 81%
13/10/23 23:30:47 INFO mapreduce.Job:  map 100% reduce 82%
13/10/23 23:31:45 INFO mapreduce.Job:  map 100% reduce 83%
13/10/23 23:32:45 INFO mapreduce.Job:  map 100% reduce 84%
13/10/23 23:33:41 INFO mapreduce.Job:  map 100% reduce 85%
13/10/23 23:34:40 INFO mapreduce.Job:  map 100% reduce 86%
13/10/23 23:35:38 INFO mapreduce.Job:  map 100% reduce 87%
13/10/23 23:36:34 INFO mapreduce.Job:  map 100% reduce 88%
13/10/23 23:37:26 INFO mapreduce.Job:  map 100% reduce 89%
13/10/23 23:38:21 INFO mapreduce.Job:  map 100% reduce 90%
13/10/23 23:39:17 INFO mapreduce.Job:  map 100% reduce 91%
13/10/23 23:40:12 INFO mapreduce.Job:  map 100% reduce 92%
13/10/23 23:41:09 INFO mapreduce.Job:  map 100% reduce 93%
13/10/23 23:41:59 INFO mapreduce.Job:  map 100% reduce 94%
13/10/23 23:42:55 INFO mapreduce.Job:  map 100% reduce 95%
13/10/23 23:43:46 INFO mapreduce.Job:  map 100% reduce 96%
13/10/23 23:44:43 INFO mapreduce.Job:  map 100% reduce 97%
13/10/23 23:45:37 INFO mapreduce.Job:  map 100% reduce 98%
13/10/23 23:46:30 INFO mapreduce.Job:  map 100% reduce 99%
13/10/23 23:47:25 INFO mapreduce.Job:  map 100% reduce 100%
13/10/23 23:47:46 INFO mapreduce.Job: Job job_1382586652459_0004 completed successfully
13/10/23 23:47:46 INFO mapreduce.Job: Counters: 45
File System Counters
FILE: Number of bytes read=76352219884
FILE: Number of bytes written=98872759630
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=21474854960
HDFS: Number of bytes written=21474836400
HDFS: Number of read operations=486
HDFS: Number of large read operations=0
HDFS: Number of write operations=4
Job Counters 
Killed map tasks=4
Launched map tasks=165
Launched reduce tasks=2
Data-local map tasks=68
Rack-local map tasks=97
Total time spent by all maps in occupied slots (ms)=8584982
Total time spent by all reduces in occupied slots (ms)=9069877
Map-Reduce Framework
Map input records=214748364
Map output records=214748364
Map output bytes=21904333128
Map output materialized bytes=22333831776
Input split bytes=18560
Combine input records=0
Combine output records=0
Reduce input groups=214748364
Reduce shuffle bytes=22333831776
Reduce input records=214748364
Reduce output records=214748364
Spilled Records=948897880
Shuffled Maps =320
Failed Shuffles=0
Merged Map outputs=320
GC time elapsed (ms)=53546
CPU time spent (ms)=3088460
Physical memory (bytes) snapshot=43805261824
Virtual memory (bytes) snapshot=181438750720
Total committed heap usage (bytes)=30207311872
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters 
Bytes Read=21474836400
File Output Format Counters 
Bytes Written=21474836400
13/10/23 23:47:46 INFO terasort.TeraSort: done

Any feedback on why this might be happening is appreciated...
Thanks
Mime
View raw message