Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 22B7A10FAE for ; Mon, 14 Apr 2014 06:39:30 +0000 (UTC) Received: (qmail 22427 invoked by uid 500); 14 Apr 2014 06:39:22 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 21941 invoked by uid 500); 14 Apr 2014 06:39:21 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 21934 invoked by uid 99); 14 Apr 2014 06:39:21 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Apr 2014 06:39:21 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of smart.rahul.iiit@gmail.com designates 209.85.212.177 as permitted sender) Received: from [209.85.212.177] (HELO mail-wi0-f177.google.com) (209.85.212.177) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 Apr 2014 06:39:15 +0000 Received: by mail-wi0-f177.google.com with SMTP id cc10so3506815wib.10 for ; Sun, 13 Apr 2014 23:38:53 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=hroI3Jhyn9PG+cG8w6vrBeLEJdqGdXnTOE1rdJkhEQQ=; b=xokNt9eHGyRbz+ul3w8Oh7VdTv4MmTQT0S/GWp6uGbSy3oFPDUzgPvlPXA73LzXRkb h0Wy88V0IqHXDMsXRSSoKIrxsmWiK+YW46fkcDFZgSOPrwa6hB+bn/C5LQBGndoGu8vO 1UAFiKIYCZjH7QvSKY0ATPNfir93uEHa7sXXOAWzTNfTFkAhzmZHl4la3N71L+jjk+OU pM/At+axjF77UE+3npqW6hGgNw+vHhxn7UxrfpB+qHpjyzvhrBqLv7v/WM/nqgNEaALI Giym9Zx7m1a8TIEMEk64dCgzQRuqCSINhOhADpHRx9ZxJo40WZlDJ/ZgcIvc2Gw8gDmW YOog== MIME-Version: 1.0 X-Received: by 10.180.188.194 with SMTP id gc2mr8379030wic.40.1397457533775; Sun, 13 Apr 2014 23:38:53 -0700 (PDT) Received: by 10.227.94.6 with HTTP; Sun, 13 Apr 2014 23:38:53 -0700 (PDT) Date: Mon, 14 Apr 2014 12:08:53 +0530 Message-ID: Subject: hadoop inconsistent behaviour From: Rahul Singh To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c269cce3b3ac04f6faef8c X-Virus-Checked: Checked by ClamAV on apache.org --001a11c269cce3b3ac04f6faef8c Content-Type: text/plain; charset=ISO-8859-1 Hi, I am running a job(wordcount example) on 3 node cluster(1 master and 2 slave), some times the job passes but some times it fails(as reduce fails, input data few kbs). I am not able to nail down the reason of this inconsistency. failed log: 14/04/14 11:57:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/04/14 11:57:25 INFO client.RMProxy: Connecting to ResourceManager at / 20.0.1.206:8032 14/04/14 11:57:26 INFO input.FileInputFormat: Total input paths to process : 1 14/04/14 11:57:26 INFO mapreduce.JobSubmitter: number of splits:1 14/04/14 11:57:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1397454060494_0003 14/04/14 11:57:26 INFO impl.YarnClientImpl: Submitted application application_1397454060494_0003 14/04/14 11:57:26 INFO mapreduce.Job: The url to track the job: http://20.0.1.206:8088/proxy/application_1397454060494_0003/ 14/04/14 11:57:26 INFO mapreduce.Job: Running job: job_1397454060494_0003 14/04/14 11:57:34 INFO mapreduce.Job: Job job_1397454060494_0003 running in uber mode : false 14/04/14 11:57:34 INFO mapreduce.Job: map 0% reduce 0% 14/04/14 11:57:40 INFO mapreduce.Job: map 100% reduce 0% 14/04/14 11:57:46 INFO mapreduce.Job: map 100% reduce 13% 14/04/14 11:57:48 INFO mapreduce.Job: map 100% reduce 25% 14/04/14 11:57:49 INFO mapreduce.Job: map 100% reduce 38% 14/04/14 11:57:50 INFO mapreduce.Job: map 100% reduce 50% 14/04/14 11:57:54 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_0, Status : FAILED 14/04/14 11:57:54 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000001_0, Status : FAILED 14/04/14 11:57:56 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000005_0, Status : FAILED 14/04/14 11:57:56 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000007_0, Status : FAILED 14/04/14 11:58:02 INFO mapreduce.Job: map 100% reduce 63% 14/04/14 11:58:04 INFO mapreduce.Job: map 100% reduce 75% 14/04/14 11:58:09 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_1, Status : FAILED 14/04/14 11:58:11 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000005_1, Status : FAILED 14/04/14 11:58:24 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_2, Status : FAILED 14/04/14 11:58:26 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000005_2, Status : FAILED 14/04/14 11:58:40 INFO mapreduce.Job: map 100% reduce 100% 14/04/14 11:58:40 INFO mapreduce.Job: Job job_1397454060494_0003 failed with state FAILED due to: Task failed task_1397454060494_0003_r_000003 Job failed as tasks failed. failedMaps:0 failedReduces:1 14/04/14 11:58:40 INFO mapreduce.Job: Counters: 51 File System Counters FILE: Number of bytes read=80 FILE: Number of bytes written=596766 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=175 HDFS: Number of bytes written=28 HDFS: Number of read operations=21 HDFS: Number of large read operations=0 HDFS: Number of write operations=12 Job Counters Failed reduce tasks=9 Killed reduce tasks=1 Launched map tasks=1 Launched reduce tasks=16 Data-local map tasks=1 Total time spent by all maps in occupied slots (ms)=3477 Total time spent by all reduces in occupied slots (ms)=148867 Total time spent by all map tasks (ms)=3477 Total time spent by all reduce tasks (ms)=148867 Total vcore-seconds taken by all map tasks=3477 Total vcore-seconds taken by all reduce tasks=148867 Total megabyte-seconds taken by all map tasks=3560448 Total megabyte-seconds taken by all reduce tasks=152439808 Map-Reduce Framework Map input records=3 Map output records=13 Map output bytes=110 Map output materialized bytes=112 Input split bytes=117 Combine input records=13 Combine output records=6 Reduce input groups=4 Reduce shuffle bytes=80 Reduce input records=4 Reduce output records=4 Spilled Records=10 Shuffled Maps =6 Failed Shuffles=0 Merged Map outputs=6 GC time elapsed (ms)=142 CPU time spent (ms)=6420 Physical memory (bytes) snapshot=1100853248 Virtual memory (bytes) snapshot=4468314112 Total committed heap usage (bytes)=1406992384 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=58 File Output Format Counters Bytes Written=28 Job Passing Logs: hadoop jar ../share/hadoop/mapreduce/hadoop-mapreduce-examples-2.3.0.jar wordcount /user/hduser/input /user/hduser/output_wordcount9 14/04/14 11:47:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/04/14 11:47:28 INFO client.RMProxy: Connecting to ResourceManager at / 20.0.1.206:8032 14/04/14 11:47:28 INFO input.FileInputFormat: Total input paths to process : 1 14/04/14 11:47:29 INFO mapreduce.JobSubmitter: number of splits:1 14/04/14 11:47:29 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1397454060494_0002 14/04/14 11:47:29 INFO impl.YarnClientImpl: Submitted application application_1397454060494_0002 14/04/14 11:47:29 INFO mapreduce.Job: The url to track the job: http://20.0.1.206:8088/proxy/application_1397454060494_0002/ 14/04/14 11:47:29 INFO mapreduce.Job: Running job: job_1397454060494_0002 14/04/14 11:47:36 INFO mapreduce.Job: Job job_1397454060494_0002 running in uber mode : false 14/04/14 11:47:36 INFO mapreduce.Job: map 0% reduce 0% 14/04/14 11:47:50 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_m_000000_0, Status : FAILED 14/04/14 11:48:05 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_m_000000_1, Status : FAILED 14/04/14 11:48:20 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_m_000000_2, Status : FAILED 14/04/14 11:48:26 INFO mapreduce.Job: map 100% reduce 0% 14/04/14 11:48:34 INFO mapreduce.Job: map 100% reduce 13% 14/04/14 11:48:35 INFO mapreduce.Job: map 100% reduce 25% 14/04/14 11:48:37 INFO mapreduce.Job: map 100% reduce 50% 14/04/14 11:48:41 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000001_0, Status : FAILED 14/04/14 11:48:42 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000003_0, Status : FAILED 14/04/14 11:48:43 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000005_0, Status : FAILED 14/04/14 11:48:44 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000007_0, Status : FAILED 14/04/14 11:48:50 INFO mapreduce.Job: map 100% reduce 63% 14/04/14 11:48:51 INFO mapreduce.Job: map 100% reduce 75% 14/04/14 11:48:52 INFO mapreduce.Job: map 100% reduce 88% 14/04/14 11:48:58 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000005_1, Status : FAILED 14/04/14 11:49:05 INFO mapreduce.Job: map 100% reduce 100% 14/04/14 11:49:06 INFO mapreduce.Job: Job job_1397454060494_0002 completed successfully 14/04/14 11:49:06 INFO mapreduce.Job: Counters: 52 File System Counters FILE: Number of bytes read=112 FILE: Number of bytes written=767175 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=175 HDFS: Number of bytes written=40 HDFS: Number of read operations=27 HDFS: Number of large read operations=0 HDFS: Number of write operations=16 Job Counters Failed map tasks=3 Failed reduce tasks=5 Launched map tasks=4 Launched reduce tasks=13 Other local map tasks=3 Data-local map tasks=1 Total time spent by all maps in occupied slots (ms)=41629 Total time spent by all reduces in occupied slots (ms)=104530 Total time spent by all map tasks (ms)=41629 Total time spent by all reduce tasks (ms)=104530 Total vcore-seconds taken by all map tasks=41629 Total vcore-seconds taken by all reduce tasks=104530 Total megabyte-seconds taken by all map tasks=42628096 Total megabyte-seconds taken by all reduce tasks=107038720 Map-Reduce Framework Map input records=3 Map output records=13 Map output bytes=110 Map output materialized bytes=112 Input split bytes=117 Combine input records=13 Combine output records=6 Reduce input groups=6 Reduce shuffle bytes=112 Reduce input records=6 Reduce output records=6 Spilled Records=12 Shuffled Maps =8 Failed Shuffles=0 Merged Map outputs=8 GC time elapsed (ms)=186 CPU time spent (ms)=8890 Physical memory (bytes) snapshot=1408913408 Virtual memory (bytes) snapshot=5727019008 Total committed heap usage (bytes)=1808990208 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=58 File Output Format Counters Bytes Written=40 Thanks and Regards, -Rahul Singh --001a11c269cce3b3ac04f6faef8c Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi,
=A0 I am runnin= g a job(wordcount example) on 3 node cluster(1 master and 2 slave), some ti= mes the job passes but some times it fails(as reduce fails, input data few = kbs).
=A0 I am not able to nail down the reason of this inconsistency.
=


failed log:

14/04/14 11:57:24 WARN util.NativeCode= Loader: Unable to load native-hadoop library for your platform... using bui= ltin-java classes where applicable
14/04/14 11:57:25 INFO client.RMProxy: Connecting to ResourceManager at /20.0.1.206:8032
14/04/14 11:57:26 I= NFO input.FileInputFormat: Total input paths to process : 1
14/04/14 11:= 57:26 INFO mapreduce.JobSubmitter: number of splits:1
14/04/14 11:57:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: j= ob_1397454060494_0003
14/04/14 11:57:26 INFO impl.YarnClientImpl: Submit= ted application application_1397454060494_0003
14/04/14 11:57:26 INFO ma= preduce.Job: The url to track the job: http://20.0.1.206:8088/proxy/applicati= on_1397454060494_0003/
14/04/14 11:57:26 INFO mapreduce.Job: Running job: job_1397454060494_000314/04/14 11:57:34 INFO mapreduce.Job: Job job_1397454060494_0003 running = in uber mode : false
14/04/14 11:57:34 INFO mapreduce.Job:=A0 map 0% red= uce 0%
14/04/14 11:57:40 INFO mapreduce.Job:=A0 map 100% reduce 0%
14/04/14 11:= 57:46 INFO mapreduce.Job:=A0 map 100% reduce 13%
14/04/14 11:57:48 INFO = mapreduce.Job:=A0 map 100% reduce 25%
14/04/14 11:57:49 INFO mapreduce.J= ob:=A0 map 100% reduce 38%
14/04/14 11:57:50 INFO mapreduce.Job:=A0 map 100% reduce 50%
14/04/14 11= :57:54 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_0,= Status : FAILED
14/04/14 11:57:54 INFO mapreduce.Job: Task Id : attempt= _1397454060494_0003_r_000001_0, Status : FAILED
14/04/14 11:57:56 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_= r_000005_0, Status : FAILED
14/04/14 11:57:56 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0003_r_000007_0, Status : FAILED
14/04/14 11:5= 8:02 INFO mapreduce.Job:=A0 map 100% reduce 63%
14/04/14 11:58:04 INFO mapreduce.Job:=A0 map 100% reduce 75%
14/04/14 11= :58:09 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_r_000003_1,= Status : FAILED
14/04/14 11:58:11 INFO mapreduce.Job: Task Id : attempt= _1397454060494_0003_r_000005_1, Status : FAILED
14/04/14 11:58:24 INFO mapreduce.Job: Task Id : attempt_1397454060494_0003_= r_000003_2, Status : FAILED
14/04/14 11:58:26 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0003_r_000005_2, Status : FAILED
14/04/14 11:5= 8:40 INFO mapreduce.Job:=A0 map 100% reduce 100%
14/04/14 11:58:40 INFO mapreduce.Job: Job job_1397454060494_0003 failed wit= h state FAILED due to: Task failed task_1397454060494_0003_r_000003
Job = failed as tasks failed. failedMaps:0 failedReduces:1

14/04/14 11:58:= 40 INFO mapreduce.Job: Counters: 51
=A0=A0=A0 File System Counters
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes= read=3D80
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes written=3D596766=A0=A0=A0 =A0=A0=A0 FILE: Number of read operations=3D0
=A0=A0=A0 =A0= =A0=A0 FILE: Number of large read operations=3D0
=A0=A0=A0 =A0=A0=A0 FIL= E: Number of write operations=3D0
=A0=A0=A0 =A0=A0=A0 HDFS: Number of bytes read=3D175
=A0=A0=A0 =A0=A0=A0= HDFS: Number of bytes written=3D28
=A0=A0=A0 =A0=A0=A0 HDFS: Number of = read operations=3D21
=A0=A0=A0 =A0=A0=A0 HDFS: Number of large read oper= ations=3D0
=A0=A0=A0 =A0=A0=A0 HDFS: Number of write operations=3D12
=A0=A0=A0 Job Counters
=A0=A0=A0 =A0=A0=A0 Failed reduce tasks=3D9
= =A0=A0=A0 =A0=A0=A0 Killed reduce tasks=3D1
=A0=A0=A0 =A0=A0=A0 Launched= map tasks=3D1
=A0=A0=A0 =A0=A0=A0 Launched reduce tasks=3D16
=A0=A0= =A0 =A0=A0=A0 Data-local map tasks=3D1
=A0=A0=A0 =A0=A0=A0 Total time sp= ent by all maps in occupied slots (ms)=3D3477
=A0=A0=A0 =A0=A0=A0 Total time spent by all reduces in occupied slots (ms)= =3D148867
=A0=A0=A0 =A0=A0=A0 Total time spent by all map tasks (ms)=3D3= 477
=A0=A0=A0 =A0=A0=A0 Total time spent by all reduce tasks (ms)=3D1488= 67
=A0=A0=A0 =A0=A0=A0 Total vcore-seconds taken by all map tasks=3D3477=
=A0=A0=A0 =A0=A0=A0 Total vcore-seconds taken by all reduce tasks=3D148867<= br>=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all map tasks=3D3560= 448
=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all reduce tasks= =3D152439808
=A0=A0=A0 Map-Reduce Framework
=A0=A0=A0 =A0=A0=A0 Map input records=3D3
=A0=A0=A0 =A0=A0=A0 Map output= records=3D13
=A0=A0=A0 =A0=A0=A0 Map output bytes=3D110
=A0=A0=A0 = =A0=A0=A0 Map output materialized bytes=3D112
=A0=A0=A0 =A0=A0=A0 Input = split bytes=3D117
=A0=A0=A0 =A0=A0=A0 Combine input records=3D13
=A0= =A0=A0 =A0=A0=A0 Combine output records=3D6
=A0=A0=A0 =A0=A0=A0 Reduce input groups=3D4
=A0=A0=A0 =A0=A0=A0 Reduce s= huffle bytes=3D80
=A0=A0=A0 =A0=A0=A0 Reduce input records=3D4
=A0=A0= =A0 =A0=A0=A0 Reduce output records=3D4
=A0=A0=A0 =A0=A0=A0 Spilled Reco= rds=3D10
=A0=A0=A0 =A0=A0=A0 Shuffled Maps =3D6
=A0=A0=A0 =A0=A0=A0 F= ailed Shuffles=3D0
=A0=A0=A0 =A0=A0=A0 Merged Map outputs=3D6
=A0=A0=A0 =A0=A0=A0 GC time e= lapsed (ms)=3D142
=A0=A0=A0 =A0=A0=A0 CPU time spent (ms)=3D6420
=A0= =A0=A0 =A0=A0=A0 Physical memory (bytes) snapshot=3D1100853248
=A0=A0=A0= =A0=A0=A0 Virtual memory (bytes) snapshot=3D4468314112
=A0=A0=A0 =A0=A0= =A0 Total committed heap usage (bytes)=3D1406992384
=A0=A0=A0 Shuffle Errors
=A0=A0=A0 =A0=A0=A0 BAD_ID=3D0
=A0=A0=A0 =A0= =A0=A0 CONNECTION=3D0
=A0=A0=A0 =A0=A0=A0 IO_ERROR=3D0
=A0=A0=A0 =A0= =A0=A0 WRONG_LENGTH=3D0
=A0=A0=A0 =A0=A0=A0 WRONG_MAP=3D0
=A0=A0=A0 = =A0=A0=A0 WRONG_REDUCE=3D0
=A0=A0=A0 File Input Format Counters
=A0= =A0=A0 =A0=A0=A0 Bytes Read=3D58
=A0=A0=A0 File Output Format Counters
=A0=A0=A0 =A0=A0=A0 Bytes Written= =3D28

Job Passing Logs:
hadoop jar ../share/hadoop/mapreduc= e/hadoop-mapreduce-examples-2.3.0.jar wordcount /user/hduser/input /user/hd= user/output_wordcount9
14/04/14 11:47:27 WARN util.NativeCodeLoader: Unable to load native-hadoop = library for your platform... using builtin-java classes where applicable14/04/14 11:47:28 INFO client.RMProxy: Connecting to ResourceManager at /<= a href=3D"http://20.0.1.206:8032">20.0.1.206:8032
14/04/14 11:47:28 INFO input.FileInputFormat: Total input paths to process = : 1
14/04/14 11:47:29 INFO mapreduce.JobSubmitter: number of splits:114/04/14 11:47:29 INFO mapreduce.JobSubmitter: Submitting tokens for job: = job_1397454060494_0002
14/04/14 11:47:29 INFO impl.YarnClientImpl: Submitted application applicati= on_1397454060494_0002
14/04/14 11:47:29 INFO mapreduce.Job: The url to t= rack the job: http://20.0.1.206:8088/proxy/application_1397454060494_0002/
14/04/14 11:47:29 INFO mapreduce.Job: Running job: job_1397454060494_000214/04/14 11:47:36 INFO mapreduce.Job: Job job_1397454060494_0002 running = in uber mode : false
14/04/14 11:47:36 INFO mapreduce.Job:=A0 map 0% red= uce 0%
14/04/14 11:47:50 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_= m_000000_0, Status : FAILED
14/04/14 11:48:05 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0002_m_000000_1, Status : FAILED
14/04/14 11:4= 8:20 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_m_000000_2, S= tatus : FAILED
14/04/14 11:48:26 INFO mapreduce.Job:=A0 map 100% reduce 0%
14/04/14 11:= 48:34 INFO mapreduce.Job:=A0 map 100% reduce 13%
14/04/14 11:48:35 INFO = mapreduce.Job:=A0 map 100% reduce 25%
14/04/14 11:48:37 INFO mapreduce.J= ob:=A0 map 100% reduce 50%
14/04/14 11:48:41 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_= r_000001_0, Status : FAILED
14/04/14 11:48:42 INFO mapreduce.Job: Task I= d : attempt_1397454060494_0002_r_000003_0, Status : FAILED
14/04/14 11:4= 8:43 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000005_0, S= tatus : FAILED
14/04/14 11:48:44 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_= r_000007_0, Status : FAILED
14/04/14 11:48:50 INFO mapreduce.Job:=A0 map= 100% reduce 63%
14/04/14 11:48:51 INFO mapreduce.Job:=A0 map 100% reduc= e 75%
14/04/14 11:48:52 INFO mapreduce.Job:=A0 map 100% reduce 88%
14/04/14 11= :48:58 INFO mapreduce.Job: Task Id : attempt_1397454060494_0002_r_000005_1,= Status : FAILED
14/04/14 11:49:05 INFO mapreduce.Job:=A0 map 100% reduc= e 100%
14/04/14 11:49:06 INFO mapreduce.Job: Job job_1397454060494_0002 completed = successfully
14/04/14 11:49:06 INFO mapreduce.Job: Counters: 52
=A0= =A0=A0 File System Counters
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes re= ad=3D112
=A0=A0=A0 =A0=A0=A0 FILE: Number of bytes written=3D767175
=A0=A0=A0 =A0=A0=A0 FILE: Number of read operations=3D0
=A0=A0=A0 =A0=A0= =A0 FILE: Number of large read operations=3D0
=A0=A0=A0 =A0=A0=A0 FILE: = Number of write operations=3D0
=A0=A0=A0 =A0=A0=A0 HDFS: Number of bytes= read=3D175
=A0=A0=A0 =A0=A0=A0 HDFS: Number of bytes written=3D40
=A0=A0=A0 =A0=A0=A0 HDFS: Number of read operations=3D27
=A0=A0=A0 =A0= =A0=A0 HDFS: Number of large read operations=3D0
=A0=A0=A0 =A0=A0=A0 HDF= S: Number of write operations=3D16
=A0=A0=A0 Job Counters
=A0=A0=A0 = =A0=A0=A0 Failed map tasks=3D3
=A0=A0=A0 =A0=A0=A0 Failed reduce tasks= =3D5
=A0=A0=A0 =A0=A0=A0 Launched map tasks=3D4
=A0=A0=A0 =A0=A0=A0 Launched = reduce tasks=3D13
=A0=A0=A0 =A0=A0=A0 Other local map tasks=3D3
=A0= =A0=A0 =A0=A0=A0 Data-local map tasks=3D1
=A0=A0=A0 =A0=A0=A0 Total time= spent by all maps in occupied slots (ms)=3D41629
=A0=A0=A0 =A0=A0=A0 To= tal time spent by all reduces in occupied slots (ms)=3D104530
=A0=A0=A0 =A0=A0=A0 Total time spent by all map tasks (ms)=3D41629
=A0= =A0=A0 =A0=A0=A0 Total time spent by all reduce tasks (ms)=3D104530
=A0= =A0=A0 =A0=A0=A0 Total vcore-seconds taken by all map tasks=3D41629
=A0= =A0=A0 =A0=A0=A0 Total vcore-seconds taken by all reduce tasks=3D104530
=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all map tasks=3D4262809= 6
=A0=A0=A0 =A0=A0=A0 Total megabyte-seconds taken by all reduce tasks= =3D107038720
=A0=A0=A0 Map-Reduce Framework
=A0=A0=A0 =A0=A0=A0 Map i= nput records=3D3
=A0=A0=A0 =A0=A0=A0 Map output records=3D13
=A0=A0=A0 =A0=A0=A0 Map output bytes=3D110
=A0=A0=A0 =A0=A0=A0 Map outpu= t materialized bytes=3D112
=A0=A0=A0 =A0=A0=A0 Input split bytes=3D117=A0=A0=A0 =A0=A0=A0 Combine input records=3D13
=A0=A0=A0 =A0=A0=A0 Com= bine output records=3D6
=A0=A0=A0 =A0=A0=A0 Reduce input groups=3D6
= =A0=A0=A0 =A0=A0=A0 Reduce shuffle bytes=3D112
=A0=A0=A0 =A0=A0=A0 Reduce input records=3D6
=A0=A0=A0 =A0=A0=A0 Reduce = output records=3D6
=A0=A0=A0 =A0=A0=A0 Spilled Records=3D12
=A0=A0=A0= =A0=A0=A0 Shuffled Maps =3D8
=A0=A0=A0 =A0=A0=A0 Failed Shuffles=3D0=A0=A0=A0 =A0=A0=A0 Merged Map outputs=3D8
=A0=A0=A0 =A0=A0=A0 GC time = elapsed (ms)=3D186
=A0=A0=A0 =A0=A0=A0 CPU time spent (ms)=3D8890
=A0=A0=A0 =A0=A0=A0 Physi= cal memory (bytes) snapshot=3D1408913408
=A0=A0=A0 =A0=A0=A0 Virtual mem= ory (bytes) snapshot=3D5727019008
=A0=A0=A0 =A0=A0=A0 Total committed he= ap usage (bytes)=3D1808990208
=A0=A0=A0 Shuffle Errors
=A0=A0=A0 =A0=A0=A0 BAD_ID=3D0
=A0=A0=A0 =A0=A0=A0 CONNECTION=3D0
=A0= =A0=A0 =A0=A0=A0 IO_ERROR=3D0
=A0=A0=A0 =A0=A0=A0 WRONG_LENGTH=3D0
= =A0=A0=A0 =A0=A0=A0 WRONG_MAP=3D0
=A0=A0=A0 =A0=A0=A0 WRONG_REDUCE=3D0=A0=A0=A0 File Input Format Counters
=A0=A0=A0 =A0=A0=A0 Bytes Read= =3D58
=A0=A0=A0 File Output Format Counters
=A0=A0=A0 =A0=A0=A0 Bytes Written=3D40

Thanks and Regards,
=
-Rahul Singh
--001a11c269cce3b3ac04f6faef8c--